Course 374 - Cryptography and Information Theory

Lecturer: Michael Purser and Timothy Murphy

Date: 1996-97

Groups: JS and SS Mathematics

Prerequisites:

Duration: 21 weeks

Lectures per week: 3

Assessment: Cryptography will account for 60% of the overall mark; Information Theory for 40%. Cryptography will be marked entirely by Exam. There may be a Project element in Information Theory.

Examinations: One 3-hour examination

This course is in 2 independent parts: Cryptography, given by Dr Purser; and Information Theory, given by Dr Murphy.

Dr Pursers's part of the course will be marked by Examination; Dr Murphy's part will also be marked by Examination, with a possible contribution by Project.

Dr Purser's part of the course will start in November.

Cryptography

This course discusses cryptography with particular reference to computer networks.

Topics (not necessarily in order of appearance):

  1. Introduction-Confidentiality and Authenticity
  2. Shannon's Theory
  3. Block Encryptors and Stream Encryptors
  4. ECB, CBC, CFB modes
  5. Integrity checks
  6. MDC's and MAC's
  7. Identification, Authentication and Authorisation
  8. Access control procedures: 1-way, 2-way
  9. Public (Asymmetric) Key Crytpology vs Private (Symmetric) Key Cryptology
  10. Non-repudiation
  11. 3-way access control
  12. Digital signatures
  13. Diffie-Hellman Key Exchange
  14. Some algorithms
    1. Vigénère, Vernam
    2. Enigma
    3. DES
    4. A stream encryptor based on maximum length sequences
    5. RSA
    6. IDEA
    7. FIAT/SHAMIR
    8. Hashing algorithms
  15. Relevant Mathematics
    1. Generating Primes
    2. Testing Primes
    3. Factorising
    4. Random Numbers
    5. Discrete Logarithms
  16. Cryptanalysis
    1. Statistical Analyses
    2. Brute force
    3. Differential Cryptanalysis
  17. Key Management
    1. Key Distribution
    2. Certification
    3. Sharing Keys
    4. PINs
    5. Chipcards

Information Theory

This course will cover Algorithmic Information Theory, a subject which marries Shannon's original Statistical Information Theory to the concepts of computability and Turing machines.

According to Algorithmic Information Theory, the informational content, or entropy, of a string may be measured by the minimum length to which the string can be compressed.

Notes for the course are available on the Maths Unix System in "/usr/local/pub/AlgorithmicInformationTheory". Read "README" before printing them out.