Module MA346H: Algorithmic Entropy
- Credit weighting (ECTS)
- 5 credits
- Semester/term taught
- Hilary term 2012-13
- Contact Hours
- 10 weeks, 3 lectures including tutorials per week
- Lecturer
- Prof. Timothy Murphy
- Learning Outcomes
- Module Content
-
The second law of thermodynamics states that entropy is increasing - or more precisely, that the entropy of a closed system cannot decrease.
This determines the direction of time, and distinguishes the past from the future.
But what exactly is entropy? It is often described as a measure of order. A well-ordered system has low entropy; a disordered system has high entropy.
The concept of entropy has passed through four phases:
- Thermodynamics 1820--1860
- Carnot, Clausius (who coined the word `entropy') and others studied the amount of work which could be abstracted from a given quantity of heat - a very important economic issue at the time. Their studies showed that there was a theoretical maximum to the proportion of heat energy that can be converted into work.
- Statistical Mechanics 1860--1900
- Boltzmann showed that the entropy of a gas could be described in terms of the motion of its constituent molecules.
- Information Theory 1950--1960
- Claude Shannon studied the electronic communication of messages --- again, a very important economic issue at the time --- and established that there is a limit to the extent to which a given message can be compressed, re-finding Boltzmann's formula \[ H = - \sum p \log p, \] but now expressing the entropy $H$ of a random variable.
- Algorithmic Information Theory 1960--1970
- Andrei Kolmogorov and Gregor Chaitin independently had the idea of defining entropy in terms of compressibility, basing the latter concept on Turing's notion of an ideal computer, and his discovery that one can construct a universal Turing machine $U$ able to imitate any other Turing machine $T$. Thus the entropy $H(s)$ of a string $s$ --- no longer regarded as part of a statistical ensemble as in Shannon's theory --- is defined as the length of the shortest input to $U$ which will output the string $s$.
This module will study the Kolmogorov/Chaitin theory of algorithmic entropy.
The module will be in four parts:
- The definition of a Turing machine, and the `game' of constructing Turing machines to carry out simple algorithms, eg adding or multiplying integers.
- The proof that there exists a universal Turing machine.
- The definition and basic properties of the entropy $H(s)$ of a string $s$.
- The application of algorithmic information theory to prove Gödel's Unprovability Theorem, that in any non-trivial axiomatic system there will be propositions that can neither be proved nor disproved.
Notes for the course can be found at http://www.maths.tcd.ie/pub/Maths/Courseware/AlgorithmicEntropy/
- Module Prerequisite
- Assessment Detail
- This module will be examined in a 2 hour examination in Trinity term. Continuous assessment will contribute 20% to the final grade for the module at the annual examination session.