There are numerous examples of how the interplay between information theory and statistical mechanics/thermodynamics has been fruitful. Jayne’s approach to statistical mechanics (now text-book knowledge) was heavily inspired by Shannon’s information theory. Going in the other direction, the Von Neumann entropy, extensively used as the canonical information measure in quantum information theory, was originally justified by Von Neumann using thermodynamic arguments.
In quantum information theory we have developed a comparatively sophisticated understanding of entropy, both at a conceptual mathematical as well as at the physical level. In the spirit of above, this understanding should allow us to better understand the behaviour of thermal systems, particular at the small scale.
One-shot statistical mechanics
There are occasions when the average (mean) quantities of a system do not characterise important aspects of the system’s behaviour. This is important, for example, when considering systems involving an activation energy threshold, with an uneven spread of energies. If most of the values are below the threshold, but there is a single extremely high value, then the mean value may be above this limit. Thus, if one only considers the mean value, one might incorrectly think that the system is likely to meet the activation energy, when in fact, most of the time it does not. (See diagram to the left).
The level of experimental control at the level of individual atoms is so great that it is quite realistic to think of engines that are composed of a few atoms. Here quantum effects are very important. We study what such engines could do. The standard theoretical methods for studying heat engines are not so useful at this level and we are instead developing an alternative approach called one-shot statistical mechanics.
Free energy and entropy
A key challenge of our times is to identify sources of free energy and to use these efficiently. Although quantum information science has been predominantly focused on information processing, we believe that the kind of understanding of entropy developed in this field can also help solve problems related to energy transfer and generation. The link between entropy and energy centres on the notion of free energy, the energy which is available for doing work.
In the two papers above, we show that indeed the free energy should also be quantified using the smooth entropy, at least where the physical models for work extraction we used apply. It really does matter which S one uses, as the version of S we propose to use can deviate arbitrarily from the von Neumann entropy in general. We moreover considered a strange quantum effect whereby entropy (of the conditional type) can be negative, and gave this an interpretation in terms of free energy: it corresponds to settings where one get work out by reducing the entropy of a system (resetting/erasing it in the Landauer sense). These results do not actually contradict existing statistical mechanics but rather extend it to new regimes.
The quantum versions of entropy are important to use and understand here when talking about free energy. Modern energy technology is, after all, increasing becoming quantum technology.
Energy efficient bit-reset in finite-time
Landauer's principle states that resetting a bit or qubit in the presence of a heat bath at temperature T costs at least kTln2 of work, which is ultimately dissipated as heat. It represents the fundamental limit to heat generation in irreversible computers, extrapolated to be reached around 2035. The validity of this result is of great importance to the one-shot approach to statistical mechanics, forming a key assumption in derivations, such as the free-energy results above. However, this assumption is based on the assumption that the protocol is infinitely slow. This, whilst mathematically convenient, is unphysical.
In our recent research, we show that one can reset a qubit in finite time at a guaranteed work cost exponentially close to kTln2, not just on average but in any single shot. The optimality statements in the literature for one-shot statistical mechanics are accordingly also relevant for physical experiments, which take place in finite time.