Jason Palmer reports for the BBC:
The idea of entropy is fundamentally an intuitive one – that the Universe tends in general to a more disordered state.
The classic example is a dropped cup: it will smash into pieces, but those pieces will never spontaneously recombine back into a cup. Analogously, a hot cup of coffee will always cool down if left – it will never draw warmth from a room to heat back up.
But the idea of “causal entropy” goes further, suggesting that a given physical system not only maximises the entropy within its current conditions, but that it reaches a state that will allow it more entropy – in a real sense, more options – in the future.
Alex Wissner-Gross of Harvard University and the Massachusetts Institute of Technology in the US and Cameron Freer from the University of Hawaii at Manoa, have now put together a mathematical model that ties this causal entropy idea – evident in a range of recent studies – into a single framework.