This episode breaks down the blog post The First Law of Complexodynamics : which explores the relationship between complexity and entropy in physical systems. The author, Scott Aaronson, is prompted by a question posed by Sean Carroll at a conference, asking why complexity seems to increase and then decrease over time, whereas entropy increases monotonically. Aaronson proposes a new measure of complexity, dubbed "complextropy", based on Kolmogorov complexity. Complextropy is defined as the size of the shortest computer program that can efficiently sample from a probability distribution such that a target string is not efficiently compressible with respect to that distribution. Aaronson conjectures that this measure would explain the observed trend in complexity, being low in the initial state of a system, high in intermediate states, and low again at late times. He suggests that this "First Law of Complexodynamics" could be tested empirically by simulating systems like a coffee cup undergoing mixing. The post then sparks a lively discussion in the comments section, where various readers propose alternative measures of complexity and engage in debates about the nature of entropy and the validity of the proposed "First Law".
Audio : (Spotify) https://open.spotify.com/episode/15LhxYwIsz3mgGotNmjz3P?si=hKyIqpwfQoeMg-VBWAzxsw
Paper: https://scottaaronson.blog/?p=762