Comments:  (or for this version) 
Abstract: We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the informationtheoretic idea that knowledge of a stochasticallyevolving system degrades over time. The Bayesian Second Law can be written as $\Delta H(\rho_m, \rho) + \langle \mathcal{Q}\rangle_{Fm}\geq 0$, where $\Delta H(\rho_m, \rho)$ is the change in the cross entropy between the original phasespace probability distribution $\rho$ and the measurementupdated distribution $\rho_m$, and $\langle \mathcal{Q}\rangle_{Fm}$ is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the Second Law that bound the entropy increase from below by a nonnegative number, as well as Bayesian versions of the Jarzynski equality. We demonstrate the formalism using simple analytical and numerical examples.
Tutorial on Second law of Thermodynamics
第29講 Entropy and the Second Law of Thermodynamics I A
Entropy  God's Dice Game: The book describes the historical evolution of the understanding of entropy, alongside biographies of the scientists who ... communication theory, economy, and sociology Book (CreateSpace Independent Publishing Platform)
