
By placing the discussion of uncertainty management, a fundamental biological necessity, within the framework of information theory and self-organizing systems, our model helps to situate key psychological processes within a broader physical, conceptual, and evolutionary context. In equations, entropy is usually denoted by the letter S and has units of joules per kelvin (JK 1) or kgm 2 s 2 K 1. It is an extensive property of a thermodynamic system, which means its value changes depending on the amount of matter that is present. Four major tenets of EMU are proposed: (a) Uncertainty poses a critical adaptive challenge for any organism, so individuals are motivated to keep it at a manageable level (b) uncertainty emerges as a function of the conflict between competing perceptual and behavioral affordances (c) adopting clear goals and belief structures helps to constrain the experience of uncertainty by reducing the spread of competing affordances and (d) uncertainty is experienced subjectively as anxiety and is associated with activity in the anterior cingulate cortex and with heightened noradrenaline release. Entropy is the measure of the disorder of a system. We propose the entropy model of uncertainty (EMU), an integrative theoretical framework that applies the idea of entropy to the human information system to understand uncertainty-related anxiety.

The concept comes out of thermodynamics, which deals with the transfer of heat energy within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. Entropy is defined as the quantitative measure of disorder or randomness in a system. Thus, the greater the disorderliness in an isolated system, the higher is the entropy. Hence, change in entropy does not differ with the nature of the processes either reversible or irreversible.

It is a part of his collection Slow Learner, and was originally published in the Kenyon Review in 1960, while Pynchon was still an undergraduate.In his introduction to the collection, Pynchon refers to Entropy as the work of a beginning writer (12). Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Since entropy is a state function, the entropy change of a system depends only on initial and final state irrespective of the path taken. Summary: Entropy Entropy is a short story by Thomas Pynchon.
