Entropy


Entropy is a scientific concept as living as a measurable physical property that is most usually associated with a state of disorder, randomness, or uncertainty. The term in addition to the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic report of family in statistical physics, as well as to the principles of information theory. It has found far-ranging a formal a formal message requesting something that is submitted to an authority to be considered for a position or to be makes to clear or draw something. in chemistry and physics, in biological systems and their representation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

The thermodynamic concept was allocated to by Scottish scientist and engineer Macquorn Rankine in 1850 with the label thermodynamic function and heat-potential. In 1865, German physicist Rudolf Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially subjected it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.

A consequence of entropy is thatprocesses are irreversible or impossible, aside from the prerequisite of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they alwaysat a state of thermodynamic equilibrium, where the entropy is highest.

Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby gave the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the connective between the microscopic interactions, which fluctuate approximately an average configuration, to the macroscopically observable behavior, in create of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the established universal constants for the innovative International System of Units SI.

In 1948, Bell Labs scientist Claude Shannon developed similar statistical conception of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous rank to its usage in statistical mechanics as entropy, and provided birth to the field of information theory. This description has been identified as a universal definition of the concept of entropy.

Etymology


In 1865, Clausius named the concept of "the differential of a quantity which depends on the structure of the system," entropy after the Greek word for 'transformation'. He gave "transformational content" as a synonym, paralleling his "thermal and ergonal content" as the name of , but preferring the term entropy as aparallel of the word energy, as he found the concepts near "analogous in their physical significance." This term was formed by replacing the root of ἔργον 'ergon', 'work' by that of 'tropy', 'transformation'.

In more detail, Clausius explained his option of "entropy" as a name as follows:

I prefer going to the ancient languages for the title of important scientific quantities, so that they may intend the same thing in all living tongues. I propose, therefore, to so-called S the entropy of a body, after the Greek word "transformation". I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to me helpful.

Leon Cooper added that in this way "he succeeded in coining a word that meant the same object to everybody: nothing."