Entropy comes from the Greek word "ἐντροπία" or "entropy" which is the thermodynamic physical magnitude or part of the physics that studies the mechanical action of heat and the other forms of energy that allow to measure that part of the energy stored in a system that does not used.
The entropy measurement allows to establish the order that a system has in a given written application, with respect to the one that carried it out or could have done it in another. Thus, the entropy difference could be determined for the formation or constitution of a system from its disaggregated components, and also for any process that may occur in an already constituted system.
The entropy is a state function, where a magnitude physical microscopic characterizes the state of an equilibrium system of extensive nature and value, entropy describes the reversible of systems the physical studying the mechanical action of heat and the remaining forms of energy.
The entropy of formation of a chemical compound where the difference of the entropy in the formation process of the constituent elements is specified, but if there is greater entropy of formation, it will be more favorable.
In information theory, entropy is the measure of the lack of security or trust that exists before a set of messages where only one will be received and it is a measure of the information that is necessary to reduce or eliminate the lack of security or trust.