Sales Toll Free No: 1-855-666-7446



Defining "disorder" to understand "entropy" should be done carefully. A more better way to characterize entropy is to say that it is a measure of the "multiplicity" associated with the state of the objects. When we throw a dice, the probability of getting a seven is more than that of getting a two. This is because a seven can be obtained in six different ways while a two can be obtained in one way only. So seven has higher multiplicity than two, and we can conclude that a seven represents higher "disorder" or higher entropy.

The concept of entropy defines nature tendency to measure the disorder in a finite system. Entropy and energy and their relationship are important to an understanding not just about physics but about life. Earlier the second law of thermodynamics was considered as 'law of disorder'. The main development is the recognition of the "law of maximum entropy production" or "MEP" that lead to many changes. Lets study about entropy in detail.

Entropy definition

Back to Top
Entropy (S) is a measure of the number of specific ways a thermodynamic system can be arranged often called as measure of disorder, or its a measure of how much a system is move ahead to attain the thermodynamic equilibrium.
The entropy of an isolated system never gets decreased as isolated systems spontaneously reach towards the thermodynamic equilibrium that has maximum entropy. If the entropy gets changed, then the change in entropy is given as,
$\Delta$ S = $\frac{\Delta\ Q}{T}$
$\Delta$ Q is the heat transfer to the system
T is thermodynamic temperature.

Entropy equation

Back to Top
Entropy is a variable used in a precise statement of the second law of thermodynamics. It is defined as a thermodynamic function whose change is independent of the path of transformation of the system and is given by,
dS = $\frac{\Delta\ Q}{T}$

The change in entropy dS is measured as the ratio of the change in the heat energy to the temperature, at which the change takes place in a reversible process.

dS = $S_{f}$ - $S_{i}$ = $\frac{\Delta\ Q}{T}$

where $S_{f}$ is the final entropy and $S_{i}$ is initial entropy

The difference in entropy between any two states can be obtained by integrating the above equation,
$\Delta$ S = k ln W
S is the entropy
k is the Boltzmann constant
W is no of the micro state that are consistent with the given macro state.

Change in Entropy

Back to Top
Entropy is a state function in that it depends only on the initial and final state of the system, regardless of the path by which the changes take place is termed as change in entropy given as
dS = $S_{f}$ - $S_{i}$ = $\frac{\Delta\ Q}{T}$

where $S_{f}$ is the final entropy and $S_{i}$ is initial entropy

The entropy change at constant volume is given by,
$\Delta$ S = n$C_{V}$ ln $\frac{V}{V_0}$

The entropy change for constant pressure is given by,
$\Delta$ S = n$C_{P}$ ln $\frac{P}{P_0}$

The entropy change for constant temperature is given by,
$\Delta$ S = nR ln $\frac{V}{V_0}$

Here $C_{V}$ is the specific heat at constant volume, $C_{P}$ is the specific heat at constant pressure.

Law of Entropy

Back to Top
There are two laws of entropy known as second law of thermodynamics,

Kelvin Planck statement :
It is impossible to get a device or system in a complete cycle that gives entire work from it without any release of heat to the surroundings.

Clausius statement : According to it we cannot construct a device which, operating in a cycle, producing no affect other than the transfer of heat from a cooler body to a hotter one. That tells that heat cannot flow from one body to another only due to the temperature change, some amount of work has to be definitely done to achieve that.

Entropy Vs Enthalpy

Back to Top
Lets see some difference between enthalpy and entropy :

It is the measure of disorder in a system It is the measure of total energy in a thermodynamic system
It is measured in Joules per Kelvin It is measured in Joules per Kilogram
3. It is expressed as the internal energy of a system plus the product of the pressure and volume of the system. It is expressed as a function of thermodynamic variables as temperature,pressure

Negative Entropy

Back to Top
Negative enthalpy is a type of chemical reaction where the reaction is exothermic. This tells that when the reaction takes place, energy gets released in the reaction. Example: Burning of fuel that gives out heat to the surroundings.

Examples of Entropy

Back to Top
Lets go through some examples on entropy:

Solved Examples

Question 1: The probability for destroying the target in only one time is 0.36. Compute the probability that it would be destroyed on the second attempt itself.
The probability of destroying the target in one trial is p = 0.36

The value of the q is calculated by,
q = 1-p
q = 1- 0.36
q = 0.64

By the geometric distribution, the probability of the x failures preceding the first success is calculated by using the formula

P(X = x) = q x p, the value of x is 0, 1, 2. . .

The target is destroyed at the second attempt, so x = 2.

P(X = 2) = (0.64) 2 (0.36)
P(X = 2) = (0.4096) (0.36)
P(X = 2) = 0.147

The probability for destroying the target at the third trial is 0.147.       

Question 2: What is entropy change if $\Delta$ H = -95 kJ mol-1 and $\Delta$ T = 300 C
$\Delta$ H = 95 J, T = 303 K

Entropy change = $\frac{\Delta\ H}{T}$
                       = $\frac{95}{303}$
                       = 0.313 JK-1mol-1.