SlideShare ist ein Scribd-Unternehmen logo
1 von 25
ENTROPY & LAW OF
INCREASE OF ENTROPY
        Khemendra Shukla
           MSc. II Semester
            Applied Physics
           B.B.A. University
                   Lucknow
Contents :

 Entropy : Statistical Interpretation

 Principle Of Entropy Increase.

 Equilibrium Conditions

  i) Thermal Equilibrium, ii) Mechanical
  Equilibrium iii)concentration Equilibrium

 Gibbs Paradox

                                           2
Entropy :

• Physically Entropy of a system is a measure of
  disorder of its molecular motion.

• Greater is disorder of molecular motion greater is the
  entropy.

• The connection between statistical mechanics is
  provided by entropy.

• Natural processes tend to move toward a state
  of greater disorder.

                                                       3
Entropy in Statistical Mechanics
From the principles of thermodynamics the thermodynamic
entropy S has the following important properties:

 dS is an exact differential and is equal to dQ/T for a
reversible process, where dQ is the heat quantity added to
the system.

Entropy is additive: S=S1+S2.

S  0.      If the state of a closed system is given
macroscopically at any instant, the most probable state at
any other instant is one of equal or greater entropy.
                                                        4
State Function
This statement means that entropy is a state function in
that the value of entropy does not depend on the past
history of the system but only on the actual state of the
system...

The entropy  of a system (in classical statistical physics)
in statistical equilibrium can be defined as

                           = ln 

where  is the volume of phase space accessible to the
system, i.e., the volume corresponding to the energies
                                                        5
between E and E+ .
Change in entropy
Let us show first, that changes in the entropy are
independent of the system of units used to measure .
As  is a volume in the phase space of N point particles it
has dimensions

(Momentum  Length)3N =(Action)3N

Let  denote the unit of action; then  /  3N is
dimensionless. If we were to define
        
   ln 3 N  ln   3 N ln 
       
we see that for changes
            ln                                    6
independent of the system units.        =Plank’s constant is a
natural unit of action in phase space.

 It is obvious that the entropy , has a definite value for an
 ensemble in statistical equilibrium; thus the change in
 entropy is an exact differential.


 We see that, if      is interpreted as a measure of the
 imprecision of our knowledge of a system or as a
 measure of the “randomness” of a system then the
 entropy is also to be interpreted as a measure of the
 imprecision or randomness.
                                                            7
Entropy is an additive
 It can be easily shown that  is an additive. Let us
 consider a system made up of two parts, one with N1
 particles and the other with N2 particles. Then

      N  N1  N 2
 and the phase space of the combined system is the
 product space of the phase spaces of the individual parts:
    12

The additive property of the entropy follows directly:
        ln   ln 12  ln 1  ln 2   1   2

                                                            8
Principle Of Entropy Increase
 The entropy of a system remains constant in reversible
   process but increase inevitably in all reversible
   process. Since a reversible process represents a
   limiting ideal case, all actual processes are inherently
   irreversible.

 That means after each cycle of operation the entropy of
   system increases and tends to a maximum.

 “The entropy of an isolated system either increases or
   remains constant or remains constant according as
   processes is undergoes are irreversible or reversible.”
Equilibrium conditions

We have supposed that the condition of statistical
equilibrium is given by the most probable condition of a
closed system, and therefore we may also say that the
entropy  is a maximum when a closed system is in
equilibrium condition.

The value of  for a system in equilibrium will depend on
the energy E (  <E>) of the system; on the number Ni
of each molecular species i in the system; and on external
variables, such as volume, strain, magnetization, etc.   10
Insulation

                        1          2
                                             Barrier



                            Fig. 3.1.

Let us consider the condition for equilibrium in a system
made up of two interconnected subsystems, as in Fig. 3.1.
Initially a rigid, insulating, non-permeable barrier separates
the subsystems from each other.



                                                         11
Thermal equilibrium
Thermal contact - the systems can exchange the
energy. In equilibrium, there is no any flow of
energy between the systems. Let us suppose that the
barrier is allowed to transmit energy, the other inhibitions
remaining in effect. If the conditions of the two
subsystems 1 and 2 do not change we say they are in
thermal equilibrium.
In the thermal equilibrium the entropy  of the total
system must be a maximum with respect to small
transfers of energy from one subsystem to the other.
Writing, by the additive property of the entropy         12
  1   2
we have in equilibrium
                 =1+2=0

             1          2 
               E1         E2  0
            E1          E2 

 We know, however, that

         E  E1  E2  0

as the total system is thermally closed, the energy in a
microcanonical ensemble being constant. Thus
           1    2  
                   E1  0
          E1   E2  
                                                    13
As E1 was an arbitrary variation we must have

                1     2 
                           
               E1     E 2 
in thermal equilibrium. If we define a quantity 
by
                  1       
                      
                         E
then in thermal equilibrium

                      1   2

Here  is known as the temperature and will shown later
to be related to the absolute temperature T by =kT
, where k is the Boltzmann constant, 1.38010-23 j/deg K.
                                                       14
Mechanical
equilibrium
 Mechanical contact the systems are separated by the
mobile barrier; The equilibrium is reached in this case by
adequation of pressure from both sides of the barrier.

We now imagine that the wall is allowed to move and also
passes energy but do not passes particles. The volumes
V1, V2 of the two systems can readjust to maximize the
entropy. In mechanical equilibrium
       1          2          1          2 
         V1         V2         E1         E2  0
      V1          V2          E1          E2 
                                                                     15
After thermal equilibrium has been established the last two
terms on the right add up to zero, so we must have

   1          2 
        V1         V2  0
  V1          V2 

Now the total volume V=V1+V2 is constant, so that

             V  V1  V2
We have then

                1    2  
                        V1  0
               V1   V2  


                                                       16
As V1 was an arbitrary variation we must have
              1     2 
                         
             V1     V2 

in mechanical equilibrium. If we define a quantity  by
                  
                    
                 V  E , N

we see that for a system in thermal equilibrium the
condition for mechanical equilibrium is

                  1   2

 We show now that  has the essential characteristics of
 the usual pressure p.
                                                      17
Concentration Equilibrium

The systems can be exchange by the particles. Let us
suppose that the wall allows diffusion through it of
molecules of the i th chemical species. We have

                     N i1  N i 2
For equilibrium
                    1    2  
                               N i1  0
or                 N i1   N i 2  


                            1  2
                                
                            Ni1 Ni 2               18
We define a quantity i by the relation
                  i    
                          
                      N i  E ,V

The quantity i is called the chemical potential of the ith
species. For equilibrium at constant temperature


                   i1   i 2



                                                       19
Gibbs Paradox

 i) Mixing Of Two Different Ideal Gas

It can be regarded as the expansion
of each of the gases to the volume                       N1 , V1   N 2 , V2
           V  V1  V2                                   T         T

The change in entropy,
     12   1   2 
                                                      Mixing of two gases
   N1 ln V  N 2 ln V   N1 ln V1  N 2 ln V2 
          V          V 
   N1 ln    N 2 ln    0
          V          V 
           1          2
For this case
                         1
                V1  V2  V    &   N1  N 2  N
                         2
                We get,

                            2N ln 2

 This gives the entropy of mixing for two different ideal
    gases and is in agreement with experiments.
ii) Mixing Of one Ideal Gas with the same ideal gas

The removal of the partition should not affect the distribution
  of the system over the accessible states.

   The change in entropy,

                 12   1   2   0
in particular, for the case
                    1
           V1  V2  V        &   N1  N 2  N
                    2
we get an un observed and therefore unaccountable increase
  of entropy 2Nln2 when a partition is removed from a box
  containing the same gas the entropy becomes zero. This is
  the Gibbs paradox.
Resolution Of Gibbs Paradox

  In the case (i) Removal of partition leads to
   diffusion of molecules which is an irreversible
   process therefore increase of        entropy makes
   sense.

  We can imagine mixing of gases iterchanges the
   position of molecules,creates a new state

 Therefore, the no. of accesible states increases or
   equivalently the entropy increase.
Cont’d…

    In the case (ii) no new state is created when
      partition is removed.
   If there are N molecules N! permutations are possible,
      also the mixing is indistinguishable.
   Therefore no. of accessible states is too large by a
      factor of N!,
                   N !
                                       
                 ln        3N 
                                     ln  3 N   ln N !
                        N !h           h 
                       
                 ln  3 N    N ln N  N 
                     h 
   This extra term resolves the gibbs paradox.it makes
     entropy properly additive.
Entropy : statistical approach

Weitere ähnliche Inhalte

Was ist angesagt?

Chapter 03-group-theory (1)
Chapter 03-group-theory (1)Chapter 03-group-theory (1)
Chapter 03-group-theory (1)
한석 나
 

Was ist angesagt? (20)

reducible and irreducible representations
reducible and irreducible representationsreducible and irreducible representations
reducible and irreducible representations
 
Magnetic susceptibility of magnetic materials
Magnetic susceptibility of magnetic materialsMagnetic susceptibility of magnetic materials
Magnetic susceptibility of magnetic materials
 
Lect. 3 gibbs helmholtz equation, chemical potential, gibbs duhem equation
Lect. 3  gibbs helmholtz equation, chemical potential, gibbs duhem equationLect. 3  gibbs helmholtz equation, chemical potential, gibbs duhem equation
Lect. 3 gibbs helmholtz equation, chemical potential, gibbs duhem equation
 
Chapter 03-group-theory (1)
Chapter 03-group-theory (1)Chapter 03-group-theory (1)
Chapter 03-group-theory (1)
 
Born-Oppenheimer approximation.pptx
Born-Oppenheimer approximation.pptxBorn-Oppenheimer approximation.pptx
Born-Oppenheimer approximation.pptx
 
Crystal field theory
Crystal field theoryCrystal field theory
Crystal field theory
 
Lanthanide and actinide chemistry
Lanthanide and actinide chemistryLanthanide and actinide chemistry
Lanthanide and actinide chemistry
 
Ligand substitution reactions
Ligand substitution reactionsLigand substitution reactions
Ligand substitution reactions
 
Density Functional Theory
Density Functional TheoryDensity Functional Theory
Density Functional Theory
 
Intro. to quantum chemistry
Intro. to quantum chemistryIntro. to quantum chemistry
Intro. to quantum chemistry
 
Zero field splitting
Zero field splittingZero field splitting
Zero field splitting
 
SCHRODINGER EQUATION
SCHRODINGER EQUATION SCHRODINGER EQUATION
SCHRODINGER EQUATION
 
Quantum Chemistry
Quantum ChemistryQuantum Chemistry
Quantum Chemistry
 
Band theory
Band theoryBand theory
Band theory
 
Ligand substitution reactions
Ligand substitution reactionsLigand substitution reactions
Ligand substitution reactions
 
Statistical ensembles-b.subha
Statistical  ensembles-b.subhaStatistical  ensembles-b.subha
Statistical ensembles-b.subha
 
Introduction statistical thermodynamics.pptx
Introduction statistical thermodynamics.pptxIntroduction statistical thermodynamics.pptx
Introduction statistical thermodynamics.pptx
 
Supramolecular chemistry
Supramolecular chemistrySupramolecular chemistry
Supramolecular chemistry
 
NANO266 - Lecture 2 - The Hartree-Fock Approach
NANO266 - Lecture 2 - The Hartree-Fock ApproachNANO266 - Lecture 2 - The Hartree-Fock Approach
NANO266 - Lecture 2 - The Hartree-Fock Approach
 
Kinetics of solution in reaction
Kinetics of solution in reactionKinetics of solution in reaction
Kinetics of solution in reaction
 

Andere mochten auch

Entropy - A Statistical Approach
Entropy - A Statistical ApproachEntropy - A Statistical Approach
Entropy - A Statistical Approach
Nicholas Montes
 
ENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMESENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMES
Hamed Zakerzadeh
 
Max Entropy
Max EntropyMax Entropy
Max Entropy
jianingy
 
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdfKernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
grssieee
 
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
Dongseo University
 

Andere mochten auch (20)

Entropy - A Statistical Approach
Entropy - A Statistical ApproachEntropy - A Statistical Approach
Entropy - A Statistical Approach
 
Entropy by lexain
Entropy by lexainEntropy by lexain
Entropy by lexain
 
ENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMESENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMES
 
Max Entropy
Max EntropyMax Entropy
Max Entropy
 
Two parameter entropy of uncertain variable
Two parameter entropy of uncertain variableTwo parameter entropy of uncertain variable
Two parameter entropy of uncertain variable
 
Entropy Variation Analysis Based on Monitoring
Entropy Variation Analysis Based on MonitoringEntropy Variation Analysis Based on Monitoring
Entropy Variation Analysis Based on Monitoring
 
Entropy-driven cutoff phenomena
Entropy-driven cutoff phenomenaEntropy-driven cutoff phenomena
Entropy-driven cutoff phenomena
 
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdfKernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
Kernel Entropy Component Analysis in Remote Sensing Data Clustering.pdf
 
Entropy and systemic risk measures
Entropy and systemic risk measuresEntropy and systemic risk measures
Entropy and systemic risk measures
 
Entropy-based Histograms for Selectivity Estimation
Entropy-based Histograms for Selectivity EstimationEntropy-based Histograms for Selectivity Estimation
Entropy-based Histograms for Selectivity Estimation
 
Managing Entropy - Clarity '13 - Keith Goode
Managing Entropy - Clarity '13 - Keith GoodeManaging Entropy - Clarity '13 - Keith Goode
Managing Entropy - Clarity '13 - Keith Goode
 
Bose-Einstein Condensation, Negative Temperatures and the Dispute between Bol...
Bose-Einstein Condensation, Negative Temperatures and the Dispute between Bol...Bose-Einstein Condensation, Negative Temperatures and the Dispute between Bol...
Bose-Einstein Condensation, Negative Temperatures and the Dispute between Bol...
 
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
2013-1 Machine Learning Lecture 02 - Andrew Moore: Entropy
 
Entropy, Origin Of The Concept
Entropy, Origin Of The ConceptEntropy, Origin Of The Concept
Entropy, Origin Of The Concept
 
Hutchison_Entropy
Hutchison_EntropyHutchison_Entropy
Hutchison_Entropy
 
Entropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networksEntropy based algorithm for community detection in augmented networks
Entropy based algorithm for community detection in augmented networks
 
Entropy and Probability
Entropy and ProbabilityEntropy and Probability
Entropy and Probability
 
Tang 04 entropy and nuclear reactions
Tang 04   entropy and nuclear reactionsTang 04   entropy and nuclear reactions
Tang 04 entropy and nuclear reactions
 
Impact of scrambling on barcode entropy
Impact of scrambling on barcode entropyImpact of scrambling on barcode entropy
Impact of scrambling on barcode entropy
 
Entropy and denial of service attacks
Entropy and denial of service attacksEntropy and denial of service attacks
Entropy and denial of service attacks
 

Ähnlich wie Entropy : statistical approach

New microsoft power point presentation
New microsoft power point presentationNew microsoft power point presentation
New microsoft power point presentation
Farkhanda sharif
 

Ähnlich wie Entropy : statistical approach (20)

Lecture note
Lecture noteLecture note
Lecture note
 
Week 7 entrophy
Week 7 entrophyWeek 7 entrophy
Week 7 entrophy
 
lecture2.pdf
lecture2.pdflecture2.pdf
lecture2.pdf
 
Entropy
EntropyEntropy
Entropy
 
Thermodynamics 2
Thermodynamics 2 Thermodynamics 2
Thermodynamics 2
 
4
44
4
 
Worksheet first law
Worksheet first lawWorksheet first law
Worksheet first law
 
Unit 2.1 thm
Unit 2.1 thmUnit 2.1 thm
Unit 2.1 thm
 
lecture_223⁵4323564334555543343333334.ppt
lecture_223⁵4323564334555543343333334.pptlecture_223⁵4323564334555543343333334.ppt
lecture_223⁵4323564334555543343333334.ppt
 
Lecture 2
Lecture 2Lecture 2
Lecture 2
 
Entropy A Measure of Disorder.ppt
Entropy A Measure of Disorder.pptEntropy A Measure of Disorder.ppt
Entropy A Measure of Disorder.ppt
 
Biochem textbook
Biochem textbookBiochem textbook
Biochem textbook
 
MUHAMMAD NASIR
MUHAMMAD NASIRMUHAMMAD NASIR
MUHAMMAD NASIR
 
nasir
nasirnasir
nasir
 
JC H2 Physics Formula List/Summary (all topics)
JC H2 Physics Formula List/Summary (all topics)JC H2 Physics Formula List/Summary (all topics)
JC H2 Physics Formula List/Summary (all topics)
 
Lecture No.3.pptx A good slide for students
Lecture No.3.pptx A good slide for studentsLecture No.3.pptx A good slide for students
Lecture No.3.pptx A good slide for students
 
laser 2.pdf
laser 2.pdflaser 2.pdf
laser 2.pdf
 
Canonical Partition Function Parameters.ppt
Canonical Partition Function Parameters.pptCanonical Partition Function Parameters.ppt
Canonical Partition Function Parameters.ppt
 
aula-9.ppt
aula-9.pptaula-9.ppt
aula-9.ppt
 
New microsoft power point presentation
New microsoft power point presentationNew microsoft power point presentation
New microsoft power point presentation
 

Kürzlich hochgeladen

Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Sérgio Sacani
 
Seismic Method Estimate velocity from seismic data.pptx
Seismic Method Estimate velocity from seismic  data.pptxSeismic Method Estimate velocity from seismic  data.pptx
Seismic Method Estimate velocity from seismic data.pptx
AlMamun560346
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
PirithiRaju
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Sérgio Sacani
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
1301aanya
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
Sérgio Sacani
 
Pests of mustard_Identification_Management_Dr.UPR.pdf
Pests of mustard_Identification_Management_Dr.UPR.pdfPests of mustard_Identification_Management_Dr.UPR.pdf
Pests of mustard_Identification_Management_Dr.UPR.pdf
PirithiRaju
 
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdfPests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
PirithiRaju
 

Kürzlich hochgeladen (20)

SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICESAMASTIPUR CALL GIRL 7857803690  LOW PRICE  ESCORT SERVICE
SAMASTIPUR CALL GIRL 7857803690 LOW PRICE ESCORT SERVICE
 
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune WaterworldsBiogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
Biogenic Sulfur Gases as Biosignatures on Temperate Sub-Neptune Waterworlds
 
GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)
 
COMPUTING ANTI-DERIVATIVES (Integration by SUBSTITUTION)
COMPUTING ANTI-DERIVATIVES(Integration by SUBSTITUTION)COMPUTING ANTI-DERIVATIVES(Integration by SUBSTITUTION)
COMPUTING ANTI-DERIVATIVES (Integration by SUBSTITUTION)
 
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxCOST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
 
CELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdfCELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdf
 
pumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit flypumpkin fruit fly, water melon fruit fly, cucumber fruit fly
pumpkin fruit fly, water melon fruit fly, cucumber fruit fly
 
IDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicineIDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicine
 
Seismic Method Estimate velocity from seismic data.pptx
Seismic Method Estimate velocity from seismic  data.pptxSeismic Method Estimate velocity from seismic  data.pptx
Seismic Method Estimate velocity from seismic data.pptx
 
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verifiedConnaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
 
Unit5-Cloud.pptx for lpu course cse121 o
Unit5-Cloud.pptx for lpu course cse121 oUnit5-Cloud.pptx for lpu course cse121 o
Unit5-Cloud.pptx for lpu course cse121 o
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
 
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
High Profile 🔝 8250077686 📞 Call Girls Service in GTB Nagar🍑
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
 
Pests of mustard_Identification_Management_Dr.UPR.pdf
Pests of mustard_Identification_Management_Dr.UPR.pdfPests of mustard_Identification_Management_Dr.UPR.pdf
Pests of mustard_Identification_Management_Dr.UPR.pdf
 
Feature-aligned N-BEATS with Sinkhorn divergence (ICLR '24)
Feature-aligned N-BEATS with Sinkhorn divergence (ICLR '24)Feature-aligned N-BEATS with Sinkhorn divergence (ICLR '24)
Feature-aligned N-BEATS with Sinkhorn divergence (ICLR '24)
 
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
 
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdfPests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
 

Entropy : statistical approach

  • 1. ENTROPY & LAW OF INCREASE OF ENTROPY Khemendra Shukla MSc. II Semester Applied Physics B.B.A. University Lucknow
  • 2. Contents :  Entropy : Statistical Interpretation  Principle Of Entropy Increase.  Equilibrium Conditions i) Thermal Equilibrium, ii) Mechanical Equilibrium iii)concentration Equilibrium  Gibbs Paradox 2
  • 3. Entropy : • Physically Entropy of a system is a measure of disorder of its molecular motion. • Greater is disorder of molecular motion greater is the entropy. • The connection between statistical mechanics is provided by entropy. • Natural processes tend to move toward a state of greater disorder. 3
  • 4. Entropy in Statistical Mechanics From the principles of thermodynamics the thermodynamic entropy S has the following important properties:  dS is an exact differential and is equal to dQ/T for a reversible process, where dQ is the heat quantity added to the system. Entropy is additive: S=S1+S2. S  0. If the state of a closed system is given macroscopically at any instant, the most probable state at any other instant is one of equal or greater entropy. 4
  • 5. State Function This statement means that entropy is a state function in that the value of entropy does not depend on the past history of the system but only on the actual state of the system... The entropy  of a system (in classical statistical physics) in statistical equilibrium can be defined as  = ln  where  is the volume of phase space accessible to the system, i.e., the volume corresponding to the energies 5 between E and E+ .
  • 6. Change in entropy Let us show first, that changes in the entropy are independent of the system of units used to measure . As  is a volume in the phase space of N point particles it has dimensions (Momentum  Length)3N =(Action)3N Let  denote the unit of action; then  /  3N is dimensionless. If we were to define    ln 3 N  ln   3 N ln   we see that for changes    ln  6
  • 7. independent of the system units.  =Plank’s constant is a natural unit of action in phase space. It is obvious that the entropy , has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact differential. We see that, if  is interpreted as a measure of the imprecision of our knowledge of a system or as a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure of the imprecision or randomness. 7
  • 8. Entropy is an additive It can be easily shown that  is an additive. Let us consider a system made up of two parts, one with N1 particles and the other with N2 particles. Then N  N1  N 2 and the phase space of the combined system is the product space of the phase spaces of the individual parts:   12 The additive property of the entropy follows directly:   ln   ln 12  ln 1  ln 2   1   2 8
  • 9. Principle Of Entropy Increase The entropy of a system remains constant in reversible process but increase inevitably in all reversible process. Since a reversible process represents a limiting ideal case, all actual processes are inherently irreversible. That means after each cycle of operation the entropy of system increases and tends to a maximum. “The entropy of an isolated system either increases or remains constant or remains constant according as processes is undergoes are irreversible or reversible.”
  • 10. Equilibrium conditions We have supposed that the condition of statistical equilibrium is given by the most probable condition of a closed system, and therefore we may also say that the entropy  is a maximum when a closed system is in equilibrium condition. The value of  for a system in equilibrium will depend on the energy E (  <E>) of the system; on the number Ni of each molecular species i in the system; and on external variables, such as volume, strain, magnetization, etc. 10
  • 11. Insulation 1 2 Barrier Fig. 3.1. Let us consider the condition for equilibrium in a system made up of two interconnected subsystems, as in Fig. 3.1. Initially a rigid, insulating, non-permeable barrier separates the subsystems from each other. 11
  • 12. Thermal equilibrium Thermal contact - the systems can exchange the energy. In equilibrium, there is no any flow of energy between the systems. Let us suppose that the barrier is allowed to transmit energy, the other inhibitions remaining in effect. If the conditions of the two subsystems 1 and 2 do not change we say they are in thermal equilibrium. In the thermal equilibrium the entropy  of the total system must be a maximum with respect to small transfers of energy from one subsystem to the other. Writing, by the additive property of the entropy 12
  • 13.   1   2 we have in equilibrium  =1+2=0   1    2      E1    E2  0  E1   E2  We know, however, that E  E1  E2  0 as the total system is thermally closed, the energy in a microcanonical ensemble being constant. Thus   1    2         E1  0  E1   E2   13
  • 14. As E1 was an arbitrary variation we must have   1    2       E1   E 2  in thermal equilibrium. If we define a quantity  by 1    E then in thermal equilibrium 1   2 Here  is known as the temperature and will shown later to be related to the absolute temperature T by =kT , where k is the Boltzmann constant, 1.38010-23 j/deg K. 14
  • 15. Mechanical equilibrium Mechanical contact the systems are separated by the mobile barrier; The equilibrium is reached in this case by adequation of pressure from both sides of the barrier. We now imagine that the wall is allowed to move and also passes energy but do not passes particles. The volumes V1, V2 of the two systems can readjust to maximize the entropy. In mechanical equilibrium   1    2    1    2      V1    V2    E1    E2  0  V1   V2   E1   E2  15
  • 16. After thermal equilibrium has been established the last two terms on the right add up to zero, so we must have   1    2    V1    V2  0  V1   V2  Now the total volume V=V1+V2 is constant, so that V  V1  V2 We have then   1    2         V1  0  V1   V2   16
  • 17. As V1 was an arbitrary variation we must have   1    2       V1   V2  in mechanical equilibrium. If we define a quantity  by         V  E , N we see that for a system in thermal equilibrium the condition for mechanical equilibrium is 1   2 We show now that  has the essential characteristics of the usual pressure p. 17
  • 18. Concentration Equilibrium The systems can be exchange by the particles. Let us suppose that the wall allows diffusion through it of molecules of the i th chemical species. We have N i1  N i 2 For equilibrium   1    2         N i1  0 or  N i1   N i 2   1  2  Ni1 Ni 2 18
  • 19. We define a quantity i by the relation i         N i  E ,V The quantity i is called the chemical potential of the ith species. For equilibrium at constant temperature  i1   i 2 19
  • 20. Gibbs Paradox i) Mixing Of Two Different Ideal Gas It can be regarded as the expansion of each of the gases to the volume N1 , V1 N 2 , V2 V  V1  V2 T T The change in entropy,    12   1   2  Mixing of two gases  N1 ln V  N 2 ln V   N1 ln V1  N 2 ln V2  V  V   N1 ln    N 2 ln    0 V  V   1  2
  • 21. For this case 1 V1  V2  V & N1  N 2  N 2 We get,   2N ln 2 This gives the entropy of mixing for two different ideal gases and is in agreement with experiments.
  • 22. ii) Mixing Of one Ideal Gas with the same ideal gas The removal of the partition should not affect the distribution of the system over the accessible states. The change in entropy,    12   1   2   0 in particular, for the case 1 V1  V2  V & N1  N 2  N 2 we get an un observed and therefore unaccountable increase of entropy 2Nln2 when a partition is removed from a box containing the same gas the entropy becomes zero. This is the Gibbs paradox.
  • 23. Resolution Of Gibbs Paradox  In the case (i) Removal of partition leads to diffusion of molecules which is an irreversible process therefore increase of entropy makes sense.  We can imagine mixing of gases iterchanges the position of molecules,creates a new state Therefore, the no. of accesible states increases or equivalently the entropy increase.
  • 24. Cont’d…  In the case (ii) no new state is created when partition is removed. If there are N molecules N! permutations are possible, also the mixing is indistinguishable. Therefore no. of accessible states is too large by a factor of N!,    N !         ln  3N   ln  3 N   ln N !  N !h  h      ln  3 N    N ln N  N  h  This extra term resolves the gibbs paradox.it makes entropy properly additive.

Hinweis der Redaktion

  1. Extensive property of entropy