Today finish discussion of phase diagrams. Continue discussion regarding lack of diffusion in solid state. Last time a phase diagram was drawn and lines extrapolated, and there was semi-qualitative information.

Spinodal Decomposition

Consider instabilities. Metastable points are local minimum. There is complete instability at spinodal points. Consider a system with immiscibility. There are two distinct regimes. Reach states inside the miscibility gap by taking a high temperature state and quench down.

<p>
</p>

Metastable does not necessarily mean unstable. A system may be stable against small fluctuations. Free energy can be found from a chord. If the sample is quenched at another part of the free energy curve, the chord is below the curve with small fluctuations. Small changes lower free energy. Spinodal points are points where the following relation is true.

<center>

<br>

<math>\frac

Unknown macro: {partial^2 G}
Unknown macro: {partial x_B^2}

=0</math>

<br>

</center>

Consider two cases.

  • Inside spinodal: Can't retain the solid solution unless quench very cold.
  • Outside spinodal: Solid solution is metastable. Concentration needs to be large enough to lower free energy. There must be a fluctuation of critical size.

There is discussion in 3.21 about nucleation and growth. A material may need to form a nucleus that is sufficiently large.

<p>
</p>

Spinodal transformations are wavelike. There is a very different kinetic regime, and a concentration wave is sent through the sample. There is a spinodal equivalent of many transformations. Spinodal transformations are a good way to make small scale inhomogeneity.

Ordering

Explain based on mean-field theory. Consider <math>B_2</math> ordering. Break symettry between two sublattices. Minimize free energy, <math>F(\eta)</math> with respect to an order parameter. Look how <math>F(\eta)</math> evolves with temperature. Disorder is associated with <math>\eta = 0</math>.

<p>
</p>

A minima at <math>\eta = 0</math> corresponds to disorder. At higher temperatures, there is a localized minima that corresponds to an ordered state but that is not lower free energy. At transition, there are two minima with same energy. Positive curvature can disappear, and there can be an unstable disordered state. Below is the spinodal condition.

<center>

<br>

<math>\frac

Unknown macro: {partial^2G}
Unknown macro: {partial n^2}

|_

Unknown macro: {eta = 0}

= 0</math>

<br>

</center>

In a system with order-disorder, there will be spinodal decomposition. In some cases there needs to be an ordering fluctuation of critical size. If a system is quenched into spinodal region, the system in unstable. There is spinodal decomposition in systems of copper and gold. Local driving force is dramatic. This ends a discussion about non-equilibrium phenomenon.

Entropy

Link microscopic phenomenon with macroscopic phenomenon in remaining lectures. How does one know if there is positive enthalpy of mixing? When is enthalpy important? Today and Friday, look at the microscopic origin of entropy. What are physical sources of entropy? Any physical degree of freedom that can take up energy is a source.

  • Configurational: main source of entropy
  • Translational: gases
  • Vibrational: solids
  • Electronic

A stiff material is associated with a high vibrational frequency and low entropy.

<p>
</p>

Consider a model for electronic entropy. How does one make a model? Try to turn a system into independent degrees of freedom. What are the independent subsystems? Eigenstates act independently, and there is a need to know eigenstates. They are enumerated by density of states.

Whether a system is occupied or not could be regarded as a variable. Knowing whether any given state is occupied or not is enough to completely describe a system. Sum over independent subsystems and calculate entropy. The probability that a state is occupied is provided by the Fermi-Dirac functino.

<center>

<br>

<math>s = -k_B \int \eta ( \epsilon ) \left [P_

Unknown macro: {occ}

\ln P_

+ P_

Unknown macro: {unocc}

\ln P_

\right] d \epsilon</math>

<br>

<math>S = -k_B \sum P_i \ln P_i</math>

<br>

<math>S_

Unknown macro: {el}

= -k_B \int n( \epsilon ) \left [f \ln f + (1-f) \ln (1-f) \right] d \epsilon</math>

<br>

</center>

Analyze what this looks like. Below is a graph of the density of states with fermi function superimposed. When <math>f=1</math> and when <math>f=0</math>, there is no entropy. The only electrons that contribute to entropy are <math>kT</math> near the Fermi level. Estimate what the entropy will be. Look at the number of states within the width.

<center>

<br>

<math>\frac{ S_

}

Unknown macro: { k_B }

\approx k_B T \eta (\epsilon_F)</math>

<br>

<math> S_

Unknown macro: {el}

\approx k_B^2 T \eta (\epsilon_F)</math>

<br>

</center>

The contribution of each state is of order one. Consider another conmatht, the failure of the Drude model. The model consider classical particles and a heat capacity of <math>\frac

Unknown macro: {3}
Unknown macro: {2}

k T</math>. An expression of the electronic heat capacity is below.

<center>

<br>

<math>S_

= N_e \frac

Unknown macro: {3}
Unknown macro: {2}

k T</math>

<br>

</center>

This model considers all electrons contributing to the heat capacity, but not all do. Consider the fraction of electrons that contribute.

<center>

<br>

<math>\frac

Unknown macro: { k_B T }
Unknown macro: { epsilon_F }

</math>

<br>

</center>

The order of <math>\epsilon_F</math> is several electron volts. There is a small number of electrons that contribute to the heat capacity. Evaluate the number of electrons that contribute in different material classes.

Insulators

If a band gap is large enough, the contribution is zero.

<center>

<br>

<math>S_

Unknown macro: {el}

\approx 0</math>

<br>

</center>

Semiconductors

The entropy is high per carrier, but there are not many carriers. There is an insignificant number of carriers on a per mole basis.

<center>

<br>

<math>S_

\right \mbox

Unknown macro: {small}

</math>

<br>

</center>

Metals

There are wide bands in a metal, and states are spread out. There are not piled up at the Fermi level.

<center>

<br>

<math>S_

Unknown macro: {el}

\right \mbox

Unknown macro: {small/medium}

</math>

<br>

</center>

Oxides

<center>

<br>

<math>MgO</math> <math>\mbox

Unknown macro: { and }

</math> <math>Al_2O_3</math>

<br>

<math>S_

Unknown macro: {elec}

= 0</math>

<br>

<math>\mbox

Unknown macro: {Conducting Oxides}

</math>

<br>

<math>S_

\right \mbox

Unknown macro: {very high}

</math>

<br>

</center>

Why is the entropy higher? States are much more localized. Bandwidth is proportional to overlap squared, and there is much less bandwidth in oxides. Make extremely high in mixed valence conductors.

<p>
</p>

The term <math>k</math> is not a descriptor of state. It is not known how to do things between localized an unlocalized states. Calculate the entropy of localized systems. Consider a perovskite, <math>LaMnO_3</math>. Ions are <math>La^

Unknown macro: {3+}

</math> and <math>Mn^

</math>, and doping with <math>Sr^

Unknown macro: {2+}

</math> is charge compensated with <math>Mn

</math>.

<center>

<br>

<math>\left ( La_

Unknown macro: {1-x}

^

Unknown macro: {3+}

Sr_x^

Unknown macro: {2+}

\right ) \left ( Mn_

^

Unknown macro: {3+}

Mn_x^

Unknown macro: {4+}

\right ) )_3</math>

<br>

</center>

States on <math>Mn</math> are fully localized. Entropy pertaining to the first portion of the compound is related to standard entropy of mixing. How is the contribution of <math>Mn</math> to entropy described? Think of variables that are needed to characterize and then combinatorics. The ions <math>Mn^

</math> and <math>Mn^

Unknown macro: {4+}

</math> must be distributed. There is configurational entropy.

<center>

<br>

<math>-k_B \left [\left (x \ln x + (1-x) \ln (1-x) \right]</math>

<br>

</center>

Consider starting to delocalize wherein eigenstates are not just on <math>Mn^

Unknown macro: {3+}

</math>. The eigenstate is a Bloch wave. Interesting properties are in regimes between.

Thermoelectrics

Consider moving electrons from a cold piece to a hot piece. Move heat by electrons. Entropy carries heat. Consider how much heat there is divided by how much current. Conducting oxides are good candidates to use in this application. When carriers are localized, it is hard to optimize conductivity and high entropy. Novel thermoelectrics are marginal conductive oxides.

Vibrational Effects

There tends to be stabilization of less dense phases at high temperature. Consider a close-packed transition, such as a transition from <math>fcc</math> or <math>hcp</math> to <math>bcc</math>. There is less dense packing in body-centered cubic structures. Estimate transitions.

<center>

<br>

<math>E_

Unknown macro: {Ti}

^

Unknown macro: {hcp}

- E_

^

Unknown macro: {bcc}

</math>

<br>

<math>\theta_D^

Unknown macro: {hcp}

- \theta_D^

</math>

<br>

</center>

Make a model of the transition. See where the transition occurs.

<center>

<br>

<math>\theta_

Unknown macro: {hcp}

= 470 K</math>

<br>

<math>\theta_

Unknown macro: {bcc}

= 375 K</math>

<br>

</center>

Estimate Einstein frequency and calculate free energy curves. Consider the phase diagram of iron. Iron is body-centered cubic at low temperatures. There is a high temperature transition, and a magnetic transition at a temperature close to the <math>bcc</math> to <math>hcp</math> transition.

Configurational Entropy

Configurational entropy drives mixing and disorder. Is disorder inevitable? Consider the inequality below.

<center>

<br>

<math>S_

Unknown macro: {config}

^

Unknown macro: {dis}

\gg S_

^

Unknown macro: {ord}

</math>

<br>

</center>

The structure sets energy scale. Phenomena is independent of temperature. Consider a system of copper and zinc with equal contributions of each component. There is a second order transition.

<p>
</p>

Consider a system of 50-50 <math>CuPd</math>. Both components are <math>fcc</math>. There is <math>B2</math> ordering on a <math>bcc</math> lattice. It doesn't disorder on <math>bcc</math>. Order to disorder couples to change in lattice. The <math>bcc</math> structure is associated with unfrustrated ordering, while the ordering interaction can't be satisfied with <math>fcc</math>. With disorder go back to <math>fcc</math>

<p>
</p>

Is <math>S_

Unknown macro: {vib}

^

Unknown macro: {dis}

</math> greater than or less than <math>S_

^

</math>? Are frequency modes larger or higher in ordered state? Either is possible. Consider an example where <math>S_

Unknown macro: {vib}

^

Unknown macro: {dis}

</math> is less than <math>S_

^

Unknown macro: {ord}

</math>. Mix atoms of different size. The ordered state is constrained by orbital. In the disordered state, some bonds shrink and some rise. The loss of entropy may be more important than the gain of others.

<p>
</p>

Add entropy. Compare <math>S_

Unknown macro: {config}

^

+ S_

Unknown macro: {vib}

^

Unknown macro: {ord}

</math> with <math>S_

Unknown macro: {config}

^

Unknown macro: {dis}

+ S_

^

Unknown macro: {dis}

</math>? What is the sign of the difference in sums? Typically, there is higher entropy associated with the disordered state at high temperature. Can it be that the disordered entropy is lower? This depends on whether <math>S_

Unknown macro: {vib}

^

</math> greater than or less than <math>S_

Unknown macro: {vib}

^

Unknown macro: {ord}

</math>.

<p>
</p>

A lower entropy associated with the disordered state has not been seen in atomic systems. In polymeric systems, there is a phenomenon of inversion. There is a transition from solid solution to phase separation. Configurational entropy is proportional to the number of polymer chains, while the vibrational term is proportional to the number of monomers. With a higher degree of polymerization, the vibrational entropy determines properties.

<p>
</p>

There has been inverse solidification observed wherein it is possible to melt by cooling. When the material demonstrating this phenomenon is heated, it takes up more volume. There is more vibrations and higher vibrational entropy.

  • No labels