Login

Welcome, Guest. Please login or register.

November 03, 2025, 02:17:03 pm

Author Topic: brightsky's Chem Thread  (Read 79220 times)  Share 

0 Members and 2 Guests are viewing this topic.

scribble

  • is sexier than Cthulhu
  • Victorian
  • Forum Leader
  • ****
  • Posts: 814
  • Respect: +145
  • School Grad Year: 2012
Re: brightsky's Chem Thread
« Reply #165 on: July 03, 2013, 11:00:10 pm »
+1
entropy is VERY hard to explain. my understanding of it isn't very good tbh, so i'm prolly not a good person to ask.
My lecturer hated the disorder definition of entropy though. He said that entropy is more to do with probability, or the number of ways a microstate can be arranged in a macrostate. which is a dodgy definition to me, because in a chemical system, its hard to say what  macro/microstate is. :S
But entropy (S) is defined to be = kb*ln(W) where W is the number of ways a microstate can be arranged in a macrostate. again this is dodgy. but we don't really with entropy, we're usually working with change in entropy, which, of course, comes with more concepts and formulas that I don't really want to go into/probably can't go into because I don't understand it well myself. My lecturer was pretty much like "hey guys, heres a formula, im not gonna explain it because its weird, but trust me its true (Y)(Y) here have some chocolate frogs!!" 
But what is good to know is that the state of highest probability will be the state of highest entropy. So if ΔS is positive, you're going to a more probable state, and if ΔS is negative, you're going to a less probable state.
and I guess what the disorder thing is saying is that the more disorder there is, the higher the entropy, because it's the most probable. If you have some gas in a container, chances are the gas will be "disordered".


psyxwar, I think you're on the right track but whether a reaction is spontenous or not is to do with the change in Gibbs free energy, which is related to elthapy, entropy and temperature.
ΔG = ΔH - TΔS

really guys, if you're really, really, REALLY keen to learn entropy, pick up a first year chem book. my uni tells us to use zumdahls chemical principles. (which you may or may not be able to find online if you're one to choose the thug lyf ;) ) though i don't see why you would want to learn about entropy; entropy is yucky and makes everyone's heads hurt.

lzxnl

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 3432
  • Respect: +215
Re: brightsky's Chem Thread
« Reply #166 on: July 03, 2013, 11:28:50 pm »
0
What I see entropy as is how smeared across the energy of a system is. It is, as scribble says, defined by k ln W, where W is the greatest multiplicity of a system. Here, multiplicity means the number of energy configurations that have the same energy.

Now if you think about it, if we have 3 energy states and 3 electrons, entropy is greatest when each energy state has one electron. Similarly, in a room of a certain volume, entropy is greatest when the particles are spread out evenly. Entropy can be thought of in terms of probability. The second law of thermodynamics arises not because that configuration is the only possible one, but because it is the most likely one.

If we had 23 million energy spaces and 23 million electrons, how many ways can we fill these energy spaces with one electron per space vs all 23 million electrons in the same space? You do the maths. It's an idea of how entropy works.

psyxwar, there are two things to consider. Firstly, there is an entropy change of the system itself. This is obviously just delta S. Then, there's the entropy change of the surroundings. As delta H is the negative of the heat released into the surroundings, and because the surroundings basically stay at the same temperature, the entropy change of the surroundings is -del H/T.
Summing these yields the entropy change of the universe in this reaction, which is S - H/T (I've left out the deltas for clarity)
So reaction is spontaneous when this is greater than zero.
Or, multiplying this by -T, we have -T delta S + H which is the Gibbs free energy.

And as for the relation between the equilibrium constant and temperature...here are three equations which may be of interest.
1. delta G (standard) = -RT ln K
2. delta G(standard) = delta H (standard) - T*delta S(standard)

Using these equations, you can quite easily show that ln(K2/K1) = delta H(standard)/R * (T1^-2 - T2^-2)

I don't like entropy either. I really don't like it. Especially when it gets more complex and you have to figure out where to integrate over -.-
2012
Mathematical Methods (50) Chinese SL (45~52)

2013
English Language (50) Chemistry (50) Specialist Mathematics (49~54.9) Physics (49) UMEP Physics (96%) ATAR 99.95

2014-2016: University of Melbourne, Bachelor of Science, Diploma in Mathematical Sciences (Applied Maths)

2017-2018: Master of Science (Applied Mathematics)

2019-2024: PhD, MIT (Applied Mathematics)

Accepting students for VCE tutoring in Maths Methods, Specialist Maths and Physics! (and university maths/physics too) PM for more details

lzxnl

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 3432
  • Respect: +215
Re: brightsky's Chem Thread
« Reply #167 on: July 03, 2013, 11:30:51 pm »
0
Oh, and as for why like dissolves like:
If you try to dissolve something that's non-polar with something that's polar, you have to break the polar-polar bonds first, which will be much stronger than polar-nonpolar bonds. This is energetically unfavourable, leading to higher delta H and hence higher delta G => not spontaneous.
You can do the reasoning for polar dissolving polar now.
2012
Mathematical Methods (50) Chinese SL (45~52)

2013
English Language (50) Chemistry (50) Specialist Mathematics (49~54.9) Physics (49) UMEP Physics (96%) ATAR 99.95

2014-2016: University of Melbourne, Bachelor of Science, Diploma in Mathematical Sciences (Applied Maths)

2017-2018: Master of Science (Applied Mathematics)

2019-2024: PhD, MIT (Applied Mathematics)

Accepting students for VCE tutoring in Maths Methods, Specialist Maths and Physics! (and university maths/physics too) PM for more details

scribble

  • is sexier than Cthulhu
  • Victorian
  • Forum Leader
  • ****
  • Posts: 814
  • Respect: +145
  • School Grad Year: 2012
Re: brightsky's Chem Thread
« Reply #168 on: July 03, 2013, 11:42:14 pm »
0
actually, ΔS = q/T, if the reaction is reversible. (q is the heat added to the system)
and because ΔE= q+w, when the volume is constant, w=0 so ΔE=q, and ΔS=ΔE/T
if the pressure is constant, q=ΔH, so ΔS=ΔH/T

what i would like to know is where the ΔS=q/T thing came from to begin with.

lzxnl

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 3432
  • Respect: +215
Re: brightsky's Chem Thread
« Reply #169 on: July 03, 2013, 11:48:17 pm »
0
Oh you mean that? Dear me.
I was shown a statistical mechanics proof that scares me. It can be proven from the S = k ln W and the statistical mechanics expression for energy levels and microstates (as you can see, I've forgotten a lot of it already) and the result is literally delta S = q/T. Apologies for not being more helpful; I can only remember so much from the summer school.
2012
Mathematical Methods (50) Chinese SL (45~52)

2013
English Language (50) Chemistry (50) Specialist Mathematics (49~54.9) Physics (49) UMEP Physics (96%) ATAR 99.95

2014-2016: University of Melbourne, Bachelor of Science, Diploma in Mathematical Sciences (Applied Maths)

2017-2018: Master of Science (Applied Mathematics)

2019-2024: PhD, MIT (Applied Mathematics)

Accepting students for VCE tutoring in Maths Methods, Specialist Maths and Physics! (and university maths/physics too) PM for more details

scribble

  • is sexier than Cthulhu
  • Victorian
  • Forum Leader
  • ****
  • Posts: 814
  • Respect: +145
  • School Grad Year: 2012
Re: brightsky's Chem Thread
« Reply #170 on: July 03, 2013, 11:58:12 pm »
+1
i thought the k ln(W) thing was literally just the definition of S though. like they went, W is too big, so lets take the natural logarithm of it so we get numbers that are nicer to deal with. oh and lets throw botzmanns constant in there because we we like to use mols and Na*kb = R yay!
kind of how we use a pH scale because talking about [H+] gives us yucky numbers to look at.

ahaha statistical mechanics, no thank you. i don't even know what that means. now i see why my poor chem lecturer didn't explain these things;;; we'll just stick with "hey heres a formula, use it. trust me it works, the guys who came up with it have phDs" :'D

lzxnl

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 3432
  • Respect: +215
Re: brightsky's Chem Thread
« Reply #171 on: July 04, 2013, 12:26:25 am »
0
k ln W is the statistical mechanics definition of entropy. As to why they defined it like that, I have no idea. I've seen a rational that goes like this:
Obviously, the higher the heat, the higher the entropy change. However, if the surroundings were at a high temperature, then this heat wouldn't change the energy states much. Therefore, if we define quantitatively a function entropy given by q/T, where does this lead us?

Five pages of working later: oh look, S = k ln W

I would not be surprised if that was what happened.

The problem is, pH isn't even -log[H+]. It's the activity of the H+ ion, which can depend on the ions around it. From a source I found on the internet, adding some MgCl2 solution to a HCl solution can increase the pH. It's because adding more Cl - ions increases the chemical potential of the H+ ion and hence its chemical activity, more than the diluting of the solution affects its activity.

Statistical mechanics is a bit weird for me to define as well. I just see it as mechanics applied to large numbers of particles based on common sense notions of what happens to them. Like PV=nRT.
2012
Mathematical Methods (50) Chinese SL (45~52)

2013
English Language (50) Chemistry (50) Specialist Mathematics (49~54.9) Physics (49) UMEP Physics (96%) ATAR 99.95

2014-2016: University of Melbourne, Bachelor of Science, Diploma in Mathematical Sciences (Applied Maths)

2017-2018: Master of Science (Applied Mathematics)

2019-2024: PhD, MIT (Applied Mathematics)

Accepting students for VCE tutoring in Maths Methods, Specialist Maths and Physics! (and university maths/physics too) PM for more details

scribble

  • is sexier than Cthulhu
  • Victorian
  • Forum Leader
  • ****
  • Posts: 814
  • Respect: +145
  • School Grad Year: 2012
Re: brightsky's Chem Thread
« Reply #172 on: July 04, 2013, 12:36:52 am »
0
pffttt I'D BE HIGHLY SURPRISED IF THEY ONLY NEEDED FIVE PAGES OF WORKING. XDXD

wut. ._____. go away let me believe that pH is -log[H+], you're shattering all my hopes and dreams </3
do you still have the link though? sounds like an interesting read.

i hear the word mechanics and want to cry. physics is not phun. :'(

Mao

  • CH41RMN
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 9181
  • Respect: +390
  • School: Kambrya College
  • School Grad Year: 2008
Re: brightsky's Chem Thread
« Reply #173 on: July 04, 2013, 02:29:16 am »
+3
1. Forget about entropy and free energy. You don't need it yet, and I'd rather you learn it properly at university than hurridly via a few forum posts.

2. Le Chatelier's principle is not a rule of thumb, it's a mathematical consequence of the way chemical reactions take place, i.e. collisions. It is the result of a vigorous, mathematical proof, and should be taken as such.

3. The way K values change with temperature is a result of the Arrhenius equation, which is a consequence of the Boltzmann distribution, i.e. classical thermodynamics. Why this comes about is too difficult to explain, and again, it is much better to learn it properly at university.

4. Equilibrium is a result of chemical kinetics. It is, by definition, a mathematical artefact. Don't interpret it more than such.

For now, it is suffice to interpret the interaction between heat and equilibrium as:

A + B + heat <--> C + D : endothermic. Increasing heat pushes system to the right, decreasing heat pushes system to the left
A + B <--> C + D + heat : exothermic. vice versa.
« Last Edit: July 04, 2013, 02:39:40 am by Mao »
Editor for ATARNotes Chemistry study guides.

VCE 2008 | Monash BSc (Chem., Appl. Math.) 2009-2011 | UoM BScHon (Chem.) 2012 | UoM PhD (Chem.) 2013-2015

Mao

  • CH41RMN
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 9181
  • Respect: +390
  • School: Kambrya College
  • School Grad Year: 2008
Re: brightsky's Chem Thread
« Reply #174 on: July 04, 2013, 02:48:17 am »
+3
I will now address a couple of specific points with regards to entropy. This may help the individuals I'm replying to, mostly in the sense that it'll better prepare them for future lectures, but I don't recommend reading these for VCE.

entropy is VERY hard to explain. my understanding of it isn't very good tbh, so i'm prolly not a good person to ask.
My lecturer hated the disorder definition of entropy though. He said that entropy is more to do with probability, or the number of ways a microstate can be arranged in a macrostate. which is a dodgy definition to me, because in a chemical system, its hard to say what  macro/microstate is. :S
But entropy (S) is defined to be = kb*ln(W) where W is the number of ways a microstate can be arranged in a macrostate. again this is dodgy. but we don't really with entropy, we're usually working with change in entropy, which, of course, comes with more concepts and formulas that I don't really want to go into/probably can't go into because I don't understand it well myself. My lecturer was pretty much like "hey guys, heres a formula, im not gonna explain it because its weird, but trust me its true (Y)(Y) here have some chocolate frogs!!" 
But what is good to know is that the state of highest probability will be the state of highest entropy. So if ΔS is positive, you're going to a more probable state, and if ΔS is negative, you're going to a less probable state.
and I guess what the disorder thing is saying is that the more disorder there is, the higher the entropy, because it's the most probable. If you have some gas in a container, chances are the gas will be "disordered".

Almost.

The Boltzmann definition S=k ln(W) is nice, but you need to be careful about how you count the number of microstates. If we have one particle in a continuous 1-D space, then we don't have a finite number of microstates. This is amongst one of the finer points in statistical mechanics that you will learn to work around. The Boltzmann definition is a beautiful definition, but often not practical.

Also, entropy is defined for an ensemble of microstates. You don't calculate the entropy of a particular configuration (a microstate). Rather, a microstate belongs to one or more different ensembles, at different probabilities, and each of these ensembles have a different energy level. It is meaningless to talk about the state of highest probability. Rather, it is more meaningful to talk about ensembles that have the highest entropy. You will often find that the set of microstates don't change, but their probabilities change between ensembles, and so give rise to a different entropy.

Lastly, in the ideal gas sense, entropy is 'disorder', but there are cases when entropy should not be interpreted as order/disorder. A more strict definition of entropy would invite me to discuss the definition of energy. What is "energy"? Energy is a simple, mathematical description of a certain quantity (not necessarily physical) that is conserved in our universe. Look up "Noether's theorem" if you are interested. Consequently, the conservation of energy is useful for solving physics problems, and we come up with various interpretations of what "energy" is, even though it is just a mathematical artefact. Entropy is much of the same thing. It arises because of a mathematical convenience (I forget exactly which theorem it was), and one of its interpretation is Boltzmann's statistical mechanics definition. Entropy and energy are both, at the end of the day, mathematical artefacts. We teach these interpretations rather than definitions because they are pragmatic, but these interpretations should not be taken as a law, "entropy is disorder" is not a totally correct statement.

For a reaction to be spontaneous, its change in entropy for the universe must be a positive value.
Not quite. The definition is much stricter than that. The change in entropy of the universe is always positive. Any system that violates this is not only non-"spontaneous", but also creates a perpetual motion machine.

What you should say is:
For a reaction to be physically possible, its change in entropy for the universe must be a positive value.

Therefore, change in entropy of surroundings can be modelled by (-deltaH)/T.
This is only true if deltaH=Q, and you have defined your system as such. Not all deltaH result in a heat change, e.g. a photoabsorption has a massive deltaH, but there is little heat exchange.
« Last Edit: July 04, 2013, 03:04:03 am by Mao »
Editor for ATARNotes Chemistry study guides.

VCE 2008 | Monash BSc (Chem., Appl. Math.) 2009-2011 | UoM BScHon (Chem.) 2012 | UoM PhD (Chem.) 2013-2015

Mao

  • CH41RMN
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 9181
  • Respect: +390
  • School: Kambrya College
  • School Grad Year: 2008
Re: brightsky's Chem Thread
« Reply #175 on: July 04, 2013, 02:52:32 am »
0
I assume that, if you increase the temperature of the equilibrium mixture, the increased level of energy now in the environment will push the backwards reaction (endothermic) as its activation energy has been reached. Therefore, the forward reaction rate will start to increase (as there are more molecules if reactants that are now going to collide, meaning more frequent successful collisions) until it again equals the rate the backwards reaction, so in the end, there has been a net backwards reaction, thus increasing the concentration of reactants and decreasing the concentration of products. This in the end decreases the K value (im assuming this is basically the same as the Ka, just not dealing with acids, although this may have the potential to be of no help at all)

Ive wrote this whole thing then realised a potential flaw in the first sentence. Now I too am looking for an explanation lol. Ill leave what i wrote in case it does help in some way to reach a conclusion

this is almost spot on. You shouldn't say "as its activation energy has been reached". Rather, a greater proportion of collisions now exceed this activation energy (according to how the Boltzmann distribution vary with temperature).

This increase in the proportion of successful collision goes both ways, it helps forwards and backwards reactions. However, its impact is greater for endothermic reactions (compared to their respective backwards, exothermic reactions), since its activation energy is larger.
Editor for ATARNotes Chemistry study guides.

VCE 2008 | Monash BSc (Chem., Appl. Math.) 2009-2011 | UoM BScHon (Chem.) 2012 | UoM PhD (Chem.) 2013-2015

Nobby

  • Guest
Re: brightsky's Chem Thread
« Reply #176 on: July 04, 2013, 03:40:15 am »
0
This increase in the proportion of successful collision goes both ways, it helps forwards and backwards reactions. However, its impact is greater for endothermic reactions (compared to their respective backwards, exothermic reactions), since its activation energy is larger.

On a somewhat related note, why is it that the addition of a catalyst doesn't affect the equilibrium constant of a reaction?

I haven't actually done any integration with the Boltzmann distribution (largely because I wouldn't know what to do) to investigate, but it seems highly unlikely that   and   are going to have the same relationship as   and .

Mao

  • CH41RMN
  • Honorary Moderator
  • Great Wonder of ATAR Notes
  • *******
  • Posts: 9181
  • Respect: +390
  • School: Kambrya College
  • School Grad Year: 2008
Re: brightsky's Chem Thread
« Reply #177 on: July 04, 2013, 03:57:56 am »
0
On a somewhat related note, why is it that the addition of a catalyst doesn't affect the equilibrium constant of a reaction?

I haven't actually done any integration with the Boltzmann distribution (largely because I wouldn't know what to do) to investigate, but it seems highly unlikely that   and   are going to have the same relationship as   and .

I'm unsure why we are calculating ?

Now, denote the reduction in the activation energy by the catalyst as :

If we are at equilibrium, we are more interested in seeking rate forwards=rate backwards. Assuming the activities are all equal to 1, then we have , , and the term containing cancels out, yielding Q that is the same as the original K (in terms of the Arrhenius equation), so the equilibrium position is unaffected.
Editor for ATARNotes Chemistry study guides.

VCE 2008 | Monash BSc (Chem., Appl. Math.) 2009-2011 | UoM BScHon (Chem.) 2012 | UoM PhD (Chem.) 2013-2015

Alwin

  • Victorian
  • Forum Leader
  • ****
  • Posts: 838
  • Respect: +241
Re: brightsky's Chem Thread
« Reply #178 on: July 04, 2013, 10:25:07 am »
+1
I have another question, related to thermodynamics. Is chemical energy the same as internal energy. The textbook says that "the chemical energy of a substance is the sum of its potential energy and kinetic energy". I was always under the impression that this was the definition of internal energy. In fact, I've never really heard of the term 'chemical energy' being used in a technical way. Does 'chemical energy' have a formal definition?
Why does an increase in the temperature result in a decrease in Ka value for an exothermic reaction? Or rather, how do you explain the effects of temperature on Ka value using common sense? Currently, I have to recourse to a whole bunch of fancy formulas, the derivations of which I do not yet understand. Can anyone provide an intuitive explanation of the phenomena outlined above?

Hmm, I'll kind of disregard the other posts for now (sorry), haven't posted in a while and all out of touch with things :P Hopefully my explanation can help clear up some issues. Also note my chemistry understanding is partly based in Singaporean textbooks I flicked though a while back, that's just a warning. Someone please correct me if I'm wrong, its been a couple years

The majority of the answer comes under the topic of Chemical Energetics.

Q1
Most simply put, chemical bonds are a source of (potential) energy, the movement of molecules in space is (kinetic) energy, the vibrations and rotations of molecules is another source of chemical energy. All of these forms of chemical energy contribute in one way or another to chemical reactions. Look at parts of Q2 for a more sophisticated answer

Q2
My ‘intuitive’ method for determining the change in K for chemical systems is based upon the fact that high temperatures favour endothermic reactions, whereas low temperatures favour exothermic reactions. I will explain more shortly. But first, for any chemical equilibrium there is forward and a reverse reaction.
Now, if the forward reaction is exothermic ΔH<0 , that means the reverse reaction must be endothermic ΔH>0. At increased temperature, the endothermic reaction is favoured, so the reverse reaction occurs faster (greater rate of reaction) than the forward exothermic reaction does. This means that reactants are favoured more than products initially were. Hence, the point of equilibrium shifts to the left, meaning a smaller value of K. You can use this limited logic to deduced the effect on K of different systems under different temperatures.
Look at the spoiler for more:
Table of Answers
Forward reaction   Increase temperature     Decrease temperature     
ExothermicSmaller KLarger K
EndothermicLarger KSmaller K

Now, a rudimentary explanation of entropy:

Entropy
Entropy
Entropy, S, is a measure of the disorder (or randomness) in a system
In nature, any system in random motion tends to become more ‘mixed up’ or disorderly as time passes, i.e. nature ends towards maximum entropy in isolated systems
Note: An isolated system is one which cannot exchange matter nor energy with its surroundings!

Factors Affecting Entropy
Factors Affecting Entropy
The entropy of a chemical system is affected by:
  • Change in temperature
    1.   Entropy increases as temperature increases.
    2.   This is because as temperature increases, the molecules or ions undergo greater vibration (solids) or rapid motion (liquids and gases) and this reduces overall orderliness
    3.   Note: at absolute zero, a substance has maximum order and so has zero entropy
  • Change in phase
    1.   SGAS >> SLIQUID > SSOLID
    2.   A solid has low entropy since its crystalline structure is ordered and regular
    3.   A liquid has higher entropy since molecules or ions in liquid state display less order
    4.   A gas has much higher entropy since its molecules have free movement
  • Change in number of particles
    1.   Especially for gaseous systems, increasing number of particles means the system becomes less orderly so larger entropy
  • Mixing of particles
    1.   Mixing process leader to disorder, and so entropy increases
    2.   Entropy increases when two pure gases are mixed together because overall orderliness reduced as molecules become randomly mixed
    3.   Entropy increases when a solid dissolves in a liquid. This is because the original crystal becomes particles scattered throughout the solution

Change in entropy
Change in entropy:
The change in entropy, ΔS=Sfinal – Sinitial
Some brief examples are:
Water (s) at 0°C to water (l) at 0°C   ΔS is positive
Water (s) at 25°C to water (l) at 25°C   ΔS is positive
Cl2 (g) to 2Cl-   ΔS is positive since number of particles increased, 1 mol -> 2 mol
Ar (g) at 2 atm to Ar (g) at 1 atm   ΔS is negative Ar atoms  free to move in larger volume (less order)

Standard Gibbs Free Energy
Standard Gibbs Free Energy
Every chemical reaction is accompanied by a heat (or energy) change, ΔH, and a redistribution of matter, ΔS. The combined effects of  these two factors is expressed by the quantity free energy, G

Standard Gibbs free energy, ΔG˚, is a state function of a system and is defined by means of the equation:
ΔG˚= ΔH - T ΔS˚  where T=temperature in Kelvin
A state function is one whose value is determined solely by the state of the system (amount of substance, temperature, pressure) and is independent of how the change is brought about.
Note: there are many more ways of expressing ΔG˚, most of which can be found here Gibbs free energy

The sign of ΔG˚ can be used to deduce whether a reaction or process will be spontaneous
If ΔG˚<0 the reaction is feasible and could take place spontaneously. The reaction is said to be exergonic (aka energy giving)
If ΔG˚>0 the reaction is not feasible and cannot take place spontaneously. The reaction is said to be endergonic (aka energy requiring)
If ΔG˚=0, the reaction is at equilibrium (aka no net reaction)

Reference for the bit above: A-level Study Guide Chemistry (H2) Edition 3.03 by CS Toh ©2006-2009 for formal definitions that made sense and examples, rather than me blithering on about Gibbs and signs making no sense :P

Exothermic and Endothermic reactions
From the equation ΔG˚= ΔH - T ΔS˚, the value of ΔG˚ is dependent on temperature T.
Hence, ΔG˚ may be negative when:
ΔH    ΔS   
-+Exothermic reaction (ΔH<0) accompanied by increase in entropy (ΔS>0)
Reaction is spontaneous at all temperature T, (eg organic combustion; explosives)
Eg 1/8 S8 (g) + O2 (g) → SO2 (g)
--Exothermic reaction (ΔH<0) accompanied by decrease in entropy (ΔS<0).
BUT ΔG˚<0 only when |ΔH|>|T ΔS| (from state of function equation)
Reaction is only spontaneous at low temperatures T, (eg condensation and freezing; addition reaction, precipitation) or else at high temperature  will |T ΔS|>|ΔH| (not feasible reaction)
Eg H2O (l) → H2O (s)
N2 (g) + 3H2 (g) → 2NH3 (g)
++Endothermic reaction (ΔH<0) accompanied by LARGER increase in entropy (ΔS>0)
BUT ΔG˚<0 only when T ΔS > ΔH (from state of function equation)
Reaction is only spontaneous at high temperatures T, (eg melting and boiling; decomposition; electrolysis; dissolving)
Eg H2O (s) → H2O (g)
AgNO3 (s) + aq → AgNO3 (aq)
0+The reduction of Gibbs free energy arises completely from the increase in entropy
Eg CHCl3 dissolved in CCl4
In this example, the intermolecular forces of the two species are similar, so ΔH = 0. The dissolution also results in greater randomness in the solution, hence entropy increases.

I realise that this is far from a complete introduction to entropy or chemical energetics, but it is all I have time for – I’ve been working on this for a day now haha, saving it in a text file and adding more when I get the chance.
The main point is to look at the last table regarding the sign of ΔG˚ which tells you if the reaction is feasible or not. Remember that when ΔG˚ < 0 the reaction is feasible, and then look at the condition, exothermic or endothermic etc. Hope it helps!

Going out soon, so if anything doesn’t make sense, or I’ve gone wrong somewhere, please let me know! I’ll reply when I can, sorry if I can’t be more helpful.


EDIT 1: Hmm, only just got around to reading all the other posts. For some reason, I get the feeling that integration might just, possibly, not be on the chem 3/4 course ;) And I'd listen to Mao he always knows what he's talking about, and most likely will correct me wherever I went askew :P
EDIT 2: Didn't quite answer the question, put subtopics in spoilers to make the post lest monstrous :P
« Last Edit: July 05, 2013, 12:42:30 pm by Alwin »
2012:  Methods [48] Physics [49]
2013:  English [40] (oops) Chemistry [46] Spesh [42] Indo SL [34] Uni Maths: Melb UMEP [4.5] Monash MUEP [just for a bit of fun]
2014:  BAeroEng/BComm

A pessimist says a glass is half empty, an optimist says a glass is half full.
An engineer says the glass has a safety factor of 2.0

brightsky

  • Victorian
  • ATAR Notes Legend
  • *******
  • Posts: 3136
  • Respect: +200
Re: brightsky's Chem Thread
« Reply #179 on: July 04, 2013, 02:15:37 pm »
0
This might sound like a dumb question, but why can't ethanoic acid function as a buffer by itself? Is it simply because there ain't enough ethanoate ions to consume extra hydronium ions introduced into solution with the addition of an acid?
2020 - 2021: Master of Public Health, The University of Sydney
2017 - 2020: Doctor of Medicine, The University of Melbourne
2014 - 2016: Bachelor of Biomedicine, The University of Melbourne
2013 ATAR: 99.95

Currently selling copies of the VCE Chinese Exam Revision Book and UMEP Maths Exam Revision Book, and accepting students for Maths Methods and Specialist Maths Tutoring in 2020!