OMG, thankyou so much for helping with the +C common denominator error I had. I wasn't going to sleep if I couldn't get that solution.
Still a little fuzzy on the +C idea though, a little embarrassing as a 4u student, but was the problem here that I was trying to multiply +C by a variable/unknown term? coz its okay to do if its just a constant like "2", or "-e^0.1" since C represents any constant anyway?
You introduce a new constant, that
depends on the old constant.
So say that your original constant was \(C\). Your new constant would be a
new arbitrary constant, say \(C_0\) instead. The idea is that whilst \(C\) is arbitrary, you can't use the
same \(C\) to represent two different things
-----------------------------------------------------
Basically what happened was that when you multiplied everything else by a term, namely the \(k(1-\alpha)\), you didn't multiply \(C\) to it at all. That's no longer mathematically correct.