I think you're confusing two concepts.
The voltage of transmission is the voltage difference between the two wires, at the start and the end of the circuit. The voltage lost across the transmission wires is a different quantity.
Think of the entire transmission as a series circuit with a weird transformer in the middle and think of the power source as just a voltage source.
Then, the potential difference over the voltage source is constant. The voltage loss, however, is the voltage over the wires, which can be seen as a resistor in themselves. It's not the same voltage as the voltage across the transformer; that voltage is also something different. There are two voltages depending on which coil you're looking at, and those voltages are the potential differences between each half of the same coil. So really, when we get to a transformer, it's as if we have a new circuit with a new voltage source.
As you know, when the transformer voltage goes up, the current drops. The current is really what determines the voltage. Try not to think of electric transmission questions in terms of voltages, because otherwise you'll confuse the various voltages. Just think of it in terms of current like what SocialRhubarb said. There is nothing trippy about the currents, except that they increase as the voltages decrease (across the transformer and NOT the wires! Those voltage drops are still just IR). Increase voltage across transformer => drop the current => drop the voltage loss across the power lines.