Verification, please?
As long as we're almosting and close-ing...
During RUN, the coil is supplied with voltage reduced from 14 Volts by a 1/2 ohm resistor, while the 4 pin ECU is supplied with a full 14 V.
While the original factory 5 pin ECUs were supplied at 14 Volts greatly reduced by a 5 Ohm resistor.
When referring to the 5-pin ECU as having a reduced voltage supplied on Run through the 5-ohm side of a dual ballast, keep in mind that pin-1 on both the 4 and 5-pin ECUs is
also/still supplied with system voltage.
Also, in the interest of understanding:
When the resistor is cold, almost no voltage reduction takes place, so it works just right.
Not really. ;)
That sort of thing might happen when there is a poor connection that generates heat; when it's cold it may show less of a voltage drop than when it's hot.
In the case of a wire wound resistor (the ballast), as long as it's operating correctly the resistance it shows, the voltage drop it makes, will be close to the same whether it is hot or cold. A cold 1.25ohms won't be different from a hot 1.25ohms. All of this when within design limits. A resistor designed to handle 50 watts might get real hot and work just fine at 40 watts. Make it try to drop 70 watts....THEN heat is a problem.
One other aside (and having nothing to do with the original question) that may be helpful: In an OEM setup, the Start wire has a tie-point in the harness you don't see that feeds coil+. That's how both Start and Run end up on the single wire that is attached to coil+ while taking two paths around/through the ballast.