Excessive ballast resistor resistance?

-

Chained_360

Delusional Member
FABO Gold Member
Joined
Nov 11, 2015
Messages
562
Reaction score
332
Location
Indianapolis, IN
So I recently started hooking up a relay for my electric choke, and I'm using the blue wire from the alternator to the coil to turn on the switch. However, in my investigation as to where the actual ballast resistor IS (wiring diagrams DO NOT represent actual physical locations...), I tested the factory ballast resistor. Now, my '68 shop manual says that it should be 0.5-0.6 ohms of resistance, but I was reading 1.5-1.6 ohms.

How does this affect engine performance? Should I replace the resistor?
 
Should be less than 1 ohm, for factory coil


Is this a 2 terminal ballast? I don't understand "blue wire from alternator to coil." There should not be a wire such as.........

The blue field wire, if factory, goes from alternator field to the KEY side of the ballast, not the coil

What are you using for a coil? Some aftermarket coils use proprietary resistors. Some use the factory resistor PLUS a dedicated resistor, such as the old rectangular Mallory
 
Probably using the alt field wire for a supply for the electric choke because of it being hot with the key on?
 
Yeah, Trailbeast is right, sorry for being vague. I'm using the wire from the field end of the alternator to turn on the relay, because it's only hot when the alternator is turning. I was just wondering if I should replace the resistor because it's over specification on resistance.
 
If you are sure you measured it correctly, I'd say "yes." Is this a two terminal resistor?
 
Measure it when cool; that is when the 0.5 to 0.6 ohm spec is valid. Also, first connect the 2 leads of the ohmmeter directly together and see how much resistance the leads have by them selves. Do this a few times. The measure the resistor and subtract out the lead resistance to get the real ballast resistance.

If it is indeed 1.5 ohms cold them it will be about 5 ohms hot and that is much too high. OEM ballast were less than 2 ohms hot. That high a ballast will drop spark energy by a factor of 4 or more.
 
If it is indeed 1.5 ohms cold them it will be about 5 ohms hot and that is much too high. OEM ballast were less than 2 ohms hot. That high a ballast will drop spark energy by a factor of 4 or more.

I am curious about your assumptions of the significant temperature increase for 4 x resistance change. My thinking is a resistance increase of about 25%., more than that, the paint on the firewall might burn. I am thinking they should stay below 250F.
 
No assumptions being made at all , Kit. I measured several different types in an operating stock Mopar points system, and that is the resistance rise I got cold to hot.

Some numbers cold /hot:
1.7/7 (BWD RU4) Do not use
0.6/3.5 (BWD RU19) OK if you have to
0.6/2.0 (NOS Mopar PN 2095501) This is the best part to use; I got some on eBay.

The MSD 0.8 ohm ballast should be OK too but I never measured one yet.

None were excessively hot. You may be making the wrong assumption on the resistance wire being used if you are thinking of a lower resistance rise versus temp. The NOS Mopar ballast is in the car and is not too hot at all.
 
I just for fun looked up a 70 Plymouth RR 383 at O'Reallys and they show an (BWD) RU-6, but I cannot find a listed spec for resistance.

NAPA shows two, both incorrect, and both way over 1 ohm, 1 is 1.85. Christ O'mighty, what ever HAPPENED to parts in this country??
 
I think the resistors are a coil of nichrome wire. The resistance of nichrome used for heating elements is well characterized and documented for resistance vs temperature. The problems lie in the junction between the nichrome and the terminals, along with the ratio of nickel and chromium alloy.

Over time and heat cycles the spot weld on the terminals can fail, the nichrome can oxidize and properties change for increased resistance. When made in China, who controls the alloy?

It is possible to make an active current source with transistor, I do that on the low switched side of my electronic ignitions. In normal operation the source resistance is quite low, when the current limit is reached, the current holds at set current limit. The dwell time is controlled also, the current ramps up with time and voltage, properly control that and current is limited. At low RPM extra dwell is required because of significant RPM variations, it is hard to predict a few milli seconds over hundreds of milliseconds.

The standard mopar control box does not control dwell time, it is similar to points, but extends the dwell by limiting the ignition period similar to that of a dual points. The ballast limits the current.

Enough about all that for now, I need sleep.
 
I just for fun looked up a 70 Plymouth RR 383 at O'Reallys and they show an (BWD) RU-6, but I cannot find a listed spec for resistance.
I looked all over the BWD site and never found complete specs either. BTW, the BWD RU12's and RU23's are not right for this application either.

NAPA shows two, both incorrect, and both way over 1 ohm, 1 is 1.85. Christ O'mighty, what ever HAPPENED to parts in this country??
It seems less important to have specs anymore than part listings that serve the distribution chain. I have found just one parts counter guy in recent years who even had an inkling that the right ballast resistance can be important, but I don't think he knew why!
 
I think the resistors are a coil of nichrome wire. The resistance of nichrome used for heating elements is well characterized and documented for resistance vs temperature. The problems lie in the junction between the nichrome and the terminals, along with the ratio of nickel and chromium alloy.
Well, that is what I don't know for sure...if it is nichrome, or another alloy with a higher R vs. T coefficient. BTW, most of the resistance rise seemed to take place in the first 30 seconds or so.
 
-
Back
Top