engine temp sensor specs

-

buck351

Well-Known Member
Joined
Apr 22, 2010
Messages
427
Reaction score
37
Location
Campton Hills, IL
Anyone know the specs on this for resistance verses temperature? I was checking the resistance of the old and new at 65 degrees and they were 315 verses 275. Wanted to comare to the specs for the sensor.

I'm wondering which is closer to the spec so the temp gauge is more accurate. Well as accurate as it can be.
 
Should be the same for fuel and optional oil gauge. Mopar used to sell a tester that had three resistors inside, which simulate empty/ cold, middle of scale/ half tank, and full/ hot mark

All you need to do is scare up the resistors, from someplace like Mouser Electronics. I did mine this way because I have a "junk box."

http://www.forabodiesonly.com/mopar/showthread.php?t=179517

The basic resistances are

L = 73.7 Ohms (empty)
M = 23.0 Ohms (1/2)
H = 10.2 Ohms (full
 
I decided to try one of the newer replacement sensors I bought with the lower resistance compared to the original one. As I expected the gauge reads higher than the original one for 180. After years of looking at the 180 point with the original sensors the higher reading does bother me some. The original one was right around the start of the barred range on the temp gauge and the new one is aound the "M" in Temp. Since both new sensors were lower resistance than the original I'm thinking this is a reproduction parts thing, not always exactly like the original.
 
It's too bad, that is, I don't know where, to find the resistance to temp plot of these. This would be listed for any stand alone thermister

Then, with a temp gauge and ohmeter, you could verify that the sender was "sending" the correct resistance as per engine temp.

The entire original gauge system is not known for accuracy, etiher.
 
If these instruments were anything more than range indicators they might have had numbers on the screens rather than just hash marks.Therefore the senders adont need to be identical to reach the store shelves. Anyway... since you've asked I'll give up a little of information that I paid dearly for;
at 120 degrees 80 ohms
at 170 degrees 32 ohms
at 230 degrees 13.5 ohms
If you do the math you'll find the resistance within normal operating range changes approx. 3.1 ohms per 10 degrees. That in itself gives the sender makers an out the door tollerence of +/- 3 ohms or +/- 10 degrees. Whan a gauge has numbers on the screen the matched senders are held to a much tighter tolerence.
If you consider the factories tester using 73 ohms, 23 ohms, 10 ohms you might assume that you can figure out 23 ohms is the center of the screen and what temperature that is. It wouldn't be correct though because 23 ohms is not the center of every gauge screen.
I kinda doubt any of this will help figure out your problem.
 
-
Back
Top