S1 temp gauge calibration?

The temp gauge on my ‘new’ ‘67 S1 reads a few degrees high, according to two separate thermometers. It appears to have its original sender unit, as far as I can tell.

I’m fine making the mental adjustment, but it would be great if there was ‘adjustment’ that could be made.

Has anyone done such a thing?

Thanks

If you look at the back of the gauge you should see two holes beside the elect. connectors. .A small flat head screwdriver inserted and turned moves the needle. I did it on my '67 last year. Adjusted the gauge to what my infrared thermometer read at the transmitter.

IMG_9009

All of which is probably why Jaguar eventually eliminated the numbers:

Wouldn’t you like a gauge that assured you that you are ‘normal’?

Yeah well with the usual level of accuracy it’d be lying.

1 Like

I calibrated mine similar to Terry, but used a thermometer suspended into the coolant in the header tank to avoid IR measurement issues.

The calibration points are accessed by popping out the little cork plugs; one adjusts the zero point and the other is the full scale adjustment.

He picture below gives a better idea of what you’re actually adjusting. It isn’t an adjustment screw that is turned - you’re sliding the little plates inside the gauge that pivot around a fixed point.

image

Great, thanks guys. I had a foggy recollection that this was the case, so this is great.

Yes, I suspended two thermometers in the header tank, and both read 180F, while the temp gauge read 190F…so I guess it’s more than a few degrees off, as I first indicated.

Depending on where the sender is in comparison to the header tank it may not be off. When I checked mine I pulled the sending unit, ran longer wires to the stock connections and suspended the sender in a pan of water on a hot plate. Then compared the water temp in the pan to the gauge. that was the most accurate way I could think of.

I’d like to think it wasn’t a coincidence but my IR temp was what the thermostat temp was.

How do these methods compare to a Series 2? I believe I acquired a Series 1 sensor (likely my ordering mistake) and when the car is idling in the garage and my IR gun reads far under 180 (like 140-150) anywhere on the cooling system (intake manifold near the sensor, water pump, any hose, radiator, etc. the gauge needle is reading right up and past the right hand red line. Meanwhile I have a Hayden switch set for 180 degrees that doesn’t turn on the electric fan until my IR gun actually reads closer to 180.

I have a legit Series 2 temp sensor on order, but wonder about the ability to adjust the gauge if necessary.

I didn’t adjust mine. I just mapped the temperature reading to a drawing of the gauge face so I’d know in the future that if the needle was pointing to the “M” in Normal the water temperature was 192F. I was buying myself a little piece of mind.

2 Likes

Great idea. Once I get the correct sensor. If I did that yesterday it would be 135 on the L.

1 Like

If your new sender continues to over-read I would check where the gauge is getting its power supply from - the Green/Black wire. It (and the Fuel gauge) should be connected to the regulated (10v) output of the voltage regulator mounted on the rear of the gauge panel. If it is mis-connected to unregulated supply (12v), or the voltage regulator isn’t working properly, the gauge will mis-read. The gauge itself can be calibrated and adjusted using a bench power supply - there are some small dots on the dial face that correspond to particular input voltages. I don’t have the details to hand, but they are in archives somewhere.

1 Like

I tested the voltage today at the water temp gauge terminal and it was fluctuating between 7 and 10 volts every few seconds. Is that normal behavior? (car was running)

That sounds plausible. The voltage regulator generates an “average” of 10v, by alternating between the battery voltage (nominal 12v) and 0v. The gauges use a heated element to drive the needle position which responds very slowly, so the fluctuating supply voltage is not an issue. Sounds like your gauge is connected to the correct supply.

Excellent. Thank you.

I’m guessing you were using a digital multimeter and have an original design stabilizer (not one of the solid state versions). The original design switches on and off so measured over time it averages 10V, and an old style analog meter has enough weighting so it will read right.

Replacement solid state stabilizers are nice in that they actually create a stable 10 VDC output instead of a 10 VAC RMS average. Makes troubleshooting a bit easier too since pretty much everyone uses DVM anymore.

Correct on all counts. Thanks for the info.

I loaned my 50 year old Radio Shack multi-meter to my son a few years ago and he “lost” it… that still burns my butt.

LOL. I wish I had kept one of my old analog meters as well. Sometimes they’re just a better tool for the job.

2 Likes

The S2 has the same adjustment mechanism. If you look carefully at the back of the gauge, there are two stoppers covering the adjustment holes. Just bear in mind that you aren’t turning a screw, but moving a cam. So gentle is the word.

1 Like