Okay, I partially understand what is going on with the engine sensors now.
I removed the signal lead from the oil pressure gauge on another car (this one's a 1927 Ford with a 1967 Ford 289 hi-perf engine ) and measured voltage between the lead and gauge. With the engine turned off it was just about 10vdc (~9.99-10.03). With the engine running it was about 10.63.
Now, since this is a 12vdc system, I assume the ~10vdc when not running shows the minimum voltage (most resistance at the sensor) and the closer the pressure gets to the maximum (least resistance at the sensor), the closer it will be to sending 12vdc to the gauge.
Is this correct so far?
Here is the hurdle for me:
I think I have the voltage divider construction figured out. Simplistically speaking, you connect two resistors together in series and connect the signal lead between the resistors. The input on one resistor goes to the engine sensor (sending unit) and the output on the other goes to ground.
Am I still on track?
Now for the big question--how do I go about sizing the resistors?
From what I have read, it looks like I will need R1=700 ohms and R2=500 ohms. Here is my reasoning for that:
input voltage * (R2/(R1+R2))=output voltage
So, is this correct or do I need different size resistors here? Should they be 5 and 7 ohms? How about 5K and 7K ohms?
Having said (asked) all of that, am I correct in assuming the measured signal coming from the FB would be somewhere between 10vdc and 12vdc with ~10.63vdc being normal oil pressure (in this instance)?
My brain hurts...