As I mentioned previously, one of the possibilities was that you used the wrong procedure which gave you the wrong value, and I was right. That's because you followed flawed instructions. In fact, I take issue with a lot of what I saw in that article.
First of all, to test the ground wire, they had you put the meter's lead on the battery positive terminal. Of course you'll have 12 volts that way, which is what you listed. That IS a valid test if you're testing a ground wire with a test light which needs voltage to work. It is not the way to take a normal voltage reading. Think of starting with 100 pounds of water pressure in a pipe. You use a gauge connected to the source and to a point a mile away and find the difference is 20 psi. Therefore, you can calculate you actually have 80 pounds at that point. Who would do that? You would just put a gauge at the point and measure 80 pounds directly.
I understand the thinking, but voltage is electrical pressure, and it is always measured in relation to another point. On cars we measure in relation to ground which is the engine block and the battery's negative post. That is how we take electrical pressure readings directly. Now, to be fair, testing the way your author said does have a legitimate benefit, but it wasn't stated, and that's why most people are going to be confused if they have some understanding of basic electrical theory. A ground wire is supposed to have 0.0 volts on it, and it will if it isn't "open", as in cut. Even if it's cut, you'll still have 0.0 volts on the "good" side of the wire, but there will be some voltage on the other part of the wire on the other side of the break. To add to the confusion, he gave you an awful lot of leeway on what's acceptable for the 5.0 volt supply voltage, and that implies you can have the same leeway on the ground wire. That means if you found 11.5 volts on it, that's okay. In fact, that is absolutely not okay. First of all, the 5.0 volt supply is very carefully regulated on all cars, and I'd raise an eyebrow if I found 4.8 volts, much less 4.5 volts. The same precision is needed on the ground wire. As I mentioned earlier, you should find 0.2 volts on it. Normally that two tenths of a volt indicates a loose or corroded connection, and it's that very tiny voltage that identifies bad connections in high-current circuits like starter circuits, but in this case, that 0.2 volts is dropped across the monitoring circuitry inside the Engine Computer, so it is normal. 0.4 volts is not normal and indicates a problem. Usually that's not an issue because most Engine Computers have four ground wires. Each pair is for a different purpose, but two wires are for the sensors, and failure of both isn't common.
Another problem with using the battery's positive post as the reference is there can be some variation in battery voltage. A fully-charged battery will read 12.6 volts, so if the ground wire has the normal 0.2 volts, you're going to measure 12.4 volts. If the battery was just charged with a portable charger, the surface charge can make it read as high as 13.0 volts. If it is partially run down, it might read 12.2 volts. All of those are legitimate values and do not indicate a problem, but when used as the reference for other voltage readings, you can see how they could vary by six or eight tenths of a volt. When only 0.2 volts, (related to ground) is acceptable, you can't interpret a reading as good or bad when it can legitimately vary that much.
Measuring the voltage on the ground wire my way will give you 0.2 volts. 0.0 volts, and 0.4 volts are not okay. Doing it as described in the article, you said the author said "11 to 12 volts is okay". No it isn't. The only acceptable reading would be 12.4 volts, but you still have the battery voltage variable to make the reading useless.
So, instead of following the directions to move the meter's ground probe to the battery's positive terminal, leave it on ground or the negative post, and take all three voltage readings directly. If you want to experiment, after you read the signal voltage with the plug connected to the sensor, unplug it and read it on the terminal in the connector. The first way you'll find close to 0.5 volts. The 0.9 volts the author mentioned is too high. With the plug removed, you should find the full 5.0 volts, but on some vehicles it will go to 0.0 volts. There will be a pull-up or pull-down resistor to force the signal voltage to go to one of those maximum values and set a fault code. That way it can't "float" to some random value that the computer would try to use.
The other red flag in that article was when the author said to pierce the wire insulation. That made the hair stand up on the back of my neck! There are test probes that help you do that, but in my Electrical classes I watched for students doing that very closely on my prepared cars. If they did, they were allowed to replace that entire piece of wire from one terminal to the other. Most importantly, that hole is where moisture is going to sneak in and corrode the wire. Before it breaks completely it will develop some resistance and a circuit can still work, but not correctly. In a high-current circuit that could mean a starter motor that cranks too slowly, but it still cranks, or a heater fan or wiper motor that doesn't run as fast as it's supposed to, but it still run's. In sensor circuits, an error of as little as 0.01 volt can affect what the computer thinks is happening and how it reacts.
People who use "Scotch-Lok" connectors for trailer wiring run into the same problems because they don't seal out moisture. To add to the misery, GM used aluminum wires on some of their vehicles, and those corrode very quickly. If you pierce that insulation, you can expect to find a broken wire within a few weeks.
The other issue on my "bugged" cars was a wire with a hole in it would be a clue that someone else was looking at that wire when they were trying to diagnose the problem previously. Absolutely do not poke holes in any wires. If you have already, use some RTV gasket sealer from a small tube to seal the hole when you're done. Electrical tape should never be used on a car or truck. It will unravel into a gooey mess on a hot day.
The last concern is your last sentence, "I was able to test the OHMS and the readings for all 3 wires is.7mV." Ohms and volts are two different things. If you simply mixed up the terminology, I'll get over it, but voltages are read in a circuit that is powered up and working. Resistance, (ohms and kilohms), must always be read in a dead circuit. The meter has its own internal battery for that purpose.
What I suspect you were doing was measuring the continuity of the three wires between the sensor and the computer, and you found.7 ohms. You'll actually have more resistance than that just in the meter leads, so 3 or 4 ohms would not be unexpected. Regardless, you don't have to do those tests unless the preliminary voltage readings show to suspect a broken wire. If any one of those three wires did not pass its continuity test, you would have found a seriously incorrect voltage on that terminal at the sensor. The voltages can only be correct if the wires are okay.
Also watch the "volts" and "millivolts" designations. There's a big difference between the two. If you have an auto-ranging meter, it can be real easy to overlook the scale the meter picked for you. I have over a dozen digital meters I used in 35 years of tv repair, and I never used an auto-ranging meter. It is way too easy to make a mistake when you're in a hurry and taking lots of readings very quickly.
Wednesday, May 7th, 2014 AT 5:13 PM