In the procedures of Experiment 3, I found that placing a 2.2k resistor in each ignition output will not stop the engine from starting, but a pair of 15k resistors and a 3.3k resistor (in parallel, totalling 2292 ohms) won't allow it to start. I also found that, with resistors in the lines, the voltage across the added resistance and the input impedance is 9.3v hot while running, and 8.0v hot while not running.
2.2k works, but is it the best resistance for the job? While we don't want the 8v, non-running signal to fire the ignition, we also don't want the 9.3v signal to not fire the ignition, in the case of a lower-than-normal voltage situation (cold, etc.)
It's possible that there is a specific resistance that, when put inline with the ignition signals, will lower the voltage enough that both signals will sit on opposite sides of the "fire" threshold.
It's already given that the maximum resistance that can be used is less than 2.3k. The minimum resistance has to be enough that the 8v signal doesn't reach the 4.82v that is known to fire the coil, and using the standard series-resistance math, that comes out to about 1.6k ohms. Based on some measurements, I've arrived at a loosely calculated 2.4k input impedance in the ignition system.
The Fly in the Ointment:
The problem with all of this is that, while the varied voltages discussed would make things correct when the car is either running or not, it does not compensate for the third status of the engine - cranking. I'm calling this experiment a dud. (9/24/06)