I can tell you from experience that I have seen a lot of weird electrical failures from light bulbs to computers caused by low or bad batteries. Don't screw with them just charge them up test them if they fail replace them.
Sound advice. We are very much in agreement on that, even if for different reasons.
In hopes of promoting a better understanding of DC circuit principles, though, I hope you'll permit me to digress a little bit and explain in a little more detail why increased current is not the culprit.
Using I = P / E
If you have a 35 watt draw and your battery voltage is 12 volts than your current is approx. 3 amps. Same thing 35 watts now with a low battery 3 volts your amperage is approx. 12 amps.
This power formula is well established in science, but the flaw is in how you're applying it. The formula allows you to find the current drain
if and
only if you know both of the other variables, P and E. The thing is, you cannot assume that power consumption remains constant with changing voltage! It doesn't. In fact, it can't.
The rated power draw of any given device is the power it consumes
only at the given supply voltage for which it is rated. As voltage decreases below the standard test condition, most electrical devices consume LESS current, and dissipate less power as a result.
This is because of Ohm's Law. In any device that presents a constant resistive load, current is always directly proportional to voltage:
I = E / R
The higher the applied voltage, the higher the current through the device; the lower the voltage, the lower the current. Most electrical components do present a fairly constant resistance to their voltage source.
The most notable exceptions are devices like transistors, which can be forced by their associated circuitry to try to maintain a constant current flow despite changing voltage; and tungsten light bulbs, whose resistance is not constant but varies with the filament temperature. Circuits involving these components can TRY to maintain a constant power drain with changing voltage, but only succeed within a fairly limited range of voltages. Real-world circuit resistances then take over to limit current as voltage drops further.
Take a headlight bulb for an example. To keep the starting numbers simple, let's say it's rated for 48 watts
at 12 volts. When powered by that voltage, then, a nice round 4 amperes of current will flow through it, per the formula you quoted. Let's say it's a bright and very pure pure white light, and deduce a filament temperature around 6500K. At that temperature, by measuring the current and voltage, we know that the filament possesses three ohms resistance. (I=E/R, or 4amps = 12volts / 3ohms)
Now, let's let the battery gradually run down.
At 11 volts, the light gets just perceptibly dimmer, though most people would not yet notice much yellowing of the light. The very fact that there IS any change in the output tells us that the power of the bulb is decreasing, though--it's not a constant.
Interestingly, if you were to measure current through the filament, it has dropped less than the voltage has, because the cooling of the metal has allowed its resistance to decrease a little. The voltage has dropped just over 8 percent, but the current would have only dropped somewhere between 4 and 4.5% and be drawing about 3.83A. Plugging in the numbers, we now have the bulb drawing 42 watts (42=11V*3.83A) and the filament resistance is down to 2.87 ohms (R=E/I, or 2.87ohms = 11V / 3.83A) because its temperature is now below 6000K. And THAT is a direct result of the filament having LESS electrical power to convert to light energy.
That trend continues as the voltage falls. At 6 volts, the output of the bulb will be very visibly dimmer, and distinctly orange. The much cooler filament (in the mid-2000K range) will now have a much lower resistance... approximately 2.1 ohms. The current then is I=E/R, or 6/2.1 = 2.8amps. Not falling off nearly as fast as the voltage, due to the very non-linear resistance characteristics of tungsten with temperature, but still falling...not increasing. And the wattage is down to under 18, well under half what it was at the rated 12 volts.
Point being, the bulb CANNOT force itself to somehow draw the same
power despite the voltage change. If it could, it would never start to get dim until the voltage dropped to zero*--and we all know that's not how it works.
The wattage consumed continually drops with decreasing voltage because the device doesn't have any way to force itself to somehow "pull" a correspondingly greater amount of current.
(And that's not even taking into account the fact that as the battery loses charge, its own internal resistance is increasing. That also ultimately limits how much current it can source as the voltage gets lower and lower.)
I used the light bulb example for the very reason that its resistance is
not constant, and it CAN self-compensate somewhat for decreasing voltage--but most other electronic devices don't come anywhere close to that behavior. They actually do follow Ohm's Law more linearly, so that their current falls off in substantially
direct proportion to a voltage decrease, not merely the square root of it like a tungsten filament does over part of its operating range. These are the sort of devices to which the ECU connects: relays, small lamps that have tiny current draws under worst-case conditions (and LEDs which stop conducting entirely when voltage becomes too low), fuel injectors, stepper motors, sensors, etc.; all devices whose current consumption drops as voltage does.
(*Remind me some time, in some other thread, to tell the story of driving a Triumph GT-6 automobile back from Birmingham, AL, to Warm Springs, GA, in the middle of the night with no alternator! Quite unrelated to [because it'd be completely impossible with!] modern EFI and electronic ignition systems, of course, but a beautiful illustration of power consumption decreasing as voltage does.)