Does fuse amperage required change with voltage

-

mcnoople

Well-Known Member
Joined
Mar 11, 2012
Messages
742
Reaction score
35
Location
illinois
My question is this does a circuit require a larger fuse at 12v than it would at 14v. I am doing planning for a complete underhood rewire and figured I would asked. What I mean is if your headlamps draw 22 amps at 12.7v would they draw less amps at 14.2? And if this is how it does work what voltage do you use for testing current draw. I am assuming running fast idle with all loads off(other than tested load), but I would hate to find myself under fused at a low battery condition like cold start in winter. So this is mainly a fuse selection question. How much extra amperage do you allow when choosing fuse rating. 22 amp load plus 10% would be a 25 amp fuse be sufficient.

None of the posted numbers are the result of any test they are numbers I made up for the question.
 
Things can get a little complicated, but the simple answer is NO

Complicated because with lights, their resistance increases with heat, that is, lights draw more current when cold, than when lit. But in the range that "is" "when they are lit" they don't change MUCH, so we can assume they stay the same.

So you will have to have enough "fudge factor" to cover the "power up surge" when you first turn them on, as they will draw more for a short time.

Google up "ohms law."

Simply, we assume the lamp resistance "stays the same" when at operating brilliance.

Since R (resistance) is now assumed to be a (pretty much) constant, this means that if voltage goes UP, then current goes UP as well

Ohms law is I = E / R, which means

Current in amps = Voltage divided by Resistance

So if E (Voltage) goes up, I (Current) must go up

http://hvacwebtech.com/images/pie.gif
 
Things can get a little complicated, but the simple answer is NO

Complicated because with lights, their resistance increases with heat, that is, lights draw more current when cold, than when lit. But in the range that "is" "when they are lit" they don't change MUCH, so we can assume they stay the same.

So you will have to have enough "fudge factor" to cover the "power up surge" when you first turn them on, as they will draw more for a short time.

Google up "ohms law."

Simply, we assume the lamp resistance "stays the same" when at operating brilliance.

Since R (resistance) is now assumed to be a (pretty much) constant, this means that if voltage goes UP, then current goes UP as well

Ohms law is I = E / R, which means

Current in amps = Voltage divided by Resistance

So if E (Voltage) goes up, I (Current) must go up

http://hvacwebtech.com/images/pie.gif

Hmmm... something's not making sense to me. I agree with everything you said until you got to the part that if E (voltage) goes up I (current) goes up too.

Current and Voltage are inversely proportional so since wattage=current x voltage given the same wattage if current goes up voltage goes down and vice-versa. What am I missing?
 
No, it's R that's inverse. If R goes up, less current is drawn at any given voltage. And the power formula is not inverse. You must have a formula where one value is the divisor /below the fraction line to make it inverse.

Current cannot be "changed" except by the EFFECT of changing the voltage or resistance.

Obviously, you cannot manually change the resistance of something such as a light bulb. But an example is the dash dimmer control on your headlight switch. That is nothing more than a variable resistance. To make the lamps brighter, you are manually DEcreasing the resistance of the dimmer control.

This increases voltage TO the lamps

and CAUSES current to increase.

With wattage, in which resistance is a constant, like a baseboard heater, and voltage is known and can be changed, current can be affected in that way.

If you could "lash up" a tapped transformer (or use what is called an autotransformer, or Variac) you can manually change the voltage delivered to the load, in this case a heater.

As you increase voltage, R stays about the same, and current must increase. With a constant R, ohms law tells us that current will be a simple relationship to voltage, IE if we double voltage, current will double.

This means wattage (power) will NOT double with a 2x voltage increase, it will increase by FOUR

I=E/R

If E is 110V, and R is 55 ohms, I will be 2A

If E is 220V, R stays the same, I will be 4 A

So our power will be

P=IE

P=110V x 2A or 220 watts
P=220V x 4A or 880 watts.
 
After all that the simple answer is still no. A fuse is a circuit protection devise. Unless you plan to upgrade the entire circuit the amp load it can handle remains unchanged.
And by the way, a fast blow fuse rated at 30 amps can pass more than 50 amps for X millisecs before opening.
 
Fuses are often sized to protect wiring, not loads. Protecting loads with fuses is not fool proof.

Loads such as motors, for a given load, increasing the voltage reduces the current. This is because power is V*I. Starting a motor results in high current, that reduces as the motor speed increases. It is back emf that reduces current on a full speed lightly loaded motor.

For lighting loads, the turn-on current is high for a cold filament. When the filament is hot enough to produce light, the current drops.

When initial starting currents are much greater than normal loads, it becomes obvious why fuses are are not used to protect loads, just wiring. Fuses are selected by wire gauge. Wire gauge is selected by current requirement and permissible voltage drop.

Reliable load protection is often with electronic means, using solid-state devices for current sensing and control.
 
-
Back
Top