View Full Version : bulbs burn then smoke and go out.

n2h20

06-08-2005, 10:58 AM

I have a client whos lighting in the front of house does not work. about 10 lights with a cheapo transfomer... (300 or 500 watt) so she calss me and tells me that if she replaces a bulb it burns right out. SO i tried it and the buld burns on the inside (kind of yellowish smoke) then gets brights and burns out.. what could cause this? i have not checked the wiring under ground but assume it is ok. would this be a transformer problem?

NickN

06-08-2005, 11:24 AM

Sounds like too much current.Measure the current flow at the fixture and see what you get.Remember,current must be measured "in line".

Take your power (wattage) and divide by your voltage.That is the maximum current flow you should have with 0 resistance.If the bulb isn't rated for that,it will blow.

Also,measure the voltage at the fixture.Divide the power(watts) there by the voltage and see what you've got.Then check the amp rating on your bulb.

n2h20

06-08-2005, 07:02 PM

so is that saying that if i had a 500 watt transfomer and only 1 20 watt bulb it would blow?

dvmcmrhp52

06-08-2005, 08:11 PM

so is that saying that if i had a 500 watt transfomer and only 1 20 watt bulb it would blow?

Most likely it would, transformers should be close to the total wattage of all bulbs running off of it otherwise the life of the bulb is decreased.

NickN

06-09-2005, 08:48 AM

<i>so is that saying that if i had a 500 watt transfomer and only 1 20 watt bulb it would blow?</i>

No.The current is what's important.Wattage is power.Yes,a 20 watt bulb off a 500 watt txformer will decrease it's life cycle,but it won't blow like your talking about.If you're pulling too much current(amps) at the fixture,this will cause the bulbs to overheat and blow.Think of current flow like water going through a hose.It flows,negative to positive,negative to positive,etc.,,Voltage is consistant and is felt across the circuit,but current flows like water.Since the negative particles are attracted to the positive particles,you get flow from negative to positive.Now,if for some reason the current flow were disrupted or was too high at that fixture,POP,out goes the bulb.

Check the current at the fixture and look at the specs on your lamp.I'm guessing the current at that fixture is higher than the lamp amp rating.

Frog Lights, LLC

06-10-2005, 06:38 PM

Check the voltage at the bulb. maybe too high.

Check the voltage rating of the bulb , maybe she bought 6 volt bulbs. Wattage is not a facture in this case.

dvmcmrhp52

06-13-2005, 06:28 PM

O.K. frog and Nickn,

How would the amperage at the fixture be too high?

How would the voltage be too high at the fixture outside of the possibility of the customer buying the wrong bulbs?

Not busting here I'm interested in your response...............

NickN

06-14-2005, 08:27 AM

A short in the fixture would cause the current to increase.A bad transformer could cause a current increase.Even a bulb with too little voltage could(just a bad bulb,not necessarily the wrong one.).Current is inversly proportional to voltage(E=IxR),so by dropping the voltage,the current increases.Same goes for resistance and voltage.Substitute that with (P=IxE) and you can see that Power(watts) is also inversly proportional to current(I) and voltage(E).

Take P(watts) and divide by E(volts) and you get the max current(I).If he has a 500 watt transformer and 15 volts,his max current would be 33.3 amps.I'm assuming his lights are wired in parallel.In parallel the current increases with each fixture installed(It=I1+I2+I3,etc.,,)(as does resistance) and the voltage remains near the same.(Think two 12 volt batteries in parallel)So,if a fixture were bad,it could cause an increase in current which would be too high for the lamp to take.

I'll try to come up with some basic electronic formulas for this page as time allows and explain different laws and functions.It helps that I have an electronic engineering degree.:D

NickN

06-14-2005, 12:09 PM

Another example of what could have happened is in the drawing here.A short is created across the wire(in red),so instead of the circuit being parallel,it is now a series circuit.The short will provide very little resistance,approx .1 ohm.So,using ohms law 15volts/.1ohm = 150 amps.This would be way over the 33.3 amps and would cause his bulb blow.

edit:The drawing didn't show up too well.I had to resize it and lost alot of info.

dvmcmrhp52

06-14-2005, 08:30 PM

Thanks NickN,

I'm following you somewhat.........No electrical engineering degree here........LOL.

But........one thought for me was a short, but I think ALL low voltage lighting has shorts to a degree..........

However, for arguments sake let's say she only had a 100 watt transformer, would it be as likely to have too much current for the specific bulbs even with a short?

Again, I'm trying to get it straight for myself............

n2h20

06-14-2005, 10:22 PM

i dont know if it make a difference but now all (6-8) fixtures have no bulbs in them and she says the all blow out no matter which one you put the bulb in. I will try and stop by tomorrow to get the exact wattage of the transformer and how many fixtures..

Thanks all for your help.

fixer67

06-14-2005, 10:54 PM

A lot of people here keep talking about "wattage" being the problem. It is not. The problem here is voltage. I can take a 12 volt 7 watt bulb and hook it up to a car battery which can put out over 12,000 watts of power for a short time(12 Volt 1000 AMP battery) and by the way I have an 800 AMP battery in my car right now. Any way the 7 watt bulb will run just fine with no problem. It is the VOLTS that is causing this problem. Either you have 6 volt bulbs on a 12 volt transformer or 12 volts bulbs on a 24 volt transformer or the transformer is messed up and needs to be replaced. I once seen a mess like this where there was no transformer. The man though he could just plug the lights straight into the 120 VAC wall outlet.

NickN

06-15-2005, 07:56 AM

Well,with a 100 watt transformer and let's say 15V output,you would have 6.66 amps max.

As for all systems having a short,no they don't.Unless your talking about the filiment in the bulb as a short.The filitment though,is creating resistance,so it's not a short.It acts like a resistor.A short has very little resistance,approx. .1ohm.So,even with a 100 watt transformer,if their was a short(.1 ohms),yes you would still blow the bulbs because then you would be creating 1000 amps of current. (100 watts/.1ohm)

You would create even more current than the 500 watt transformer,because current is inversly proportional to power.(P=IxE)

As the power increases,the current and voltage will decrease,but as the power decreases,the current and voltage will increase.

This is why 110volts AC is more deadly than 220V AC.It carries more current.Current always finds the shortest path to ground.When someone is electricuted,they are the shortest path to ground.They are the conductor.

dvmcmrhp52

06-15-2005, 06:30 PM

[QUOTE=NickN]Well,with a 100 watt transformer and let's say 15V output,you would have 6.66 amps max.

QUOTE]

If he has a 500 watt transformer and 15 volts,his max current would be 33.3 amps.I'm assuming his lights are wired in parallel.

If you're pulling too much current(amps) at the fixture

O.K.,

If current is measured by amps as you've stated,How can a 100 watt transformer put out more current than a 500 watt transformer?

Or are we only speaking of more current out of a 100 watt transformer than a 500 watt when a short is present?

Let's see...........at 100 watts it takes more current to produce the same result compared to 500 watts?

When I said I think most systems have shorts I did not mean an absolute short......more like the spark plug wires on a vehicle when they are "leaking" and throwing "sparks"..........

Not an absolute short to my way of thinking but a short that reduces efficiency............Make any sense?

One last question................maybe........lol.

Why will a larger than necessary transformer reduce the life of the bulbs?......( a 500 watt transformer running only twenty 7 watt bulbs)

Maybe I'm just too feeble for all of this, but............lernin is good............

:)

T Edwards

06-15-2005, 09:10 PM

n2h20

I would check the voltage across the output terminals of the transformer. Sometimes weird things happen in the secondary side. On the primary side if the windings are open you get zero output, and if they short it will trip the disconnect unit.

Sometimes you may find a transformer that has no business being in that circuit.

n2h20

06-15-2005, 10:28 PM

thanks again i will get over there this weekend to take a look/...

NickN

06-16-2005, 08:23 AM

Because current is inversly proportional to power.It's basic Ohm's law.

Let me see if this will work.

P

------

I * E

P,being power is over E(voltage) and I(current).In this formula,the current and voltage are proportional,but both are inversly proportional to power.

You can use this formula to find Power by multiplying I * E.You can also use it to find the current(I) if you know the Power and (E)Voltage by dividing the E(Voltage) into the Power(watts).

Or,if you know the power and the current(I),you can find the voltage by dividing the current(I) into the Power.Hopefully the formula above helps.

NickN

06-16-2005, 08:30 AM

<i>Let's see...........at 100 watts it takes more current to produce the same result compared to 500 watts?</i>

In a way,yes,but it's more of a byproduct created by the power and voltage.

It's not necessarily needed,it just happens.Again,Ohms Law.

NickN

06-16-2005, 08:34 AM

" Why will a larger than necessary transformer reduce the life of the bulbs?......( a 500 watt transformer running only twenty 7 watt bulbs)"

Too much power,just like a 500 watt amplifier will eventually blow a 10 watt speaker.The bulb is rated(or made) to withstand a certain amount of voltage,current,and power.If any of these is too high,it will reduce the life of the bulb,because the bulb wasn't designed to withstand the load.

NickN

06-16-2005, 08:37 AM

I should have also stated,the Power itself being higher won't shorten the life a whole lot.

NickN

06-16-2005, 10:01 AM

Here's a neat website I found that may help explain things.

http://jersey.uoregon.edu/vlab/Voltage/

NickN

06-16-2005, 10:18 AM

n2h20,

Use a multimeter.Check the voltage output of the transformer by hooking the negative lead to the negative side of the txformer and the positive to the positive side.Next,check the current by reversing the leads(negative to positive,positive to negative) and setting your meter to measure amps.(Because current flows negative to positive,you must complete the circuit using your multimeter this way.It's the only way to measure current.)Once you have these two readings,use the P=I*E formula to calculate the power(watts).Now,compare that to what the ouput power(watts) of the transformer is labeled as.If it coincides with what the transformer is supposed to be,then the transfomer is ok and you have another problem.(Make sure to disconnect the wires coming from the transformer to the lights while doing this.)

yz250fpilot

06-22-2005, 09:09 AM

Like Noel said, check your voltage at the fixture. The voltage rating of the bulb is probably less than the voltage being provided at the fixture.

yz250fpilot

06-22-2005, 09:37 AM

" Why will a larger than necessary transformer reduce the life of the bulbs?......( a 500 watt transformer running only twenty 7 watt bulbs)"

The bulb is rated(or made) to withstand a certain amount of voltage,current,and power.If any of these is too high,it will reduce the life of the bulb,because the bulb wasn't designed to withstand the load.

Twenty 7 watt bulbs designed to run at the transformer rated voltage will theoretically operate with a 500 watt transformer as long as they will with a 175 watt transformer with no ill effect. Actually, the 500 watt transformer would probably run cooler.

Not to stray from the subject, but I would actually probably prefer the 500 watt transformer because this would give you the flexability to add more lights at a later time. The 175 watt transformer would already be at 80% of the rated power, and it's a good idea to leave 20% for overhead.

A couple good things to keep in mind are current takes the path of least resistance, and the smaller the load on the circuit (light bulbs in this case) the lower the consumption of power. Twenty 7 watt bulbs = (7 watts)x(20)=140 watts. There might be a tad bit of of heat expelled due to the resistance of the circuit wire, but that would be very negligible regarding the affect on power consumption.

Steve

yz250fpilot

06-22-2005, 09:58 AM

<snip>I'm assuming his lights are wired in parallel.In parallel the current increases with each fixture installed(It=I1+I2+I3,etc.,,)(as does resistance) and the voltage remains near the same.(Think two 12 volt batteries in parallel)<snip>It helps that I have an electronic engineering degree.:D

Resistance does not increase in a parallel circuit.

dvmcmrhp52

06-22-2005, 06:11 PM

Twenty 7 watt bulbs designed to run at the transformer rated voltage will theoretically operate with a 500 watt transformer as long as they will with a 175 watt transformer with no ill effect. Actually, the 500 watt transformer would probably run cooler.

Not to stray from the subject, but I would actually probably prefer the 500 watt transformer because this would give you the flexability to add more lights at a later time. The 175 watt transformer would already be at 80% of the rated power, and it's a good idea to leave 20% for overhead.

A couple good things to keep in mind are current takes the path of least resistance, and the smaller the load on the circuit (light bulbs in this case) the lower the consumption of power. Twenty 7 watt bulbs = (7 watts)x(20)=140 watts. There might be a tad bit of of heat expelled due to the resistance of the circuit wire, but that would be very negligible regarding the affect on power consumption.

Steve

.

Please explain then, why the manufacturers recomend keeping the wattage rating of the transformer closer to the actual intended usage so as not to decrease bulb life?

NickN

06-23-2005, 08:54 AM

"Resistance does not increase in a parallel circuit."

You are correct.In a paralllel circuit,the resistance will DECREASE with each resistor installed,and in turn your current will increase.

Don't know what I was thinking.

dvmcmrhp52

06-23-2005, 05:19 PM

By the way...........Good info guys.................

steveparrott

06-27-2005, 11:47 AM

Reading through this thread I see a lot of unnecessary confusion about low voltage. Here's some comments in response to previous posts:

1. There's no minimum lamp load requirement for quality low voltage transformers. A 1500 watt transformer can be run 24/7 powering only a 20 watt lamp. However, if a poor quality transformer is used then the manufacturer may have boosted the voltage at the taps to compensate for the low transformer efficiency; in other words, the 12 volt tap may actually be delivering 14 volts under no load – when it's loaded up with lamp wattage the actual voltage can drop 2 volts to the required 12 volts. All EI type transformers (cheaper types) are low efficiency; torroidal transformers have higher efficiency and will drop only about 0.5 volts under full load. A voltmeter will tell the story by measuring voltage at the fixture.

2. Contrary to one post, the only accurate way to measure amps on a low voltage circuit is with a clamp-on type ampmeter. This can be tested for wire runs by testing one wire at the common or the voltage tap or the entire transformer can be measured at the photocell loop.

3. Contrary to one post, the resistance of the wire is very important to include in calculations since low voltage loses voltage very rapidly according to wire gauge and distance.

4. Keep in mind that when checking voltage at the fixture, you need to have all lamps lit (even at the socket you're testing). If one lamp is out then the voltage reading could be off by as much as 0.5 volts. CAST has pigtails that give you an extra socket at fixtures for testing while the lamp is lit.

vBulletin® v3.8.6, Copyright ©2000-2014, Jelsoft Enterprises Ltd.

0