# bulbs burn then smoke and go out.

Discussion in 'Landscape Lighting' started by n2h20, Jun 8, 2005.

2. ### NickNLawnSite Bronze Memberfrom AlabamaPosts: 1,010

n2h20,
Use a multimeter.Check the voltage output of the transformer by hooking the negative lead to the negative side of the txformer and the positive to the positive side.Next,check the current by reversing the leads(negative to positive,positive to negative) and setting your meter to measure amps.(Because current flows negative to positive,you must complete the circuit using your multimeter this way.It's the only way to measure current.)Once you have these two readings,use the P=I*E formula to calculate the power(watts).Now,compare that to what the ouput power(watts) of the transformer is labeled as.If it coincides with what the transformer is supposed to be,then the transfomer is ok and you have another problem.(Make sure to disconnect the wires coming from the transformer to the lights while doing this.)

3. ### yz250fpilotLawnSite Memberfrom North AlabamaPosts: 168

Like Noel said, check your voltage at the fixture. The voltage rating of the bulb is probably less than the voltage being provided at the fixture.

4. ### yz250fpilotLawnSite Memberfrom North AlabamaPosts: 168

Twenty 7 watt bulbs designed to run at the transformer rated voltage will theoretically operate with a 500 watt transformer as long as they will with a 175 watt transformer with no ill effect. Actually, the 500 watt transformer would probably run cooler.

Not to stray from the subject, but I would actually probably prefer the 500 watt transformer because this would give you the flexability to add more lights at a later time. The 175 watt transformer would already be at 80% of the rated power, and it's a good idea to leave 20% for overhead.

A couple good things to keep in mind are current takes the path of least resistance, and the smaller the load on the circuit (light bulbs in this case) the lower the consumption of power. Twenty 7 watt bulbs = (7 watts)x(20)=140 watts. There might be a tad bit of of heat expelled due to the resistance of the circuit wire, but that would be very negligible regarding the affect on power consumption.

Steve

5. ### yz250fpilotLawnSite Memberfrom North AlabamaPosts: 168

Resistance does not increase in a parallel circuit.

6. ### dvmcmrhp52LawnSite Platinum Memberfrom Pa.Posts: 4,205

.

Please explain then, why the manufacturers recomend keeping the wattage rating of the transformer closer to the actual intended usage so as not to decrease bulb life?

7. ### NickNLawnSite Bronze Memberfrom AlabamaPosts: 1,010

"Resistance does not increase in a parallel circuit."
You are correct.In a paralllel circuit,the resistance will DECREASE with each resistor installed,and in turn your current will increase.
Don't know what I was thinking.

8. ### dvmcmrhp52LawnSite Platinum Memberfrom Pa.Posts: 4,205

By the way...........Good info guys.................