View Single Post
Old 06-15-2005, 07:30 PM
dvmcmrhp52's Avatar
dvmcmrhp52 dvmcmrhp52 is offline
LawnSite Platinum Member
Join Date: Jul 2003
Location: Pa.
Posts: 4,210
QUOTE=NickN]Well,with a 100 watt transformer and let's say 15V output,you would have 6.66 amps max.
If he has a 500 watt transformer and 15 volts,his max current would be 33.3 amps.I'm assuming his lights are wired in parallel.
If you're pulling too much current(amps) at the fixture

If current is measured by amps as you've stated,How can a 100 watt transformer put out more current than a 500 watt transformer?
Or are we only speaking of more current out of a 100 watt transformer than a 500 watt when a short is present?

Let's 100 watts it takes more current to produce the same result compared to 500 watts?

When I said I think most systems have shorts I did not mean an absolute short......more like the spark plug wires on a vehicle when they are "leaking" and throwing "sparks"..........
Not an absolute short to my way of thinking but a short that reduces efficiency............Make any sense?

One last
Why will a larger than necessary transformer reduce the life of the bulbs?......( a 500 watt transformer running only twenty 7 watt bulbs)

Maybe I'm just too feeble for all of this, but............lernin is good............
Reply With Quote
Page generated in 0.04170 seconds with 8 queries