Have you ever wondered how changes in transformer input voltage affect the total amperage draw? This is good to know so you can evaluate changes in amperage readings. Say, for example, you install a system and make the following intial measurements: Voltage at the GFCI (under full load): 115V Primary transformer amperage: 8.0 amps You return to the site a year later and take new measurements: Voltage at the GFCI (under full load): 126V (about 10% higher) Primary transformer amperage: ???? amps What would you predict (if only the voltage has changed)? Would the amperage be higher, lower, or stay the same? If it does change, by how much? Tips: Amps=Watts/Volts but this will not give the correct answer, unless you consider what's happening with the lamps.