Originally Posted by FIMCO-MEISTER
The point to any et based smart controller is that it performs as close to ideal as possible.
The point of these smart controllers is to provide hands-off irrigation adjustments due to changes in weather .... which "should" lead to significant water savings. This study is pointless with regard to field conditions, where we have highly variable system performance, mixtures of smart and standard controllers, and incorrect programming of all controllers in the vast majority of cases.
This is a controller evaluation study under ideal virtual conditions, some of which are completely unrealistic. The first year of the study attempted to evaluate in the field ..... and failed miserably.
Even if a controller is programmed correctly, and the lawn boys and home owners stay out of the controller, it still requires adjustments throughout the season (smart or not), which hardly ever occurs. Since the "industry standard" is to program a controller for peak water use, controllers that go unadjusted (the majority) waste huge amounts of water during times when peak water use is not the case. The smart controllers should easily outperform a standard controller in this case as they will self-adjust to some extent (perfectly accurate or not) based on current weather conditions .... of course assuming it was programmed correctly
Now if you want a realistic evaluation of controllers under typical field conditions, then you need to perform the study in the field under typically found field conditions with typically found controller programming.