Skip to main content

Many Better Buildings Neighborhood Program partners found that it was important to communicate during the program design phase with organizations and individuals that will collect or supply data for the evaluation. In this way, the involved individuals and organizations understand why the data is being collected and are invested in the process. Also, involving evaluators in the design of the program ensures that the right data is collected from the start.

  • The Southeast Energy Efficiency Alliance (SEEA) programs experienced challenges collecting the information needed for third-party evaluation, measurement, and verification activities. SEEA contracted a third party, Cadmus Group, to evaluate its programs’ activities, and the evaluator found that the data they used as inputs - programmatic data from the Quarterly Program reports to DOE and utility bill data to perform their analysis - did not always correlate entirely with the data that programs were collecting from homeowners and contractors. SEEA and Cadmus worked with each of the programs to fill in the gaps. In some cases the programs were able to provide this information. For others, they were unable to because it only existed in paper format or the program did not have the time or resources to provide the information.  To address this issue, Cadmus developed a flexible approach to determine savings for each program, based on database reviews, billing analysis, technical desk reviews, whole-house energy modeling, and engineering analysis.
    • Cadmus conducted technical desk reviews when sufficient billing data was not available, but when SEEA provided data files from simulation models or other project documentation for a sufficient sample of participant homes.
    • They used whole-house energy modeling to assess the reported savings when sufficient billing or simulation data was not available, but program tracking data was complete enough to populate required model inputs.
    • Engineering analysis was used to estimate savings for measures not incorporated into the models or to evaluate the applicability of data from other program locations when little or no data other than the program tracking data was available.
    • When data was not available, Cadmus compared weather and participant profiles to other program locations for similarities.
  • For EnergyWorks in Philadelphia, problems with data tracking became apparent during the data analysis conducted for the program evaluation. EnergyWorks determined that these problems could have been avoided by clearly articulating and communicating data collection goals to program staff and defining key program performance indicators early in the program. The program established an overarching goal of 2,000 upgrades but found that it would have also been good to have set smaller specific goals tied to performance in order to better understand trends as they were occurring over the life of the program and how these trends were impacting the overall goal of 2,000 upgrades.