Conducting an evaluation requires that you work with a number of different parties, both internal and external, to ensure that:
- You and your evaluator(s) maintain a shared vision of the goals of the evaluation, and how you will get there
- Data transfers happen in a timely fashion
- Stakeholders are engaged in the act of conducting evaluations and are given an opportunity to provide input on scope changes and interim deliverables, so they are the more likely to find the results credible and act on the recommendations.
This handbook describes the steps necessary and resources available for successful third-party evaluations, including overseeing evaluation activities, reviewing evaluation deliverables, identifying and managing potential risks and evaluation scope changes, and communicating progress.
Conducting an evaluation builds on the preparations you made before this stage. Reference the Evaluation & Data Collection – Develop Evaluations Plans handbook for guidance on identifying the right questions to ask, appropriate metrics to collect, and the processes needed to initiate third-party impact and process evaluations and see the Evaluation & Data Collection - Develop Resources handbook for information on how to identify and implement data collection systems and tools for an effective evaluation.
This handbook discusses the steps you should take to manage third-party impact and process evaluation activities. These steps include:
- Oversee evaluation activities
- Review evaluation deliverables
- Identify and mitigate potential risks
- Adjust scope and timeline to accommodate evaluation changes
- Communicate progress throughout the evaluation process.
Oversee evaluation activities
As well as monitoring the progress of the evaluation and whether staff and the evaluation team are performing as agreed upon during contract negotiations, you will need to take several steps to oversee evaluation activities once they begin:
- Review each program component’s evaluation plan, including the key data, metrics, and measurement strategies, and ensure that the evaluation team implements it.
- Host an initial meeting with the evaluation team. This should be in-person and should include at least the evaluator’s project lead and your team’s key contacts for the evaluation process. Use this meeting to make any necessary clarifications to the scope of work and timeline, as well as roles and responsibilities of key team members. Sharing information about the program itself is an important part of this meeting because it provides valuable context and perspective to the project team.
- Schedule periodic check-in meetings with the evaluation team to answer questions or provide clarification on evaluation deliverables and ensure that the original plans and any subsequent changes are understood and appropriately implemented.
- Coordinate the transfer of information from staff and subcontractors to the evaluation team to ensure that staff and subcontractors are complying with the protocols and procedures agreed upon in the contract negotiation and final scope of work. As the effectiveness of the evaluation depends on the quality and timeliness of information from your team, this coordination will be necessary throughout the evaluation period.
- Arrange for the evaluation team to contact and interview a sample of your program’s contractors and customers. To be considerate of their time, ask contractors and customers if they are willing to be contacted about the program at a later date. Customers could be asked during the rebate or loan application process if they are willing to be contacted in the future. Be cognizant of how often contractors and customers are contacted for surveys or interviews, whether they are for quality assurance or an evaluation. If they are contacted too often to provide feedback, they may be less likely to participate.
Where to Find Evaluation Reports from Residential Energy Efficiency Programs
Reviewing evaluation reports from other residential energy efficiency programs can provide insights into the evaluation activities you should consider for your program. Here are some national and regional repositories of energy efficiency program evaluations:
- The Better Building Neighborhood Program’s Evaluation Report webpage provides links to grantee evaluation reports.
- The California Measurement Advisory Council (CALMAC) provides a searchable database of evaluation reports on energy efficiency programs in the state.
- The Northeast Energy Efficiency Partnerships (NEEP) EM&V Forum’s Repository of State EM&V Studies contains links to historical and recently released studies and evaluation reports from across the northeast.
- The Northwest Energy Efficiency Alliance (NEEA) completed several market research and evaluations reports, including reports for initiatives in the residential sector.
- The U.S. Energy Information Administration (EIA) State Energy Efficiency Program Evaluation Inventory provides a report and searchable spreadsheet of EM&V reports, including annual reports and impact and process evaluation reports.
Review evaluation deliverables
Depending on the scope and scale of your evaluation, you may have several deliverables including interim and final reports. Take the time to review these to ensure that they meet the goals identified in your evaluation plan and provide the information expected by stakeholders.
You may need to distribute evaluation deliverables to key stakeholders for review. Reviewers should be identified and confirmed when you are developing your evaluation plans (refer to the Evaluation and Data Collection – Develop Evaluation Plans handbook.
- Provide reviewers with clear guidance on what to review, the review and evaluation schedule, and the process for providing feedback.
- Compile feedback for the evaluator to consider.
- Thank stakeholders and address any questions they raise with the evaluation team.
Identify and mitigate potential risks
Identify issues, events, or other circumstances that could put your evaluation project at risk of not meeting milestones.
- Certain events can have a major impact on evaluations—for example, government rule changes, budget cuts, or significant market changes can affect your original plan to complete the evaluation. Accommodate these events by adjusting your original evaluation plan.
- While it is important to involve stakeholders in the early phases of scoping evaluations, their expectations can change, especially when there is turnover in stakeholder staff. Stay in routine contact with key stakeholders, so that you can identify and act on any changes in their expectations whenever possible.
Continually monitor for risks and consider how to mitigate any issues that arise. Depending on their nature, these issues may be simply addressed through proactive communication, or may require some level of scope change and/or contract renegotiation.
Adjust scope and timeline to accommodate evaluation changes
Given the many moving parts and potential risks discussed above, you must be sure to capture their effect on the evaluation—from minor adjustments to the schedule to more substantial changes that could require contract modifications.
- Adjust timelines as necessary to accommodate reality, including all parts of the evaluation scope of work that will be affected by a change to any one of them. Be sure to review your entire timeline so that any change to an element is reflected in all related elements. Gantt chart software can automatically reflect timetable changes in related timetables.
- If contract modifications are necessary, follow your organization’s contracting protocols to ensure that changes to scope, timeline, or budget are formalized and approved by both the contractor and your management.
Communicate progress throughout the evaluation process
Throughout the evaluation process, from kickoff to final report, it is critical to communicate progress to your staff, subcontractors, and stakeholders. As noted in previous steps, any changes to evaluation scope or timeline may impact the program resources you have identified to support evaluation activities. Sharing evaluation progress with program managers during the evaluation process allows program managers to learn about results on an ongoing basis and gives them an opportunity to ask questions. You should provide regular updates with information relevant to the audience, such as:
- Any schedule changes that impact the timing of interactions of key staff with evaluators.
- Any changes to scope or schedule that affect the type or schedule of deliverables and any necessary review from stakeholders.
- Any necessary framing of evaluation results so that they are well understood (see the Evaluation & Data Collection – Communicating Impacts handbook for more information).
- Interim report findings and recommendations that could guide process improvements (see the Program Design & Customer Experience – Assess and Improve Processes handbook for more information).
Tips for Success
Measure and evaluate performance at key points in the process
Measuring performance at key points in the upgrade process (e.g., assessments, conversion rates, and financing applications) has helped programs understand where their processes are working smoothly and where they are not. This information has helped them continuously improve their program design and implementation. To monitor progress, successful programs have combined information from their project tracking systems with customer surveys, information from call centers, and feedback from contractors and lenders to understand the customer experience. Make data accessible for program staff to track progress, identify successful strategies, and detect points of failure.
- Enhabit, formerly Clean Energy Works Oregon, established an extensive process for getting customer feedback at key points in the program delivery process to evaluate customer satisfaction and better understand why some homeowners chose to undertake upgrades while others did not. The program identified seven points in the program delivery process to gather information through feedback surveys and phone interviews: application, assessment, bid, drop-out, financing, completion, and experience after 12 months. The program credited this kind of customer communication and feedback as one of the keys to its ongoing success.
Source: Clean Energy Works Research Planning, Will Villota, CEWO, 2012 (Presented during January 19, 2012 Better Buildings Residential Neighborhood Program peer exchange call).
- Boulder County’s EnergySmart program sent an online customer feedback survey to homeowners who had completed upgrades. Among other things, the customer surveys affirmed customer satisfaction and identified the opportunity for word-of-mouth marketing. Surveys found that the vast majority of the respondents would recommend the EnergySmart service to a friend or neighbor. The surveys also surfaced some weaknesses that the program resolved. For example, some respondents noted contractor’s lack of response and professionalism as an issue, which led the program to develop guidelines for professionalism and customer contact. Surveys also noted that the assessment report was long and confusing, leading the program to develop a new, customized report that was easier to follow and clearer about next steps.
- Connecticut’s Neighbor to Neighbor Energy Challenge used qualitative contractor and customer feedback combined with quantitative data to evaluate how well its outreach efforts led to home energy assessments. When informal contractor feedback alerted program managers that relatively few interested customers were following through to have assessments conducted on their homes, the program analyzed project data and found that only around a quarter of customers who expressed interest in an assessment had completed one. To diagnose the problem, the program analyzed data to see how customers were acquired, how long it took to send leads to contractors, and how long it took contractors to follow up with customers to arrange for an assessment. Through qualitative analysis, the program found, among other things, that customers didn’t understand what they were signing up for and may have been unwilling to say “no” to young and enthusiastic outreach staff. The program also found that its staff wasn’t following up quickly enough with people that wanted more information. In response, the program improved its process for distributing leads to contractors (e.g., linking contractors to homeowners in 1-2 days), created a “receipt” for interested customers outlining next steps, and set up a system to call non-responsive leads after two weeks. With these and other steps, the program increased its close rate 35% in one month after changes were implemented.
Ask customers about their program experience and for feedback on how your program can improve—and listen to their responses
Better Buildings Neighborhood Program partners found that conducting surveys of program participants that focus on tangible, easy-to-answer questions, such as the timeliness of service and the quality of work, resulted in better feedback. By including open-ended questions and questions about non-energy benefits, partners were able to garner a broader range of information and a better understanding of who their customers are and what they value (e.g., comfort, cost savings). Partners also found that administering customer surveys during or immediately following completion of the customer’s energy upgrade led to a higher rate of response.
- Enhabit, formerly known as Clean Energy Works Oregon, requests feedback from all customers during the upgrade process to help assess how contractors can improve their customer service. Quarterly customer surveys of participants who have completed assessments and upgrades include questions about customer satisfaction with the contractor’s work. This feedback enables the program to track what is working and what is not, and to respond with improvements quickly.
- Local Energy Alliance Program (LEAP) in Charlottesville, Virginia and Northern Virginia, dramatically modified its home energy upgrade process in response to homeowner feedback. Recognizing that many homeowners found a several thousand dollar investment challenging, LEAP implemented a “staged upgrade” process that allowed homeowners to implement home energy upgrades over a period of time, dividing the financial investment into smaller payments.
Use compatible formats for data sharing and reporting, and work with partners to implement standard data exchange protocols
Many Better Buildings Neighborhood Program partners found that it was critically important to use compatible formats for data sharing and reporting with partners. Aligning data formats and collection plans with national data formats (e.g., Home Performance XML schema (HPXML), Standard Energy Efficiency Data platform (SEED), Building Energy Data Exchange (BEDES)) ensured compatibility for aggregation and reporting.
- For Arizona Public Service’s (APS) Home Performance with ENERGY STAR® Program, a lack of transparency and access to data meant it took hours each month to compile progress reports. Coordination with trade allies was difficult for similar reasons–both the utility and its contractors lacked visibility into project status and task assignment, as well as the ability to identify program bottlenecks, which impacted APS customer service. Program delivery metrics, from administrative overhead to customer and trade ally satisfaction, were lower than expected. APS then began the search for a more dynamic software platform to engage customers, track and manage projects, empower trade allies, and analyze and report results. The program needed HPXML, an open standard that enables different software tools to easily share home performance data. The new HPXML-compliant platform, EnergySavvy’s Optix Manage, resulted in higher cost effectiveness and greater satisfaction for the program, including 50% less administrative time to review and approve projects, a 66% reduction in data processing time for APS reporting, 31% less contractor administrative time to submit projects, and a three-fold increase in trade ally satisfaction. HPXML also had the added benefit that contractors can choose their own modeling software.
- The New York State Energy Research & Development Authority (NYSERDA) heard from home performance contractors and other stakeholders that a more streamlined data collection process was needed to reduce the paperwork burden and time spent on a per project basis. In response, the program launched the NY Home Performance Portal in July 2013. This web-based interface made it easier for customers to choose and apply for the home performance program and made the application process for a home energy assessment clear, fast, and simple. In 2015, NYSERDA further refined their data collection process and began processing of all projects in a web-enabled interface designed to facilitate program coordination. This new platform allowed NYSERDA to automate project approvals for 85-90% of projects. In addition, the platform supported HPXML which facilitates data sharing among multiple New York programs, thereby reducing the administrative burden for contractors participating in multiple programs. It allowed NYSERDA to automate the work scope approval process through validation of standardized data. An additional benefit of HPXML for NYSERDA was creating an open modeling software market.
- Massachusetts Department of Energy Resources (MassDOER) provides statewide oversight to energy efficient programs administered by utilities under the Mass Save brand. Originally, contractors from Conservation Services Group, Inc. and Honeywell International Inc. used audit software customized for the program in their home energy assessments. When MassSave piloted the Home MPG program, contractors were also required to generate an Energy Performance Scorecard for each home. The existing audit software, however, did not have this capability. To address this problem, software developers added the Energy Performance Scorecard capability, so the contractors could use the same software to record the audit results and generate the energy performance scorecard. Despite implementation delays, this solution allows the use of the Energy Performance Scorecards to potentially expand to statewide.
Establish data sharing relationships as early as possible
Though potentially challenging, establishing relationships for sharing energy consumption data is critical for evaluating program impact on energy and cost savings. Many Better Buildings Neighborhood Program partners found success by approaching utilities during the program planning phase, or at least several months in advance of when they planned to start collecting data, to outline shared goals, assets, tools, needs and constraints. Clear and concise data requests helped speed up utilities’ response times for providing the data and alleviated utility concerns and questions regarding data needs.
- Energize Phoenix formed a partnership with the local electric utility, Arizona Public Service (APS), while designing the program and coordinated with them throughout program development. Energize Phoenix found that understanding Arizona Public Service’s concerns and challenges related to data sharing was a key ingredient in forging a successful partnership, as was instituting a formal agreement to clarify roles and responsibilities.
- Southeast Energy Efficiency Alliance (SEEA) found that not all of their programs were successful in obtaining utility bill data. Common obstacles included that the utility did not have the technology infrastructure to easily export the information, would only release data for a fee (based on how many records were pulled), or simply did not have the time or resources to provide the information even if the program had a signed client release form from the homeowner. Among SEEA's programs, those that were most successful in obtaining utility billing information–including NOLA WISE in New Orleans, Louisiana; Local Energy Alliance Program (LEAP) in Charlottesville, Virginia; Atlanta SHINE in Atlanta, Georgia; and DecaturWISE in Decatur, Georgia - consulted with the utilities to determine what information the program needed to include in the client release form. Additionally, some programs developed a written memorandum of understanding with the utility specifying data collection and transfer roles and responsibilities. SEEA programs also found it best to make data requests to utilities on a quarterly basis to minimize the burden on the utility as many utilities do not have staff dedicated to data exporting. Some programs received data more frequently, but in these situations the utility had the means to easily pull and export data.
- When local utilities Philadelphia Gas Works (PGW) and Philadelphia Electric Company (PECO) shared customers’ energy usage data with EnergyWorks, all parties made sure that the proper data sharing requirements were observed and all parties signed the necessary forms. Philadelphia EnergyWorks built its customer data release approval language into the program’s loan application form to minimize the number of additional forms that a customer or contractor would need to handle.
- EnergySmart in Eagle County, Colorado, successfully developed partnerships with utilities during and after the Better Buildings Neighborhood Program grant period, but in hindsight found it would have been more beneficial to engage utilities prior to submitting the original DOE grant application. By not fully engaging utilities up front, EnergySmart created the following environment where the utilities are only partially included in the program and retained similar or redundant in-house services. As EnergySmart continued forward, they were able to gain the trust of the utility by offering help, data, and information. EnergySmart also shared their results with the utility’s management and board of directors. Through this gained trust, utilities were more willing to share data.
Last Updated: 12/14/2017