Description
Evaluation plans are critical tools used to measure the success of residential energy efficiency programs, and touch all aspects of program design and delivery. Your evaluation plan should include:
- An explanation of program goals
- The metrics to determine whether those goals are being achieved. See the Marketing & Outreach – Develop Evaluation Plans handbook, Financing – Develop Evaluation Plans handbook, Contractor Engagement & Workforce Development – Develop Evaluation Plans for more information
- Designs for data collection systems and processes
- Criteria for determining cost-effectiveness
- Communication channels to share the plans and results with stakeholders
- The scope, schedule, and budget for evaluation activities.
You need to plan carefully to ensure that your evaluation activities are aligned with your program design and implementation activities. The graphic below, from the State and Local Energy Efficiency Action Network (SEE Action) Energy Efficiency Program Impact Evaluation Guide, illustrates the importance of this integration. The graphic shows the residential energy efficiency program implementation cycle, emphasizing the role of evaluation activities in providing feedback to current and future program decisions.
- Program goal setting. Both program goals and evaluation goals should be developed when a program is first envisioned, in order to adequately plan and budget for the activities necessary to measure progress towards goals.
- Program design. Evaluation planning should begin during the program design phase in order to ensure that processes and systems to collect data are integrated to support program administration and evaluation.
- Program launch. A detailed program evaluation plan should ideally be in place when the program is launched. This helps ensure program preparedness to evaluate program activities from the outset.
- Implement Evaluation. Program administrators should be evaluating their programs’ success as a standard business practice; however, it is recommended that formal evaluation activities also occur, usually with an experienced third party to provide an unbiased perspective. While it can be valuable to coordinate third-party evaluations with program launch activities (see Clean Energy Works Oregon example on the Step-by-Step tab), typically third-party evaluations begin after a period of program performance.
Program Implementation Cycle with High-Level Evaluation Activities
Source: Energy Efficiency Program Impact Evaluation Guide, State and Local Energy Efficiency Action Network, 2012.
This handbook addresses two types of external evaluations, described in the Overview handbook and defined below:
- Impact evaluations determine program-specific induced effects, such as reductions in energy use, demand, and non-energy benefits, such as avoided emissions of greenhouse gases, economic impacts, or job creation.
- Process evaluations assess how program performance can be improved and are an important adjunct to impact evaluations.
The Evaluation & Data Collection Overview handbook also describes market assessments, which look at your program’s impact on market transformation. This type of assessment is further discussed in the Market Position & Business Model – Assess the Market handbook.
This handbook discusses the steps you should take to develop evidence-based insights into your program’s performance through third-party impact and process evaluations. These steps include:
- Determine the type(s) of evaluation
- Determine the scope of the evaluation(s)
- Develop a timeline for evaluation activities
- Decide how to determine your program’s cost-effectiveness
- Solicit proposals for evaluation services
- Evaluate proposals
- Negotiate and execute the contract.
Subsequent handbooks in this component area discuss how to develop resources to support the evaluation plans, execute the evaluation plans, and communicate the results of evaluation reports.
Step by Step
Several public resources are available to help you plan your evaluation, including—the Energy Efficiency Program Impact Evaluation Guide (see below), and other reports, tools, and case studies referenced in this handbook.
Energy Efficiency Program Impact Evaluation Guide
Chapter 8 in the SEE Action Guide presents impact evaluation concepts and the steps involved in the planning process, including the development of evaluation approaches, budgets, and schedules. This chapter also discussed how evaluation planning and reporting is integrated into the program implementation process and presents issues and questions to help determine the scope and scale of an impact evaluation. The end of the chapter includes a checklist for preparing an evaluation plan.
Source: Energy Efficiency Program Impact Evaluation Guide, State and Local Energy Efficiency Action Network, 2012.
You will need to determine what type(s) of evaluation best help your program. The three common types of energy efficiency program evaluations are:
- Impact evaluations help you understand the extent to which program products and services are meeting the stated goals of your program, and may be necessary based on your program’s funding requirements. Impact evaluations may also measure the effects of the program on non-energy goals, such as job growth, economic impacts, water savings, and greenhouse gas reductions. Impact evaluations are often the highest-priority third-party evaluations because they provide objective analysis of your program’s success at meeting goals.
- Process evaluations can help you identify the extent to which existing processes and practices are, or are not, contributing to your program’s performance and gain insight into aspects of your program design that you can and should enhance.
- Market evaluations can help you gain market insights that are critical to designing and implementing effective residential energy efficiency programs. They can be useful for strategic planning and during your initial program design phase.
Spotlight on the Better Buildings Neighborhood Program Evaluation
DOE administered the Better Buildings Neighborhood Program (BBNP) to support programs promoting whole building energy upgrades. BBNP distributed a total of $508 million to support efforts in hundreds of communities served by 41 grantees. DOE commissioned an independent evaluation to determine the direct impacts, market effects, and lessons learned from the Better Buildings Neighborhood Program.
The impact evaluation developed independent, quantitative estimates of the impacts of BBNP. Third-party evaluators conducted the impact evaluation of the three-year program in two phases and combined the findings to develop a verified energy savings estimate for BBNP. The impact evaluation comprised two broad activities to determine gross verified savings:
- Measurement and verification of a sample of BBNP grantees and upgrade projects using an ex-post analysis (actual savings based on post-upgrade conditions); and
- Billing regression analysis on projects from grantees with sufficient utility bill data.
The impact evaluation also constructed an economic impact model of the U.S. economy and estimated the economic impacts of BBNP, including jobs, economic output, income (personal and business), and tax revenue that result from the program spending relative to a base case scenario where BBNP did not exist. The impact evaluation is presented in Savings and Economic Impacts of the Better Buildings Neighborhood Program (Final Evaluation Volume 2).
The market effects evaluation sought to identify early indications that BBNP had an effect on the local building improvement markets in which the program operated, and to understand how and why home performance contractors and equipment distributors changed their business practices in a way that promoted greater adoption of energy efficiency. It explored changes in the market for energy efficient products and services and in market actors’ behavior resulting from BBNP activities. The study drew on multiple data sources, including phone surveys with contractors and distributors, in-depth interviews with contractors, surveys with homeowners, and in-depth interviews with financial institutions. The market effects evaluation is presented in Market Effects of the Better Buildings Neighborhood Program (Final Evaluation Volume 5).
The process evaluation drew on interview and survey information collected from the grantees, DOE program staff and contractors, program participants and nonparticipants, home performance contractors serving homeowners, and financial institutions working with the grantees. In addition, an extensive literature review informed the evaluation. The process evaluation had two broad objectives:
- To assess the degree to which BBNP met its goals and objectives related to program processes and grantee program activity.
- To identify the most effective approaches – including program design and implementation activities – to completing building energy upgrades that support the development of a robust retrofit industry in the U.S.
The process evaluation and detailed analyses of specific program strategies are presented in three volumes:
- Drivers of Success in the Better Buildings Neighborhood Program – Statistical Process Evaluation (Final Evaluation Volume 3)
- Process Evaluation of the Better Buildings Neighborhood Program (Final Evaluation Volume 4)
- Spotlight on Key Program Strategies from the Better Buildings Neighborhood Program (Final Evaluation Volume 6).
A synthesis of the entire BBNP evaluation study is presented in Evaluation of the Better Buildings Neighborhood Program (Final Synthesis Volume 1).
Next, you will need to determine the scope of each type of evaluation you plan to conduct. The scope of your evaluation should derive from your program’s goals and objectives. It must clearly articulate what questions you would like answered:
- Will you be evaluating your program’s energy savings and achievement of other goals (i.e., impact)?
- You will need to list each of the program’s goals and the time period you want the evaluator to examine.
- Will you be looking at your program’s operational efficiency and ability to serve the market?
- Your scope of work will need to clearly state your program goals, as well as the areas you would like examined that can be improved through operational changes. For example, how easy or difficult is it for a customer to participate in the program? Are there ways to streamline processes for subcontractors that would make them more effective or more likely to participate? Could you lower the administrative cost of your program by eliminating ineffective processes?
Guide for Benchmarking Residential Energy Efficiency Program Progress with Examples
The guide provides an inventory of recommended Residential Program Progress Metrics, describes approaches for using them effectively, and gives examples of peer benchmarks from the Better Buildings Neighborhood Program for comparison. Appendix C of the guide provides planning worksheets to help identify the types of information that would be useful for documenting success in meeting your program goals.
Third-party evaluations can only occur after the program has been operating for some time, generating enough data to make evaluations meaningful. Your evaluation plans should include a list of interim activities to ensure that program evaluation will be efficient and effective, such as:
- Communicating with your evaluator to ensure that you are collecting the right types of data, in a format that is easily transferrable once the evaluation begins
- Communicating with program participants for potential participation in evaluation activities (e.g., including authorizations on program forms)
- Identifying and preparing program staff who will be responsible for supporting evaluation activities
The Develop Resources handbook discusses the systems and tools that will support data collection and data quality necessary for efficient and effective evaluation. The Conduct Evaluation handbook discusses the steps program managers need to take to properly oversee evaluation activities and which steps you can plan for when you are developing your evaluation plan.
When sequencing events, consider the process and time to procure evaluation services and finalize a contract with a third-party evaluator. If you decide to run impact and process evaluations concurrently, you may want to include scopes for both activities in a single RFP and allow bidders to submit proposals for one or the other, as many evaluation firms specialize in one type of evaluation and you want to ensure you get the best value from the evaluation activity.
Phased third-party evaluation activities can also inform program design and implementation activities as your program ramps up. A phased evaluation approach engages an evaluator before program launch and evaluates program processes and impact as the program launches. This type of ongoing evaluation is ultimately what you would want your internal evaluation activities to look like; however, if you are on a tight launch timeframe in a complicated market, engaging a third-party evaluator to help you identify issues and improvements during the launch process can be very effective.
Clean Energy Works Oregon Used a Three-Phase Evaluation Process
Clean Energy Works Oregon conducted a three-phase evaluation of its pilot program, Clean Energy Works Portland (CEWP).
- The phase 1 (August–September 2009) evaluation provided rapid feedback about the initial CEWP 50-home test pilot program which allowed the program to adjust screening criteria, recruit additional home performance contractors, and build on experience with CEWP without a gap in program activity.
- The phase 2 (February–March 2010) evaluation documented participants’ experiences and satisfaction with CEWP and summarized lessons learned. These were applied during phase 3.
- The phase 3 (June–August 2010) evaluation documented the status of projects and evaluators created a report with detailed results for all three phases.
Third-party evaluation of program activities while the pilot was phasing in allowed the program to identify many process and policy issues in order to develop a program design that could attract the support to expand the program statewide.
Source: Clean Energy Works Portland Pilot Process Evaluation, Research Into Action, Inc., 2010.
Assessing the cost-effectiveness of energy efficiency resources involves comparing the costs and benefits of these resources with other resources that meet energy and other applicable objectives. Historically, energy efficiency has been assessed through standard tests defined in the California Standard Practice Manual. These assessments entail comparing the cost of energy efficiency resources to forecasts of avoided supply-side resources and other relevant costs and benefits. The National Efficiency Screening Project’s National Standard Practice Manual builds and expands upon the California Manual, providing current experience and best practices.
Since cost-effectiveness can be defined in numerous ways, it is important to understand what is required by your program’s utility regulators, legislation, or funders. For example, if you receive public funding to administer a residential energy efficiency program, make sure to understand what cost-effectiveness test(s) will be used by regulators or funders to evaluate the program’s performance. Even if cost-effectiveness tests are not a required measure of performance, it can provide an assessment of program performance, and perhaps open up opportunities for financial support.
Measuring the Cost-Effectiveness of Your Program
Five cost-effectiveness tests have been developed to consider efficiency costs and benefits from different perspectives.
Source: U.S. Department of Energy, 2014.
Three tests—the Program Administrator Cost Test, the Total Resource Cost test, and the Societal Cost test—are predominately used by states as the primary way to screen efficiency programs.
While the choice of test is important, it is even more important to ensure that you are applying the tests properly—that each test is being applied in a way that achieves its underlying objectives, is internally consistent, accounts for the full value of energy efficiency resources, and uses appropriate planning methodologies and assumptions.
Best practices to use in applying the cost-effectiveness tests include:
- Fully accounting for other program impacts where appropriate
- Properly estimating avoided costs
- Using the most appropriate discount rate
- Capturing spillover effects
- Fully accounting for the risk benefits of energy efficiency.
Source: Best Practices in Energy Efficiency Program Screening, Synapse Energy Economics, Inc., 2013.
Improving Cost-Effectiveness Tests
The National Efficiency Screening Project is improving the way that utility-funded energy efficiency resources are screened for cost-effectiveness through the Resource Value Framework. The Resource Value Framework is a collection of principles and recommendations to provide guidance for states to develop and implement tests that are consistent with sound principles and best practices. The National Standard Practice Manual further provides information, templates, and examples that can support a state or jurisdiction in applying the universal principles, and also in constructing appropriate tests in a structured, logical, and documented manner.
The Resource Value Framework recommends that states that use the framework to design or modify their cost-effectiveness screening tests take the following steps:
- Decide which overall perspective - utility or societal - is appropriate for the state.
- Identify the state’s energy policy goals that are relevant to, and might be affected by, energy efficiency resources. For example: assist low-income customers with high energy burdens, increase the diversity of energy resources, and reduce energy price volatility.
- Identify a way of accounting for those energy policy goals in the state’s screening test.
- Use the Resource Value Framework template to explicitly identify the assumptions and methodologies necessary to ensure that the test is balanced, transparent, and takes the appropriate energy policy goals into account.
Source: The Resource Value Framework: Reforming Energy Efficiency Cost-Effectiveness Screening, National Efficiency Screening Project, 2014.
As a program administrator, you will likely find value in evaluating your program from many different perspectives; however, there are constraints that limit the effectiveness of program evaluations and it will be necessary to decide where your evaluation resources are best spent. Here is a series of questions to help you identify how to evaluate the cost-effectiveness of your program:
- What cost-effectiveness tests should be used?
- The criteria for determining cost-effectiveness may be prescribed by a public utility commission or state legislation.
- Even if you’re not required to measure cost-effectiveness, you may want to regularly evaluate your program’s performance using multiple tests. This information can be used to communicate your program's results to potential funders.
- Should savings be assessed for the first year, over the estimated life of the measure, or for a target year for savings/reductions?
- Considering lifetime savings will encourage investments that maximize long-term energy savings.
- Some funders and stakeholders may be interested in first-year savings, which show the magnitude of annual program activities.
- What non-energy benefits should be included (e.g., water savings, comfort, carbon emissions reduction) and how should they be quantified?
- Usually, a legislative or regulatory body decides whether cost-effectiveness screening must include non-energy benefits. However, even if capturing non-energy benefits is not mandated, you may wish to do so to demonstrate the broader value of your program.
- Consider the cost of capturing the data necessary to measure non-energy benefits, and compare it to the benefits of your program.
Ultimately you will need to enter into a formal partnership with an evaluator to conduct your evaluation. Most organizations that operate residential energy efficiency programs issue formal requests for proposals (RFPs) to solicit, assess, and select an evaluator. This process is typically used to encourage market competition. Some organizations have strict rules for when you must seek a competitive bid for services, so be sure to look for these when you are designing your evaluation.
- Typically, any organization that receives federal funding is subject to financial audits that require competitive procurement policies.
- State and local governments may have procurement policies that require competitive bids for work above a certain dollar threshold.
- An RFP is not necessary in all situations; your program’s procurement rules may allow a less rigorous process, such as using a request for information (RFI) or request for qualifications (RFQ).
If you determine that an RFP is necessary or preferable for your purposes, your RFP should include a scope of work (SOW) that documents the work you are asking the evaluation team to do. The SOW should include:
- Key questions that the evaluation should attempt to answer
- How evaluation findings will be used
- How evaluators can access the available data and other sources of information
- Any regulatory reporting requirements
- A list of work products and the associated delivery schedule
- The review process for the evaluation reports.
In addition to the SOW, your RFP should clearly articulate how the proposals will be assessed. This will make your decision-making process transparent and may avoid challenges from non-winning bidders, which can delay the start of the evaluation work.
Some organizations have RFP templates that include approved legal requirements. Visit Examples for sample RFPs that you may find useful in drafting your own.
After you issue a solicitation for evaluation services, the next task is to assess the proposals, considering your program’s budget, and select an evaluation firm.
Best practice for assessing proposals is to develop a scoring sheet that gives members of the proposal review team a way to rank the qualifications of bidders objectively. You should carefully assess the proposal for:
- Whether it adequately addresses the key objectives in the SOW
- Bidder experience evaluating residential energy programs, including the experience of the lead staff person named in the bid and any other staff resources
- Value the evaluation can provide, within the proposed schedule and the budget available.
When making your final selection, you must also consider you program’s budget for evaluations. Some programs have mandated budgetary guidelines for evaluation activities (e.g., a percentage of the total annual program budget is set aside for program evaluation). The amount spent on evaluation is widely variable, depending on the level of evaluation activity and whether the program has been evaluated before (and thus the evaluator can build on previous evaluations rather than starting from scratch).
Once you have chosen your evaluator, be sure to use the contract negotiation period to confirm data that must be gathered, data transfer protocols, changes to routines to accommodate the evaluation activities, and the final list of deliverables.
- Identify whether any data must be gathered in addition to those already used for internal purposes. Evaluators know the best types of data to measure program progress toward goals, so consider engaging your evaluator early in the program design process. Ideally, data collection systems have been designed with both internal and external evaluation needs in mind, so your need for additional data should be minimal. If your evaluator requests data that your systems were not designed to collect, use the contract negotiation period to determine if the program can accommodate the need for more data.
- Determine the data transfer protocols and timeline. Data that do not meet the evaluator’s specifications or are not transferred in a timely fashion will slow down the evaluation process. This can make results less accurate.
- Determine how you will need to adjust existing routines to accommodate the evaluation. Process evaluations rely heavily on interviews with staff and customers to understand how the program operates and where there are opportunities for improvement. Therefore, you will need to ensure that the appropriate staff will be available for interviews or to provide the evaluator with customer contact information, or perhaps contact customers on behalf of the evaluator. Use the contract negotiation period to finalize these processes.
- Negotiate the scope of interim and final deliverables, and their due dates. When determining due dates, identify if outside stakeholders should be involved in the review of interim and final deliverables. Include time for stakeholder review or even a peer review; the latter can take two months or more.
Tips for Success
In recent years, hundreds of communities have been working to promote home energy upgrades through programs such as the Better Buildings Neighborhood Program, Home Performance with ENERGY STAR, utility-sponsored programs, and others. The following tips present the top lessons these programs want to share related to this handbook. This list is not exhaustive.
Many Better Buildings Neighborhood Program partners found that it was important to communicate during the program design phase with organizations and individuals that will collect or supply data for the evaluation. In this way, the involved individuals and organizations understand why the data is being collected and are invested in the process. Also, involving evaluators in the design of the program ensures that the right data is collected from the start.
- The Southeast Energy Efficiency Alliance (SEEA) programs experienced challenges collecting the information needed for third-party evaluation, measurement, and verification activities. SEEA contracted a third party, Cadmus Group, to evaluate its programs’ activities, and the evaluator found that the data they used as inputs - programmatic data from the Quarterly Program reports to DOE and utility bill data to perform their analysis - did not always correlate entirely with the data that programs were collecting from homeowners and contractors. SEEA and Cadmus worked with each of the programs to fill in the gaps. In some cases the programs were able to provide this information. For others, they were unable to because it only existed in paper format or the program did not have the time or resources to provide the information. To address this issue, Cadmus developed a flexible approach to determine savings for each program, based on database reviews, billing analysis, technical desk reviews, whole-house energy modeling, and engineering analysis.
- Cadmus conducted technical desk reviews when sufficient billing data was not available, but when SEEA provided data files from simulation models or other project documentation for a sufficient sample of participant homes.
- They used whole-house energy modeling to assess the reported savings when sufficient billing or simulation data was not available, but program tracking data was complete enough to populate required model inputs.
- Engineering analysis was used to estimate savings for measures not incorporated into the models or to evaluate the applicability of data from other program locations when little or no data other than the program tracking data was available.
- When data was not available, Cadmus compared weather and participant profiles to other program locations for similarities.
- For EnergyWorks in Philadelphia, problems with data tracking became apparent during the data analysis conducted for the program evaluation. EnergyWorks determined that these problems could have been avoided by clearly articulating and communicating data collection goals to program staff and defining key program performance indicators early in the program. The program established an overarching goal of 2,000 upgrades but found that it would have also been good to have set smaller specific goals tied to performance in order to better understand trends as they were occurring over the life of the program and how these trends were impacting the overall goal of 2,000 upgrades.
Many Better Buildings Neighborhood Program partners found that setting up their information technology (IT) systems early in the program design stage ensured that data terms and data entry procedures were consistently applied by all system users. Reaching agreement with stakeholders (e.g., contractors, lenders, marketing partners, evaluators, program staff) on what data the data system would collect, known as system requirements, and how the collected data would be used to evaluate the program helped programs ensure that the data collected was complete. Programs have also found that they receive data of the quality needed for graphs and cost-effectiveness calculations when stakeholders agree up front that the data will be used for these purposes and not just to track energy savings and expenditures.
- Be SMART Maryland found that transitioning from spreadsheet-based data collection system to a customized energy IT system was crucial to administering a multifaceted energy efficiency program with rigorous data collection requirements. Investing in their system while they were still designing their program allowed Be SMART to smoothly integrate the system into the program’s operations and to ensure quality data collection and integrity over time. Be SMART also found that while spreadsheets were useful tools in collecting data, their use in analyzing data and generating reports was limited, since the program had to go through a time-consuming consolidation process to combine data from different sources and spreadsheets.
- In Boulder County, Colorado, EnergySmart found that it took between four to six months for a database developer and coding consultant to fully develop and test the data system because of its high level of complexity and the customization required. The program also found that having actual users test the system with real inputs and real reporting requirements helped ensure better data quality and user-friendliness. In addition, EnergySmart found that before beginning database development, it was important to reach agreement among stakeholders on what reporting will be expected, and design the database to facilitate building and exporting the reports. For EnergySmart, it was important to set expectations with report recipients about the IT system’s reporting capabilities early on in the process, so recipients did not expect reports that the system was unable to produce.
Though potentially challenging, establishing relationships for sharing energy consumption data is critical for evaluating program impact on energy and cost savings. Many Better Buildings Neighborhood Program partners found success by approaching utilities during the program planning phase, or at least several months in advance of when they planned to start collecting data, to outline shared goals, assets, tools, needs and constraints. Clear and concise data requests helped speed up utilities’ response times for providing the data and alleviated utility concerns and questions regarding data needs.
- Energize Phoenix formed a partnership with the local electric utility, Arizona Public Service (APS), while designing the program and coordinated with them throughout program development. Energize Phoenix found that understanding Arizona Public Service’s concerns and challenges related to data sharing was a key ingredient in forging a successful partnership, as was instituting a formal agreement to clarify roles and responsibilities.
- Southeast Energy Efficiency Alliance (SEEA) found that not all of their programs were successful in obtaining utility bill data. Common obstacles included that the utility did not have the technology infrastructure to easily export the information, would only release data for a fee (based on how many records were pulled), or simply did not have the time or resources to provide the information even if the program had a signed client release form from the homeowner. Among SEEA's programs, those that were most successful in obtaining utility billing information–including NOLA WISE in New Orleans, Louisiana; Local Energy Alliance Program (LEAP) in Charlottesville, Virginia; Atlanta SHINE in Atlanta, Georgia; and DecaturWISE in Decatur, Georgia - consulted with the utilities to determine what information the program needed to include in the client release form. Additionally, some programs developed a written memorandum of understanding with the utility specifying data collection and transfer roles and responsibilities. SEEA programs also found it best to make data requests to utilities on a quarterly basis to minimize the burden on the utility as many utilities do not have staff dedicated to data exporting. Some programs received data more frequently, but in these situations the utility had the means to easily pull and export data.
- When local utilities Philadelphia Gas Works (PGW) and Philadelphia Electric Company (PECO) shared customers’ energy usage data with EnergyWorks, all parties made sure that the proper data sharing requirements were observed and all parties signed the necessary forms. Philadelphia EnergyWorks built its customer data release approval language into the program’s loan application form to minimize the number of additional forms that a customer or contractor would need to handle.
- EnergySmart in Eagle County, Colorado, successfully developed partnerships with utilities during and after the Better Buildings Neighborhood Program grant period, but in hindsight found it would have been more beneficial to engage utilities prior to submitting the original DOE grant application. By not fully engaging utilities up front, EnergySmart created the following environment where the utilities are only partially included in the program and retained similar or redundant in-house services. As EnergySmart continued forward, they were able to gain the trust of the utility by offering help, data, and information. EnergySmart also shared their results with the utility’s management and board of directors. Through this gained trust, utilities were more willing to share data.
Many Better Buildings Neighborhood Program partners found that it was critically important to use compatible formats for data sharing and reporting with partners. Aligning data formats and collection plans with national data formats (e.g., Home Performance XML schema (HPXML), Standard Energy Efficiency Data platform (SEED), Building Energy Data Exchange (BEDES)) ensured compatibility for aggregation and reporting.
- For Arizona Public Service’s (APS) Home Performance with ENERGY STAR® Program, a lack of transparency and access to data meant it took hours each month to compile progress reports. Coordination with trade allies was difficult for similar reasons–both the utility and its contractors lacked visibility into project status and task assignment, as well as the ability to identify program bottlenecks, which impacted APS customer service. Program delivery metrics, from administrative overhead to customer and trade ally satisfaction, were lower than expected. APS then began the search for a more dynamic software platform to engage customers, track and manage projects, empower trade allies, and analyze and report results. The program needed HPXML, an open standard that enables different software tools to easily share home performance data. The new HPXML-compliant platform, EnergySavvy’s Optix Manage, resulted in higher cost effectiveness and greater satisfaction for the program, including 50% less administrative time to review and approve projects, a 66% reduction in data processing time for APS reporting, 31% less contractor administrative time to submit projects, and a three-fold increase in trade ally satisfaction. HPXML also had the added benefit that contractors can choose their own modeling software.
- The New York State Energy Research & Development Authority (NYSERDA) heard from home performance contractors and other stakeholders that a more streamlined data collection process was needed to reduce the paperwork burden and time spent on a per project basis. In response, the program launched the NY Home Performance Portal in July 2013. This web-based interface made it easier for customers to choose and apply for the home performance program and made the application process for a home energy assessment clear, fast, and simple. In 2015, NYSERDA further refined their data collection process and began processing of all projects in a web-enabled interface designed to facilitate program coordination. This new platform allowed NYSERDA to automate project approvals for 85-90% of projects. In addition, the platform supported HPXML which facilitates data sharing among multiple New York programs, thereby reducing the administrative burden for contractors participating in multiple programs. It allowed NYSERDA to automate the work scope approval process through validation of standardized data. An additional benefit of HPXML for NYSERDA was creating an open modeling software market.
- Massachusetts Department of Energy Resources (MassDOER) provides statewide oversight to energy efficient programs administered by utilities under the Mass Save brand. Originally, contractors from Conservation Services Group, Inc. and Honeywell International Inc. used audit software customized for the program in their home energy assessments. When MassSave piloted the Home MPG program, contractors were also required to generate an Energy Performance Scorecard for each home. The existing audit software, however, did not have this capability. To address this problem, software developers added the Energy Performance Scorecard capability, so the contractors could use the same software to record the audit results and generate the energy performance scorecard. Despite implementation delays, this solution allows the use of the Energy Performance Scorecards to potentially expand to statewide.
Many Better Buildings Neighborhood Program partners found that it is important to get buy-in from program staff and contractors on the importance of data integrity to the program mission and then to invest time to develop materials and train everyone who has a role in data collection and analysis. In this way, programs administrators can ensure that staff understand the what, how, and why of data collection and analysis requirements.
- In the Maryland Department of Housing & Community Development’s (DHCD) commercial, multifamily and residential programs - collectively known as Be SMART Maryland, discrepancies arose in the variety of ways contractors determined energy savings and the type of energy modeling software used. Contractors utilized diverse auditing software which in turn used different factors and algorithms to determining overall energy savings. At times, their upgrade reports submitted to the program omitted many input fields used to determine the energy cost savings and did not include units of measure (therms, kWh, etc.). Be SMART Maryland developed a standard methodology for estimating energy savings by requiring certain information fields such as estimated energy savings percent and having DHCD quality assurance inspectors (who are Building Performance Institute, Inc. certified and trained in energy auditing) review energy audits reports for reasonableness. For the multifamily energy efficiency programs, the program developed a standard energy audit guide, and energy auditors were required to follow one of several DOE approved audit approaches.
- The Missouri Agricultural Energy Saving Team —A Revolutionary Opportunity (MAESTRO) program faced a challenge concerning the level of specificity in auditors’ reports and the precision used in determining audit results. In many of the most rural areas of the state, auditors were simply unaccustomed to the level of precision required by the program. To address the problem with those auditors, the program identified the utility company supplying energy to the audited homes, identified their utility rates at the time of the audit, and then applied these data to a data-driven model established by auditors who operated at the required level of quantifiable data.
- Seattle’s Community Power Works (CPW) program provided contractors with multiple trainings and a well-staffed hotline for communication and assistance throughout the grant period. Training began with a program overview session for new participating contractors focused on program logistics that informed contractors of the program’s intent, rebate structures, and expectations. Other trainings covered basic safety, quality oversight, and use of their mandatory reporting database, which the program used to track projects and workforce development. To help contractors improve their performance, program staff used survey results to provide feedback to contractors on quality assurance, customer satisfaction, and compliance with program requirements.
Paper-based or spreadsheet-based information collection processes can be low cost to develop and easy to roll-out, but more often than not, they become cumbersome to aggregate and store the data from many sources. Many Better Buildings Neighborhood Program partners found that investing in information and communications technologies (ICT) eased program implementation and was well worth the effort because they were able to regularly monitoring progress and automate what would otherwise be time-intensive, manual processes. For more information on the future of ICT, see ACEEE’s How Information and Communications Technologies Will Change the Evaluation, Measurement, and Verification of Energy Efficiency Programs.
- Garfield Clean Energy in Garfield County, Colorado, at first used a series of Excel spreadsheets and hardcopy file folders to track participants, their energy upgrade measures, and resulting energy savings. As the number of participants reached into the hundreds, the program realized that spreadsheets did not offer the level of sophisticated searching and reporting that was needed to analyze the results of their work. They explored several online customer relationship management systems and contracted with a third-party developer to customize their selected system so that it could track building and energy data, energy upgrades, contractors, dollars spent, rebates awarded, and deemed energy savings. The customization and data entry work, which took several months to complete, enabled Garfield Clean Energy to create detailed reports based on a wide variety of reporting parameters, and to better analyze the effectiveness of program activities.
- When Enhabit, formerly Clean Energy Works Oregon, scaled up their pilot program to the entire city of Portland, it was clear to them that an IT solution was needed to meet the demands of funding agencies, media requests, and good project management from the customer perspective. Enhabit worked with software company EnergySavvy to develop a unified service delivery platform to manage the home energy upgrade process from application to completion. The software platform provides a web-based interface between homeowners, contractors, and lenders, enabling each party to document progress through the Enhabit program. The platform also streamlined data collection and analysis.
- In Boulder, Colorado, EnergySmart used spreadsheets to manage data for its predecessor energy efficiency programs. As the program expanded under the Better Buildings Neighborhood Program, it became clear that EnergySmart needed to pursue a more user-friendly, real-time, cloud-based IT system for tracking customers through the implementation process. They selected a system to allow for tracking of many metrics in a much more consistent, accurate, and organized fashion than the previously used spreadsheet. The system can be accessed in the field by EnergySmart Energy Advisors using iPads or tablets to enter basic customer information, building baseline information, assessment findings for upgrade opportunities, completed upgrades with associated energy and cost savings, rebates and financing received, and the supporting documentation. The collected data is compiled for reporting to various stakeholders, including the U.S. Department of Energy, county commissioners, and city staff and leaders. The ability of Energy Advisors to access the system in the field allows for much greater efficiency and accuracy than the static logging of data upon returning to the office.
Resources
The following resources provide topical information related to this handbook. The resources include a variety of information ranging from case studies and examples to presentations and webcasts. The U.S. Department of Energy does not endorse these materials.
Homeowner data collection survey created by RePower.
This data intake template spreadsheet provides a way to track home energy performance metrics.
Survey for Minnesota home owners participating in Community Energy Services pilot program about their experience at their home visit.
Questionnaire for contractors participating in the Green Madison program about their overall experience, level of participation, training, and available resources.
Participant survey sent to Me2 customers that have completed at least the initial Energy Advocate visit.
Survey for people who signed up to participate in the Me2 program for home performance assessments, but ultimately decided not to participate. The goal of the survey is to help improve services for future participants.
Data release form that allows the Connecticut Clean Energy Finance and Investment Authority (CEFIA) to obtain customer utility account and actual energy usage data, energy costs, underwriting and loan repayment records, and data on energy savings measures installed.
This report for the Connecticut Energy Efficiency Board provides a review of best practices in impact evaluation, recommendations for calculating oil and propane savings, and discusses the impact evaluation findings for the Home Energy Services (HES) and Home Energy Services-Income Eligible (HES-IE) Programs. This best practices review provides an overview of key evaluation protocol and guideline documents.
This is a recording of a webinar from August 2015. Home Performance with ENERGY STAR hosted a panel on HPXML; the value it can bring to businesses and implementation methods. Interested organizations can use this resource to learn more about HPXML and its potential benefits.
Home Performance (HP) XML is transforming the way home energy upgrade programs collect and transfer information from one software system to another, leading to improved contractor satisfaction, lower administrative costs, and technological advancements in the home performance industry. This presentation provides an overview of HPXML and its benefits, and discuss how the data standard is facilitating technological and process improvements among home energy upgrade programs and software developers in the United States.
This presentation covers the National Standard Practice Manual (NSPM) which provides a comprehensive framework for cost-effectiveness assessment of energy resources, with a focus on energy efficiency. The manual describes the principles, concepts, and methodologies for sound, balanced assessment of resource cost-effectiveness.
This summary from a Better Buildings Residential Network peer exchange call focused on energy modeling in multifamily homes.
This summary from a Better Buildings Residential Network peer exchange call focused on evaluating and demonstrating the cost-effectiveness of energy upgrades to programs.
This presentation includes a brief Introduction to Northeast Energy Efficiency Partnerships (NEEP) and the Regional Evaluation, Measurement and Verification Forum (EM&V Forum), Regional Energy Efficiency Database (REED) development and content, and how to access the REED reports and underlying data.
This webinar series is intended for state officials starting or expanding their EM&V methods for a wide range of efficiency activities including utility customer-funded programs, building energy codes, appliance and equipment standards, energy savings performance contracting, and efficiency programs that support pollution reduction goals or regulations.
This presentation describes how programs have leveraged data to increase program energy savings, with a spotlight on advanced and real-time monitoring and verification (M&V 2.0), contractor scorecards, and intelligent quality assurance (QA) and monitoring.
This presentation covers the current pilot project testing M&V2.0 as an evaluation tool facilitated by Connecticut Department of Energy and Environmental Protection (CT DEEP). Speakers on this panel presented examples of how whole building modeling is currently being used for M&V now and its potential future applications. Speakers also discussed benchmarking, data access and other protocols, and how experience with efficiency programs teach us so we can build upon the current experience.
The ultimate objective of the protocol is to develop a system that can be used to guide the production of readily usable dataset that can leverage project data from future meter-based measurement and evaluation studies, or metering studies to develop end-use load shapes. The protocol includes a "NEEP Residential Data Collection Protocol Report" template.
Form used by the Colorado Public Utilities Commission for consent to disclose utility customer data.
Homeowner survey created by the utility to inform their whole home upgrade program.
This sample email survey template, created by the Better Buildings Neighborhood Program, was designed for programs to develop their own survey of successful program participants in order to assess customer experience.
This sample phone survey template, created by the Better Buildings Neighborhood Program, was designed for programs to use with applicants who have been screened out from participating in a program.
This sample phone survey template for program drop-outs, created by the Better Buildings Neighborhood Program, was designed for programs to find out why applicants that applied to participate in a program ultimately dropped out.
Sample phone survey template for program contractors.
The U.S. Department of Energy's Better Buildings Residential Program released version 2.0 of a user-friendly tool for estimating the cost-effectiveness of a residential energy efficiency program based on program administrator inputs. Cost-effectiveness analysis compares the benefits (i.e., outputs or outcomes) associated with a program or a measure with the costs (i.e., resources expended) to produce them. Program cost-effectiveness is commonly used by public utility commissions to make decisions about funding programs or program approaches. Program designers, policy makers, utilities, architects, and engineers can use this tool to estimate the impact of different program changes on the cost-effectiveness of a program.
REED serves as a dashboard for the consistent reporting of electric and natural gas energy efficiency program energy and demand savings and associated costs, avoided emissions and job impacts across the Northeast and Mid-Atlantic region. REED is a project of NEEP's Regional Evaluation, Measurement and Verification Forum (EM&V Forum) and is based on the EM&V Forum's Common Statewide Energy Efficiency Reporting Guidelines.
The State and Local Energy Efficiency Action Network (SEE Action) Evaluation, Measurement, and Verification (EM&V) Resource Portal serves as an EM&V resource one-stop shop for energy efficiency program administrators and project managers. The resources focus on tools and approaches that can be applied nationwide, address EM&V consistency, and are recognized by the industry.
The Building Energy Data Exchange Specification (BEDES, pronounced "beads" or /bi:ds/) is designed to support analysis of the measured energy performance of commercial, multifamily, and residential buildings, by providing a common data format, definitions, and an exchange protocol for building characteristics, efficiency measures, and energy use.
Home performance extensible markup language (HPXML) is a national Building Performance Institute Data Dictionary and Standard Transfer Protocol created to reduce transactional costs associated with exchanging information between market actors. This website provides resources to help stakeholders implement HPXML and stay updated on its development.
The Standard Energy Efficiency Data (SEED)™ Platform is a software application that helps organizations easily manage data on the energy performance of large groups of buildings. Users can combine data from multiple sources, clean and validate it, and share the information with others. The software application provides an easy, flexible, and cost-effective method to improve the quality and availability of data to help demonstrate the economic and environmental benefits of energy efficiency, to implement programs, and to target investment activity.
The energy efficiency reporting tool for public power utilities is an Excel-based template is designed to produce consistent, useful metrics on program investments and performance for small to medium-sized administrators of public power efficiency programs.
Better Buildings Residential Energy Efficiency Cost-Effectiveness Tool Version 2.0: Introduction and Demonstration
This presentation provides an introduction and demonstration of DOE's Better Buildings Residential Energy Efficiency Cost-Effectiveness Tool Version 2.0., a user-friendly tool for estimating the cost-effectiveness of a residential energy efficiency program based on a program administrator’s inputs.
Presentation on the Energy Efficiency Reporting Tool for Public Power Utilities
This presentation discusses the energy efficiency reporting tool for public power utilities. The tool is an Excel-based template is designed to produce consistent, useful metrics on program investments and performance for small to medium-sized administrators of public power efficiency programs.
Partnering with Utilities Part 1 -- Successful Partnerships and Lessons from the Field
Partnering with Utilities Part 2-Topics for Local Governments-Creating Successful Partnerships with Utilities to Deliver Energy Efficiency Programs
This webcast focused on advanced topics for local government-utility partnerships, with presentations from local governments and their partnering utilities that have well-developed, multi-year relationships and programs.
EM&V Basics, Tools and Resources to Assist EECBG and SEP Grantees
This webinar offers an introduction to EM&V basics, including data collection, tracking tools, M&V approaches, and reporting energy savings.
Door-to-Door Outreach and Tracking Impacts
This webcast discusses door-to-door campaigns and how to track the impacts of these campaigns.
Guidelines for Retrieving Customer Usage Data from Utilities
Leveraging EPA's Portfolio Manager in Benchmarking and Disclosure Policy
This comprehensive national guide provides a step-by-step process to apply the Resource Value Framework and allow jurisdictions to develop their own primary cost-effectiveness test -- the Resource Value Test. It provides guidance using lessons learned in state and local jurisdictions over 20 years.
New advanced Information and Communications Technologies (ICT) are pouring into the marketplace and are stimulating new thinking and a shift in the energy efficiency EM&V paradigm. These emerging technologies, including advanced data collection and analytic tools, are purported to provide timely analytics on program results and efficacy. This report reviews how new data analytic tools serve to help identify savings opportunities and engaging customers in programs like never before, and explores the potential for advanced data collection (e.g. AMI, smart meters) and data analytics to improve and streamline the evaluation process.
This report contains guidance on issues and policy options related to providing access to customer energy use information that can be used to support and enhance the provision of energy efficiency services while protecting customer privacy.
Volume 1 of the Better Buildings Neighborhood Program Evaluation Report provides findings from a comprehensive impact, process, and market effects evaluation of the program period, spanning from September 2010 through August 2013.
Volume 6 of the Better Buildings Neighborhood Program Evaluation Report provides findings from a comprehensive impact, process, and market effects evaluation of the program period, spanning from September 2010 through August 2013. This volume includes case studies that describe successful strategies that programs used during the evaluation period.