As the manager of an energy efficiency program, you need to be asking questions about the performance and outcomes of your program:
- How well is the program accomplishing its objectives?
- How effective are the marketing campaigns?
- How satisfied are participants with the available incentives, including any financing offered?
- Just how much of a difference are we making in the local home improvement market?
- Is the program cost-effective?
- How can we improve program performance?
Questions like these—about performance and outcomes—ultimately involve evaluation of the effects of a program.
There are three common types of energy efficiency program evaluations: impact, process, and market evaluations. These can include a wide range of assessment studies to determine the effects of a program.
Common Types of Energy Efficiency Program Evaluations
The three evaluation types are not mutually exclusive. While their focus may differ, they each have important interconnections.
Impact evaluation: An evaluation of the program-specific changes (e.g., changes in energy and/or demand use), directly or indirectly induced, associated with an energy efficiency program.
Impact evaluations can help you:
- Determine the extent to which your program has achieved the projected energy savings.
- Understand the non-energy benefits (or co-benefits) such as avoided emissions and job creation that directly or indirectly result from a program.
- Quantify program outcomes for use in communications and marketing and outreach.
Process evaluation: A systematic assessment of an energy efficiency program. Their purpose is to document program operations and identify and recommend improvements to increase the program’s efficiency or effectiveness for acquiring energy resources while maintaining high levels of participant satisfaction.
Process evaluations can help you:
- Identify the extent to which existing processes and practices are, or are not, contributing to your program’s performance.
- Identify ways to improve program performance.
Market evaluation: An evaluation of the change in the structure or functioning of a market, or the behavior of participants in a market, that results from one or more program efforts. Typically, the resultant market or behavior change leads to an increase in the adoption of energy efficient products, services, or practices.
Market evaluations can help you:
- Gain insights into the relevant market for your program.
- Identify how the market or market participants have changed as a result of your program.
- Understand what approaches had a positive effect on the market.
Source: Energy Efficiency Program Impact Evaluation Guide, State and Local Energy Efficiency Action Network, 2012.
Although active program management will involve data collection and analysis for continuous improvement, evaluations of impact, process, and market effects are typically conducted independently from day-to-day program management. These evaluations may be conducted by a third-party evaluator or an internal team.
Results from third-party evaluations not only provide actionable information for program improvement, they also credibly demonstrate results to funders, stakeholders, and the public. For example, a program that can show its cost-effectiveness—in terms of progress toward goals compared to program expenditures—is more likely to be viewed as successful by appropriators and stakeholders.
Internal process optimization is performed in-house and more often than formal, third-party process evaluations, so that program managers can make timely changes to improve the program. Routine internal evaluation is a critical program management tool that will help improve program performance, and thus improve the outcomes of third-party evaluations. See the Program Design & Customer Experience handbook for information on how to build internal evaluation into program management.
This handbook introduces Evaluation & Data Collection and leads to a series of handbooks exploring this area in detail. Taken together, these handbooks provide guidance for program administrators on:
- Plan evaluations, focusing on external evaluations conducted by third-party evaluators. Topics include setting the scope, schedule, and budgets; designing appropriate data collection systems and processes; developing appropriate cost-effectiveness criteria; and engaging evaluators and other partners. This handbook also discusses how to integrate evaluation with all program aspects, especially the daily activities that affect data collection quality and relevance. Internal evaluations are covered in the Program Design & Customer Experience handbooks.
- Develop the resources needed to support external evaluation activities, including implementation of tools and systems that will support data collection and quality assurance necessary for effective evaluation, and structuring of data release agreements.
- Conduct and manage evaluation activities, including working with external evaluation partners and other stakeholders, as well as with all internal organization actors with a role in affecting evaluation activities and quality assurance.
- Communicate evaluation results to contractors, funders, regulators, the public, and other audiences. This handbook discusses how to use third-party evaluations, particularly impact evaluations, to help positively influence these stakeholders and provide necessary validation of the program.
Better Buildings Neighborhood Program Evaluation
To understand how the Better Buildings Neighborhood Program affected the residential energy efficiency market and industry, the U.S. Department of Energy worked with third-parties to conduct a comprehensive impact, process, and market effects evaluation. The full suite of evaluation reports, completed in June 2015, include:
- Evaluation of the Better Buildings Neighborhood Program (Final Synthesis Report Volume 1)
- Savings and Economic Impacts of the Better Buildings Neighborhood Program (Final Evaluation Volume 2)
- Drivers of Success in the Better Buildings Neighborhood Program—Statistical Process Evaluation (Final Evaluation Volume 3)
- Process Evaluation of the Better Buildings Neighborhood Program (Final Evaluation Volume 4)
- Market Effects of the Better Buildings Neighborhood Program (Final Evaluation Volume 5)
- Spotlight on Key Program Strategies from the Better Buildings Neighborhood Program (Final Evaluation Volume 6)
For more information about the evaluation, see the Better Buildings Neighborhood Program Accomplishments webpage.
The following are important stages for successful program administrators to follow when implementing Evaluation & Data Collection activities; however, no two programs are the same, and program administrators need to take into account the unique aspects of their market to create the most effective approach possible. Select each stage to access its handbook.
- Develop Evaluation Plans
Identify the right questions to ask, appropriate metrics to collect, and the processes needed to initiate third-party impact and process evaluations.
- Develop Resources
Identify and implement systems and tools that will support data collection and data quality necessary for effective evaluation.
- Conduct Evaluation
Manage third-party impact and process evaluation activities by coordinating with evaluators, transferring data, and overseeing evaluation deliverables.
- Communicate Impacts
Communicate pertinent results of evaluations to program staff, partners, and stakeholders.
Key steps and topics for evaluating programs, including collecting data needed for evaluations
In recent years, hundreds of communities have been working to promote home energy upgrades through programs such as the Better Buildings Neighborhood Program, Home Performance with ENERGY STAR, utility-sponsored programs, and others. The following tips present the top lessons these programs want to share related to this handbook. This list is not exhaustive.
Many Better Buildings Neighborhood Program partners found that it was important to communicate during the program design phase with organizations and individuals that will collect or supply data for the evaluation. In this way, the involved individuals and organizations understand why the data is being collected and are invested in the process. Also, involving evaluators in the design of the program ensures that the right data is collected from the start.
- The Southeast Energy Efficiency Alliance (SEEA) programs experienced challenges collecting the information needed for third-party evaluation, measurement, and verification activities. SEEA contracted a third party, Cadmus Group, to evaluate its programs’ activities, and the evaluator found that the data they used as inputs - programmatic data from the Quarterly Program reports to DOE and utility bill data to perform their analysis - did not always correlate entirely with the data that programs were collecting from homeowners and contractors. SEEA and Cadmus worked with each of the programs to fill in the gaps. In some cases the programs were able to provide this information. For others, they were unable to because it only existed in paper format or the program did not have the time or resources to provide the information. To address this issue, Cadmus developed a flexible approach to determine savings for each program, based on database reviews, billing analysis, technical desk reviews, whole-house energy modeling, and engineering analysis.
- Cadmus conducted technical desk reviews when sufficient billing data was not available, but when SEEA provided data files from simulation models or other project documentation for a sufficient sample of participant homes.
- They used whole-house energy modeling to assess the reported savings when sufficient billing or simulation data was not available, but program tracking data was complete enough to populate required model inputs.
- Engineering analysis was used to estimate savings for measures not incorporated into the models or to evaluate the applicability of data from other program locations when little or no data other than the program tracking data was available.
- When data was not available, Cadmus compared weather and participant profiles to other program locations for similarities.
- For EnergyWorks in Philadelphia, problems with data tracking became apparent during the data analysis conducted for the program evaluation. EnergyWorks determined that these problems could have been avoided by clearly articulating and communicating data collection goals to program staff and defining key program performance indicators early in the program. The program established an overarching goal of 2,000 upgrades but found that it would have also been good to have set smaller specific goals tied to performance in order to better understand trends as they were occurring over the life of the program and how these trends were impacting the overall goal of 2,000 upgrades.
Though potentially challenging, establishing relationships for sharing energy consumption data is critical for evaluating program impact on energy and cost savings. Many Better Buildings Neighborhood Program partners found success by approaching utilities during the program planning phase, or at least several months in advance of when they planned to start collecting data, to outline shared goals, assets, tools, needs and constraints. Clear and concise data requests helped speed up utilities’ response times for providing the data and alleviated utility concerns and questions regarding data needs.
- Energize Phoenix formed a partnership with the local electric utility, Arizona Public Service (APS), while designing the program and coordinated with them throughout program development. Energize Phoenix found that understanding Arizona Public Service’s concerns and challenges related to data sharing was a key ingredient in forging a successful partnership, as was instituting a formal agreement to clarify roles and responsibilities.
- Southeast Energy Efficiency Alliance (SEEA) found that not all of their programs were successful in obtaining utility bill data. Common obstacles included that the utility did not have the technology infrastructure to easily export the information, would only release data for a fee (based on how many records were pulled), or simply did not have the time or resources to provide the information even if the program had a signed client release form from the homeowner. Among SEEA's programs, those that were most successful in obtaining utility billing information–including NOLA WISE in New Orleans, Louisiana; Local Energy Alliance Program (LEAP) in Charlottesville, Virginia; Atlanta SHINE in Atlanta, Georgia; and DecaturWISE in Decatur, Georgia - consulted with the utilities to determine what information the program needed to include in the client release form. Additionally, some programs developed a written memorandum of understanding with the utility specifying data collection and transfer roles and responsibilities. SEEA programs also found it best to make data requests to utilities on a quarterly basis to minimize the burden on the utility as many utilities do not have staff dedicated to data exporting. Some programs received data more frequently, but in these situations the utility had the means to easily pull and export data.
- When local utilities Philadelphia Gas Works (PGW) and Philadelphia Electric Company (PECO) shared customers’ energy usage data with EnergyWorks, all parties made sure that the proper data sharing requirements were observed and all parties signed the necessary forms. Philadelphia EnergyWorks built its customer data release approval language into the program’s loan application form to minimize the number of additional forms that a customer or contractor would need to handle.
- EnergySmart in Eagle County, Colorado, successfully developed partnerships with utilities during and after the Better Buildings Neighborhood Program grant period, but in hindsight found it would have been more beneficial to engage utilities prior to submitting the original DOE grant application. By not fully engaging utilities up front, EnergySmart created the following environment where the utilities are only partially included in the program and retained similar or redundant in-house services. As EnergySmart continued forward, they were able to gain the trust of the utility by offering help, data, and information. EnergySmart also shared their results with the utility’s management and board of directors. Through this gained trust, utilities were more willing to share data.
Paper-based or spreadsheet-based information collection processes can be low cost to develop and easy to roll-out, but more often than not, they become cumbersome to aggregate and store the data from many sources. Many Better Buildings Neighborhood Program partners found that investing in information and communications technologies (ICT) eased program implementation and was well worth the effort because they were able to regularly monitoring progress and automate what would otherwise be time-intensive, manual processes. For more information on the future of ICT, see ACEEE’s How Information and Communications Technologies Will Change the Evaluation, Measurement, and Verification of Energy Efficiency Programs.
- Garfield Clean Energy in Garfield County, Colorado, at first used a series of Excel spreadsheets and hardcopy file folders to track participants, their energy upgrade measures, and resulting energy savings. As the number of participants reached into the hundreds, the program realized that spreadsheets did not offer the level of sophisticated searching and reporting that was needed to analyze the results of their work. They explored several online customer relationship management systems and contracted with a third-party developer to customize their selected system so that it could track building and energy data, energy upgrades, contractors, dollars spent, rebates awarded, and deemed energy savings. The customization and data entry work, which took several months to complete, enabled Garfield Clean Energy to create detailed reports based on a wide variety of reporting parameters, and to better analyze the effectiveness of program activities.
- When Enhabit, formerly Clean Energy Works Oregon, scaled up their pilot program to the entire city of Portland, it was clear to them that an IT solution was needed to meet the demands of funding agencies, media requests, and good project management from the customer perspective. Enhabit worked with software company EnergySavvy to develop a unified service delivery platform to manage the home energy upgrade process from application to completion. The software platform provides a web-based interface between homeowners, contractors, and lenders, enabling each party to document progress through the Enhabit program. The platform also streamlined data collection and analysis.
- In Boulder, Colorado, EnergySmart used spreadsheets to manage data for its predecessor energy efficiency programs. As the program expanded under the Better Buildings Neighborhood Program, it became clear that EnergySmart needed to pursue a more user-friendly, real-time, cloud-based IT system for tracking customers through the implementation process. They selected a system to allow for tracking of many metrics in a much more consistent, accurate, and organized fashion than the previously used spreadsheet. The system can be accessed in the field by EnergySmart Energy Advisors using iPads or tablets to enter basic customer information, building baseline information, assessment findings for upgrade opportunities, completed upgrades with associated energy and cost savings, rebates and financing received, and the supporting documentation. The collected data is compiled for reporting to various stakeholders, including the U.S. Department of Energy, county commissioners, and city staff and leaders. The ability of Energy Advisors to access the system in the field allows for much greater efficiency and accuracy than the static logging of data upon returning to the office.
Many Better Buildings Neighborhood Program partners found that setting up their information technology (IT) systems early in the program design stage ensured that data terms and data entry procedures were consistently applied by all system users. Reaching agreement with stakeholders (e.g., contractors, lenders, marketing partners, evaluators, program staff) on what data the data system would collect, known as system requirements, and how the collected data would be used to evaluate the program helped programs ensure that the data collected was complete. Programs have also found that they receive data of the quality needed for graphs and cost-effectiveness calculations when stakeholders agree up front that the data will be used for these purposes and not just to track energy savings and expenditures.
- Be SMART Maryland found that transitioning from spreadsheet-based data collection system to a customized energy IT system was crucial to administering a multifaceted energy efficiency program with rigorous data collection requirements. Investing in their system while they were still designing their program allowed Be SMART to smoothly integrate the system into the program’s operations and to ensure quality data collection and integrity over time. Be SMART also found that while spreadsheets were useful tools in collecting data, their use in analyzing data and generating reports was limited, since the program had to go through a time-consuming consolidation process to combine data from different sources and spreadsheets.
- In Boulder County, Colorado, EnergySmart found that it took between four to six months for a database developer and coding consultant to fully develop and test the data system because of its high level of complexity and the customization required. The program also found that having actual users test the system with real inputs and real reporting requirements helped ensure better data quality and user-friendliness. In addition, EnergySmart found that before beginning database development, it was important to reach agreement among stakeholders on what reporting will be expected, and design the database to facilitate building and exporting the reports. For EnergySmart, it was important to set expectations with report recipients about the IT system’s reporting capabilities early on in the process, so recipients did not expect reports that the system was unable to produce.
Many Better Buildings Neighborhood Program partners found that it was critically important to use compatible formats for data sharing and reporting with partners. Aligning data formats and collection plans with national data formats (e.g., Home Performance XML schema (HPXML), Standard Energy Efficiency Data platform (SEED), Building Energy Data Exchange (BEDES)) ensured compatibility for aggregation and reporting.
- For Arizona Public Service’s (APS) Home Performance with ENERGY STAR® Program, a lack of transparency and access to data meant it took hours each month to compile progress reports. Coordination with trade allies was difficult for similar reasons–both the utility and its contractors lacked visibility into project status and task assignment, as well as the ability to identify program bottlenecks, which impacted APS customer service. Program delivery metrics, from administrative overhead to customer and trade ally satisfaction, were lower than expected. APS then began the search for a more dynamic software platform to engage customers, track and manage projects, empower trade allies, and analyze and report results. The program needed HPXML, an open standard that enables different software tools to easily share home performance data. The new HPXML-compliant platform, EnergySavvy’s Optix Manage, resulted in higher cost effectiveness and greater satisfaction for the program, including 50% less administrative time to review and approve projects, a 66% reduction in data processing time for APS reporting, 31% less contractor administrative time to submit projects, and a three-fold increase in trade ally satisfaction. HPXML also had the added benefit that contractors can choose their own modeling software.
- The New York State Energy Research & Development Authority (NYSERDA) heard from home performance contractors and other stakeholders that a more streamlined data collection process was needed to reduce the paperwork burden and time spent on a per project basis. In response, the program launched the NY Home Performance Portal in July 2013. This web-based interface made it easier for customers to choose and apply for the home performance program and made the application process for a home energy assessment clear, fast, and simple. In 2015, NYSERDA further refined their data collection process and began processing of all projects in a web-enabled interface designed to facilitate program coordination. This new platform allowed NYSERDA to automate project approvals for 85-90% of projects. In addition, the platform supported HPXML which facilitates data sharing among multiple New York programs, thereby reducing the administrative burden for contractors participating in multiple programs. It allowed NYSERDA to automate the work scope approval process through validation of standardized data. An additional benefit of HPXML for NYSERDA was creating an open modeling software market.
- Massachusetts Department of Energy Resources (MassDOER) provides statewide oversight to energy efficient programs administered by utilities under the Mass Save brand. Originally, contractors from Conservation Services Group, Inc. and Honeywell International Inc. used audit software customized for the program in their home energy assessments. When MassSave piloted the Home MPG program, contractors were also required to generate an Energy Performance Scorecard for each home. The existing audit software, however, did not have this capability. To address this problem, software developers added the Energy Performance Scorecard capability, so the contractors could use the same software to record the audit results and generate the energy performance scorecard. Despite implementation delays, this solution allows the use of the Energy Performance Scorecards to potentially expand to statewide.
The following resources provide topical information related to this handbook. The resources include a variety of information ranging from case studies and examples to presentations and webcasts. The U.S. Department of Energy does not endorse these materials.
The Connecticut Neighbor to Neighbor Energy Challenge developed this form for authorization to obtain household energy information.
Homeowner data collection survey created by RePower.
This report for the Connecticut Energy Efficiency Board provides a review of best practices in impact evaluation, recommendations for calculating oil and propane savings, and discusses the impact evaluation findings for the Home Energy Services (HES) and Home Energy Services-Income Eligible (HES-IE) Programs. This best practices review provides an overview of key evaluation protocol and guideline documents.
The Multi-State Residential Retrofit Project is a residential energy-efficiency pilot program, funded by a competitive U.S. State Energy Program (SEP) award through the U.S. Department of Energy. The Multi-State Project operates in four states: Alabama, Massachusetts, Virginia, and Washington. During the course of this three-year process evaluation, Cadmus worked closely with NASEO and the four states to collect information about the programs from many perspectives, including: State Energy Office staff, program implementers, homeowners, auditors/contractors, real estate professionals, appraisers, lenders, and utility staff. This report discusses: the project’s context; its goals; the evaluation approach and methods; cross-cutting evaluation results; and results specific to each of the four states.
EnergySmart Colorado uses surveys and a customer database to get feedback from homeowners that helps fine-tune program services and operations.
This report provides an independent analysis of the job creation impact of DOE's investment in energy efficiency programs, from 2010 to 2013. The analysis calculates the job creation results that would have occurred in the Southeast, based on the prevailing economic conditions from 2010 to 2013, had DOE invested in sectors other than energy efficiency.
This presentation describes how programs have leveraged data to increase program energy savings, with a spotlight on advanced and real-time monitoring and verification (M&V 2.0), contractor scorecards, and intelligent quality assurance (QA) and monitoring.
This presentation covers the current pilot project testing M&V2.0 as an evaluation tool facilitated by Connecticut Department of Energy and Environmental Protection (CT DEEP). Speakers on this panel presented examples of how whole building modeling is currently being used for M&V now and its potential future applications. Speakers also discussed benchmarking, data access and other protocols, and how experience with efficiency programs teach us so we can build upon the current experience.
This webinar series is intended for state officials starting or expanding their EM&V methods for a wide range of efficiency activities including utility customer-funded programs, building energy codes, appliance and equipment standards, energy savings performance contracting, and efficiency programs that support pollution reduction goals or regulations.
This presentation covers the National Standard Practice Manual (NSPM) which provides a comprehensive framework for cost-effectiveness assessment of energy resources, with a focus on energy efficiency. The manual describes the principles, concepts, and methodologies for sound, balanced assessment of resource cost-effectiveness.
This summary from a Better Buildings Residential Network peer exchange call focused on energy modeling in multifamily homes.
This summary from a Better Buildings Residential Network peer exchange call focused on how organizations can utilize energy modeling tools like the Asset Score for multifamily buildings into their program offerings, narrow the gap between predicted and actual energy savings, and use program data to increase program productivity and quality. It features speakers from Pacific Northwest National Laboratory, American Council for an Energy-Efficient Economy, and OptiMiser.
This presentation covers the importance of collecting and evaluating program data, including data related to marketing efforts.
The Evaluation & Data Collection Implementation Plan Template will help you develop a strategy for planning, operating, and evaluating your data collection and evaluation activities.
The U.S. Department of Energy's Better Buildings Residential Program released version 2.0 of a user-friendly tool for estimating the cost-effectiveness of a residential energy efficiency program based on program administrator inputs. Cost-effectiveness analysis compares the benefits (i.e., outputs or outcomes) associated with a program or a measure with the costs (i.e., resources expended) to produce them. Program cost-effectiveness is commonly used by public utility commissions to make decisions about funding programs or program approaches. Program designers, policy makers, utilities, architects, and engineers can use this tool to estimate the impact of different program changes on the cost-effectiveness of a program.
REED serves as a dashboard for the consistent reporting of electric and natural gas energy efficiency program energy and demand savings and associated costs, avoided emissions and job impacts across the Northeast and Mid-Atlantic region. REED is a project of NEEP's Regional Evaluation, Measurement and Verification Forum (EM&V Forum) and is based on the EM&V Forum's Common Statewide Energy Efficiency Reporting Guidelines.
The Energy Data Accelerator Toolkit is a collection of resources featured in the Better Buildings Solution Center that will enable other utilities and communities to learn and benefit from the work of the Accelerator. It describes the best practices that enabled cities, utilities, and other stakeholders to overcome whole-building data access barriers.
List of building energy software packages, some of which are available for free or a small fee.
The Regional Evaluation, Measurement and Verification Forum (EM&V Forum) works to support use and transparency of current best practices in evaluation, measurement, verification, and reporting of energy and demand savings, costs, avoided emissions and other impacts of energy efficiency, while also advancing the development of strategies and tools to meet evolving policy needs for efficiency.
A pro forma is a tool of forecasting the impact that adjustments to a business model can have on future financials, using a set of assumptions and inputs. In the residential energy efficiency industry, programs can use pro forma tools to forecast the impact that marketing campaigns, incentive re-structuring, or other program changes will have on the program budget and results. Example assumptions include the number of homeowner registrations that a set of marketing activities generate in a year, average assessment to upgrade conversion rate, and average incentive per project. By applying assumptions such as these, a pro forma tool can also help your program determine how effective various strategies are at achieving program goals and objectives. Program administrators can help contractors by supporting them with their own business pro forma. To help you get started, here are a few useful resources: the National Home Performance Council developed a presentation on their Integrated Pro Forma Project; for an example program pro forma, see the presentation by Virginia’s Local Energy Alliance Program (LEAP-VA); the National Home Performance Council also developed the Contractor Pro Forma Tool.
The Building Energy Data Exchange Specification (BEDES, pronounced "beads" or /bi:ds/) is designed to support analysis of the measured energy performance of commercial, multifamily, and residential buildings, by providing a common data format, definitions, and an exchange protocol for building characteristics, efficiency measures, and energy use.
Home performance extensible markup language (HPXML) is a national Building Performance Institute Data Dictionary and Standard Transfer Protocol created to reduce transactional costs associated with exchanging information between market actors. This website provides resources to help stakeholders implement HPXML and stay updated on its development.
The Standard Energy Efficiency Data (SEED)™ Platform is a software application that helps organizations easily manage data on the energy performance of large groups of buildings. Users can combine data from multiple sources, clean and validate it, and share the information with others. The software application provides an easy, flexible, and cost-effective method to improve the quality and availability of data to help demonstrate the economic and environmental benefits of energy efficiency, to implement programs, and to target investment activity.
A comprehensive source of data on the environmental characteristics of almost all electric power generated in the United States.
A tool that provides information on the air emissions attributable to the electricity used in a home or business during one year, along with a description of what these numbers mean in everyday terms and information on how to be more energy efficient or buy green power.
The Buildings Performance Database (BPD) is the largest national dataset of real building performance data, and enables users to perform statistical analysis on an anonymous dataset of hundreds of thousands of commercial and residential buildings from across the country. One of the most powerful applications of the tool is custom peer group analysis, in which users can examine specific building types and geographic areas, compare performance trends among similar buildings, identify and prioritize cost-saving energy efficiency improvements, and assess the range of likely savings from these improvements.
Better Buildings Residential Energy Efficiency Cost-Effectiveness Tool Version 2.0: Introduction and Demonstration
This presentation provides an introduction and demonstration of DOE's Better Buildings Residential Energy Efficiency Cost-Effectiveness Tool Version 2.0., a user-friendly tool for estimating the cost-effectiveness of a residential energy efficiency program based on a program administrator’s inputs.
Energy Efficiency and Conservation Loan Program Webinar Series: #1 Overview and Cost Effectiveness
This webinar is the first (in a series of six) hosted by USDA Rural Utility Service (RUS) and focusing on the Energy Efficiency and Conservation Loan Program (EECLP). This webinar provides an overview of the Energy Efficiency and Conservation Loan Program. It covers the requirements and benefits of the program and also discusses steps you can take to evaluate the cost effectiveness of energy program options.
Energy Efficiency and Conservation Loan Program Webinar Series: #2 Evaluation, Monitoring & Verification
This webinar is the second (in a series of six) hosted by USDA Rural Utility Service (RUS) and focusing on the Energy Efficiency and Conservation Loan Program (EECLP). This webinar covers the key concepts of Evaluation, Monitoring & Verification (EM&V), gives an overview of the full process, from estimating savings before programs are implemented to measuring and verifying the savings at the end. The webinar also covers EM&V framework, evaluation plans, technical reference manuals and measurement and verification studies.
EM&V Basics, Tools and Resources to Assist EECBG and SEP Grantees
This webinar offers an introduction to EM&V basics, including data collection, tracking tools, M&V approaches, and reporting energy savings.
Volume 1 of the Better Buildings Neighborhood Program Evaluation Report provides findings from a comprehensive impact, process, and market effects evaluation of the program period, spanning from September 2010 through August 2013.
Volume 2 of the Better Buildings Neighborhood Program Evaluation Report comprises a measurement and verification process, as well as billing regression analysis on projects with sufficient utility bill data, to determine gross verified savings.
Volume 3 of the Better Buildings Neighborhood Program Evaluation Report statistically identifies factors associated with successful residential energy upgrade programs using a survey sampling, cluster analysis, and multivariate regression approach.
Volume 4 of the Better Buildings Neighborhood Program Evaluation Report assesses the degree to which the Better Buildings Neighborhood Program met its process goals and objectives to identify the most effective program design and implementation approaches.
Volume 5 of the Better Buildings Neighborhood Program Evaluation Report provides findings from a comprehensive impact, process, and market effects evaluation of the program period, spanning from September 2010 through August 2013.
Volume 6 of the Better Buildings Neighborhood Program Evaluation Report provides findings from a comprehensive impact, process, and market effects evaluation of the program period, spanning from September 2010 through August 2013. This volume includes case studies that describe successful strategies that programs used during the evaluation period.
This comprehensive national guide provides a step-by-step process to apply the Resource Value Framework and allow jurisdictions to develop their own primary cost-effectiveness test -- the Resource Value Test. It provides guidance using lessons learned in state and local jurisdictions over 20 years.
New advanced Information and Communications Technologies (ICT) are pouring into the marketplace and are stimulating new thinking and a shift in the energy efficiency EM&V paradigm. These emerging technologies, including advanced data collection and analytic tools, are purported to provide timely analytics on program results and efficacy. This report reviews how new data analytic tools serve to help identify savings opportunities and engaging customers in programs like never before, and explores the potential for advanced data collection (e.g. AMI, smart meters) and data analytics to improve and streamline the evaluation process.
This report provides guidance and recommendations to help residential energy efficiency programs to more accurately estimate energy savings. It identifies steps program managers can take to ensure precise savings estimates, apply impact estimates over time, and account for and avoid potential double counting of savings.
Information and communications technologies (ICT) can automate and transform the evaluation, measurement, and verification (EM&V) of energy efficiency programs. ICT enables the remote monitoring and sophisticated analysis of energy usage, increasing the speed and scale of many EM&V activities. This report reviews traditional EM&V practices, explores new enabling technologies including the Internet of Things and remote building analysis, and describes the application of ICT to each stage of the EM&V process. The report then projects ways forward through a number of challenges (e.g., data overload) and concludes that ICT-enabled EM&V could eventually change the design of efficiency programs and the responsibilities of program administrators, implementers, and evaluators.
Browse key topics that many residential energy efficiency programs need to address. Select a topic below to see curated resources, including case studies, presentations, tools, calculators, templates, and more. See a full list of all common search topics. If you have suggestions for additional topics, please tell us.Back to top