U.S. Department of Energy Energy Efficiency & Renewable Energy U.S. Department of Energy Energy Efficiency & Renewable Energy

Description

Key Resources

None available at this time.

As you implement your program, you will want to regularly assess the efficiency of your operations and the effectiveness of your strategies and tactics in leading you to meet your program goals. Continuous improvement may involve making small adjustments in your internal processes. It may also mean refining your original program design and revisiting aspects of your implementation plan, as well as adapting the program to new needs and opportunities in the market. Although this ongoing assessment and improvement is related to formal process evaluation, it is distinct in that it is ongoing, less formal, and generally done as part of ongoing internal program management.

To effectively assess and improve your program, you need to build in systems and processes to collect information, review it, and make decisions about program refinements. A common and useful structure for describing this process of continuous improvement is the Plan-Do-Check-Act Cycle shown below.

Plan-Do-Check-Act Cycle of Continuous Improvement

Cycle of continuous improvement

Source: U.S. Department of Energy, 2014

The steps are:

  • Plan. For your initial program launch and ongoing operation, develop your implementation plan. Over time, update or implement plans for refining your program, following decisions to make changes.
  • Do. Launch, then implement your program on an ongoing basis.
  • Check. Assess how your program is working, including any new approaches you have implemented. Through a conscious effort of information collection and review, look for trends and patterns that inform adjustments to program design and operational processes. Use resources such as program dashboards, data reports and other resources that help you track and communicate program performance. This step in the Plan-Do-Check-Act Cycle is the main focus of this handbook.
  • Act. Make decisions about program refinements and communicate them to all relevant program staff and partners. The cycle then continues with planning how to undertake refinements (“plan” step), implementing these refinements (“do” step), assessing them (“check” step), and making further decisions about program refinement (“act” step).

As you go through this continuous improvement process, you will collect and assess information from all program components as inputs to program decisions, including Marketing and Outreach, Financing, and Contractor Engagement & Workforce Development. Electronic program management dashboards and/or regular reports (e.g., on program metrics, customer feedback, quality assurance) will help you quickly synthesize this information. Manage your program by assimilating all of this information, identifying opportunities for improvement, assessing the potential positive and negative impacts of any changes you might want to make, and deciding on a course of action. You will also want to communicate major program design decisions to staff, partners, and stakeholders before implementing changes.

The following steps will guide you in assessing and improving your program:

  • Track quantitative and qualitative information for assessing program process and impacts
  • Establish internal and external processes for reviewing and communicating program performance
  • Regularly review and assess program metrics and feedback to determine what is working well and what is not
  • Make decisions about program design and implementation changes and communicate them.

Better Buildings Neighborhood Program Evaluation

The U.S. Department of Energy (DOE) administered the Better Buildings Neighborhood Program (BBNP) to  support programs marketing and conducting whole building energy upgrades. DOE provided grant funding through this program from 2010 to 2013 to energy efficiency programs throughout the United States. To understand the program’s success, DOE worked with third-parties to undergo a comprehensive impact, process, and market effects evaluation.

BBNP grantees that were most successful had higher market penetration, better savings-to-investment ratios, higher numbers of contractor job hours invoiced, and higher energy savings, among other factors. Analyzing these factors, the evaluation found a few key predictors of successful program outcomes:

  1. Offering multiple pathways to program participation, including offering multiple types of assessments and providing direct install activities
  2. Offering contractor training and having a large number of contractors available to perform upgrades
  3. Targeting outreach to recruit participants from groups with shared social networks, and tailoring messages to resonate with those groups
  4. Engaging with community-based organizations to drive demand for homeowner upgrades and to stimulate the supply of contractors
  5. Encouraging homeowners to install deep retrofits by using marketing techniques, tiered incentive structures, financing options, and contractor and participant support

For more on the evaluation, see the Better Buildings Neighborhood Program Accomplishments

 

Find related information across other program components:

Step-by-Step

There are several steps that will help you assess and improve your program.

Track quantitative and qualitative information for assessing program process and impacts

 Your program should draw on many sources of information for tracking program process and impacts, including:

Other sources of information for assessing programs, which are described in greater detail below, are:

  • Customer, contractor, and partner feedback
  • Metrics on work flows
  • Budget tracking information

Customer, Contractor, and Partner Feedback

In addition to quantitative information about program accomplishments and customers, you should collect and use qualitative information about your program, including those described below.

Customer feedback. Customer feedback can give you information about:

  • Customer satisfaction (with the program overall as well as with specific program components or partners)
  • The clarity and efficiency of the customer process
  • Customer motivations
  • Customer interest in various types of services and upgrades
  • Other aspects of your program 

Customer feedback can be gathered via:

  • Customer surveys
  • Tracking of common questions and feedback received through call centers
  • Contractors, who can provide insights from their direct contact with customers

For more on collecting information from customers, see the handbook on developing program metrics for evaluation plans.

Contractor feedback. As with customers, you can survey contractors. Many programs also establish ongoing interactions with contractors to get their feedback on program processes and strategies. More information on contractor feedback can be found in the handbook on assessing and improving strategies for contractor engagement & workforce development.

Feedback from partners. Your partners can be an excellent source of feedback about your program based on their interactions with you as well as their ongoing interactions with customers, contractors, and others. Many programs establish advisory committees or other mechanisms for getting feedback directly from partners. More information on getting feedback from specific partners can be found in handbooks on marketing & outreach, financing, and contractor engagement & workforce development.

Strategies for Getting Customer Feedback

Customer satisfaction is critical to a successful program. Customers who are not satisfied may drop out of the process before they conduct an upgrade, and they certainly will not bring more customers to your program by word of mouth to neighbors, friends, and family. There are many causes of customer satisfaction and dissatisfaction, and using multiple methods to get feedback can help you understand these causes and where there may be opportunities to improve.

Actively seek out information from your customers through your website, call centers, and contractor conversations, and by making your program’s email contact information and phone number easily available. Note that these strategies can be used to get feedback from contractors and partners as well.  

Let customers know you want their feedback and ask them for it directly. For example, Efficiency Maine called all of the participants in its residential direct install program to gauge satisfaction and see if these customers were interested in moving forward with additional measures identified in home assessments.

Surveys of customers who have completed the program. Surveying these customers can help you understand what is working well and what drove them to upgrade their homes. The U.S. Department of Energy has developed a phone-based survey example for homeowners who have completed the assessment and upgrade process. Useful survey questions include:

  • Questions that help you understand and categorize customer needs (e.g., “Was the availability of financing important to you?”)
  • Questions that help you understand why customers were interested in an upgrade (e.g., “Were you interested in saving money, increasing the comfort of your home, decreasing your environmental impact, etc.?”)
  • Open-ended questions about overall satisfaction with the program and opportunities for improvement.

Surveying customers who did not successfully complete the program. Surveying these program drop-outs will help you find opportunities to improve and identify process failure points that need special attention. DOE has developed a phone-based survey example for program drop-outs.

Focus groups. Assembling focus groups or panels of customers can help you test and refine program delivery strategies and techniques. Focus groups work best when you are seeking specific feedback—for example, about the effectiveness of a marketing message or customer interest in a new program service. Focus groups can provide you with relatively rapid feedback from a cross-section of customers that will save you the time and frustration of trying out a new untested strategy on your customers.

Call center information. Call centers can be very effective at drawing out useful feedback, especially information that customers may not think to put in a survey. Standardized templates or forms for call center staff can help record and communicate key feedback to the program team.

Put yourself in your customer’s shoes. Have staff walk through the process as if they were a customer, and assess how the process would work for them. Better yet, encourage staff to actually go through an assessment and upgrade process themselves to experience the process from the customer's perspective.

 

Enhabit Gathered Feedback from Customers at Key Points in the Program Delivery Process

Enhabit, formerly Clean Energy Works Oregon, established an extensive process for getting customer feedback at key points in the program delivery process to judge customer satisfaction and understand why some homeowners chose to undertake upgrades while others did not.

As illustrated in the graphic below, the program identified seven points in the program delivery process to gather information through feedback surveys and phone interviews:

  1. Application—asking how customers heard about the program, why they signed up, their expectations, and the application experience
  2. Assessment—asking about satisfaction with the assessment, contractor, and report
  3. Bid—asking about how the bid compared with their expectations, their satisfaction with the bid, and their satisfaction with the contractor
  4. Drop-out (targeted to homeowners who received bid and were approved for financing but did not undertake upgrade projects)—asking why these customers dropped out of the program, how satisfied they were, and how likely they were to undertake upgrade measures in the future
  5. Financing—asking about the clarity and ease of the loan process and satisfaction with the financing options
  6. Completion—asking about overall benefits compared to expectations and about satisfaction with the program and contractors
  7. Experience after 12 months—asking about satisfaction with the project and program 12 months after the upgrade.

Enhabit Feedback Surveys

Clean Energy Works Oregon feedback survey

Source: Enhabit (formerly Clean Energy Works) Research Planning, Will Villota, CEWO, 2012 (Presented during January 19, 2012 Better Buildings Residential Neighborhood Program peer exchange call)

Information from the assessment informed the design of the program as it broadened its geographic reach throughout Oregon.

While it is important to gather survey data, do not give your customers “survey exhaustion.” Make sure you are not asking too much of your customers and make sure that you have a plan to use all of the information you are collecting.

 

Customer Surveys for Energy Smart in Boulder, Colorado

The EnergySmart program in Boulder County, Colorado surveyed customers about program satisfaction at the end of the upgrade process. The survey request was sent by the program’s energy advisors who had been working with homeowners throughout the process. Questions included:

  • “What appeals to you most about this service in terms of helping your home?” (Responses included increased comfort, reducing impact on the environment, lower energy bills, and others.)
  • “Please rate your experience with your contractor” (from “1” for very poor to “10” for excellent).
  • “Would you recommend this service to a friend, neighbor, or coworker?”

The survey also asked about areas where services could be improved, and it inquired about next steps homeowners planned to take.

Program staff members were able to access updated survey responses on a weekly basis, including summary graphs of results that illustrate trends in satisfaction.

Among other things, the surveys helped the program modify the assessment report provided to each homeowner, reducing it from 20 pages of detailed building science information to three pages of key data, the top five recommendations for upgrades, and information about next steps.

For more about EnergySmart’s approach to customer surveys and tracking customer data, see the presentation “Using Data to Monitor Market Transformation”.

Metrics on Work Flows

Metrics on work flows help you assess how smoothly the assessment and upgrade process is working. Having many customers moving smoothly through the process is a strong indicator that processes are not just working well for customers—but also for the contractors and lenders with whom they are interacting.

Metrics often focus on whether customers are moving from one step to the next and the amount of time these steps take. They can also track contractor, lender, energy advisor, or other process and workflow steps. For example, metrics can count the number of customers that move from one step to another (and how long it takes) between steps, tracking any steps where there is potential for a drop-off in engagement or where the program plays a role. Typical examples include:

  • Initial customer inquiry
  • Scheduling an assessment
  • Completing an assessment
  • Receiving an assessment report
  • Scheduling upgrade work
  • Completing upgrade work
  • Completing QA/QC on upgrade work
  • Submitting rebate paperwork
  • Receiving rebate.

Programs often collect data for metrics through project tracking systems, which are described in further detail in the Develop Resources handbook. These systems are helpful for ongoing monitoring and program improvement.

  • For example, Seattle’s Community Power Works has a web-based information system that program administrators use to track the number of projects at several key steps and the time it takes for customers to move from one stage of the assessment and upgrade process to the next. The system also allows homeowners to keep track of their project status through a customer portal. 

Using Data as “Anticipatory Metrics”

Metrics are particularly helpful for tracking progress toward your objectives by measuring not only completed projects, but also the intermediate achievements that indicate whether or not you are being successful even before all of the completed upgrades have been counted. You can think of these as anticipatory metrics. In economic terms, they are similar to a “leading indicator,” a metric whose change usually anticipates a change in direction in the economy. 

In program management terms, anticipatory metrics can help predict final results. They are anticipatory because they can be measured before results (e.g., number of upgrades completed). Over time, you can keep track of how well these anticipatory metrics serve your needs by seeing how well they correlate with your results. 

If your program requires a comprehensive assessment as part of the upgrade process, the number of assessments completed could be an anticipatory metric for your program. 

  • Identify anticipatory metrics by understanding the relationship between steps.  If your objective is 100 completed upgrades, you know that at least 100 assessments need to be done. Realistically you know that most programs will need at least two to three times that number of assessments because only a portion of customers will complete an upgrade project.
  • Track anticipatory metrics. If you keep track of the number of assessments that are done, you can use that information to help predict how close you may be to being able to meet your 100 upgrade objective.
  • Use information about timing to anticipate trends in program activity. If you know how long it takes on average to complete an upgrade after the assessment has been completed, you can understand when spikes or lulls in upgrades may occur—and you can predict  whether and when you will meet your objectives.

 

The Connecticut Neighbor to Neighbor Energy Challenge Collects Metrics

The Connecticut Neighbor to Neighbor Energy Challenge had a sophisticated approach to collecting and assessing metrics. To assess its program delivery strategy, the program measured several aspects of the interactions between customers, the program, and contractors, including:

  • Assessment close rate
  • Number of days to schedule and complete an assessment
  • Bid rate
  • Number of days to deliver a bid
  • Average age of bids outstanding (i.e., how long the bid has been in the hands of customers who have not yet made a decision to pursue an upgrade)
  • Upgrade rate
  • Average age of completed upgrade (from home performance assessment).

For more about the Neighbor to Neighbor Energy Challenge's approach to data collection and program assessments, see the presentation “Technology Solutions and Programmatic Approaches: Driving Innovation in Residential Energy Efficiency Strategies”.

Budget Tracking Information

While it is vital that your program measure program progress, customer satisfaction, and the efficiency of your workflows, you also need to track your internal operational budget to ensure that you are investing your funds sufficiently to get the results you want—and also to ensure that you are not spending beyond your means. Budget information helps programs track costs against overall budgets and find opportunities to reduce costs or increase investments.

Examples of key costs include:

  • Staff salaries and benefits
  • Costs for program implementers or consultants
  • Cost of incentives
  • Resource development costs (e.g., generating marketing materials)
  • Program implementation costs (e.g., marketing campaign expenses)
  • Capital expenses (e.g., equipment, computers and software).

Examples of valuable budget information include:

  • Overall spending relative to budget, including spending for completed upgrade projects and anticipated spending for projects that have been initiated but not completed
  • Overall spending relative to project outcomes (e.g., number of upgrades, energy savings).
  • Costs of specific activities (e.g., cost per lead, average onsite upgrade costs)
  • Costs of specific program components (e.g., marketing, contractor training, quality assurance) 
Close

Establish internal and external processes for reviewing and communicating program performance

As you collect information about your program, establish internal mechanisms for communicating and reviewing information that can support real-time assessment of program performance as well as mechanisms for communicating information externally. Communicating program metrics and component-specific metrics related to marketing and outreach, financing, and contractor engagement & workforce development allows program personnel and partners to monitor activity, understand progress toward goals, and assess and improve program design and delivery over time.

To communicate program information internally and externally:

  • Design communication tools to present information in a format that highlights key data, trends, and analysis.
    • Some data are best expressed as numbers reflecting magnitude (e.g., total energy savings).
    • Others are better expressed as relationships such as percentages or ratios (e.g., percent achievement of numeric objectives or cost per upgrade).
    • Trend information is useful for some information (e.g., upgrades over time).
  • Where possible, highlight progress toward specific goals and objectives.

These approaches will help program managers and partners quickly review information (i.e., the “check” step of the continuous improvement process) and make decisions about program refinements (i.e., the “act” step).

In designing a system for internal communication and review, you should consider:

  • Who. Determine who should get what type of internal information (e.g., program staff, contractors).
  • When. Determine how often information should be generated and communicated (e.g., monthly, quarterly, semi-annually).
  • What. Determine what information should be communicated (e.g., metrics on upgrades, energy savings).
  • How. Determine how metrics should be communicated (e.g., by reporting period, year-to-date, program-to-date).

For external communications, key partners that may need to see various types of program data on an ongoing basis may include:

  • Marketing partners interested in the effectiveness of their approaches and progress toward program participation goals
  • Financing institutions interested in demand for residential energy efficiency financing products
  • Contractors seeking to track the level of program-generated demand and gauge their activity relative to other contractors
  • Utilities interested in program-generated demand for rebates and other programs
  • Regulatory bodies requiring information on program cost-effectiveness

Common approaches for sharing program data are described below.

Dashboards

Dashboards allow quick access to key data to help inform program management. Among other things, they can provide:

  • Automatically updated and aggregated project data to illustrate how the program is performing on key metrics (e.g., percent progress toward objectives)
  • Summarized workflow metrics to help you understand process bottlenecks
  • Time series data showing, for example, how outreach, events, services offerings, operational changes and other factors are influencing program outcomes and the flow of projects.

It is important to note that dashboards are only as good as the data they present—current and accurate data are critical for your program.

RePower Bainbridge Uses a Dashboard to Monitor Progress and Results

RePower Bainbridge used a program dashboard, shown below, to track its progress on a monthly basis toward assessment, upgrade, energy savings, and carbon reduction goals.

Repower Bainbridge dashboard

Source: Progress Dashboard, RePower Bainbridge, 2013.

Internal Reports and Partner Communication

Regular written reports are another common strategy for sharing program information. These reports do not require special software systems but often use a standard template. Some programs produce monthly management reports showing key program metrics and budget data. Such reports are often circulated to program staff and the management team prior to monthly program meetings.

The U.S. Department of Energy developed an internal report template that programs can adapt and use. It provides a standard format for highlighting:

  • Recent program activity
  • Key program summary metrics (e.g., number of assessments, loans, upgrades)
  • Data about how many projects are in various stages from application to completion
  • Status updates on metrics for key program components:  financing, marketing and driving demand, workforce, and data & evaluation.

These internal reports can be adapted to the specific needs of a program. Some programs have used them to track leads generated by different outreach events, program activity by outreach staff member, referrals of customers to other programs, contractors attending training, and a variety of other data points.

Share program data with key partners as part of ongoing communications, coordination, and consultation about efforts to assess and improve programs.  For specific strategies for communicating these data to partners, see handbooks on Contractor Engagement & Workforce Development, Financing, and Marketing and Outreach.

Community Power Works Used a Dashboard and other Internal Reports to Track Progress

The Community Power Works (CPW) program in Seattle, WA used a program dashboard and custom reports, charts, and maps to track progress against targets. Several examples are provided below.

Monthly Indicators Dashboard

CPW’s monthly indicators dashboard showed progress toward goals for sign-ups with the program, energy assessments, home energy upgrades, and energy savings.

CPWProgressDashboard-Mar2015.png

Program Pipeline Graphic

CPW used the program pipeline chart below to review the long-term progress of CPW customers by stage. The chart below represents data from 2011-2015.

CPWProgramPipeline.png

Premises by Location Map

CPW used the premises map below to review where CPW sign-ups occurred over time. Potential customers could set the map for any given period (below is April 2011-May 2015) and zoom in or out to adjust the level of detail. In this way, homeowners could see if their neighbors were improving the energy efficiency of their homes.

CPWPremisesByLocation.png

Source: Community Power Works Program, 2015

External Reports

Your program may also want to generate routine reports for partners, funders, and others to maintain ongoing support for program implementation, provide accountability, and/or get feedback on program implementation.

  • Funders may be interested in reports that clearly show the metrics of immediate relevance to them (e.g., kilowatt-hour or therm energy savings, demographic information, non-energy benefits). This reporting can help sustain funder support and/or provide an opportunity for feedback on program strategy and results.  
  • Similarly, you may want to provide routine reports to legislatures, city councils, and other decision-making bodies to show the benefits of public investments and get feedback on what kind of impacts in the community would help encourage future support.
Close

Regularly review and assess program metrics and feedback to determine what is working well and what is not

By tracking program operations closely, you will be able to identify issues that arise, make necessary changes, and respond quickly (i.e., the “act” step of the continuous improvement process). When developing program metrics and measurement strategies, you assigned responsibilities for reviewing and assessing evaluation metrics. Many programs use a monthly or twice-monthly management team meeting (coupled with frequent informal communications) for reviewing program data to inform design and operational decisions.

Review and assessment should draw on all of the information sources described above as well as insights from other program components, including Marketing and Outreach, Financing, and Contractor Engagement & Workforce Development.

As you review data, you should focus on identifying:

  • Where the program is and is not meeting its goals.
    • For example, if energy savings from upgrades are significantly off target, focus on determining why and develop strategies to address the problem. You could look at opportunities to target high-energy-use households or revisit your program’s incentive structure to encourage measures that deliver deeper energy savings.
  • Process bottlenecks and failure points where the program may be losing customers.
    • You should seek opportunities to streamline program steps that turn out to be more time-consuming or onerous than necessary. Look for chances to eliminate extraneous processes and procedures not critical for achieving program goals. Assess whether bottlenecks can be addressed with a technology solution (e.g., tablets for contractors to automatically transfer data from the field, electronic transfer of utility data upon customer release) or whether they need to be addressed through new staff roles, responsibilities, or procedures.
  • Where customers, contractors, or partners are highly satisfied or dissatisfied.
    • Understand what aspects of the process customers, contractors, and partners are satisfied or dissatisfied with and why.  Seek advice from them about how to sustain and enhance the positive aspects of your program and how to address areas that are not working well.

The Connecticut Neighbor to Neighbor Energy Challenge Used Extensive Data Collection and a Consistent Review Schedule to Assess Its Program Delivery Process

The Connecticut Neighbor to Neighbor Energy Challenge used extensive quantitative and qualitative data collection to assess its program delivery process, from initial outreach strategies in 14 towns around the state through assessments to upgrades. This approach identified several areas for improvement.

The program used a customer relationship management system to manage and analyze data. Quantitative data collection focused on interactions between customers, the program, and contractors. Some key metrics included:

  • The rate of customers moving from assessments to upgrades
  • Time taken at various steps in the upgrade process, such as scheduling and conducting an assessment, issuing a bid for upgrade work, and performing the upgrade

Quantitative data were supplemented by qualitative information from customer surveys conducted online and by phone, and from staff debriefs following customer events.

Data were used to develop management reports comparing metrics against weekly, monthly, and quarterly goals across multiple strategies. Program administrators used a dashboard to view high-level metrics and issued more detailed pipeline reports (measuring project-level data) for contractors. The program also shared progress reports with utility ratepayer fund administrators that oversaw program funding.

To assess its delivery process, the program engaged in what it described as an “unwrapping” process, taking a close look at all of the interactions between customers, contractors, and the program. Examples of issues identified and addressed through project improvements included:

  • Poor-quality leads at program launch. Feedback from contractors, customer surveys, and data on how many projects resulted in assessments and upgrades highlighted that only about a quarter of customers interested in the program (i.e., “leads”) actually followed through with an assessment (the program tracked the lead-to-assessment ratio as a metric called the “close rate”).
    • To address the issue, the program engaged in intensive retraining of outreach staff, developed contractor scorecards, and made other program changes that led to an increase in the close rate to 60%.
  • Low assessment-to-upgrade conversation rate. Project pipeline reports revealed that the number of customers moving from assessments to upgrades was low.
    • To address the issue, the program created a contractor liaison position to interact directly with contractors and understand their bid and sales process, instituted a publicly available contractor scorecard, and held monthly contractor meetings to discuss what was working well and what was not.

The program also created a “swim lane” strategy, in which program staff determined which customers were already interested in upgrades and which were mainly interested in home assessments.

  • Customers already interested in upgrades were matched with contractors experienced in doing upgrade work.
  • Customers mainly interested in assessments were matched with contractors experienced in home assessments that could help homeowners identify what kind of residential energy efficiency measures would be most appropriate for them.

For more about the Neighbor to Neighbor Energy Challenge approach to data collection and program assessments, see the presentation “Technology Solutions and Programmatic Approaches: Driving Innovation in Residential Energy Efficiency Strategies”.

 

Programs Learned from Contractors That They Needed “Progress Payments” to Cover Initial Costs for Materials and Labor

Residential energy efficiency programs in San Diego, Seattle, and Oregon maintained active lines of communication with their contractors to get feedback and assess how well the programs were working for contractors. In all three cases, contractors suggested an improvement: provide contractors with incremental “progress payments” throughout upgrade projects rather than a single payment at the end of the project.

Progress payments would allow contractors to buy materials and cover other expenses incurred early in a project. Because contracting companies are often small businesses, they often do not have a large amount of capital to cover costs that are repaid only at the end of a project.

In response to the feedback from contractors, all three programs implemented changes.

  • San Diego’s Energy Upgrade California program established progress payments as part of its financing approach.
  • Seattle’s Community Power Works and Enhabit set up small revolving lines of credit to cover up to a 50% down payment on upgrade project costs.
  • All three programs established rules for who would repay the funds if the project was not completed homeowners in San Diego and contractors in Seattle and Portland.
Close

Make decisions about program design changes and communicate them

In the “act” step of the Plan-Do-Check-Act Cycle, program managers—often in collaboration with staff and partners—make decisions about program design and operational changes.

For programs to be flexible and embrace change, an entrepreneurial approach is required for program managers and staff. DOE’s experience with the Better Buildings Neighborhood Program has shown that programs that quickly adjusted to changing conditions tended to be more successful.

At the same time, programs need to be judicious about when to make changes—frequent adjustments can confuse customers and frustrate contractors. You should also understand how much leeway your program has to make changes. For example, utility-run programs may need regulatory approval for program design changes.

To make sure that your program is regularly and systematically reviewing information and making decisions:

  • Establish a regular cycle of program design review and refinement, such as monthly or quarterly meetings
  • Establish clear decision-making processes, responsibilities, and approaches for internal and external collaboration
  • Understand who needs to approve program design changes, whether internal management or a board or public utility commission, as well as which potential changes require approval and which the program may implement without approval
  • Establish clear lines of communication among program staff, contractors, and partners.

Denver Energy Challenge Turns the Program Around with Innovation

The Denver Energy Challenge began serving customers in 2011. By September 2011, 2,100 residents had signed up, but only 100 undertook assessments and none moved on to upgrades. In the fall of 2011, the program re-assessed and committed to improving its results. The Challenge launched several new features, including having green teams consisting of outreach staff canvass neighborhoods. By the following summer (July 2012), residents had completed over 1,000 upgrades.

Here is the program’s advice on key elements that helped turn the program around:

  • Conduct smart, strategic marketing and outreach
  • Carefully track and measure data
  • Work collaboratively with contractors
  • Make the process as easy as possible for customers and contractors
  • Leverage other resources and partnerships
  • Keep innovating
  • Do not give up!

For more on the approach used by the Denver Energy Challenge and its results, see the presentation “Turning around your residential program: Lessons Learned” 

 

Green  Jobs–Green New York Adjusted Loan Eligibility Requirements Based on Approval Rates for Lower Income Households

The Green Jobs–Green New York program run by the New York State Energy Research and Development Authority (NYSERDA) found that lower income residents who most needed financial support were not qualifying for loans to complete upgrades. As a result, NYSERDA adjusted its financing program to establish two tiers of loans.

Tier 1 loans followed traditional underwriting standards, establishing a minimum FICO credit score of 640 and a maximum debt to income (DTI) ratio of 50%. Tier 1 loans were financed through capital markets.

Tier 2 loans used alternate underwriting criteria that were easier to achieve for lower-income residents. These loans were financed by a revolving loan fund managed by NYSERDA. 

  • For households with credit scores below 640, NYSERDA Tier 2 standards increased the maximum DTI to 55% and used utility bill repayment history in lieu of credit score to assess creditworthiness.
  • For households with FICO scores above 680 that were rejected from Tier One because they had DTI ratios above 50%, Tier 2 standards increased the maximum DTI to 70% and used utility bill repayment history to assess creditworthiness.

To be eligible for Tier 2 loans, homeowners also needed to be current on their mortgage payments for the previous twelve months. The program revised the Tier 2 criteria three times—what it described as “gradually lowering the bar”—based on loan application approval/denial rates. As of December 2013, the approval rate for Tier 2 loans was as high as 77%.

As of December 2013, with nearly $44 million in total Tier 1 and Tier 2 loans issued, the delinquency rate was slightly higher for Tier 2 than Tier 1, but the total delinquency rate was only 3.6%. Among other things, the program learned that utility bill repayment history was a strong indicator of whether customers would repay loans.

Loans Closed and Loan Value for NYSERDA Green Jobs–Green New York (November 2010–December 2013)

Source: Green Jobs—Green New York, 2014.

Close

Tips for Success

In recent years, hundreds of communities have been working to promote home energy upgrades through programs such as the Better Buildings Neighborhood Program, Home Performance with ENERGY STAR, utility-sponsored programs, and others. The following tips present the top lessons these programs want to share related to this handbook. This list is not exhaustive.

Keep the program simple for your customers

Given all of the other things that compete for your audience’s attention, it is critical that program participation steps are straightforward and easy to understand.  Many programs have found that complexity makes it harder for interested homeowners to complete upgrade projects. These programs have focused on streamlining services, requiring as few steps as possible for customers, and keeping the message about the upgrade process simple.

  • Enhabit, formerly Clean Energy Works Oregon, provided a “One-Stop Shop” Home Energy Remodel process to guide customers through a four-step process: apply, assess, finance, and transform. This simple process gave customers access to a comprehensive package of services that included low-interest financing and rebates, free energy assessments, assistance from an independent energy advisor, and the option to repay monthly loan obligations through their heating utility bills. To keep the process simple for customers and, in the process, improve program administration efficiency, Enhabit focused on process automation through its internal project tracking system.
  • The EnergySmart program in Boulder County, Colorado, found that having an energy advisor assigned to each program participant throughout the home upgrade process was a key to keeping the program simple for customers and for overall program success. Energy advisors offered easily accessible subject-matter expertise, project management support, and encouragement to help customers make decisions and complete their upgrades. They installed low-cost energy savings measures and helped homeowners review assessment reports, determine which home improvements to pursue, select contractors, and apply for rebates and financial incentives. EnergySmart enjoyed a robust conversion ratio; nearly 70 percent of enrolled homeowners completed a home energy upgrade. For more on energy advisors, see Energy Advisors: Improving Customer Experience and Efficiency Program Outcomes.
  • Recognizing that many different types of energy efficiency financing and rebates were available to its customers—but that it could be overwhelming to sort through them all— RePower Bainbridge helped customers access aggregated information by creating a consumer-friendly guide to all utility and non-utility incentives in its service area. The local utility benefited from the guide as well—it made the guide available to all of its customers.
Close

Provide customers with a single point of contact to help them through the upgrade process

While homeowners may be interested in the benefits of an energy upgrade, many are deterred from completing an upgrade project because of the complex and unknown process. Often, a significant portion of homeowners who receive energy assessments do not continue with the upgrades. As part of the Better Buildings Neighborhood Program, multiple programs across the country tested a range of customer service strategies through a single point of contact to guide homeowners through the entire upgrade process. These program staff members are often called energy advisors or energy coaches and can provide a combination of services to help customers overcome barriers to home energy upgrades.

This approach – identifying barriers and providing targeted services through dedicated energy advisors to overcome them – has produced higher conversion rates and more satisfied customers; however, these services can also be time-intensive and increase the cost of program delivery. For more information on utilizing energy advising services to minimize informational, decision-making, and transactional barriers faced by homeowners, see Energy Advisors: Improving Customer Experience and Efficiency Program Outcomes.

  • EnergySmart in Boulder County, Colorado, found that having an energy advisor assigned to each program participant through the home energy upgrade process was a key to program success. Energy advisors built trust with the customer during an initial home visit and maintained a one-on-one relationship with homeowners throughout the process. Energy advisor services included installing low-cost measures, reviewing the assessment report and work scope, assisting with contractor selection, and helping with program paperwork. The relationship endured after the upgrade: after they completed their first upgrade, program participants frequently continued to stay in communication with energy advisors about additional projects and questions. Through customer surveys, Boulder found that 97% of customers rated their energy advisor as professional, knowledgeable, and timely. These customers agreed that “working with my Energy Advisor has been worth my time and effort.” In Boulder, around 60-70% of homeowners enrolled in the program took actions to upgrade their homes.
  • Energy advisors for Enhabit, formerly Clean Energy Works Oregon, provided education, objective advice on the assessment report and work scope, and quality control to customers across nearly half of the state. Program staff helped customers initiate the process by scheduling a home energy assessment, and they provided a quality control review following upgrades. Advisors also monitored the progress of each project through internal project pipeline status reports, which helped reduce bottlenecks and minimize customer frustration. The energy advisor strategy helped Enhabit achieve a 94% customer satisfaction rating during the program pilot. Enhabit found that in some cases—such as having energy advisors present at assessments conducted by high performing contractors—the program could reduce energy advisor services without impacting customer satisfaction or reducing the number of upgrades completed. This knowledge allowed the program to reallocate their resources.
  • The Denver Energy Challenge provided customers with free energy advisor services starting with an initial phone call. The energy advisors helped customers by identifying available rebates and financing options, finding qualified home improvement contractors, reviewing bids, providing education on energy improvements, and even connecting qualified residents with other free or subsidized energy improvement services outside of the Denver Energy Challenge. As a result of this support, nearly 75% of customers who worked with an energy advisor went on to complete a home energy upgrade.
  • NeighborWorks of Western Vermont staff scheduled all contractor visits for its customers residing in small towns across Rutland County. Once contractors completed home energy assessments, energy advisors reviewed assessment reports with customers. This review helped customers understand the content of the reports and prioritize improvements to be undertaken based on their needs and budgets. Energy advisors helped customers apply for financing (as needed) – a common point in the upgrade process where projects stall – and move on to the next steps. The energy advisor acted as the customer’s primary point of contact for information about the assessment and upgrade process. This approach contributed to the program’s success in completing over 600 upgrades from 2010 through 2013.
  • Greater Cincinnati Energy Alliance (GCEA) energy advisors helped homeowners through every aspect of the upgrade process, from requesting an assessment to hiring a contractor. The program found that offering energy advising services through one individual person – the energy advisor – made potential customers more comfortable with the program, even if many customers did not actually contact the advisor. This hands-on customer service increased the number of completed upgrades and ensured that a high standard of quality was maintained throughout the process.
Close

Make upgrade options clear and concise for customers

Programs in many regions of the U.S. find that the concept of home performance is new to homeowners. Homeowners may not know how energy efficiency measures compare (e.g., energy savings benefits of insulation versus new windows) or have not heard about some effective measures, such as air sealing. Programs can help customers overcome decision paralysis with a prioritized list of upgrade recommendations and help deciding which measures to undertake. Several programs have devised simple approaches to help customers understand the energy savings, cost savings, and other benefits from various types of measures, so homeowners can choose what is best for them. Recognize that customers may have other priorities when considering an assessment’s proposed measures (e.g., improving the look of their home with new windows or replacing an aging furnace before winter weather sets in).

  • Austin Energy developed a form to estimate energy savings using a point system that contractors could use with residents during a home assessment.  The form helped contractors and customers quickly determine which measures would achieve 15% energy savings in the home.  Texas A&M’s Energy Systems Laboratory validated the point system for the program to ensure its accuracy and integrity.  The program found that this streamlined approach was appealing to customers and contractors.
  • Los Angeles County’s Energy Upgrade California implemented the Flex Path program that used a point system to show the energy savings from a menu of energy upgrade measures.  To be eligible for program rebates, residents then selected which measures they would like to undertake that would total over 100 points and achieve 15% energy savings.
  • Michigan Saves, formerly BetterBuildings for Michigan, provided customers with a “base package” that included an energy assessment, direct installs of compact fluorescent light bulbs and water saving devices, and basic measures like air and duct sealing. Customers could then choose to undertake additional measures (e.g., insulation, furnace replacement) in addition to the base package. The program found that the clear and concise base package was a good way to get people into the program, but it wasn’t sufficient to reach the program’s goal of 15% energy savings in upgraded homes. Getting homeowners to achieve higher energy savings through additional measures required incentives, such as rebates and low interest financing.  For more information, see the case study Experiment to Find the Right Mix of Incentives.
Close

Keep the program simple for your contractors

Program administrators learn early on that they should minimize the burden for contractors entering and participating in the program. Satisfied contractors are a key to satisfied customers and successful programs. Many programs have realized that they should streamline program processes, minimize changes over time and communicate early with contractors about new offerings and potential changes. To reduce contractors’ reporting costs and help ensure timely and complete reporting, these programs have streamlined contractor reporting forms and requirements as much as possible. Many programs also avoid making contractors meet locally-specific certification requirements, instead requiring certification from nationally recognized programs. For more on working effectively with contractors, see the Contractor Engagement and Workforce Development handbooks.

  • NeighborWorks of Western Vermont focused on listening to the needs, wants, and issues of contractors, so the program could help them serve customers most effectively. The NeighborWorks program held individual monthly meetings with each contractor to review client status, as well as bi-weekly group contractor meetings to review program issues, alert contractors to any changes in the program, and provide learning opportunities.
  • Enhabit, formerly Clean Energy Works Oregon, has been very successful in engaging contractors in regular, ongoing communication and making adjustments to the program in response to contractor feedback. For example, when Enhabit engaged a new financing partner, the program asked contractors to examine the loan product and approval process. Leadership of the Home Performance Contractors Guild of Oregon, an organization that provided a unified voice and formal role for program contractors, identified that the timing of loan signings came too late in the contractor sales process. The guild said the financing product would not be of much use to contractors because contractors would have to expend considerable effort in a project before knowing if their customer could get a loan to pay for it. As a result, Enhabit renegotiated with the financing partner to put the loan signing earlier in the sales process. For more information, see the case study Making the Program Work for Contractors.
Close

Measure and evaluate performance at key points in the process

Measuring performance at key points in the upgrade process (e.g., assessments, conversion rates, and financing applications) has helped programs understand where their processes are working smoothly and where they are not. This information has helped them continuously improve their program design and implementation. To monitor progress, successful programs have combined information from their project tracking systems with customer surveys, information from call centers, and feedback from contractors and lenders to understand the customer experience. Make data accessible for program staff to track progress, identify successful strategies, and detect points of failure.

  • Enhabit, formerly Clean Energy Works Oregon, established an extensive process for getting customer feedback at key points in the program delivery process to evaluate customer satisfaction and better understand why some homeowners chose to undertake upgrades while others did not. The program identified seven points in the program delivery process to gather information through feedback surveys and phone interviews: application, assessment, bid, drop-out, financing, completion, and experience after 12 months. The program credited this kind of customer communication and feedback as one of the keys to its ongoing success.

CEWO Feedback Surveys

Source: Clean Energy Works Research Planning, Will Villota, CEWO, 2012 (Presented during January 19, 2012 Better Buildings Residential Neighborhood Program peer exchange call).

  • Boulder County’s EnergySmart program sent an online customer feedback survey to homeowners who had completed upgrades. Among other things, the customer surveys affirmed customer satisfaction and identified the opportunity for word-of-mouth marketing. Surveys found that the vast majority of the respondents would recommend the EnergySmart service to a friend or neighbor. The surveys also surfaced some weaknesses that the program resolved. For example, some respondents noted contractor’s lack of response and professionalism as an issue, which led the program to develop guidelines for professionalism and customer contact. Surveys also noted that the assessment report was long and confusing, leading the program to develop a new, customized report that was easier to follow and clearer about next steps.
  • Connecticut’s Neighbor to Neighbor Energy Challenge used qualitative contractor and customer feedback combined with quantitative data to evaluate how well its outreach efforts led to home energy assessments. When informal contractor feedback alerted program managers that relatively few interested customers were following through to have assessments conducted on their homes, the program analyzed project data and found that only around a quarter of customers who expressed interest in an assessment had completed one. To diagnose the problem, the program analyzed data to see how customers were acquired, how long it took to send leads to contractors, and how long it took contractors to follow up with customers to arrange for an assessment. Through qualitative analysis, the program found, among other things, that customers didn’t understand what they were signing up for and may have been unwilling to say “no” to young and enthusiastic outreach staff. The program also found that its staff wasn’t following up quickly enough with people that wanted more information. In response, the program improved its process for distributing leads to contractors (e.g., linking contractors to homeowners in 1-2 days), created a “receipt” for interested customers outlining next steps, and set up a system to call non-responsive leads after two weeks. With these and other steps, the program increased its close rate 35% in one month after changes were implemented.
Close

Examples

The following resources are examples from individual residential energy efficiency programs, which include case studies, program presentations and reports, and program materials. The U.S. Department of Energy does not endorse these materials.

Case Studies

  1. Author: U.S. Department of Energy
    Publication Date: 2011

    This case study describes Austin Energy's short-term, comprehensive rebate/financing offer to jump-start participation and valuable lessons learned along the way.

  2. Author: U.S. Department of Energy
    Publication Date: 2011

    This case study describes an innovative program design used by BetterBuildings for Michigan to "sweep" neighborhoods in order to effectively reach its residential audience and achieve an 80% participation rate among those canvassed.

Program Presentations & Reports

  1. Author: Lilah Glick, Greater Cincinnati Energy Alliance
    Publication Date: 2011

    Presentation from the Greater Cincinnati Energy Alliance on how to conduct a real-time evaluation of programs and services.

  2. Author: Energy Trust Oregon (Prepared by Johnson Consulting Group)
    Publication Date: 2012

    This report presents key findings and recommendations from the process evaluation of Clean Energy Works Oregon's (now Enhabit's) energy efficiency financing program. Table 1 provides a good list of key process evaluation research questions which may help others scope comprehensive process evaluations.

  3. Author: Research Into Action, Inc.
    Publication Date: 2010

    This report describes the process evaluation of a pilot project in Portland Oregon that informed the refinement and expansion of the program statewide into Clean Energy Works Oregon (now Enhabit).

  4. Author: Cynthia Adams, Local Energy Alliance Program; Larry Earegood, Consumers Energy (MI); John Schott, NYSERDA; Gavin Hastings, Arizona Public Service; Emily Salzberg, Washington State University Energy; Adam Buick, Community Power Works (WA); Bob Knight, BKi
    Publication Date: 2014

    Quick summaries of strategies various programs have used to improve the efficiency of delivering efficiency.

  5. Author: Kat Donnelly, EMpowerDevices; Kerry O'Neill, Earth Markets
    Publication Date: 2012

    Connecticut's Neighbor to Neighbor Energy Challenge uses dashboards that display key project data for administrators and contractors to monitor progress over time. The program has evaluated performance at different steps in the process and identified strategies to improve performance where needed, such as sales training for contractors, energy advisors, monthly contractor scorecards, and multiple customer "touches." These improvements increased the close rate from 26 to 60 percent in one year.

  6. Author: Elizabeth Babcock, City and County of Denver, Colorado
    Publication Date: 2012

    This presentation highlights key plan elements that helped a Denver energy efficiency program reorient toward success.

  7. Author: Melissa Glickman, Boulder County, Colorado (now EnergySmart)
    Publication Date: 2012

    EnergySmart Colorado uses surveys and a customer database to get feedback from homeowners that helps fine-tune program services and operations.

  8. Author: Betsy Kleinfelder, The Sustainability Institute
    Publication Date: 2012

    As part of its "intentional learning" process, Charleston WISE collects information from homeowners that helps the program systematically test assumptions and implement continuous improvement.

  9. Author: Mary Templeton, Michigan Saves; George Clark, Energy Efficiency Contractor
    Publication Date: 2014

    Overview of Michigan Saves' employer outreach initiative to drive uptake of home energy upgrades.

Program Materials

  1. Author: Wisconsin Energy Conservation Corporation
    Publication Date: 2011

    Example Me2 and Green Madison process evaluation plan to conduct an in-depth investigation and assessment of the major program areas.

  2. Author: RePower Bainbridge
    Publication Date: 2012

    Homeowner data collection survey created by RePower.

  3. Author: Washington State University Energy Program
    Publication Date: 2012

    This mid-program evaluation includes extensive analysis of program sectors, including results of surveys of participants, and summarizes lessons learned to date.

  4. Author: Community Power Works
    Publication Date: 2015

    The Community Power Works program in Seattle, WA uses a program dashboard to track progress against targets. This is an example dashboard from March 2015, which is updated on a monthly basis with progress toward goals for sign-ups, energy audits, home energy upgrades, and energy savings.

Toolbox

The following resources are available to help design, implement, and evaluate possible activities related to this handbook. These resources include templates and forms, as well as tools and calculators. The U.S. Department of Energy does not endorse these materials.

Templates & Forms

  1. Author: U.S. Department of Energy
    Publication Date: 2011

    This simple paper-based dashboard template is a useful tool for developing internal and external reports for communicating on a regular basis key program activities and accomplishments.

  2. Author: Connecticut Neighbor to Neighbor Energy Challenge
    Publication Date: 2011

    Short survey for Connecticut's Neighbor to Neighbor Energy Challenge workshop participants. The workshop allowed the program to share its energy efficiency offerings with homeowners.

  3. Author: U.S. Department of Energy
    Publication Date: 2011

    Sample phone survey template for program contractors.

  4. Author: U.S. Department of Energy
    Publication Date: 2011

    This sample email survey template, created by the Better Buildings Neighborhood Program, was designed for programs to develop their own survey of successful program participants in order to assess customer experience.

  5. Author: U.S. Department of Energy
    Publication Date: 2011

    This sample phone survey template for program drop-outs, created by the Better Buildings Neighborhood Program, was designed for programs to find out why applicants that applied to participate in a program ultimately dropped out.

  6. Author: U.S. Department of Energy
    Publication Date: 2011

    This sample phone survey template, created by the Better Buildings Neighborhood Program, was designed for programs to use with applicants who have been screened out from participating in a program.

  7. Author: Los Angeles County, California
    Publication Date: 2010

    Sample script Los Angeles County used to survey homeowners about energy issues.

  8. Author: U.S. Department of Energy
    Publication Date: 2011

    This document provides a menu of initial questions for a program administrator or implementer to build on and use in developing a real-time evaluation survey to collect qualitative data from program participants.

  9. Author: U.S. Department of Energy
    Publication Date: 2011

    This document provides a menu of initial questions for a program administrator or implementer to build on and use in developing a real-time evaluation survey to collect qualitative data from contractors.

Tools & Calculators

None available at this time.

Topical Resources

Topical Presentations

  1. Author: Patrick Roche, Conservation Services Group
    Publication Date: 2012

    Presentation describing how Conservation Services Group uses data to monitor market transformation and for internal QA/QC purposes.

  2. Author: Jane Peters, Research Into Action, Inc.
    Publication Date: 2011

    This presentation describes steps programs can take to obtain useful feedback from customers regarding their programs.

  3. Author: Jane Peters, Research Into Action, Inc.
    Publication Date: 2010

    This presentation covers the importance of collecting and evaluating program data, including data related to marketing efforts.

Publications

  1. Author: U.S. Department of Energy
    Publication Date: 2006

    This guide details and explains the five types of general program evaluations and provides guidance on selecting the type of evaluation suited to the program to be evaluated, given the type of information required and budget limitations. It is intended for use by managers of both deployment and R&D programs within the U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy (EERE), although most of the examples of evaluations pertain to deployment programs.

  2. Author: Lawrence Berkeley National Laboratory
    Publication Date: 2013

    This policy brief presents a program typology and standardized data metrics for assessing energy efficiency program characteristics, costs, and impacts. Based on a review of nationwide regulatory filings, the research serves as part of an effort to analyze the cost per unit of savings for utility customer-funded energy efficiency programs. The paper discusses the program categories and definitions, which are based primarily on review of several years of annual energy efficiency reports from 108 program administrators in 31 states for approximately 1,900 unique programs.

  3. Author: Consortium for Energy Efficiency
    Publication Date: 2010

    This guide provides background on the home improvement market in the U.S. and Canada and end users and systems in existing homes, as well as a description of energy efficiency program approaches and strategies.

  4. Author: Oak Ridge National Laboratory
    Publication Date: 2011

    The Residential Retrofit Program Design Guide focuses on the key elements and design characteristics of building and maintaining a successful residential energy upgrade program. The material is presented as a guide for program design and planning from start to finish, laid out in chronological order of program development.

  5. Author: Karen L. Palmer and Margaret A. Walls, Resources for the Future
    Publication Date: 2015

    This article presents the results of a household survey that showed many homeowners have not had an energy audit, and many of those who have, have not followed through with recommended upgrades.

Webcasts

  1. Integrating Experimental Design Into Your Program
    Author: Annika Todd, Lawrence Berkeley National Lab
    Publication Date: 2011
    Presentation, Media
    (79 MB)
    , Transcript

    Experimental design is often used to increase certainty about the actual impacts of a program and what strategies are worth repeating going forward. This webcast reviewed some experimental design techniques and gave examples of how they might fit into your programs.

Last Updated: 02/22/2016