Team-5_Abstracts

TEAM 5
Paper ID Number: PPR-304
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: •
Evaluating Strategic Energy Management: Guidelines and Lessons Learned
from the Pacific Northwest
Efficiency programs focused on changing behavior and organizational practices are growing in both residential
and non-residential energy efficiency programs. Different flavors of Strategic Energy Management (SEM) for
industrial customers have been implemented by a program administrator in the Pacific Northwest since 2009. In
this time, SEM has grown tremendously to include over 100 industrial sites, and savings have increased
accordingly; the industrial program derived 28% of electric savings and 8% of gas savings from SEM in 2013.
The first impact evaluation of SEM projects was completed as part of a program impact evaluation covering the
2009-2011 program years. This evaluation looked at 15 SEM projects, and found a very high realization rate
(105%). However, the process of evaluating SEM projects revealed a number of challenges. The most
fundamental challenge was a lack of clarity and consensus around how SEM projects would be evaluated as no
SEM evaluation guidelines currently exist. Given the growth in non-residential SEM, the program
administrator’s evaluation team held two workshops focused on evaluation of SEM. Each workshop was led by a
different evaluation consulting firm with experience evaluating SEM, and the workshops involved 28 program
designers, implementers, and evaluators. The goal of the workshops was to discuss issues with evaluating SEM
and how best to address them, focusing on identifying areas of agreement and any outstanding issues requiring
additional discussion or research. The workshops resulted in a series of guidelines for evaluating SEM, a list of
outstanding issues related to evaluating SEM, and a roadmap for addressing these issues. These guidelines are
being used and tested as part of an impact evaluation of the program administrator’s 2012 industrial efficiency
program, which is set to be complete in March 2015. This evaluation includes a significant emphasis on SEM,
and a key goal is assessing the savings function of SEM, i.e. what happens to savings over time? This paper will
first summarize challenges associated with evaluating SEM gleaned from past evaluations. Then, we will outline
the SEM evaluation guidelines, outstanding issues, and roadmap resulting from the workshops. Finally, we will
share results and lessons learned from the SEM portion of the 2012 impact evaluation.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
304
•
Evaluating Strategic Energy Management: Guidelines and
Lessons Learned from the Pacific Northwest
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
A. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
B. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-308
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Connecting Behavior Change to Energy Savings through Big Data
This paper will explore the ways in which Connected technology and big data can open new opportunities to
achieve behavior-based energy savings as well as how these technologies can assist in the evaluation of
behavioral efforts and the attribution of energy savings to behavior. As energy efficiency programs apply
behavioral approaches to increase savings, the use of two-way communication technologies has emerged as a
valuable tool to achieving this goal. We propose a paper and presentation that would highlight innovative ways
program administrators are leveraging new technologies and data opportunities to both change energy use
behavior and to quantifiably demonstrate that energy savings has occurred as a result. The paper and
presentation would focus on two different pilots currently underway among program administrators in the U.S.
and Canada. The pilots featured facilitate two-way interaction via a variety of different technologies including (1)
smart phone apps that encourage remote adjustment of thermostats, (2) web portals that provide detailed
energy-use information and compare customers’ usage to that of others, and (3) real-time feedback provided
through displays in customers’ homes or businesses. These pilots provide actionable information in a way
informed by social science research in order to encourage customer engagement and reduce energy
consumption. The focus would be on how these technologies can change behavior and the methods program
administrators have used to quantify and evaluate these impacts. The paper would shed light on some of the
doors that Connected technologies and big data can open in terms of facilitating behavioral program evaluation,
as well as the lessons that have been learned to date from pilots employing these approaches.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
308
Connecting Behavior Change to Energy Savings through Big Data
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
C.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
D. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-316
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: How Much Fire in the Hole: Commercial Gas EFLH on the Cheap
Objective: Programs nationwide struggle each year to estimate how much and at what output their rebated
commercial natural gas heating equipment runs. While direct metering remains the Holy Grail, it also remains
out of reach due to the complexity and expense of installing gas metering equipment. Often estimates rely on
survey data which is not reliable. This paper will present a template for an evaluation approach that could
finally produce accurate and affordable EFLH estimates by climate zone for commercial natural gas heating
equipment. Results: For many years, cold-weather utilities struggled to find an accurate and inexpensive
method for estimating equivalent full load hours (EFLH) for commercial natural gas heating equipment. The
challenges include the following: ·
Design redundancy and multiple-unit configurations, ·
Oversizing,
·
The expense of sub-metering natural gas, ·
The difficulty of isolating single heating systems for billing
analysis in buildings with multiple end uses. Recently, the author performed a commercial gas program
evaluation for a cold climate utility. The evaluation used on-site inspections to evaluate two key inputs:
operational efficiency and the percentage of heating load used by the program-rebated measures. We then
paired this on-site data collection effort with results from site-level billing analysis to produce estimates of both
savings and EFLH. This effort produced promising and reasonable EFLH estimates. Performed with one utility
and a small sample, this effort represents a template for an evaluation approach which – over time and with
moderate improvements – could produce increasingly accurate and affordable EFLH estimates by climate zone
for commercial natural gas heating equipment. Worthiness: With this evaluation, we begin to overcome the
significant barriers to determining accurate commercial gas heating EFLH values. With the help of utility
program staff, we will discuss and debate the various approaches we considered to overcome these barriers and
explain the final agreeable compromise between accuracy and affordability.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
316
How Much Fire in the Hole: Commercial Gas EFLH on the Cheap
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
E.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
F.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-319
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Rates/Pricing
Best Focus: Findings
Abstract Title: The Impact of Time-of-Use Rates on Residential Customers in Ontario, Canada
Impact Evaluation of Ontario’s Full-Scale Time-of-Use Program The power market in Ontario features retail
competition but some 90 percent of residential customers have elected to stay with the regulated price option.
Besides the nation of Italy, the Canadian province of Ontario is the only region in the world to roll out smart
meters to all its residential customers and to deploy Time-of-Use (TOU) rates for generation charges to those
customers who stay with the regulated supply option. TOU rates were deployed as a load shifting measure in
Ontario, to incentivize customers to curtail electricity usage during the peak period and/or to shift that usage to
less expensive mid-peak and off-peak periods, and possibly to reduce overall electricity usage. This paper
reports on an impact evaluation of Ontario’s full-scale roll-out of TOU rates. This is a three-year project with the
following objectives: (i) Quantify the change in energy usage by pricing period for the residential and general
service customers using a few carefully chosen local distribution companies (LDCs); (ii) Estimate the peak period
impacts using the OPA’s definition of summer peak demand; (iii) Estimate the elasticity of substitution between
the pricing periods and the overall price elasticity of demand. This paper presents the findings from the second
year of the study, examining impacts from TOU rates from their inception through to the end of 2013. 1.
Impacts are estimated for four regions within Ontario. Each region has a distinctive climate and censusprofile. 2.
Impacts are allowed to vary by socio-demographic factors corresponding to census districts. 3.
Regional impacts are calculated to represent the corresponding regional populations. These are
calculated by weighting the regional impacts by regional customer count shares. 4.
Several LDCs are
included in the study. We employ a two-pronged approach in the econometric analysis. First, we estimated an
advanced model of consumer behavior called the “xxxx system” to discern load shifting effects that are triggered
by the TOU rates and to estimate inter-period elasticities of substitution. Second, we estimated a simple
monthly consumption model to understand the overall conservation behavior of the customers and estimate an
overall price elasticity of demand. By using the parameter estimates from these two models and solving them
together, we calculated the impact that TOU rates have had on energy consumption by period and for the
month as a whole. Load shifting impacts are split into three separate calendar years: pre-2012; 2012 and
2013. The pre-2012 period reflects all of the years that LDCs within a region were on TOU rates prior to 2012.
Some LDCs started TOU as early as 2009, while others only began in 2012, resulting in compositional changes
potentially effecting the comparison between pre-2012 and later years. We find that residential customers
show relatively consistent patterns of load shifting behavior across regions and study years. The load shifting
model parameters are generally well-behaved and have magnitudes that have been observed in other pilots. We
do not find any evidence of energy conservation.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
319
The Impact of Time-of-Use Rates on Residential Customers in Ontario,
Canada
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
G. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
H. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-322
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Integrated policy/planning
Best Focus: Policy
Abstract Title: 10 (electricity)-10 (Water)-10 (gas) Multi-family Competition and ES Portfolio
Manager Benchmarking Pilot Program Design
California has a long history of implementing successful energy efficiency and IDSM programs and policies. To curb the
critical water short and drought condition, there is an urgent need to design and deliver innovative and integrated IDSM
program to produce energy and resource savings for electricity, water and gas in the state. The 10 (Electricity) – 10 (Water)
– 10 (Gas) Multi-family (MF) Competition and Benchmarking Pilot program is designed with a multi-year and multi-phased
approach to engage MF complexes to reduce energy and water usage by 10%. The utility program team understands a 10%
behavior-only reduction (i.e., without plug load appliance upgrade) for electricity usage only may be difficult to achieve but
a 10% reduction in water usage should be easily achievable . This pilot program is designed to engage multi-family
complexes (i.e., a combination of rental dwelling units and common areas) to reduce usage of electricity, gas, and water by
10% from existing usage in a 12 months period. The pilot will utilize apartment association and property owners/managers,
common areas to engage individual renters using tactics such as common area signature and door-hangers to communicate
and rally for complex-wide engagement and support. Rewards can be offered to the MF community to motivate
engagement and desired outcome. Ongoing feedback from MF Energy Star Portfolio Manager can provide feedback to
reinforce the desired behavior.
This pilot will engage competitive behavior at three or more levels, by utilizing the capabilities of MF Energy Star Portfolio
Manager Software:




MF complex-wide self-competitive (i.e., % of reduction from all dwellings and common areas meters combined),
MF complex to MF complex competitive (i.e., apartment-A competes with apartment-B in the same city or
different cities),
By grouping the participating MF complexes together using MF Energy Star Portfolio Manager, this will enable cityto-city competition,
Aggregating additional city groups into an even bigger territory is possible.
In summary, this innovative pilot program design is using the best available behavior theory, benchmarking and software
capabilities to motivate competitive behavior at the multi-family complex level. By cleverly using available resources in the
DOE’s recently released MF Energy Star Portfolio Manager, no additional investment is needed to develop complicated
software to support MF program implementation and program scalability. The MF market barriers are well documented,
especially the split incentive barrier between property owners and renters. By engaging the apartment complex as a whole,
this program design will be able to engage both property owners and renters to collectively achieve energy efficiency and
conservation goals.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
322
10 (electricity)-10 (Water)-10 (gas) Multi-family Competition and ES
Portfolio Manager Benchmarking Pilot Program Design
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
I.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
J.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-328
When Data Will Be Available: Is available now
Has This Been Presented Before? Yes, it has been presented-please contact me
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Long-Run Savings of Home Energy Reports Programs
Home energy report (HER) programs are a cornerstone of many utility energy efficiency portfolios. These
programs involve sending electronic or paper reports to residential customers, educating them about their
energy use and encouraging them to conserve electricity or natural gas. Dozens of utilities in the United States
send energy reports to their residential customers, and millions of utility customers receive these reports.
Recently, utilities have begun launching energy reports programs aimed at commercial customers. Since
utilities launched the first large-scale HER programs in 2008, the utility industry has collected considerable
evidence about the savings gained through these programs. Impact studies of one vendor’s programs (XXX)
revealed that HERs typically resulted in average electricity savings between 1.5% and 2.5% of energy use during
the first and second program years. Now that many utility HER programs have been implemented for several
years, we can assess savings over a longer term. In particular, we reviewed studies of mature HER programs—
those running for three or more years—to evaluate the industry knowledge about HER savings, both while
homes continue to receive reports and (for several utilities) after homes have stopped receiving them. This
white paper addresses three primary questions about electricity savings from longer-running HER programs and
savings after the end of treatment: 1. How do HER programs perform over time, and how does the program
design (e.g., frequency of report delivery) affect savings? 2.
What happens to savings when the program
administrator stops sending HERs? In particular, do savings decay and, if so, how quickly? What effects result
from continuing to send HERs? 3.
How does the persistence or decay of HER savings after treatment ends
affect program savings, measure life calculations, and cost-effectiveness?
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
328
Long-Run Savings of Home Energy Reports Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
K. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
L.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-331
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Estimated impacts of installation best practices on the realized energy efficiency of
residential AC systems: a before and after test of the effectiveness of training in improving energy
efficiency
Industry experts estimate that as much as 30% of energy efficiency of central air conditioning systems can be
lost due to poor installation practices related to: specification of system size, coil matching, air flow, duct design
and sealing, refrigerant charge and commissioning. The energy efficiency potential that is lost due to poor
installation practices is unknown at present but believed to be significant. As part of its Capability Building
initiative (in which training is supplied to various trades operating in the energy efficiency space) The Authority
sponsored an HVAC contractor training course in cooperation with the Heating Refrigeration and Air
Conditioning Institute (HRAI). The course explained how to implement best practices in AC system replacement
and emphasizing the need to follow best practices to provide comfort and economic performance to their
customers. The course was offered over a two year period and trained thousands of AC technicians across the
Province. To assess the magnitude of AC efficiency lost due to installation practices, and the improvement
obtained through its training program, The Authority commissioned a study of the energy efficiency of systems
installed by technicians before and after they were trained. In the study 100 technicians were selected for
study. Then, using records provided by The Authority, customers whose air conditioning systems were replaced
before and after each technician was trained were recruited into the study sample. Customers received an
incentive to allow an independent engineer to inspect their system and install monitoring equipment needed to
measure the energy efficiency ratio (EER) of each system. The measurements required to estimate EER
included: temperature and relative humidity on the supply and return side of the cooling coil, air flow
measurements for all fan speeds, energy use of the compressor and blower motors at all common fan speeds
and temperature and relative humidity outdoors. The measurement system was left in place for at least 30 days
to collect the information required to reliably calculate the EER of each unit. This paper will provide an in depth
discussion of the methods and results of the study including a comparison of the EER of installed units with
factory specified values and a comparison of the EER of units installed by technicians before and after they have
been trained.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
331
Estimated impacts of installation best practices on the realized energy
efficiency of residential AC systems: a before and after test of the
effectiveness of training in improving energy efficiency
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
M. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
N. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-332
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Assessing the Potential of Social Networks as a Means for Information Diffusion - the
Weatherization Experiences (WE) Project
A research project recently led a national evaluation of the Weatherization Assistance Program, as tasked by the
Department of Energy. A component of the evaluation, a Social Network Analysis (“Weatherization Experiences
Project”) addressed the potential for two groups, Program recipients and weatherization staff, to influence
energy savings beyond their homes and day jobs as members of social networks. This analysis explored linkages
between individual households, weatherization staff and agencies as nodes within a multi-relational social
system. The project goals were to: 1- explore impacts of communication from a trusted source on program
utilization, household energy consuming behavior and investment in energy-efficiency measures; and 2- explore
the feasibility of Participatory Research Techniques through structured interviews administered by Program
recipients and weatherization staff. Five main questions the project sought to answer through the efforts of the
researchers are: 1- who did you tell? 2-what did you say? 3-what did they hear? 4-what did they do? 5- and
why? This reading of the community helps understand: what type of information is being shared (i.e. health and
safety benefits, energy savings, cost savings); what core values are in place that might support or hinder
adoption of new energy usage behaviors; and the motivating factors for taking action based on information
received by their trusted source. The study was fairly extensive as 85 interviewers participated resulting in 777
completed interviews. Both quantitative and qualitative findings revealed by this study will be presented.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
332
Assessing the Potential of Social Networks as a Means for Information
Diffusion - the Weatherization Experiences (WE) Project
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
O. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
P.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-334
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Evaluating a Behavioral Demand Response Trial in Japan: Evidence from the 1-year
Experiment
This paper presents the latest results of a trial, started on August 2013, to evaluate the impact of peak saving
interventions: (1) a tiered rate with increasing-tier prices that apply to usage during each 30-minute period, (2)
real-time feedback on electricity usage via an in-home display, (3) weekly reports that provide neighbor
comparison and peak saving information, and (4) an email alert to reduce peak usage. In particular, we
developed a prototype of advice-report generating system, which analyze the electricity data of each resident
and automatically choose a potentially most preferable report among the prepared templates. Each template
contained four different types of modules, some visualize the electricity usage and others suggest ways to
reduce the usage effectively. The four modules were ordered to construct a “story” to strengthen the impact on
the awareness and behavior of the residents during peak time. Through a randomized experiment targeted on
the almost 500 residents of a condominium in Chiba prefecture of Japan, we found that the total average
treatment effect of the four measures was around 10% on peak time, which was statistically significant. The
saving effects of treatment groups showed varying trends over time, implying the influences of the awareness
change on the interventions over time, or the seasonal difference of energy efficiency behaviors. In addition, we
evaluated the relationships between treatment effects and households’ demographic characteristics. The
tendencies observed through the analysis were beneficial in terms of targeting households with higher
treatment effects for the interventions, leading to improved trial cost effectiveness. Furthermore, the results
from analyzing air conditioners’ usage and questionnaire survey data suggested that roughly half of peak saving
is likely to be from saving on air conditioner usage, for instance, by decreasing temperature setting.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
334
Evaluating a Behavioral Demand Response Trial in Japan: Evidence from the
1-year Experiment
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
Q. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
R. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-338
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Leaving the Rearview Mirror Behind: Assessing the Effectiveness of a Concurrent
Impact Evaluation Process
Leaving the Rearview Mirror Behind: Assessing the Effectiveness of a Concurrent Impact Evaluation Process A
traditional post-installation impact evaluation of industrial energy efficiency programs is conducted multiple
years after the completion of energy projects. Because this rearview mirror approach has its limitations, the
State Authority Industrial and Process Efficiency (IPE) program has implemented a unique concurrent evaluation
process in which impact evaluators work side by side with program implementers on the largest and most
complex projects in real time. In general, projects that are chosen for this process have large preliminary savings
and complicated baselines. This often means the selected projects have the highest risk of differences between
projected and realized savings. The concurrent review process seeks to reduce this risk by including evaluator
input throughout the project life cycle – from incentive commitment to final reported savings. Concurrent
evaluation also benefits the program and evaluation process by requiring fewer touch points for the customers
and improving engineering rigor and quality. This paper leverages the results of a recently completed postinstallation impact evaluation of the State Authority’s IPE program to assess the effectiveness of the concurrent
evaluation process at the site and program levels. It builds on previous work summarizing the early lessons
learned through the concurrent evaluation process and discusses how these changes have affected the overall
realization rate of the program as well as the accuracy of the program implementation practices. In addition, the
paper provides both implementer and evaluator perspectives on the concurrent review process itself, its
mission, and its impact on the realization of industrial energy efficiency savings in New York State.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
338
Leaving the Rearview Mirror Behind: Assessing the Effectiveness of a
Concurrent Impact Evaluation Process
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
S.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
T.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-339
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Home audit and weatherization program evaluation
Best Focus: Methods
Abstract Title: KPI: Not Just Key Performance Indicators but Keen Portfolio Intelligence
Evaluations routinely focus resources on estimating savings by conducting field work and collecting primary
data. Utilities and program administrators usually collect and track extensive participant data for residential
audit and weatherization programs; energy auditors spend hours in customer homes collecting information such
as housing characteristics and recording their recommendations for energy-efficient measure installations. All
too often, however, these data end up on a shelf collecting dust and do not lead to actionable intelligence.
Mining of existing data can result in a more informed evaluation and ultimately provide program administrators
with information that can be used to better manage their programs. In a northeastern state, the evaluation
team’s use of a wealth of program tracking data to calculate key performance indicators (KPIs) had a ripple
effect throughout the evaluation, ultimately allowing us to gain deeper insights into program functionality, draw
evidence-based conclusions, and recommend program improvements with objectivity and specificity. In this
paper, we describe our data-driven assessment of a home energy services program. We discuss our methods for
developing the quantitative KPIs, trace their effect on subsequent evaluation data collection activities, and
describe how this data-driven, integrated approach led to more pointed recommendations. Our methodology
began with combining three years of program tracking data across eight program administrators and
approximately 190,000 participating customers and then merging these data with other program data, such as
heating and cooling rebates and energy-efficient equipment loans. Then, we identified and calculated the KPIs
and used the results to sharpen our remaining evaluation activities. For example, the evaluation team calculated
a KPI that revealed a discrepancy in the uptake of financing by different delivery channels. We then prepared
specific interview and survey questions to learn how implementers and contractors promote and discuss
financing options with customers. Finally, we paired the qualitative findings from over 1,000 customer surveys
and over 80 in-depth interviews with these KPI results to provide context and produce stronger evidence to
improve portfolio performance. This paper also explains how this approach could be applied in other
jurisdictions and for other types of demand side management programs. We discuss the lessons learned through
the study process, the importance of accurate and thorough tracking data, integration of KPI results into other
evaluation activities, and use of qualitative data to contextualize findings. By conducting data-driven evaluations
and going beyond savings goals, budgets, and customer satisfaction, evaluators can produce stronger evidencebased conclusions and offer recommendations that program administrators can incorporate into program
design and track over time.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
339
KPI: Not Just Key Performance Indicators but Keen Portfolio Intelligence
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
U. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
V. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-341
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Lighting Evaluation: Does Added Rigor Lead to Added Value?
The prevalence and repetitiveness of lighting evaluations often leads program implementers and evaluators to
contemplate whether the level of rigor for yet another evaluation is necessary. At what point does additional
rigor lead to negligible added value? How much simplification can be done while upholding confidence in the
accuracy of the results? In the realm of measurement and verification, we can simplify these questions into
three standard choices: no metering, short-term metering and long-term metering. This paper will highlight the
results of a multi-stage lighting evaluation that allowed for a direct comparison of each scenario – verification
only, 3-month metering, and 1-year metering. The evaluation enabled the comparison of accuracy, budget, and
timeline for insight into the optimal approach depending on the specific project needs and budget. The
project’s findings show there are added benefits in accuracy moving from no metering to short-term metering.
While not revolutionary, the evaluation’s multi-stage design enables the clear identification of a level of added
value that can be expected from varying rigors and the parameters that affect the accuracy of the results. These
parameters include population characteristics and budget constraints unique for each evaluation. In turn the
program administrators and evaluators will be provided with a confident framework for assessing the added
value from short-term metering and determining an appropriate metering duration with regard to the budget of
the program and the accuracy of the results. The results of this project show that long-term metering adds a
nearly trivial improvement to accuracy. This paper will demonstrate the marginal added value of long-term
metering for lighting measures and clearly define the boundaries of added value for metering duration.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
341
Lighting Evaluation: Does Added Rigor Lead to Added Value?
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
W. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
X. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-343
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: East West Bank Community Alliance and Energy Star Pledge Behavior Pilot
One size does not fit all for the residential customers. The California IOUs are very successful with the current
comparative energy usage behavior implementations using Home Energy Reports, but IOUs continue to explore
new behavior program designs and evaluation alternatives. EW Bancorp is a publicly owned company with
$27.4 billion in assets and is traded on the Nasdaq Global Select Market under the symbol “EWBC”. The
Company’s wholly owned subsidiary, EW Bank, is one of the largest independent banks headquartered in
Pasadena, California, while serving a diverse multi-lingual and ethnic population. This pilot is designed to
support opt-in customers from EWB, along with a separate RCT design. The same Energy Star Pledges will be
mailed to both the opt-in and the treatment groups. The subsequent evaluation will compare program results
from opt-in, treatment and control groups. This behavior program approach requires less data analytics
support. If proven effective, this could be another cost effective behavior program implementation option.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
343
East West Bank Community Alliance and Energy Star Pledge Behavior Pilot
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
Y.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
Z.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-351
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Harder than a Rock: Industrial Energy Management Evaluation
Industrial energy management (IEM) programs, which focus on behavioral and operations and maintenance
activities, are becoming more popular nationally. The Northwest Power and Conservation Council’s 6th Power
Plan determined there was significant potential in IEM programs. Based on that, a program administrator in the
Northwest developed a program targeting IEM in 2010. The first impact evaluation of the IEM program was
conducted in 2011. The evaluation re-ran the regression models for a census of the program, conducted a
documentation review of the capital projects and ultimately verified the program methodology for savings
estimation. Yet, the results had a significant uncertainty given the small sample size and lack of data granularity.
In 2014, the program had experienced significant program uptake and had four years of data tracking for the
first set of participants. Therefore, we worked to develop an evaluation plan focusing on persistence and
allocating savings between the energy management and capital projects. We thought that the planning would
be relatively easy. Yet, as we delved into the issues relevant for 2014, we were unpleasantly surprised by the
number of evaluation issues that are not settled and still require significant effort and consideration before the
evaluation methodologies can be leveraged nationally and cost-effectively. This paper will outline the key
evaluation issues that our 2014 evaluation is testing, including: 1)
Persistence: Verifying savings over time
is one of the most important evaluation activities related to IEM programs. Yet, methods to estimate persistence
have not been fully tested by evaluators and require specific consideration in methodologies. 2)
Sampling: Given the heterogeneity of the population and small sample sizes, evaluations must
determine at what point sampling can be conducted and what sampling strategies might work. 3)
Allocation of savings: The whole-site regression models employed by evaluations of IEM assess total
savings, which include both energy management activities and concurrent capital projects (often funded by
custom project incentives). Because, energy management savings are calculated by subtracting capital savings
from the total savings, any variation in capital project savings estimate from the actual savings are reflected in
the O&M savings. Therefore, we initially planned to conduct a detailed evaluation of the capital measures
involved, but changed plans to a test case when we realized that the cost and effort were unlikely to improve
the savings estimate enough to justify the cost. 4)
Participants with negative savings: The regression
models estimated negative O&M savings for a small, but significant, portion of the program population. This
could be due to the variation in the regression model, or inaccurate capital savings estimation. This paper will
outline the a) key evaluation planning issues that must be considered and solved in Industrial energy
management planning, b) approach used by our 2014 evaluation and c) results of the 2014 IEM study. Given the
interest and emergence of IEM programs across the country, we believe our evaluation methods and results can
be leveraged by other program administrators and evaluators.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
351
Harder than a Rock: Industrial Energy Management Evaluation
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
AA. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
BB. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-372
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Comparison of Bayesian Billing Analysis to Pooled Fixed Effects and Variable Base
Degree Day
Billing analysis has historically relied on two methods: pooled fixed effects (PFE) and two-stage variable base
degree day (VBDD) analysis. These frequentist methods are well documented and widely used, but, like all
models, each has limitations. Frequentist billing analysis methods lack good estimates of error and the results
can be biased, particularly for smaller data sets. Further, advances in computational power have allowed other
methods to become more feasible. In this paper, a Bayesian inference model (BIM) is developed to overcome
some of the limitations of the frequentist methods, and results of BIM are compared to PFE and VBDD. VBDD
fits via regression a physical heat loss model to each building for both pre- and post-intervention periods. Billing
data for VBDD can be incomplete or lost in the analysis due to insufficient number of bills or poor model fits. It is
not uncommon to discard 25% or more of the sites, which could bias the results. PFE is a single regression model
of all the bills across all buildings and does not have a relationship to the physical model of each building. There
is limited data screening for PFE so data loss is minimal. Both VBDD and PFE can include segmentation to
account for savings for different measures or building characteristics. However, neither of these frequentist
methods provides good insight into the uncertainty in the results. Failure to take into account the variance in
estimating segmentation variables or failure to take into account the error in staging the VBDD regression fits
contribute to incorrect error estimates. BIM combines the physical heat loss model of individual buildings with
the benefits of pooling data to fill in the gaps, while also preserving an estimate of uncertainty of the result. BIM
does not suffer from the bias of having to discard sites because buildings lacking data “borrow” properties from
the rest of the population. The model can also include segmentation for measures and building characteristics.
Unlike PFE, BIM does not have issues with over-weighting or under-weighting individual buildings based on the
number of bills per building. BIM is also unbiased with respect to sample size, so the approach can be used for
smaller evaluation efforts. To compare the billing analysis techniques, two real world data sets are used: a
field-metered data set and a pre/post data set from ductless heat pump (DHP) retrofit installations. The metered
data are from a whole-building measurement project covering most energy end uses of 104 residential buildings
in the Pacific Northwest. Utility bills from these buildings are processed using PFE, VBDD, and BIM, and analysis
results are compared to the end use metered data. For the DHP data set, PFE, VBDD, and BIM are used to
estimate energy savings, and results from the three methods are compared to each other. While it is always
useful in evaluations to run multiple models for comparison, BIM can help overcome limitations of PFE and
VBDD approaches.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
372
Comparison of Bayesian Billing Analysis to Pooled Fixed Effects and Variable
Base Degree Day
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
CC. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
DD. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-373
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: The Keystone of Energy Efficiency: New Approaches to Data Analysis Methods in a
Mid-Atlantic State
In Pennsylvania’s Act 129 statewide Energy Efficiency and Conservation programs, a Technical Reference Manual
(TRM) is used to guide the calculation and reporting of energy and demand savings. Since the programs’
inception in 2009 the savings assumptions for commercial lighting measures, such as HOU values and
coincidence with the system peak have been taken from secondary research based on studies conducted in
other jurisdictions with several daisy-chained references. In September 2013, the Pennsylvania Statewide
Evaluation Team embarked on one of North America’s largest Commercial Light Metering studies in order to
update measure assumptions used in the calculation of savings. Using primary data collection coupled with
detailed load shape analyses, the Statewide Evaluation Team developed Pennsylvania-specific load shapes,
operating hours, coincidence factors, and HVAC interactive factors for the 10 most prevalent building types. The
use of this primary research provides a substantial improvement over the previous use of values adopted from
secondary research by taking into account relevant considerations such as building stock and geographic
location. The Commercial Light Metering Study involved the installation of over 2,300 light loggers at 498
facilities across the state. Each site received a general survey and complete lighting inventory using the
consultant’s analysis tool, a tablet-based sampling and data collection tool. The general survey provided heating
and cooling information required to calculate HVAC interactive effects. The lighting inventory was later mapped
to the wattage table in the TRM to calculate facility lighting loads. The consultant’s analysis tool also contained a
random selection algorithm which selected fixtures within the facility for metering by the field engineer. Sites
received an average of between 4 and 5 loggers, each installed for a minimum of 45 days. The data collected
during the logging duration in conjunction with the calculated lighting loads were annualized taking into account
any seasonal operation of the business. The data from each logger was weighted by the logged space type’s
relative contribution to the lighting load in the facility to develop weighted average hours of use and
coincidence factors for each building type. 8760 load profiles were also developed for each facility type for use
in cost effectiveness calculations for the programs. The analysis of the collected data was completed in
September and will be presented in Pennsylvania’s 2016 TRM. In addition, the information gathered will be used
to shape future programs by serving as a new holistic resource for regulator-mandated annual cost effectiveness
tests, and resource planning for the next phase of the Act 129 Energy Efficiency and Conservation program.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
373
The Keystone of Energy Efficiency: New Approaches to Data Analysis
Methods in a Mid-Atlantic State
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
EE. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
FF. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-374
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Markets
Best Fit: Demand Response
Best Focus: Policy
Abstract Title: A Data Driven Approach to Demand Response Dispatch Criteria for Ratepayer Funded
Programs
Our team has developed a data-driven approach for estimating demand response program potential and
identifying dispatch conditions which will maximize program benefits. Demand response participation in ISO
markets has historically been largely focused on emergency programs with infrequent dispatch for conditions
threatening system reliability, but significant savings opportunities may also exist for rate-payer funded state
and utility programs if targeted reductions can exert downward pressure on the peak load forecast. In such a
situation, savings are generated both through a reduced capacity obligation and the lower clearing prices that
come with avoiding high cost marginal generation assets. We used historic load data to quantify the limits on
DR potential imposed by program design constraints. Our simulation calculates the expected performance of
1,600 hypothetical program designs, each of which is based on a different combination of values for program
design attributes. The results provide a ranking of program designs against a performance target. Further, the
results provide data that help program designers balance the tradeoffs inherent to DSM program design.
Assuming a cost-effective level of compensation for demand reductions during peak periods has been identified,
the economic potential of demand response is further constrained by the limits imposed by specific program
designs. These limits include program budget constraints or transaction costs that may limit market
participation. We quantified these program design elements using the following program design variables:
dispatch criterion, the frequency of dispatch, the time window of dispatch, and the total number of dispatch
hours. We applied these constraints to historical RTO system loads for the PJM to calculate and compare the
effects of program design decisions on demand response potential. This year’s D.C. Court of Appeals decision
overturning FERC Order 745 has focused attention on the role of demand response as a demand-side
management tool. The uncertainty created by this ruling has the potential to switch DR governance from
market-based ISO structures to state-level policy. Such a shift would create the need for high quality, datadriven analysis that can be used to set program goals for demand response. Our paper will review the context
of demand response in the Mid-Atlantic region and discuss the data and methods we applied to our simulation.
We will provide summaries of the results, a discussion of the insights provided by the results, and an evaluation
of how the results can inform demand response program planning. We will also comment on the role of datadriven analysis for estimating demand response potential within the framework of economic theory. We
anticipate the information we provide will generate discussion about, and evaluation of typical and alternative
approaches for estimating demand response program potential.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
374
A Data Driven Approach to Demand Response Dispatch Criteria for
Ratepayer Funded Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
GG. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
HH. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-380
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Comparison of Three Methods to Evaluate Energy Savings Benefit of Commercial
HVAC Maintenance Programs
There are many benefits of HVAC quality maintenance (QM) measures, including improved thermal comfort,
enhanced indoor air quality, and reduced energy consumption. Quantifying the benefits of these QM measures,
especially the energy savings benefit, has proven to be a challenging part of commercial QM program
evaluation. The ASHRAE/ACCA Standard 180-2008 lays the foundation for implementing QM programs and
addresses some market barriers. However, this standard does not establish a method to quantify the energy
efficiency benefits of QM programs. The deemed savings potential of commercial QM measures were calculated
from various eQUEST prototype building models and assumptions of maintenance faults. Existing in-situ
performance studies quantifying combined QM measure energy benefits are very limited. This paper identifies
three methods to analyze unit-specific annualized program energy impacts based on in-situ performance
monitoring of selected commercial rooftop units. A set of sample sites were monitored before and after service
to characterize the performance changes caused by QM activities on sampled units. Data loggers were installed
to collect one minute interval data consisting of power and air temperatures for at least one month before the
service and at least two months after the service. In addition, spot power measurement and airflow tests were
performed before and after service. The first method uses all collected trending data to generate DOE-2
packaged unit biquadratic performance curves for each unit during each service period. Trending data were also
used to generate simplified load profiles for the metered units. In addition, unit economizer operation strategy
was identified based on the trending temperatures. Using the unit performance model and the economizer
control data, it should be possible to model system performance for any combination of space cooling load and
ambient conditions. The second method generated linear regression models based on monitored unit power
and some independent variables such as weather temperatures, day of week, and others. The second method is
consistent with the DOE Uniform Methods for Unitary HVAC, although that method was designed for equipment
replacement and not maintenance. The third method used engineering analysis of each unit. This method used
the Beta-version Universal Translator 3 to separate unit operations into five different modes: off, fan on, AC on,
AC stable, and heat on. To remove transient operating data when a unit compressor staged on, the first five
minute data were removed when analyzing the unit cooling performance. By comparing the baseline and post
unit operation at different modes, it is possible to identify the change of unit cooling performance, unit control,
space cooling load, operating schedules, and space set point. The results of these three methods were
compared and analyzed. This paper recommends how to quantify and evaluate energy savings potential of
commercial HVAC maintenance programs based on a discussion on the potential issues and benefits of each of
the three analysis methods applied to the same data.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
380
Comparison of Three Methods to Evaluate Energy Savings Benefit of
Commercial HVAC Maintenance Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
II.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
JJ. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-406
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Survey Research
Best Fit: Services
Best Focus: Methods
Abstract Title: Want Better Response Rates? Go Mobile! Best Practices for the Mobile Era
Survey respondents are transitioning to mobile devices. It is essential that we respond to their needs. This paper
will synthesize the most current research on techniques for increasing response rates, focusing on mobile
technology. We will highlight the abilities of current software to encourage mobile and tablet participation and
increase completion rates on mobile devices. Adapting surveys for mobile respondents not only includes
formatting, but also survey design considerations, such as rotating answer choices so that the same options are
not always presented first, and utilizing 5 point scales rather than 7 or 10 point scales to reduce the amount of
scrolling required on smaller screens. Keeping surveys brief is key to capturing the smartphone market as
smartphone users are 13 times more likely than computer users to exit a survey if it takes longer than five
minutes. Best practices of email invite subject lines, email invite wording, timing of emails, and reminder emails
will be examined, as well as sharing the most up-to-date research on motivational incentives. By leveraging
best practices and the newest technologies, response rates can be improved while decreasing project costs. On
web-based surveys with residential program participants, we’ve seen upwards of 30% of respondents entering
the survey on their smartphone or tablet device. In residential general population surveys, we’ve seen upwards
of 40% of respondents starting our web-based survey on these devices. Mobile survey respondents are eager to
share their opinions but want an interface that is user-friendly, cutting-edge and designed for their device. By
allowing customers to take surveys using a device of their choosing, we can increase response rate, decrease
cost, and reduce non-response bias. By leveraging best practices and new technologies, we have increased the
response rate of surveys. We have begun using different platforms tailored to our target market to collect the
most accurate information, in an efficient and cost-effective manner. To illustrate how these techniques can be
used in our industry, the paper will include real world examples of how we have leveraged best practices for our
clients, resulting in increased response rates and lower costs.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
406
Want Better Response Rates? Go Mobile! Best Practices for the Mobile Era
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
KK. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
LL. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-408
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Counting more than Energy: Preparing for direct measurement of non-energy impacts
from small residential energy efficiency programs in New York
Quantifying Non-Energy Impacts (NEIs) is important both for increasing the value of efficiency investments to
consumers, and for improving the cost-effectiveness of energy efficiency projects. To increase the scale and
pace of market transformation, energy efficiency program managers are looking to use NEIs to expand the value
of efficiency in both of these respects. Increased durability, comfort, health, and other impacts associated with
energy efficiency seem to garner appeal among homeowners, but may require additional substantiation to be
incorporated into marketing, cost-effectiveness testing, and program design. Research conducted on NEIs to
date, uses many different methods including contingent valuation, direct query, conjoint analysis and direct
estimation/measurement. The majority of identified NEI values are based on self-report, whereas the research
described in this paper focuses on objective primary data collection methods, like direct
estimation/measurement, to provide further substantiated NEI values for consideration in total program
benefits.
This effort consisted of reviewing 84 published papers on NEIs and resulted in a database that
parses out NEIs and their associated values from each study. Currently the database contains 303 measure-level
NEIs and their associated value adjusted to New York (NY) based on inflation, cost-of-living, and climate. The
database is a living tool that will continue to be updated as new research is published. It allows program staff to
record program implementation counts of individual measures and analyze the additional impacts of energy
efficiency work. This database is for program managers, regulatory agencies, and other stakeholders to better
understand the influence of NEIs and enable change in cost-effectiveness testing, program design, and program
marketing. For this project, data from several small residential energy efficiency programs in NY were put into
the database to understand which measures had the greatest potential impact. The top eight measures and the
associated NEIs were then paired with possible primary research methods, and using a multi-attribute utility
(MAU) model were then prioritized based on program prominence/potential measure-level NEI, quality of
existing data, reliability of primary research method, and cost of primary research method. For the top three
measures--insulation, whole-home design, and air sealing--the two primary data collection approaches
recommended for cost-effective research and further substantiation of NEIs are modeling and performance
measurement. Direct measurement methods were given more weight in the MAU model because they are
more objective than their commonly-used counterpart—self-report methods. The modeling method would use
physics equations and available industry data to quantify processes like water transfers through home materials,
air pollutant cycling in the home, and equipment life-spans as related to maintenance. Performance
measurement would be used to refine or create models, but instead of using physics and industry data, an
evaluator would go into the field and measure real-world conditions related to similar processes listed above.
The goal of these methods and future research is to increase program manager and regulator confidence in the
characteristics and values of NEIs to homeowners, at the measure, whole-home, program, or even portfolio
level, thereby increasing the pace and scale of market transformation.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
408
Counting more than Energy: Preparing for direct measurement of nonenergy impacts from small residential energy efficiency programs in New
York
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
MM.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
NN. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-412
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Matching and VIA: Quasi-experimental methods in a world of imperfect data
In their 2012 publication on EM&V of behavioral programs, the State and Local Energy Efficiency (SEE) Action
Network ranks various evaluation designs on their appropriateness for behavior change programs. They strongly
advised the use of random control trials (RCT) in evaluations of behavior change programs, awarding RCT a five
star ranking. For situations where RCT is not possible, SEE Action reviews and ranks several quasi-experimental
methods. In descending rank order the report discusses regression discontinuity, variation in adoption (VIA), and
matched control groups. Program and design conditions often restrict impact analysis methods to quasiexperimental designs. Options are further restricted by the particular requirements of the quasi-experimental
designs. For example, regression discontinuity can be applied only if the eligibility requirement for households to
participate in a program is a cutoff value of a characteristic that varies within the population. Consequently, for
many programs, the two evaluation methods most highly rated by SEE Action are not possible to implement,
leaving VIA and matching as possible evaluation methods. While matching and VIA also have restrictive
assumptions, certain multi-year programs are compatible with both types of analysis, providing an opportunity
to assess the approaches by comparing the execution of and results from each method. In this paper we use a
longitudinal data set of energy use and program participation from an opt-in behavior program that was
deployed by four utilities over six years. We compare impact estimates of the program that were derived using a
matching method to those derived from using a VIA approach. We explore the strengths and limitations of each
method, detail the extent to which the data fulfill the assumptions of each method, and delve into the practical
consequences of choosing one method over the other. In addition, we compare the sensitivity of each method
to sample size, variability, and program-specific deployment and treatment conditions.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
412
Matching and VIA: Quasi-experimental methods in a world of imperfect
data
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
OO. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
PP. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-422
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Emerging Technologies
Best Focus: Methods
Abstract Title: Right Sizing Unitary AC Unit Replacement: A Simplistic Study
As buildings age infrastructure needs to be upgraded and replaced. The challenge is to perform the best upgrade
possible in the most cost effective manner. When it comes to buildings with packaged HVAC systems, how is
one to know what is the best, most cost effective upgrade? Typically, packaged HVAC units are replaced on a
“like-for-like” basis. When the old unit fails, a new unit of the same size and type is installed. Energy savings is
realized due to the newer unit being more efficient. But what if the original unit was not the correct size or if
there have been building improvements which can affect HVAC sizing? The reduction in size can pay for the
differential cost of a more efficient system, while saving more energy. Without proper sizing, an A/C program
may lead to the purchase, installation, and operation of a larger unit than is required, while simultaneously
consuming more energy than a smaller “right-sized” unit. This paper will explore an option to quantify in-situ
unitary HVAC unit sizing by utilizing communicating thermostats coupled with power monitoring.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
422
Right Sizing Unitary AC Unit Replacement: A Simplistic Study
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
QQ. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
RR. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-433
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Methodologies for Evaluating Behavior and Operations Savings as a DSM Resource:
Growing and sustaining a high-performance energy conservation culture in schools
Behavior change programs in the residential sector, generally in the form of home energy reports, have become
an established utility resource program accepted by both regulators and utilities. In the commercial sector,
however, behavior programs are just beginning, and bring different challenges for evaluation. Home energy
reports can use large control groups for evaluation, but commercial programs don’t have this luxury and require
each building’s performance stand on its own. Within the commercial building sector, schools represent both
the potential to achieve significant long-term savings, and to produce non-energy benefits with long term
implications. In addition to science, technology, engineering and math (STEM) learning, these benefits include
creating a culture in which students understand the connections between energy and the environment and even
become passionate about energy efficiency. They then use the school building as a learning laboratory,
collecting and analyzing data about their school’s energy use and waste, which leads to in-school campaigns that
eliminate the identified energy waste. Every one of these participating students take this knowledge into their
future. But can the savings from these programs be evaluated and pass muster as utility resource programs?
This presentation will describe an evaluation protocol for measuring energy savings from the behavior and
operational changes that results from creating a high-performance energy conservation culture, following the
International Performance Measurement and Verification Protocol, Option C – Whole Building Analysis.
Bottom-line: After making adjustments for the appropriate governing variables such as weather, floor space and
number of school days, the energy savings must be visible in the monthly energy bills in-order to be counted as
actual savings. Base Year (-) Current Year
= Total Energy Savings/Reduction for the whole building (-)
Retrofit Reduction (energy efficiency measures determined by pre-installation calculations)
= Behavior
Savings In addition to measurement of energy savings from behavior one year at a time, the presentation will
include a discussion of persistence of behavior savings over time. The authors will present documentation of
school energy savings for up to 10 years after establishment of a high-performance energy conservation culture
and environmental stewardship. The paper (presentation) will include: 1)
Detailed description of applying
IPMVP-Option C to determine a school’s total energy savings. 2)
Three case studies of schools growing a
high-performance energy conservation show-casing the actions taken and savings achieved. 3) Real-life
examples of students taking the knowledge learned during the process of growing their school’s highperformance energy conservation culture into their homes, community and future. 4) Documentation of
persistence of savings over four – 10 years, based on continuous tracking of energy use.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
433
Methodologies for Evaluating Behavior and Operations Savings as a DSM
Resource: Growing and sustaining a high-performance energy conservation
culture in schools
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
SS. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
TT. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-434
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Stuck in the Middle: Designing the Ideal Multifamily Impact Evaluation
Stuck in the Middle: Designing the Ideal Multifamily Impact Evaluation The term “multifamily” carries a
number of different connotations among utilities—is it more similar to residential or commercial? Does it
include the low income sector? Does it require five units or more in a building? The multifamily sector accounts
for approximately 15% of non-commercial building energy use nationwide and features significant savings
potential, but its classification in energy efficiency often remains in a gray area. To reduce the multifamily
sector’s substantial energy contribution, utilities have tailored energy efficiency programs to best serve their
regions’ market, demographics, and climate. The result is a broad suite of multifamily energy efficiency
programs that vary in size and delivery method, from simple CFL giveaways to comprehensive, multi-year audit
projects with several interactive measures. Given the differences in how the multifamily sector is classified by
utilities, and the subsequent variance in program design, there is no “right way” to evaluate a multifamily energy
efficiency program. This paper considers the design and delivery method of three multifamily efficiency
programs before comparing the impact evaluation methodology constructed for each. To allow a fair climate
comparison, only multifamily efficiency programs in New York State are examined, including: (1) a prescriptive
program that focuses on common-area and in-unit lighting retrofits; (2) a whole-building program with multiple,
custom measure offerings intended to reach a specific energy reduction goal; and (3) a program tailored to lowincome municipal housing authorities that targets natural gas savings only. Each program features unique
characteristics— customer bases, measure offerings, delivery methods, goals—and therefore a wide variety of
program staff concerns. A properly designed impact evaluation plans for early collaboration with program staff
to ensure that these concerns are addressed within the evaluation methodology and that the study results in
specific recommendations, leading to more optimized program design. During the evaluation planning phase,
evaluators often receive project data at varying levels of resolution and quality, also influencing the approach.
The ideal evaluation design is shaped by a number of different considerations, such as program design,
participant demographics, data quality, and the concerns of program staff: Does the program incent in-unit
measures? Is program-incented equipment accessible and measurable with data loggers? Does the program
offer incentives for multiple measures at once, and are those measures interactive? Do program measures
typically reduce facility energy use by 10% or more? Do participating facilities often experience fluctuations in
occupancy or demographics? Are participating buildings master-metered? What are program staff most
interested in learning (e.g., tenant acceptance of a specific technology, proper equipment operation, differences
in contractor performance)? Considerations such as these have influenced the impact evaluation approach for
the three multifamily program examples offered in this paper and provide a roadmap for selecting the optimal
impact evaluation design—ideally leading to improved program design— for multifamily programs nationwide.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
434
Stuck in the Middle: Designing the Ideal Multifamily Impact Evaluation
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
UU. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
VV. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-439
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Measuring impacts from personalized Business Energy Reports: Are commercial
customers motivated by peer comparisons?
After years of research, the evaluation community has developed a solid understanding of the impacts of home
energy reports in motivating residential customers to save 1-2% of annual energy consumption. Despite this,
little is known about the efficacy of comparative energy reports with more heterogeneous commercial
customers. This paper will fill this gap by presenting results from one of the first impact evaluations of such a
program. This paper will focus on the first impact evaluation of a pilot program, in which the utility sends
personalized Business Energy Reports, with information on how their usage compares to similar businesses, to a
sample of small business customers. The pilot was designed and implemented using a randomized control trial,
with 20,000 small commercial accounts divided into 48 business types, ranging from used car dealerships to hair
salons. Three quarters of each type were assigned to the treatment group and received a monthly report
targeted to their business type. The remaining 25% were designated as a comparison group. We are currently
conducting a billing analysis to measure the impacts of receiving the reports for one year. Using a differences-indifferences regression design, we will measure how much more energy consumption changed for the treatment
group than the control group after the reports were sent. We will also compare traditional rebate program
participation rates between the treatment and control groups and adjust program savings to account for savings
being claimed through other programs. We expect to complete our analysis by the end of 2014. In this paper,
we will highlight results that show how Business Energy Report programs can be more complicated to
administer and evaluate than residential energy report programs. For one thing, business programs must
determine the best way to define the “customer” who will receive a report. If a customer is defined too
narrowly, such as one business within a larger building or franchise controlled by a central customer, then a
billing analysis could underestimate the impact of the reports sent to that one single business. We will present
results showing how the method used to define customers relates to energy savings. In a related challenge,
preliminary results show that how business comparison groups are assigned is very important. When customers
feel that the business type they are being compared to is not accurate or is too general to be applicable, they
may discount information in the report. To explore this further, we will present results showing how savings vary
as customers’ business type assignments become more accurate. Because this evaluation is part of portfolio of
research for the utility, we have the unique opportunity to augment and explain our impact results with data
from other efforts. In this paper we will tie in results from in-depth qualitative usability studies with report
recipients, as well as surveys with report recipients who do and do not participate in traditional rebate
programs, to explain not just if, but how personalized Business Energy Reports influence commercial customers.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
439
Measuring impacts from personalized Business Energy Reports: Are
commercial customers motivated by peer comparisons?
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
WW.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
XX. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-442
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Policy
Abstract Title: Duplicating Successful Energy Efficiency Projects Using Quantitative and Qualitative
Methods
What makes energy efficiency projects successful, and how can those successful projects be generalized and
duplicated to impact future project success? The implementation contractor explored various Commercial &
Industrial (C&I) projects to define measures of success using both quantitative and qualitative methods. These
measures of success were then applied to historical C&I projects to determine which projects met the definition
of success, helping to form future program design. This project implemented various research activities,
including in depth interviews of Program Administrators (PAs), implementers, account representatives,
customers and project champions. In addition, this project analyzed billing and program tracking data from
multiple PAs. To accomplish this, the billing and tracking data for over 70,000 energy-efficiency measures was
linked and rolled up to over 16,000 projects. Four quantitative metrics were developed that indicated, and
potentially quantified, project success. These four metrics were: depth-of-savings (amount of lifetime energy
savings in relation to customer-size), breadth-of-savings (measure type diversity), multi-year repeat customers,
and customers who sign Memorandum of Understand (MOU) agreements. These metrics were leveraged to
identify a sample of successful projects. Projects specifically identified by PAs as successful were also blended
into this sample, providing a fifth indicator of success, which was qualitative in nature. A comparison group of
average and less-successful projects was also sampled for interviews. In depth interviews were administered
to PAs and the two aforementioned customer sample groups. The interviews covered topics including: customer
decision-making, project implementation, customer and contractor relationships, energy and non-energy
impacts, and free-ridership. This paper includes a summary of findings for the sample groups and a similar
analysis applied across all utility C&I projects logged in the 2012 program participation year. The analysis
compares quantitative and qualitative indicators of project success across an array of firm-o-graphics and
“proje-graphics” including customer size, fuel-type, energy end use, initiatives/program-type, size of utility, and
industry-sector. Finally, the differences and similarities in how respondents define and experience success are
presented in this paper. Together these analyses will help to form future project design.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
442
Duplicating Successful Energy Efficiency Projects Using Quantitative and
Qualitative Methods
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
YY. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
ZZ. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-448
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: The Latest Weather Data is Like Your New iPhone - It's Already Outdated!
Our climate is changing rapidly. According to the National Oceanic Atmospheric Administration, the warmest 10
years since records began in 1880 have all occurred since 1998 with 9 of the 10 occurring this century. 2013
marked the 37th consecutive year that the annual global temperature was above the long-term average(1). In
addition to global averages, we are constantly hearing of droughts, hurricanes, tornadoes, floods and other
natural disasters that are impacting communities on a more local level with more intensity and more frequency
than ever before. All of these indicators point to a climate that is changing so quickly that long-term historical
weather data models no longer represent an accurate prediction of present-day or future weather scenarios.
Energy efficiency programs may be needed more in the future and our estimates of impacts need to use the
best climate data available. This paper will compare the traditional method of estimating energy savings using
historical weather data (e.g., Typical Meteorological Year (TMY)) with an approach that “adjusts” the historical
TMY data for present-day and future climate considerations. TMY data is a collection of weather data from
localized weather stations spanning multiple years. The most recent TMY data available (TMY3) uses weather
data collected between 1976 and 2005. A TMY represents one year of data that pulls individual months of
weather data from different years within the available range. It is intended to represent the range of possible
weather while still giving accurate long-term averages for a given weather station. However, as we are now
beginning to understand, 6 of the 10 warmest years on record have occurred since the latest TMY data became
available meaning it is no longer an accurate proxy for estimating present-day or future weather scenarios. The
Climate Change World Weather File Generator (CCWorldWeatherGen) tool allows users to generate climate
change weather files by transforming historical weather files (e.g., TMY data) into future climate predictions for
their area(2). This allows for a more accurate representation of present-day or future scenarios compared with
historical weather data files that are constrained by the years from which their data are pulled. This paper will
compare energy savings results using the traditional approach (i.e., unadjusted TMY data), with the “adjusted”
approach (i.e., adjusted TMY data) for climates across the nation for various energy efficiency measures that rely
on weather data to estimate their savings (e.g., cooling and heating measures, building envelope). Will a warmer
climate result in reduced energy savings for insulation or air sealing measures in the winter? Will it dramatically
increase the savings for cooling measures in the summer? Will increased temperature extremes amplify peak
demand savings from cooling measures? This paper will investigate these questions and make recommendations
and a path forward for accurately modeling and predicting energy savings impacts for present-day and future
climate scenarios. 1. http://www.ncdc.noaa.gov/sotc/global/2013/13 2.
http://www.energy.soton.ac.uk/ccworldweathergen/
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
448
The Latest Weather Data is Like Your New iPhone - It's Already Outdated!
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
AAA.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
BBB.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-450
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Empowering Residential and Small Commercial Customers with Energy Management
Tools
In the evolving New York energy market, a key focus for program administrators is empowering customers to
manage their energy consumption and costs through enhanced energy information and management tools.
Currently, most residential and small business customers see their energy consumption on a monthly basis when
they receive their utility bills. With real-time energy and cost data, customers can proactively manage their
energy consumption by making informed decisions about investing in upgrades or changing their behavior to
reduce energy costs. To examine customer acceptance and the potential savings impacts of off-the-shelf
energy management (EM) tools, a New York utility is conducting market research and pilot installations for
selected tools that are appropriate for the residential and small business sectors. The market research includes
an assessment of market-ready EM tools appropriate for residential and small commercial customers, including
ratings in the categories of usability, ease of installation, data security, compatibility with other equipment, and
cost. The pilot includes field-testing of selected energy management tools with participating residential and
small business customers and direct measurement of the energy and demand impacts influenced by the tools.
The utility is conducting surveys, remote monitoring (through web-enabled dashboards), and on-site
measurement of key end-uses to answer the following questions: How do residential and small business
customers utilize real-time access to energy consumption information to save energy? How do residential and
small business customers respond to varying levels of utility engagement utilizing EM devices? What energy
and demand impacts are achieved by providing customers with energy information and management tools? In
this paper, the authors will present their methods for evaluating both qualitative (e.g. customers’ reported
behavior changes) and quantitative (e.g., measured changes in operating schedules) impacts through the pilot
period, and they will present key findings about the impact of tools including the types of tools and data that are
most effective in creating change. As energy markets and energy management tools continue to equip both
customers and utilities with more access to real-time end-use data and opportunities for two-way
communication, this research provides both program administrators and evaluators with valuable insights for
future program design with integrated EM&V techniques. The study sponsors will complete data collection in
March 2015 and will complete the final assessment of this approach in April 2015.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
450
Empowering Residential and Small Commercial Customers with Energy
Management Tools
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
CCC.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
DDD. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-451
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: If You Build It, Will They Come? New Opportunities for Feedback Interventions and
Their Potential for Energy Savings from AMI
As we move into an age where real-time data can both support home energy automation and motivate energy
management choices, what can we offer program administrators in terms of program ‘redesign?’ What role can
new technologies, tools and systems play to engage and motivate customers?
AMI data promises myriad
opportunities for program administrators to develop new programs, or enhance existing ones, that leverage an
array of behavioral interventions. However, many program administrators are left scratching their heads about
how to use energy usage information to encourage customers to save. In this paper, the authors will argue that
the advent of the smart grid and the ability for third parties to access customers’ AMI data have led to new
opportunities for targeting customer behavior using an array of behavioral theories in support of energy
management (gamification, feedback, competitions and rewards, amongst others) coupled with new
technologies, applications and software to engage the customer (Green Button Connect, disaggregation
software, smartphone applications, etc.). In this paper, the authors will present results from a behavioral
market characterization study conducted for the age of AMI. We will catalog current AMI-enabled utility and
third-party offerings in the United States, showcasing these offerings by behavioral intervention strategy as well
as potential energy savings. Our paper will highlight gaps and opportunities in the residential behavioral market,
and the potential savings from leveraging AMI data, by intervention strategy. This review will update and expand
existing research by documenting results from promising new programs, pilots and market-based initiatives.
Evaluators will leave this session with greater insights into optimizing behavioral intervention strategies to
realize the benefits of AMI data.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
451
If You Build It, Will They Come? New Opportunities for Feedback
Interventions and Their Potential for Energy Savings from AMI
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
EEE. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
FFF. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-452
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Already Decommissioned: Better Methods for Evaluating Data Center Projects
Data centers are complex facilities for evaluators. Not only do site privacy and security restrictions make it
challenging or impossible for evaluators to install metering equipment, but technologies that evolve faster than
program rules and regulations result in issues of rapidly evolving baseline characterization and even the
replacement of equipment before the evaluation cycle begins. During the recent evaluation for the State
Authority Industrial and Process Efficiency (IPE) program, the team examined nine data centers throughout New
York and encountered a host of road blocks to accurately assessing the program’s impact. After completing the
evaluation, the team identified the following areas in data center sites where improved and standardized
methodology in the technical review and M&V phases would have a huge and positive impact in the evaluation
process: •
Quickly evolving technologies mean changing practices, equipment lifespans shorter than
program evaluation cycles, and constant re-training for reviewers and evaluators. •
Site security measures
and operational policies disrupted M&V activities by changing equipment or preventing the evaluators from
accessing the equipment directly. •
Load growth and decline are hard to quantify without a physical product
to use as a basis for production efficiency. In light of these problem areas this paper uses the team’s
experiences to propose possible methods, such as concurrent evaluation and processing power measurements,
that evaluators, program administrators, and technical reviewers can use to ensure consistent quality and
accuracy in the savings analysis for data center sites.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
452
Already Decommissioned: Better Methods for Evaluating Data Center
Projects
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
GGG. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
HHH. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-454
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Rates/Pricing
Best Focus: Findings
Abstract Title: Baby, It’s Cold Outside: 2014 Polar Vortex Impacts on Residential Dynamic Electricity
Pricing Programs
During the winter of 2014, the Polar Vortex created extreme and sustained cold weather that greatly impacted
the electricity markets in the Midwest. For residential participants in market-based pricing programs, the
extreme winter weather drove electricity prices to record high of almost $2 per kilowatt hour. Customers are
typically accustomed to shifting electricity away from peak hours during summer months, but what happens
when winter prices spike at unpredictable times and remain high throughout the cooler months? How does this
impact participant behavior and retention, and what can administrators do to protect participants from high
winter prices going forward? Based on data analysis, we will take a deep dive into the impacts of the 2014
Polar by analyzing household types, hourly interval usage data, billing data and high price alert settings to
determine what types of residential customers are able to reduce or shift load during high winter prices and
what impacts this could have on the need for system voltage reduction. This paper will also highlight the
response to high prices over a long duration, how the time of day of a high price impacts response, whether
customers respond greater to day-ahead or real-time hourly prices, how price elasticity in the winter months
compares with summer months and whether there is any overall conservation effect from sustained high winter
prices. As extreme weather events occur with greater frequency, it’s imperative to continuously evaluate and
innovate to deliver the best program offering to customers. The 2014 Polar Vortex greatly impacted participant
enrollment discussions and ongoing education on the ups and downs of market-based pricing, as well as tools to
support customers going forward. This paper will also focus on the administrative changes made to
accommodate future extreme weather events and dynamic pricing administration in the age of increasing
climate volatility.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
454
Baby, It’s Cold Outside: 2014 Polar Vortex Impacts on Residential Dynamic
Electricity Pricing Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
III. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
JJJ. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-461
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Been There, Done That: What’s New in Behavioral Program Evaluation?
In the past few years, program administrators, implementers and evaluators have learned a lot about what
behavioral programs can achieve. However, we have to wonder if we are asking the right questions; we have
calculated savings, but not what drives those savings, we have calculated savings on an annual basis, but not
how long savings will persist, and finally we have calculated energy savings, but not reductions in demand. This
paper will answer these questions to help facilitate further savings that can be achieved through behavioral
programs. Aggressive savings goals, coupled with a decline in traditional widget-based programs, have
accelerated the number of behavioral interventions and programs offered by program administrators. While
some jurisdictions are just starting to develop behavioral programs with these new interventions, in others they
are maturing. For certain behavioral interventions, such as paper-based reports, there is a body of research on
savings achieved during the first few years of the program. For other interventions, such as those leveraging
Advanced Metering Infrastructure (AMI) data, more research is beginning to surface about how AMI data can
predict when savings occur during the course of the program. As such, with the evolution of these interventions,
the evaluation industry has also adopted new methodologies to estimate impacts and optimize delivery of these
programs. The authors will discuss how we can use various evaluation approaches to enhance our
understanding of the emerging questions associated with behavioral programs. It will also address potential
changes in program design to optimize enduring, cost-effective, customer-driven savings. Specifically, using a
mix of our own evaluations, as well as other research, we will discuss how survey instrument design and survey
data analysis can help determine what is driving savings, (i.e., actions participants report taking as compared to
control groups, the proportion of actions that are reported to be equipment-based versus behavior-based, etc.).
We will also discuss how evaluators can use data gathered through real-time AMI data to understand
consumption patterns and load shifting that can be attributable to the behavioral intervention and help
determine when (based on time of year/day) the interventions are most effective. Further, we will also review
recent results regarding persistence in savings from treatment reduction experiments. These findings are not
meant to be the definitive answer for a particular type of program, but rather, to illustrate the various
approaches available to evaluators.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
461
Been There, Done That: What’s New in Behavioral Program Evaluation?
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
KKK.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
LLL. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-475
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Match for Match: How Good Are Statistically Created Control Groups Versus A True
Control?
How good are current propensity score matching methodologies in creating appropriate matched control groups
for impact evaluations? The goal of statistical methodologies is to mimic the effectiveness of a classical
randomized experiment in teasing out causality. However, in reality a truly randomized treatment and control
group is not possible for many reasons including funding and ethics. An alternative approach is to create a
matched control group that serves as a stand in for a true control group when one could not have been feasibly
created. In order for a matched control group to be truly effective as a control condition, it must look, sound,
and quack like the treatment group. We benchmark a number of matching techniques, including the most
popularly used methods in the field, against a truly randomized control group from a 2012-2013 behavioral field
experiment.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
475
Match for Match: How Good Are Statistically Created Control Groups Versus
A True Control?
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
MMM. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
NNN. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-477
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: The Everlasting Low-Hanging Fruit: School Kit Programs
Programs that provide students and teachers with energy-efficiency kits and curricula are the perpetual lowhanging fruit of energy-efficiency programs. With a constant stream of new participants every semester and a
loyal base of teachers, school kit programs yield consistent and cost-effective savings. The programs not only
provide access to hard-to-reach and untapped demographics, but also result in spillover and behavioral changes
(reported by parents) at a level unmatched across program types. Freeridership is also generally low, and
satisfaction among teachers, students, and parents is extraordinarily high. Using data from 70,000 students in
Indiana and results from multiple school kit programs across the United States, we will make the case for
emphasizing and expanding these types of programs in residential portfolios to achieve greater savings.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
477
The Everlasting Low-Hanging Fruit: School Kit Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
OOO. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
PPP.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-478
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Emerging Technologies
Best Focus: Methods
Abstract Title: Cage Match or Happy Couple? Engineering Simulation Models and Billing Analysis
In the Northwest, the Regional Technical Forum (RTF) estimates savings for measures with a heavy emphasis on
calibrated engineering simulation models. The simulation models allow the RTF to vary savings values by
climate zone, building construction, and measure specification accounting for much of the variability and savings
across program participants. However, the estimated savings must be calibrated to reality. Billing analysis can
provide that reality but may not be able to resolve savings with sufficient accuracy or detail to drive changes in
savings at the level desired by the RTF. Historically, billing analysis and engineering models were used in
relative isolation in the Northwest. Building on others’ research (e.g., Crossman, IEPEC 2013), the Northwest has
taken its own approach to leveraging these two methods to accurately estimate savings. Recent examples
include ductless heat pumps, conventional air source heat pumps, duct sealing and weatherization. For these
measures, the paper will discuss the methodologies that were used to estimate savings and the ways in which
billing analysis and engineering simulation models were used together. The paper will discuss: 1)
Surprising findings: In the Northwest, because we used the two methods, we were able to accurately
estimate the reduction in wood burning associated with ductless heat pump installation, and to some extent,
weatherization measures. 2) Baselines are crucial: Traditional billing analysis is effective when baselines are
pre-conditions (retrofit). Yet, leveraging billing analysis and building simulation models allows for estimation of
savings for measures where the baseline cannot be directly observed (e.g., new construction or market
conditions/replace on burnout). 3)
Model calibrations are key: Billing analysis results provide an
opportunity to calibrate the overall results of simulation models, so that the models accurately estimate
evaluated savings of a program. However, the fixed effects billing analysis that is typically done develops results
at the aggregated program level, so it is nearly impossible to use results from these pooled regression models
to drive the calibration of measure-specific simulation models. 4)
Site-specific billing data works: We’ve
found success by using site-specific billing analysis linked to specific buildings. This allows segmentation using
survey results or other secondary data (e.g. audit data) to refine calibrations or savings estimates. 5) Success
by fuel type: Northwest is focused on electricity savings; many successful examples of leveraging billing analysis
and engineering modeling together are based on gas homes. This paper will provide insights into when this
works for electric homes. This paper will describe the strengths, weaknesses and best uses of engineering
simulation models and billing analysis for estimating electricity savings from residential HVAC and
weatherization measures. It will describe when they fight and when they’re a harmonious couple.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
478
Cage Match or Happy Couple? Engineering Simulation Models and Billing
Analysis
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
QQQ. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
RRR.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-488
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Market Transformation
Best Focus: Methods
Abstract Title: Exploring Deep Savings: A ToolKit for Assessing Behavior-Based Energy Interventions
Behavior-based energy interventions (i.e., those targeting savings through consumer energy use) are are based
on the idea that people can be encouraged to use less energy if the underlying determinants of behavior change
in some way. Research on such programs suggests great potential for energy savings, but results vary and much
is still unknown about the specific variables that impact program effectiveness. This is due in part to the way
these programs are typically evaluated. As more and more utilities and regulatory agencies focus their
attention on behavior-based energy efficiency programs, there is an urgency to ensure that evaluations of such
programs are done in as rigorous a manner as possible. While the metrics used to measure whether these
various programs work is fairly standard and easy to compare between studies, the variables and metrics used
to measure how and for whom they work have been left to individual researchers, with little attempts at
creating a replicable model. Such standardization is common in related fields such as education and psychology,
but have yet to take hold in energy program evaluation. The current paper introduces a toolkit for use in the
evaluation of behavior-based energy interventions, including but not limited to eco-feedback, home audits,
information and rebate programs, and social games. It contains individual instruments to measure: (1) context
(demographics, housing), (2) user experience (ease of use, engagement), (3) knowledge (energy literacy), (4)
attitudes (efficacy, norms), and (5) behavior (one- time, habitual). Each instrument in the toolkit can be
completed via computer, paper or phone in 5-10 minutes and are modular; they can be used as a battery or
individually based on individual program and evaluation needs. Instruments for each variable are described and
preliminary psychometric validation and suggestions for implementation are presented. Designed to
complement rather than replace traditional measures of program effectiveness, the use of such an toolkit across
multiple behavior-based energy efficiency programs can yield useful insights into effective program design. A
standardized toolkit allows evaluators to cost-effectively and rapidly compare the relative effectiveness of
different behavioral intervention options, and identify possible interactive effects that may make one
intervention more effective for one customer segment and a different intervention more effective for another
segment. Included measures delve into different dimensions of customer characteristics that may underlie
customer receptiveness to different behavioral interventions, allowing utilities to make systematic and
intentioned improvements in program design. Widespread use of standardized tools to measure behavior
intervention effectiveness across customer populations can help aggregate our overall knowledge across studies
and contribute to a more robust understanding of the reliability of energy efficiency as a resource. Such an
understanding is necessary to decrease the variability we currently see in savings from behavioral programs, so
that utilities can begin to have greater confidence in consistent and reliable EM&V findings. It is only then that
behavior-based energy interventions can begin to be valued as much as supply-side sources of energy.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
488
Exploring Deep Savings: A ToolKit for Assessing Behavior-Based Energy
Interventions
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
SSS. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
TTT. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-507
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Tangled up in Green: Using System Dynamic Recovery Methods to Isolate SEM
Savings
Energy efficiency program sponsors in many jurisdictions have launched efforts to capture energy savings arising
from strategic energy management (SEM) programs, which aim to improve operating and management
procedures in industrial facilities. Through a combination of training, technical support, and development of
peer networks, these programs encourage the owners and operators of industrial plants to adopt continuous
improvement techniques in energy management. SEM programs focus on changing operational practices, but
may also encourage participants to undertake retrofit and capital improvements to energy systems supported
by other programs. One of the principal challenges in evaluating SEM-type programs is to disaggregate energy
savings attributable to behavioral and management changes directly supported by the program from savings
associated with installation of retrofit and capital measures. This task is further complicated by the effects of
changes over time in production volumes or types of products processed in the facility, which have a large effect
on observed energy consumption. Therefore, methods to account for changes in production activity over time
and for capital improvements are critical to assessing savings attributable to SEM programs. This paper shows
how to use System Dynamic Recovery methods to separate the effects of more efficient operations from
equipment changes. System Dynamic Recovery allows evaluators to model how plant production,
environmental effects, and detailed knowledge of how efficiency measures interact (or do not interact) with
production result in energy savings. The method accounts for seasonal variation in product type and raw
material inputs and allows evaluators to simultaneously estimate realization rates to energy measures and
capture savings due to behavioral changes. The paper presents the results of using this method for the
evaluation of SEM at food processing plants for a regional alliance.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
507
Tangled up in Green: Using System Dynamic Recovery Methods to Isolate
SEM Savings
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
UUU. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
VVV.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-513
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Demand Response
Best Focus: Methods
Abstract Title: Measuring Behavior Change at Scale from Behavioral Demand Response
Behavioral Demand Response (BDR) is designed to motivate consumers to reduce their energy consumption on
the hottest days of the summer, when consumer demand for and utility prices of energy are highest. BDR
provides personalized, low cost recommendations for saving energy on peak days, helping to create a more
reliable electric grid and reducing the need for utilities to use additional power from costly and polluting power
plants. The program relies purely on changes in behavior, without the use of hardware. But how can we
determine whether the program actually changed behavior? Measuring the direct impact of a product on
human behavior is difficult, as fluctuations in energy consumption could be attributed to a variety of factors.
However, our company’s measurement and verification model allows the company to accurately isolate changes
in behavior attributed to the program. Our strategy utilizes a randomized controlled trial (RCT) to implement
the program. The M&V model follows an experimental blueprint, is endorsed by ACEEE and DOE, follows NAPEE
guidelines, and is used in PUC filings in dozens of states. To increase precision and control for any remaining insample differences between treatment and control, load reduction is estimated via regression adjustment which
controls for recent hourly customer usage, hourly usage in the same month last year (when available), and
average seasonal usage. Pre-specified regression adjustment of experimental data is recommended practice by
the DOE. Using this method, our company determined that during the summer of 2014, its BDR solution
reduced peak load by up to 5.04%, and did so during an LA heat wave. Results were robust to estimation with
and without regression adjustment due to high covariate balance between treatment and control. The
implications of the successful M&V of our program are significant. As utilities increasingly roll out programs like
demand response to engage their customers and cut back on energy consumption, accurately measuring the
energy savings, monetary savings, and customer satisfaction driven by these programs is crucial. It also allows
utilities to project what their savings could be if expanded to more households. This paper will go into how
our company accurately isolates energy savings attributed to its behavioral demand response program and a
deep dive into our company’s M&V strategy.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
513
Measuring Behavior Change at Scale from Behavioral Demand Response
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
WWW. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
XXX.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-520
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Opt-Out Behavioral Programs Using Randomized Control Trials: Best Practices for
Design and Savings Estimation
Experimental design using random assignment to treatment and control conditions, commonly referred to as
randomized control trials (RCTs), has been referred to as the “gold standard” for testing the efficacy of
interventions in many fields for decades. In the past few years RCTs been deployed in energy efficiency with
comparative energy usage programs, commonly known as home energy reports programs. The Sacramento
Utilities District was the first utility to deploy home energy reports in California on a large-scale basis starting in
2008. RCTs became the methodology of choice to measure opt-out behavioral programs for California’s
investor-owned utilities (IOUs) when Senate Bill (SB) 488, signed into law in October 2009, mandated the
California Public Utilities Commission (CPUC) to evaluate comparative usage programs using experimental
design and to determine the energy savings potential possible “through expansion of comparative energy usage
disclosure programs.” A CPUC decision in April, 2010 mandated that “savings from behavior-based energy
efficiency programs, defined as comparative energy use reporting contemplated in SB 488, shall be eligible for
counting, if evaluated consistent with experimental design methods contained within the California Evaluation
Protocols” (Ordering Paragraph 13, Decision 10-04-029). Since then the home energy reports program design
has been deployed by dozens of utilities, evaluated by many firms, and been the subject of peer-reviewed
articles. Some have asserted that the program model has been over-studied. A recent solicitation for proposals
on “social, cultural, and behavioral aspects of achieving energy efficiency potential” from the California Energy
Commission excludes “social comparison efforts as popularized by Opower.” Other industry voices are actively
calling for innovation to move beyond the home energy report design. In spite of the popularity of the opt-out
behavioral program design exemplified by home energy reports, that there is no published document that
provides program administrators with best practices for managing these types of interventions. While there are
published protocols that offer guidelines for selecting methodologies and running RCTs their broad scope makes
them less useful for the practitioner charged with launching or managing home energy reports or similar types
of large-scale, opt-out program interventions. This research provides program administrators with a concise set
of best practices for the design, savings estimation and reporting of savings from opt-out behavioral programs
using RCT. The report draws from key learnings based on the observation, design, and management of over a
dozen experiments conducted of home energy reports over the past four years. Recommendations are provided
to assist planners to: • Estimate minimum desired sample sizes for treatment and control groups based on
calculations of effect sizes observed in similar experiments. • Identify optimal sample frames, and guidelines
for sample stratification. •
Estimate both electric and gas energy savings, including analytical techniques to
isolate savings claimed by other utility measures to avoid double-counting of savings. • Estimate peak
megawatt load reduction (demand (kW) savings) using interval data when available, and proxy techniques when
not. • Best practices for reporting of energy and demand savings.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
520
Opt-Out Behavioral Programs Using Randomized Control Trials: Best
Practices for Design and Savings Estimation
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
YYY.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
ZZZ. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-527
When Data Will Be Available: Will be available no later than April 15, 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Improvements in SEM Program Evaluation Methods: Lessons Learned from Several
Recent Projects
The first Strategic Energy Management (SEM) program was launched about 10 years ago by the Northwest
Energy Efficiency Alliance. Since then, these early programs have evolved and more program administrators
began offering variants of the program. Many new SEM programs have started, though only a handful have
reported or verified energy savings due to the challenges involved in quantifying savings. Other program
administrators considering SEM are hesitant to implement programs without more evidence that savings are
verifiable and sustainable. This paper presents evaluation challenges and suggested solutions from the authors’
experience evaluating seven SEM programs. Typically savings are determined for all participants, using a
whole-facility approach where the regression analysis estimates total savings and then SEM savings are
calculated by subtracting capital measure savings from total facility savings. Evaluation activities generally begin
with reviewing a description of each participating facility and drivers of energy use, details of implemented
activities, pre- and post-program participation billing data, and information about other factors that drive
energy use. The implementer’s energy savings estimates and methodology are also reviewed, if available. This
paper will focus on two current evaluation challenges: (1) sampling heterogeneous medium-sized populations,
and (2) improving the regression analysis using site visit and interview data. The number of participants in SEM
programs has been increasing and it is no longer cost-effective to evaluate a census of participants. However,
current program participation is still small enough that the realization rate from sampled sites is highly
dependent upon which sites are in the sample. The paper will discuss the results of a simulation which tested
the impact that sampling would have on overall program results, and the accuracy of the resulting savings
estimate. The paper will also the way in which evaluations are exploring whether additional data collected
during site visits and interviews with facility staff could improve the regression analysis. Previous evaluations
have conducted site visits to verify capital measure savings, but site visits have not been conducted in the past
with the intent of improving and adding context to the regression model used to estimate savings. The goals for
the site visits are to (1) refine the capital measure savings values and thereby calculate more accurate SEM
savings, (2) collect data that may improve the regression analysis, and (3) conduct sub-metering of some
systems or processes which may improve the regression analysis. The paper will discuss which data collected
during site visits were most valuable in improving the regression models, and whether the improvement in the
estimates justified the increased evaluation costs.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
527
Improvements in SEM Program Evaluation Methods: Lessons Learned from
Several Recent Projects
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
AAAA. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
BBBB. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-535
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Discrepancy Analysis: How to Show the Implementer What to do Next
After considerable effort and expense, impact evaluators deliver a verdict: a grade-like realization rate
identifying the proportion of tracking savings that can be claimed by the program. Too often, the verdict is
issued without quantitative guidance on how program implementers could improve the realization rate. The
final report may anecdotally identify sites where egregious errors were made as illustrative of the types of
factors driving realization rates. The better evaluation reports may tabulate causes of poor estimation, but stop
well short of nailing down the culprits quantitatively. What the implementer really wants to know is what to do
next. When faced with a poor realization rate, the implementer must decide, for example, whether to increase
review staff, change savings algorithms or fix the tracking system, and which of the changes are worth the
expense. While the implementer is mulling next steps, in many cases the M&V engineers are sitting on a rich
data set that could inform that decision. The engineers have recruited, visited, inspected, metered, and
analyzed a statistically representative set of projects. The information is there in nascent form explaining what
went wrong at each site and what the impact of that error was. An error in tracking may have resulted in
underestimating savings by 10% at one project, an incorrect baseline might have caused overestimate of savings
by 80% at another. A third project might have resulted in a realization rate of 100%, but with big errors that
offset each other. How does one combine and synthesize these disparate site specific errors into a useful
description of the factors driving realization rates programmatically? This paper presents a method for
combining site-by-site savings discrepancy findings into a coherent and quantitative picture of the factors driving
a program’s realization rate. The paper describes the design elements of the analysis, including a mathematical
framework, an approach to systematic site characterization, a method for synthesizing the results, and a
presentation of results from multiple programs showing how the results can be interpreted and used by an
implementer. The outcome is a powerful set of metrics that tell the implementer not only where the
discrepancies lie but how much they have affected the results. To the authors’ knowledge this type of analysis
has not previously been performed. These Pareto-type results tell the administrator exactly where it is worth it
to invest resources to improve savings estimates.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
535
Discrepancy Analysis: How to Show the Implementer What to do Next
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
CCCC. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
DDDD. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-541
When Data Will Be Available: Is available now
Has This Been Presented Before? Yes, it has been presented-please contact me
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Combining Multiple Data Sources to Create the Whole-Mountain Model
Snowmaking is an industrial process that shares many of the evaluation challenges inherent in a complex
system. The process is simple: combine high-speed air with water at a low enough temperature and you get
snow – but below the surface there’s a mountain-sized machine driving the process, involving complex
interactions between local weather conditions, tight production schedules, environmental compliance issues,
and energy requirements. Energy is consumed by dedicated compressed air plants, pumping systems, and
fans. To optimize snow-making efficiency, ski areas need to produce the most snow possible at the lowest cost.
In practice, the act of isolating the energy efficiency associated with snowmaking upgrades is challenging.
Factors such as weather, gun deployment decisions, the size of the ski area’s snowmaking operations, gunoperator preference, water and compressed air system inefficiencies, electric vs. diesel compressor run time,
water and compressed air flow rate, and energy-use data availability all play parts in the analyst’s ability to
quantify snowmaking energy efficiencies. The Vermont Public Service Department’s impact evaluation team
worked closely with Efficiency Vermont and the ski areas to develop a method of assessing snowmaking energy
savings that is compliant with the rigorous requirements of the ISO-NE Forward Capacity Market. To build a
model robust enough to measure and normalize the savings, data was required from the program, the
mountain, the electric utility, and the State of Vermont. However, this coordination has its issues, including
maintaining an independent evaluation review and ensuring a comprehensive and clear information exchange
between parties. The lessons learned by the evaluators, regulators and program implementers through the
process of evaluating snowmaking energy efficiency upgrades can be applied to other complex evaluations of
process-related efficiency upgrades. This paper will provide the contextual background of the snowmaking
process and energy savings opportunities. It will cover the importance of using a whole systems approach when
measuring savings for a complex process with multiple dependent variables. Additionally, the data acquisition
for a complex process often involves obtaining information from multiple sources and integrating it into a
cohesive model. The authors will explain how the involvement of not only the efficiency program staff but also
the ski area personnel and regulatory agencies is essential to obtaining the required information and
successfully evaluating complex snowmaking projects. Finally, we will discuss how the lessons learned from the
whole-systems approach is applicable to other complex process evaluations. This topic can be presented as
either a paper or a poster.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
541
Combining Multiple Data Sources to Create the Whole-Mountain Model
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
EEEE. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
FFFF.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-558
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Demand Response
Best Focus: Policy
Abstract Title: Learning from Public Health: Embedded Evaluation and its' Applications to Energy
Efficiency
It is a pivotal moment in the field of energy program evaluation. With movement toward standardized methods
and evaluator certification, it is clear that we are working to solidify our profession and its role in demand side
management. However, much of this conversation leaves out critical considerations of process and what’s best
for program innovation and advancement. Other evaluation practices have moved toward standardization,
while also maintaining an eye toward program development, growth, and potential. As energy programs move
away from rebates to more sophisticated market and behavior-change models – often requiring more real-time
evaluation approaches - we would do well to look to other evaluation fields for guidance. By drawing on the
strategies and lessons learned in similarly rigorous evaluation fields, we can both professionalize and enrich our
industry. Public Health, in particular, made a clear move toward integrated evaluation in the mid-1990s. At
the time, emerging efforts, including those led by the Centers for Disease Control, highlighted the importance of
identifying and defining evaluation frameworks and standards. Chief among these standards were new
approaches that emphasized combining evaluation with program planning and management. Today, public
health evaluation efforts focus on evaluation designs and methodologies that are applied, feasible, realistic, and
timely. Instead of waiting until a program has been in the field, public health programs are now rolling out
evaluations at the time of program implementation. Trends toward participatory approaches have also aided in
engaging appropriate stakeholders, ensuring the ease of implementation and use of evaluation data. The result
is that evaluation becomes embedded within programs, leading to continual assessment and improvement of
those programs and enhanced outcomes for program participants. This paper argues that best practices for
embedded evaluation from public health should be applied to energy evaluation. First, the paper begins by
presenting an overview of evaluation theories, frameworks, and best practices from public health. Next, specific
evaluation examples, along with lessons learned, are provided. Examples include innovative process, outcome,
and impact evaluations, in addition to recent efforts in participatory and empowerment evaluation approaches.
In the last section, the paper discusses potential applications of these models to energy programs. As the
energy industry moves toward standardized evaluation approaches, it is critically important to explore
alternative models of evaluation that may help us usher in new program models and theories. To date, most
evaluations of energy programs require, and prefer, the evaluator to keep an arms-length from program
implementation. We should question whether these are the approaches we want to set in stone, and examine
whether we can benefit from real-time embedded evaluation approaches to better support our programs, such
as the approaches used widely in public health. Drawing on the lessons learned for embedded evaluation in
public health provides a lens with which to view our own approaches, and to examine ways we might improve
the practice of energy program evaluation at this critical moment in time.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
558
Learning from Public Health: Embedded Evaluation and its' Applications to
Energy Efficiency
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
GGGG. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
HHHH. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-571
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Conducting Evaluation in an Era of Declining Telephone Survey Response Rates
The survey research industry is at a crossroads. Survey research is an important tool for energy efficiency
program planning and evaluation. Despite the importance of surveys to the energy efficiency industry, few are
paying attention to the increasing challenges of fielding quality surveys and their implications for many of the
traditional methods we use to evaluate programs. The response rates of telephone surveys, the predominant
mode of data collection for decades, have been declining for the past quarter century. The increased use of cell
phones and call screening devices have accelerated this trend. Many people simply will not answer their phone
if the call is from an unknown number. For example, the Pew Center for Research on the People and the Press
recently revealed that their response rates had fallen from approximately 36% in 1997 to 9% in 2012. Given this
landscape, this paper will provide research-based solutions to the challenges of declining survey response rates
and will provide valuable information to program administrators and evaluators who make use of telephone
surveys. In this paper, we will examine the causes and effects of declining survey response rates within the
energy efficiency industry. First, we will explore the underlying causes of survey non-response and the extent to
which telephone survey response rates are declining due to respondent refusals versus technological changes in
the telecommunication industry. Low survey response rates are a concern because of potential non-response
bias and increased survey fielding costs. However, a low response rate does not necessarily mean a survey
suffers from bias. The factors that are related to survey non-response must be correlated with the study
variables of interest for the results to be biased. We will assess the prevalence of non-response bias in surveys
with program participants and non-participants using results from typical energy efficiency surveys. In addition,
we will examine the cost implications of declining survey response rates. Finally, we will suggest some
solutions to the problem of declining telephone survey response rates and provide results from a survey
experiment concluding in December 2014 that used different survey fielding modes and incentive types. The
objective of the experiment was to compare the response rates and results of a traditional telephone survey
with one that made use of multiple survey modes. The experiment made use of a split-sample design in which
we conducted a traditional telephone survey with half of the respondents and conducted a multi-method survey
with the other half. The multi-method survey invited respondents to complete the survey online or via
telephone given that people who will not answer a call from an unknown number may respond to a request to
take a survey through the mail or email. We also split the sample and provided different incentives to test the
impact of the incentive level on survey response.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
571
Conducting Evaluation in an Era of Declining Telephone Survey Response
Rates
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
IIII. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
JJJJ. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g., outside
of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-582
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Methods
Abstract Title: Cutting through the Noise: Gaining Insight on (Social, Web, Earned, Paid) Media
Reach
Energy efficiency program designers are working harder than ever to reach customers with messages about the
value of energy efficiency and demand response actions. As goals ratchet up and codes steadily improve, many
are turning to marketing and behavior programs as a way to leverage limited program dollars and move away
from the rebate driven one-by-one transactional model. Effectiveness will require energy efficiency evaluators
to get smarter about how evaluating marketing programs and improve the way web analytics are integrated into
programmatic intelligence. Better visibility and tracking of web analytics will be critical to understanding how
customers are interacting with the increasingly detailed and tailored tools and information provided via utility
account websites. Web analytics provide valuable information to programs about the extent to which a website
is playing an important role in outreach and participation processes and estimate the “buzz” generated by
discreet events (like DR events, special promotions, or limited time offers). Tracking change over time is one of
the key values of web analytics. The authors will provide a summary of recent projects evaluating marketing
efforts and then describe the approach taken to evaluate a state-wide, behavior-based demand response
program supported by both paid and earned media. In order to assess the effectiveness and reach of the
program, the authors integrated data on the program’s mass media advertising, media audience data, and
website analytics and tracking data. The authors also investigated social and earned media the program
generated, analyzing both the content of the stories and the scope of the audience they reached. The authors
will also describe how web analytics were integrated into the evaluation, review the types of web analytics data
available, and describe the insights and limitations provided by these data. By integrating data on the program’s
paid media, earned media, social media, and web analytics, the authors were able to compare the effectiveness
of the program’s various outreach efforts and identify those that most effectively supported the program’s
overall goals.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
582
Cutting through the Noise: Gaining Insight on (Social, Web, Earned, Paid)
Media Reach
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
KKKK. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
LLLL.
Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-592
When Data Will Be Available: Is available now
Has This Been Presented Before? Yes, it has been presented-please contact me
Agree to Publish: Yes I agree
Best Evaluation Area: Impacts
Best Fit: Information/Education
Best Focus: Policy
Abstract Title: What’s in Your EM&V Mix? Improving Transparency and Understanding of EM&V
Practices
The primary objective of energy-efficiency (EE) impact evaluation is to verify the impacts achieved and
attributable to the studied program for a particular program year or time period. In the past, the mere
existence of impact studies was enough to provide confidence in reported program achievements. However, as
EE resources become key components of system planning and greenhouse gas (GHG) reduction strategies, a
growing audience of stakeholders is seeking better understanding of EM&V practices and findings. A regional
organization conducted a project to build credibility of energy-efficiency as a resource by improving
transparency and understanding of EM&V practices. Through its collaborative stakeholder committee including
nine states, the team produced two standard forms that aim to (1) summarize EM&V methods in a standard
format and (2) to characterize the level of rigor associated with the evaluation findings. One form presents this
EM&V information for a single impact evaluation study, and the other form aggregates the EM&V information
for a specific program and program year. This paper presents the project objectives from multiple stakeholder
perspectives and discusses the evolution of the forms using a stakeholder process. The authors will present key
features in each form—focusing on the characterization of EM&V methods and rigor—and will discuss
challenges for both the user (inputting data into the forms) and the reader (interpreting the data in the form).
In particular, the authors will discuss the project’s approach to these challenges: •
How do we package key
EM&V data (typically from a 100+ page report) in a format that is quick and simple to understand for a nonevaluation expert. • How do the forms ensure consistent characterization of EM&V methods (e.g., to
facilitate simple comparison) while maintaining flexibility for important contextual information. •
What
EM&V data are required to systematically assess the accuracy, reliability, and rigor associated with EM&V
results? •
How can we characterize rigor consistently and fairly across program, measure, and evaluation
types? •
How do we optimize the consistency and quality of data entered in the forms? •
How do
we protect against misinterpretation of the data? •
How do we address multiple stakeholder goals in a
single format? The organization adopted the forms in summer 2014 and several states have committed to using
the forms for 2013 program reporting requirements. The paper will include feedback from the forms’ early
adopters, present steps for further development, and discuss potential uses for EPA 111(d) reporting
requirements.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
592
What’s in Your EM&V Mix? Improving Transparency and Understanding of
EM&V Practices
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
MMMM.
Reasons to accept or possibly accept: (Why you think this abstract should be presented and any
major problems that need resolution, etc.
NNNN. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-621
When Data Will Be Available: Is available now
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Services
Best Focus: Methods
Abstract Title: PROFILES OF CUSTOMER ATTRITION: WHO DROPS OUT FROM THE CARE PROGRAM?
As part of its commitment to providing the public with its energy needs, Southern California Edison (SCE) offers
its income-qualified customers with much-needed bill-relief through its California Alternate Rates for Energy
(CARE) program. In general, CARE eligibility guidelines govern program eligibility such that customers meet
maximum household income and household size guidelines. More specifically, as of June 1, 2014, households
with at most two persons should have a total combined annual income of up to $31,460 and the income
threshold increases by $8,120 for every additional household member. Additionally, households with individuals
who participate in such so-called categorical programs as MediCal/Medicaid, CalFresh/SNAP (Food Stamps),
CalWork (TANF)/Tribal TANF, WIC, MediCal for Families (Healthy Families A&B), LIHEAP, SSI, National School
Lunch (NSL), Bureau of Indian Affairs General Assistance, and Head Start Income Eligible (Tribal only) are eligible
to sign up for CARE. Notwithstanding the CARE program’s beneficial impact in providing economic relief among
SCE’s low income customers, much remains to be known about the patterns and predictors of customer attrition
from the CARE program. In this regard, the current endeavor will examine general CARE customer churn or
drop-out occurring within the first six months of 2014 as it aims to identify some significant socioeconomic and
demographic predictors for customer attrition. Additionally, the analyses will analyze the role of program
recertification (i.e. generally required every two years of all CARE customers) on the propensity for customers to
churn. This paper will use survival analysis or proportional hazards modeling that not only analyzes the
incidence of attrition or churn (i.e. terminated from CARE program or otherwise), but also simultaneously takes
into account the timing of such event in the program lifetime of the CARE customer. The findings of the
statistical analyses are hoped to shed light on formulating suitable marketing campaigns and outreach programs
and policies that will hopefully stem the tide of attrition, especially among those eligible to stay within the
program.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
621
PROFILES OF CUSTOMER ATTRITION: WHO DROPS OUT FROM THE CARE
PROGRAM?
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
OOOO. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
PPPP. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)
TEAM 5
Paper ID Number: PPR-640
When Data Will Be Available: Will be available by March 2015
Has This Been Presented Before? No, it has not been and will not be presented prior to the conference
Agree to Publish: Yes I agree
Best Evaluation Area: Behavior change
Best Fit: Information/Education
Best Focus: Findings
Abstract Title: Timing, Longevity, Depth: Investigating Customer Engagement in Residential Behavior
Programs
Utility-sponsored residential behavior-change programs comprise a growing portion of DSM budgets. From
2010 to 2013, the number of utilities including behavior change programs in their energy efficiency portfolios
more than tripled. No longer just a paper report, newer types of behavior-change programs include a variety of
engaging features. These additional program features offer evaluators more opportunities to study the program
mechanisms and consumer characteristics that promote and dissuade energy savings. While the impact of
behavior change programs on energy usage has been consistently documented at 1-2% of usage, not as much is
known about the mechanism of those impacts. Indeed little is known about how customer engagement in
program features varies by customer or impacts energy savings. We recently reviewed a newer entrant to the
behavior-change space that enables customers to use their desktop or mobile devices to: monitor how their
usage changes over time; how weather, occupancy, and appliance use affect their usage patterns; and how they
compare to their neighbors. The platform incorporates energy challenges, bill threshold alerts, peak time alerts,
energy markers, and outage alerts. Drawing on longitudinal data from over four years of energy use and
participation in this opt-in behavioral program, our paper examines how engagement in the program varies by
customer characteristics and how both customer characteristics and engagement impact energy savings.
Specifically, we report findings from an in-depth examination of customer engagement in the program including:
·
Timing: We examine the typical temporal patterns of engagement and how customers with varying patterns
differ in demographics and energy use. ·
Longevity: We compare the energy-use and demographic
characteristics of customers who are active in the program over longer periods of time to customer who are
active for shorter periods of time. ·
Depth: We compare customers by the number of times they engage and
the number of ways they engage with the program. Finally, we examine the impact of timing, longevity, and
depth of engagement on energy savings.
2015 International Energy Program Evaluation Conference
Review Sheet for Paper Proposals
Paper
Number/Title:
Recommendation:
640
Timing, Longevity, Depth: Investigating Customer Engagement in
Residential Behavior Programs
Accept
Possibly Accept
Reject (Rejection reason required to advise author – see below)
Reviewer:
RELEVANCE
The topic of the abstract must match the focus of the Conference. The Conference serves its participants as a
forum for the presentation and discussion of important evaluation results, issues, methodologies,
implementation, techniques or applications as they apply to conference topics.
ORIGINALITY
The abstract should reflect original work done by the author and be something new to the field. We are about
measurement for impacts, not descriptions about worthy programs—How do we know we got the savings or
changes etc.
SIGNIFICANCE
The abstract must represent a significant contribution to the field.
Is it a new and original contribution?
Are the conclusions sound and justified?
Is it clearly presented and well organized?
Do you believe that results will be ready by the
time of the conference?
Comments for Committee Review and to Advise Authors:
QQQQ. Reasons to accept or possibly accept: (Why you think this abstract should be presented and any major
problems that need resolution, etc.
RRRR. Reasons to reject: (One or two sentences to explain why this proposal should not be accepted—e.g.,
outside of conference topics, results may/will not be ready before May, not new or of significant interest,
questionable methodology, not enough information to make a determination, etc.) Reasons will be emailed
to the author.
Comments for Author and Session Moderator:
(Suggestions for writing or focusing the paper, etc.)