Date - Energydataweb Home

Kevin Price
Consultant
T 510.899.5555
F 503.536.6637
1648 Martin Luther King Jr. Way
Berkeley, California 94709
[email protected]
evergreenecon.com
MEMORANDUM
Date: August 22, 2013
To:
Caroline Chen (StatWizards), Brian Smith (PG&E)
Re:
Behavior Workshop – Summary
This memorandum provides an overview of the main themes from the two-day Behavior
Workshop on June 25 and 26, 2013. This is a working document; if our understanding is
inaccurate or incomplete, we will adjust accordingly.
We organized the memo into two main sections. First, we present Challenges in defining
behavior programs. This section shows that there exists a range of opinions regarding how
to define behavior programs, and why defining them is difficult in the first place. Second,
we discuss Attribution – one of the main themes of the workshop was how to rely on
behavior programs for quantifiable and defensible energy savings.
Challenges in defining behavior programs
The main goal of the workshop was to better understand behavior-based program
strategies to ultimately expand the CPUC definition of behavior programs for future IOU
program cycles. One overarching difficulty discussed at the workshop related to expanding
the definition is allowing trials of novel techniques, while ensuring responsible utilization
of ratepayer dollars.
Noted throughout the workshop, many (if not all) program theories rely on end-users
behaving in a way that saves energy. Therefore, it is difficult to define pure behavior
programs that are separate from other programs (i.e., “widget” programs). On one end of
the scale, a downstream clothes washer program – a typical widget based program –
requires the IOUs to affect the end-user behaviors of purchasing, installing, and using an
IOU-incentivized energy efficient clothes washer. On the other end of the scale, a home
energy report targets energy consumption behavior using social science techniques by
tracking usage over time, potentially comparing consumption with neighbors’ consumption
(or using another behavior intervention strategy), and relaying important messages to endusers (not excluding equipment purchase suggestions). In this second approach, rebates
Page 1
are not the driver – a home energy report provides key information and guidance to
support energy efficient behavior. However, determining where to “draw the line” between
behavior programs and other programs remains unresolved.
Additionally, there was a strong focus on technology-based behavior intervention
strategies and programs (e.g., utilizing smart phones, internet, video games). However,
presenters and attendees of the workshop noted that many of these technologies are not
readily available to low-income and other groups in the population. There seemed to be
agreement that social science based strategies should be the focus of policy decisions and
IOU programs, and not the application of specific technologies, in order to ensure equitable
distribution of ratepayer funds.
As an appendix, we present the three proposed definitions for behavior programs
discussed during the workshop (starting with the original IOU straw proposal).
Attribution
Attribution of energy savings to programs or program elements is challenging for all
energy efficiency programs. Based on comments during the workshop, a behavior
program’s program theory should clearly define a hypothesis for deriving energy savings
using behavioral intervention strategies. Evaluation should test the hypothesis in order to
determine if any energy savings are directly attributable to the program (and if so, how
much). This is complicated – even widget-based programs include behavioral factors (e.g.,
purchase decision) that may be difficult to objectively disentangle from the effects of a pure
behavior program. Below we provide an overview of the discussion points related to
program design/theory and evaluation from the workshop.
Program Design and Theory
The White Paper differentiates between underused and traditional behavior intervention
strategies. The workshop converged on an understanding that a behavior program should
rely on the underused intervention strategies. These include: commitment, feedback,
follow-through, framing, in-person interactions, energy pricing, rewards or gifts, social
norms, and multi-pronged strategies.1 In order to claim savings from these underused
strategies, social science should be the basis for program design (relevant theories are
discussed in the white paper).
1 Patrice
Ignelzi, Jane Peters, Kathrine Randazzo, and Anne Dougherty, "Paving the Way for a Richer Mix of
Residential Behavior Programs." (Enernoc, 2013).
Berkeley, California  Phone: 510.899.5555  [email protected]  Page 2
Implementation of these underused behavior intervention strategies currently lacks the
wealth of evaluation and peer review associated with (and used to inform) the traditional
strategies. One challenge in creating effective behavioral program design is balancing the
need for data/research-driven program theories with novel techniques that may yield (an
unknown level of) energy savings in the future. Dylan Sullivan from the NRDC noted that
fundamentally sound program theories with specific behavior targets can help make a case
for implementing behavior programs based on the underused (and thus more “risky”)
strategies.
Suggestions for enhancing the program design phase included the following:




Logical and well thought out program theory;
Enlisting social science theories to support program theory;
Enlisting marketing professionals (suggested by Carrie Armel, PEEC); and,
Utilizing experimental design (when possible).
In addition, during the program design process the IOUs were encouraged to refer to case
study examples of behavior programs outside of California to create effective behavior
intervention strategies and programs.
EM&V Challenges
In the state of California, the IOUs are responsible for producing and documenting
evaluable energy savings using ratepayer funding. In order to ensure the IOUs are using
ratepayer funding effectively, energy efficiency programs are evaluated on a regular basis.
Some of the same evaluation techniques currently used for widget-based programs would
be applicable to behavior-based programs. These include experimental or quasiexperimental design and billing analysis. Miriam Goldberg of DNV KEMA presented an
overview of evaluation techniques that were most effective for a range of scenarios on the
second day of the workshop.
Evaluation methods aim to determine what would have happened in absence of a program.
With behavioral programs, the challenge in attributing energy savings is complicated
because the link between information and action is harder to define than the link between
a documented purchase and installation of a widget. However, there is a sense that
disaggregated smart meter data may help understand the behaviors of end-users pre- and
post-participation in a behavior program.
Berkeley, California  Phone: 510.899.5555  [email protected]  Page 3
Appendix – Straw Proposals
IOUs
•
•
•
Program Categories Today
– Resource programs and measures (impact) lead to purchase energy
efficiency actions by customers
• Example: EUC-Whole House Program, Plug Load & Appliance
Program, etc.
– Non-resource programs and measures (indirect impact) also promote energy
efficiency actions taking by customers
• Example: Marketing Education and Outreach Program (MEO),
Workforce Education and Training (WE&T), etc.
– Behavior programs and measures NEW (impact) as defined by the CPUC
Decisions
• Example: Home Energy Report meeting comparative energy usage,
randomized experimental design, ex-post evaluation requirements
– So what’s missing? – a broadly defined behavior program to address a
diversity of population based, mass-market program designs
Behavior and Behavior Programs
– Nearly all energy efficiency programs influence behavior.
– Behavior programs, unlike more traditional DSM programs, consciously
apply an understanding of how people interact with energy.
– They draw upon an array of interventions, singly or in combination, to
influence how people think about and use energy, both at the time of the
intervention and, in the best designs, on an ongoing basis as well.
Per behavior whitepaper, behavior intervention strategies require the following
– Target one or more specific behaviors that affect end users' energy use,
– Are rooted in social science research,
– Consciously consider which behavior(s) they will affect,
– Yield evaluable effects.
Cathy Fogel
EE Behavior Programs for 2015-17:

Deploy one or more of the underused behavior intervention strategies of: 1)
commitments; 2) feedback; 3) follow through; 4) trusted community messenger
interactions; 5) rewards or gifts; or 6) social norms.
Berkeley, California  Phone: 510.899.5555  [email protected]  Page 4



May be evaluated using experimental design, quasi-experimental design, quasiexperimental design, or other evaluation methods approved by the CPUC for 201517.
Are typically evaluated on an ex-post basis, but may use ex-ante savings values if
approved as part of evaluation methods.
Utilizes behavior science framing strategies.
Chris Jones and Collaborators
All existing DSM and energy efficiency programs involve behavior, and [should apply]
[benefit from] insights from social and behavioral sciences for greater impact and deeper
savings.
Additionally, CPUC defines a [separate funding category of new] Behavior-based
innovations that deliberately apply models and approaches drawn from the social and
behavioral sciences to affect energy use that [typically] do the following:
1. Identify energy usage behaviors that are intended to be changed,
2. Identify which social science theory or combination of theories the intervention is
drawing upon,
3. Deploy behavior intervention strategies such as: 1) commitments; 2) feedback; 3)
follow through; 4) trusted community messenger interactions; 5) rewards or gifts;
6) social norms, 7) Social diffusion 8) or…..
4. Utilize messaging strategies grounded in behavioral and cognitive sciences,
5. May be evaluated using experimental design, quasi-experimental design or other
evaluations methods approved by CPUC,
6. Outcomes are typically measured on an ex-post basis, using approved evaluation
methods; however, in some cases, forecasted metrics may be used
Best practices incorporate learning from program experience and evaluation to evolve over
time.
New behavior-based innovations will inform new programs and be incorporated into
existing DSM and EE programs, where appropriate.
Berkeley, California  Phone: 510.899.5555  [email protected]  Page 5