Delta Institute is a nonprofit working to build a resilient Great Lakes environment and economy through sustainable solutions. One of Delta's strategic objectives is to disrupt the energy status quo by helping buildings become more efficient, creating strategies to increase renewable energy generation, and, in the case of DeltaLumin, working directly with consumers to understand how they use and manage their energy and develop effective strategies for behavior change.

Delta has ten years of experience working directly with homeowners on energy conservation. For years, we implemented a home weatherization program to educate homeowners on where their homes were energy inefficient and guide them toward solutions. For DeltaLumin, we wanted to flip our top-down program approach and instead start with the energy users.

With funding from the Illinois Science & Energy Innovation Foundation (ISEIF), we set out to apply the principles of human-centered design to inform the creation and piloting of a product or service that, based on an understanding of individual users, would leverage insights from smart meter data to help consumers make smarter energy decisions. We initiated a partnership with renowned design firm IDEO, whose expertise in human-centered design informed our on-the-ground research and the design of great product concepts. Because smart meter program uptake lagged in low-income, senior and "hard-to-reach" communities, we focused our outreach to those communities. To help us connect with those communities, we brought on one of our long-time partners, Faith in Place, who led community outreach and recruitment efforts.

This site walks through our process, findings, technical documentation, and next steps for DeltaLumin. By sharing this information, we hope that others can learn from our work and leverage those learnings to make a positive impact in their communities.

The DeltaLumin Approach

Delta worked with our partners to develop a user-focused, empathetic approach to product and program design. We began by spending time with our potential users, getting to know their relationship with energy use, technology, and preferred methods of communications. Those experiences directly informed the development of the DeltaLumin product and service.

Human-Centered Design

Delta worked with design firm IDEO to take a human-centered design approach to the development and piloting of the DeltaLumin product and service. Human-centered design is rooted in a comprehensive understanding of the needs, values, and motivations of the potential users of the product or service being designed. The process often begins with intensive field research that results in specific insights that are then combined with design strategies to develop new products or services that meet the needs of the people who use them.

The initial research efforts in the DeltaLumin approach were led by IDEO and included in-home interviews with potential users and later small group discussions around different possibilities for communicating personalized energy use information enabled by smart meter technology. Some insights that surfaced as a result included:

Themes of human-centered design continued throughout the pilot in the implementation of monthly workshops and the iteration of the energy model used to provide energy updates and the visual language used to communicate with pilot participants.


4 weeks

IDEO took the lead on field research including talking with potential users of the DeltaLumin product and service.


8 weeks

IDEO began the development of features, and Delta refined production and set up a system for incorporating participants’ personalized energy use data.


4 weeks

Delta worked with partner Faith in Place to identify potential participants in geographies consistent with the smart meter rollout and the demographic groups identified by ISEIF.

Pilot Implementation

12 weeks

Delta, with support from Faith in Place, implemented a three-month pilot of the DeltaLumin product and service, including mailing and emailing bimonthly energy updates to illuminate energy insightsfor participants, maintaining the participant dashboard, and hosting three pilot participant workshops to collect in-person feedback from participants.

Research Question

Will the illumination of energy use data lead low-income energy consumers to engage with their energy use beyond the level of engagement that occurs with the use of traditional monthly bills?


translation of household energy use into personalized information about participants’ energy habits and patterns


individuals with a household income at or below 50% AMI (< 150% FPL) who are not currently receiving energy subsidies (e.g. CEDA, LIHEAP)


demonstrate energy use reduction through action, stating an intention to reduce energy use, and/or indicating that an informed decision was made to not reduce energy use

Features Piloted

IDEO and Delta developed and tested four unique features that displayed personal energy use in different ways, including energy use across time, by energy-consuming appliances or electronic devices, and how it relates to existing utility programs. Energy use was illuminated using four different features, including a:

1. Spending Forecast

This tested if energy spending updates at a shorter interval (daily rather than monthly) leads to increased levels of engagement over those receiving traditional monthly bills.

Indicators of Engagement

A human-centered design approach inherently accepts that there are many different and valid ways in which customers may value and use energy, and a design process that empathizes with their value systems will be more effective in providing overall benefit.

Results of the pilot were evaluated using a mixed methods approach, which involves employing both qualitative and quantitative strategies of inquiry (e.g. numeric information like energy use data collected concurrently to text information through open-ended survey questions). This approach was used because the indicators of successful engagement include both changes in measurable energy use and changes in non-numeric opinions, motivations, and intentions related to energy use. Indicators of participant engagement included:

Conservation action or intention

Participants demonstrate energy use reduction through behavior change within the pilot period or indicate the intent to reduce energy use through behavior change within or following the pilot period.

For example: Turning out the lights when not in the room.

Efficiency action or intention

Participants demonstrate energy use reduction through physical improvements to their home, including the replacement of an inefficient appliance within the pilot period, or participants indicate the intention to reduce energy use through physical improvements to their homes within or following the pilot period.

For example: Switching out inefficient bulbs (incandescents) with more efficient bulbs (CFLs or LEDs).

Informed decision not to act

Participants demonstrate receipt and understanding of energy use illumination information and choose not to take action to reduce energy use.

Measuring Engagement

These indicators of participant engagement were measured using a concurrent triangulation of methods including:

Monthly workshop activities & discussion

Participants were required to attend monthly in-person events to share reactions and insights related to the features they interacted with the previous month. Qualitative data was collected in the form of responses to open-ended discussion questions and reactions to interactive group activities.

Monthly surveys

Monthly surveys were distributed and completed at each monthly workshop. These were mixed method surveys with an emphasis on close-ended questions, but with open-ended questions included as necessary to collect information about opinions, motivations, and intentions related to energy use generally and treatments more specifically.

Website analytics

Email views and/or website visits were tracked by user and by date/time to aid in the analysis of changes in both energy use seen in the collected energy use data and/or changes in opinions, motivations, and intentions captured in monthly surveys.

Finding Our Participants

Faith in Place led recruitment efforts for the project with support from Delta. Recruitment involved initial research on the geography of smart meter rollout in addition to many hours on the ground talking with potential participants.


Delta focused on identifying pilot participants that had smart meters in their homes and fit with ISEIF’s priority demographic characteristics, including low-income households and seniors. Delta conducted research to identify the geographic areas throughout Chicago where smart meters had already been installed and that were largely low-income communities. Faith in Place led the on-the-ground recruitment efforts focusing their attention on places of worship in the areas of high priority.

The Faith in Place recruitment team visited several places of worship throughout the recruitment period. Faith in Place was crucial in developing and communicating the value of the program to potential participants, administering eligibility surveys, and sustaining engagement through the enrollment process and beyond.

Final Pilot Cohort

The final pilot cohort consisted of a diverse group of 75 south and west side Chicago residents from many different community areas. Though the age of participants varied, the majority were over 50 years old with 36% over 65 years old. Most participating households were single- or two-person households, as many were retirees or parents with grown children. 71% of participating households owned rather than rented their homes, and most homes were between 1350 and 2000 square feet.

At the pilot kick-off event, participants completed a baseline survey for the DeltaLumin program. This survey collected information related to their homes, appliances, interaction with ComEd, as well as actions they had taken to save energy in the past. Questions were asked both about energy-saving behaviors and any energy-efficient home appliances they have. This helped us to understand the potential for reducing total energy consumption through behavior change and home retrofits.

Who are they? Where do they live? How do they use energy? Level of Participant Energy Conservation 2 participants had one energy-saving behavior before the pilot 24 participants had two energy-saving behaviors before the pilot 26 participants had three energy-saving behaviors before the pilot 16 participants had four energy-saving behaviors before the pilot 7 participants had five energy-saving behaviors before the pilot Not an Energy Saver Serious Energy Saver Level of Home Energy Efficiency 17 participants had one energy-efficient item in their home before the pilot 24 participants had two energy-efficient items in their home before the pilot 16 participants had three energy-efficient items in their home before the pilot 12 participants had four energy-efficient items in their home before the pilot 6 participants had five energy-efficient items in their home before the pilot Inefficient Home Highly Efficient Home Zip Code: Neighborhoods: Participants by Zip Code Generated with Qt 60638 Garfield Ridge Clearing 1% of participants 60652 Ashburn 3% of participants 60615 Grand Boulevard Hyde Park Kenwood Washington Park 9% of participants 60621 Englewood Greater Grand Crossing Washington Park 16% of participants 60636 Chicago Lawn Gage Park West Englewood 14% of participants 60649 South Shore 19% of participants 60617 Avalon Park Calumet Heights East Side South Chicago South Deering 9% of participants 60643 Beverly Morgan Park Washington Heights West Pullman 4% of participants 60653 Douglas Grand Boulevard Kenwood Oakland 1% of participants 60628 Pullman Roseland Washington Heights West Pullman 4% of participants 60629 Chicago Lawn Clearing Gage Park Garfield Ridge West Elsdon West Lawn 3% of participants 60620 Auburn Gresham Beverly Chatham Greater Grand Crossing Roseland Washington Heights 1% of participants 60637 Greater Grand Crossing Hyde Park South Shore Washington Park Woodlawn 1% of participants 60619 Avalon Park Burnside, Chatham Calumet Heights Greater Grand Crossing Roseland, South Shore 1% of participants Renters vs. Owners Rent Own 29% 71% Home Size 10% 20% 30% 40% <700 square ft 700-1350 square ft 1351-2000 square ft 2000+ square ft Age of Participants 5 10 15 20 25 30 65+ 50-64 35-49 20-34 <19 Family Size Number of participating households of each size 5 10 15 20 25 Household Income Number of participating households at each income leve 5 10 15 20 25 $18k-30k $30k-45k $45k-60k $60k-75k $75k+

Building a Prototype

The DeltaLumin team built a prototype web dashboard and communication service to test the energy illumination strategies developed in the earlier stages of the project. Developing the prototype involved building quick, low-fidelity versions of products and services we hypothesized would have an impact on our participants’ relationship with energy use. The next section will outline how the prototype was tested with users.

Calculating Energy Use

Spent Weather Patterns Data is adjusted for current temperatures Historical Energy Use Data follows individual historic patterns Home & Appliances Data based on current home type and appliance stock Time-based Insights Illuminates energy use patterns over time Predictive Insights Illuminates incremental energy use between bills Appliance-based Insights Illuminates the biggest energy-using appliances and electronics Incentive-based Insights Illuminates energy-related programs that are relevant to your usage patterns MAR 15 Sun Mon Tue Wed Thu Fri Sat ComEd LightingDiscount ComEd CentralAC Cycling ComEd Refrigeratorand Freezer Recycling

Due to constraints in accessing individual smart meter data, we developed an energy model that estimated energy use data that we could then present visually to our participants.

To work around the constraints related to smart meter data availability, the DeltaLumin team collected information on: (1) current weather patterns during the pilot period, (2) individual energy use data for each participant going back two years, and (3) the current appliances in their home including date and models. This information was used to simulate personalized energy updates in several forms. Based on our research, we determined that the most important data to present to our participants included:

Breaking down energy use across time

Although the lack of access to real time short-interval data made it more difficult to communicate interval energy use, estimating daily usage was important as a new way for participants to interact with their energy use information. This form of energy use information was used to fuel the monthly and daily energy use features that were found on the dashboard.

Weekly information was calculated using the average daily usage based on their average monthly energy use for the last two years of the same month. Daily usage was then adjusted for current weather conditions based on the number of heating or cooling degree days (HDD or CDD). For monthly spending, the previous two months were determined using actual energy usage data pulled from ComEd data portal while the current month was estimated using the same calculation as the spending forecast feature, which will be outlined in the next section.

spending forecast

Measuring progress towards typical monthly spending

This form of energy use was illuminated to help participants better understand how to plan for their energy spending proactively rather than simply reacting to their monthly bills after they have already used energy for the month. This information feeds both the spending forecast and the energy spending goal features.

This predictive energy information was calculated by first using the average energy use during the same month for the previous two years and the cost per unit of energy for each participant’s supplier to determine total average spending for that month. This provided the predictive total spent for the month. Then, participants’ estimated spending to date was determined using current weather information and knowledge of their current appliances. The usage for non-weather dependent appliances was derived from total monthly spending, whereas energy usage for weather dependent appliances was adjusted for daily conditions (i.e. on especially hot days, air conditioning usage was estimated to be higher than on milder days).

spending forecast

Breaking down energy use by energy-consuming appliances and devices

The appliance-level energy information was intended to help participants more easily identify opportunities to save energy and take action. This form of energy use information was used to populate the Appliance Spending Diagnosis feature.

The foundation for our appliance-level spending diagnosis came from the Illinois Technical Resource Manual (TRM) for the Illinois Energy Efficiency Incentive Programs (EEPS). Where the Illinois TRM did not provide deemed usage and savings sufficient to provide estimates, we added appliances and average usage with datasets found on the US Department of Energy website. Rather than detailing energy usage of all appliances in each home, we decided to focus on the largest energy consumers in the average home, as this provided an optimum amount of information without overwhelming the user. In the Baseline Survey completed by participant at our first workshop, they were asked about their refrigerator , air conditioner , televisions , lighting , electric range and dishwasher. We then matched the appliance make and model to the corresponding appliance type in our appliance dataset.

To personalize each participant's appliance spending diagnosis, we altered the previously calculated “average usage” to best fit the participant’s lifestyle and current weather conditions. For each appliance, we picked one variable that affects the useage most significantly. For the air conditioners (AC), we multiplied the average usage by a percent based on the Cooling Degree Days (CDD). For lighting, refrigerators, televisions, dishwashers, and ranges, we assumed the variable that most greatly impacts their usage is the occupancy of the home. The more people in a house or apartment, the more each of these appliances will be used.

spending forecast

Alignment with existing utility programs

In addition to providing estimated energy use by appliance to help participants identify savings opportunities, we also matched participants with existing utility programs based on their behavior and appliances. This was seen in the Program Match feature.

During the pilot there were three utility programs participants could be matched to related to lighting, refrigerators, and air conditioners. Matches were determined based on participant responses to the baseline survey. If they indicated that they had an older inefficient refrigerator, air conditioner, or light bulbs they were told about the related program. We also provided an estimate of the monthly savings they could expect to see if they chose to participate in the program.

Visualizing Energy Use

Several themes in the visual design of energy use information emerged through our early conversations with our targeted audience. One consistent finding was that participants found it much easier to understand their energy use when it was presented in dollars rather than in units of energy (kilowatt-hours or kwH). We also found that keeping information simple and clear was most effective in promoting understanding. These findings informed the design of our dashboard below.

Google Sheets provided a valuable development back end as we tweaked our energy models. The data was served to the front end, and scalable vector graphics (SVGs) were created on an as-needed basis using the JavaScript library D3.js.

Feb 16 Mar 5 Forecast: $130 Spent: $52.40 Hello! Welcome to your DeltaLumin dashboard. The dashboard contains information on how you use electricity in your home including how much you use, when you use it, and how you use it. It looks like you usually spend about $130 on electricity during June. So far this month, we estimate you've spent about $52.40 on your electricity. Take a look around! Spending Forecast Your estimated energy spending for June See how much you’ve already spent this month and how much you may spend if you continue using electricity at the same rate. Household Energy Usage When you spent your money Find trends in your energy use from the past few days and months. Energy Savings Goal Goal setting helps you stay on track to spend less Your goal for is to save compared to your typical spending. That would make your total spending for this month. Appliance Spending Diagnosis Where your money goes See how much we think you have spent on your appliances and electronics so far this month. DELTA LUMIN $1.15 $1.20 $1.08 $1.18 $1.10 $1.06 $1.12 Sun Mon Tue Wed Thu Fri Sat Week Month Week Month $37 $35 $40 May Jun Jul Refrigerator$15.34 Lighting$7.02 AC$6.38 TV$5.09 Electric Range$1.23 Spent: $63.52 Your goal for June is $81.65 $18.13 left

Communicating Energy Use

After calculating the data and using that data to build visualizations, we then built a system for communicating that information to participants through channels that were easily accessible for them, including the online participant dashboard, emails, and printed letters.

png factory

Connecting the data to the online dashboard was accomplished using Google Sheets and Tabletop, a javascript library that converts Google Sheets data to objects and/or arrays through an XHR. This was useful for modifying and tweaking calculations, but it is too slow for a production environment. The back end will be rebuilt to allow for more real-time data manipulation.

Developing individualized email and print mailings required the creation of a PNG factory using saveSVGAsPNG. This allowed the team to use the Dashboard instead of replicating this functionality in Adobe Creative Suite. The team used Salesforce to insert PNGs for email.

Refining the Model

The pilot provided valuable information about the limitations of monthly data in giving users useful energy information, even when the data is well-modeled. Users need at least day-after data to make any action or decisions on purchases. The availability of this data is also important. Though our development product is able to take monthly data and work through algorithms to process it into passable daily data, a daily feed is much more likely to return accurate data and recommendations.


Implementing the Pilot

The DeltaLumin pilot ran for three months and included an onboarding process with an enrollment packet and kick-off events, twice-monthly energy updates communicating personal energy use, monthly workshops to solicitdirect feedback from participants through conversation and written surveys, and continued engagement and participant support.

Enrollment Process

Welcome Packet for new participants

In the enrollment process, we sent eligible participants their enrollment packet, which outlined the pilot objectives and schedule and asked them to: (1) attend a kick-off event, (2) collect information about their appliances, and (3) agree to share their utility information with DeltaLumin. At the pilot kick-off events, participants received more information, completed baseline surveys, completed additional paperwork, and participated in their first energy-related activity.

Energy Updates

Energy Updates were provided to all 75 participants twice a month throughout the three-month pilot period. The information that was provided varied from month to month as features were introduced to the DeltaLumin pilot. However, the delivery mechanisms remained the same throughout the pilot period.

mailed energy update example


Every two weeks, all 75 participants were mailed energy updates. At the mid-month, participants received information outlining progress toward their total monthly energy spending. At the end of the month, participants received a summary of their total monthly energy spending. Energy spending information was presented in several forms through the different available features: Spending Forecast (predictive), Appliance Spending Diagnosis (appliance-level information), Energy Savings Goal (progress towards goal), and Program Match (utility program-related information). Throughout the pilot period, 46 participants reported that they referred to their mailed energy update at least once every month.

DELTALUMIN Here is your end-of-month Energy Update. View this email in your browser Hello, Lucy. Your Savings Goal Below you can see how you did with the personalized money-saving goal we set for you for the month of June. Congratualtions! You met your savings gaol this month. You will have the op-portunity to set a new goal at the next Pilot Participant Workshop. Your Monthly Spending See our estimate of how much you spent on energy in June and how it compares to previous months. Your Appliance Spending Diagnosis See how much we think you spent on your appliances and electronics in June. Update: We have updated some of our formulas in the last few weeks, so you may see some changes to this sec-tion. Thanks for your continued feedback. Please note: The information DeltaLumin shares with you on your dashboard and through DeltaLumin mailings is estimated based on the types of appliances in your home as well as your past billing patterns. This information is for educational purposes and should only be used as a guideline; it will not impact your current or future bill amounts. Have questions? Let us know with an email ( or a phone call (312) 487-1087. Spent: $63.52 Your goal for June is $81.65 $18.13 left Refrigerator$15.34 Lighting$7.02 AC$6.38 TV$5.09 Electric Range$1.23 Want to see more? Apr $69.52 May $58.56 Jun $63.52


In addition to their mailed energy updates, participants who specified an email address received the same information in an email. Throughout the pilot 28 participants reported that they referred to their emailed energy update at least once every month and 24 participants reported referring to both the email and mail versions of their energy updates.

Workshop Series

The DeltaLumin pilot program included five in-person events: two kick-off events and three pilot participant workshops. At the workshops, we listened to the participants as they told us their first-hand experience with our material. Participant feedback was collected via written surveys, facilitated activities, and open-ended discussions. Each workshop is described below including its objective, a summary of the hands-on activity, and the results we drew from each experience.

Pilot Kick-Off Events: Introduction to the Program

The objective of the kick-off event activity was to introduce participants the program/pilot, meet team members in person, and to gather baseline information on their attitudes, opinion, and energy use. In anticipation of the first Energy Update, which would include an Appliance Spending Diagnosis outlining estimated energy use per appliance, the activity aimed to better understand how participants currently think about the energy consumption of individual appliances. Participants were asked to rank appliances and electronic devices based on how much money each contributes to their monthly energy bill.

We found that participants did not have a consistent understanding of which items in their home were contributing the most to their monthly energy spending. There was discussion around not only how much energy particular appliances and electronics use, and how they are used by participants and other members of their households. We found that participants had significant connections between energy use behaviors and feelings of safety (i.e. participants leaving TVs or lights on overnight to feel safe). We also found that there was misinformation about how appliances and electronics use energy when turned on and off.

Workshop 1: Reactions to First Energy Updates

The objective of the first pilot participant workshop was to solicit feedback from participants about the clarity and effectiveness of the energy updates they had received in the previous month. The features that were evaluated included the the forecast feature and appliance-level energy use feature. During the activity participants discussed the layout and design of the information presented in the energy updates. Participants were also given an opportunity to change their answers to the appliance ranking activity that took place during the kick-off event based on information they had received in the first energy update, most notably the Appliance Spending Diagnosis.

We found that participants appreciated the simplicity and clarity of the information being communicated. They also liked the use of dollars rather than kWh to show the quantity of energy being consumed. Participants found the appliance diagnosis to be the most interesting; however, the accuracy of information was a concern as we continued to refine our model for predicting appliance-level energy consumption.

Workshop 2: Setting Energy Savings Goals

The objective of the second pilot participant workshop was to introduce participants to the idea of setting energy savings goals and to understand participant thought processes around setting goals in other areas of their lives. The activity presented participants with the energy use and occupancy information for a hypothetical household along with a specific energy savings goal for that hypothetical household. Participants were then asked to work together to identify the most feasible and effective ways to meet the household’s savings goals from a list of conservation and efficiency energy-saving actions. At the end of the activity, participants were asked to set their own energy savings goal. Participants were given personalized goal setting cards with suggested savings goals of 5%, 10%, and 15% with dollar savings calculated using their historic use for that month.

From the the hypothetical household activity, we found that participants were surprised by which conservation actions would yield the greatest savings. The savings, upfront costs, and returns on investment for large efficiency actions (all but replacing light bulbs) were generally what participants expected. The goals set at the end of this workshop were generally more feasible than the savings goals discussed in the project’s early research phase: an energy savings goal of 5% was chosen by 60% of workshop participants; in contrast, during the initial research phases of the DeltaLumin project, common responses to a similar prompt included saving $20 or more on a bill of $100, or over 80%.

Workshop 3: Matching Participants with Existing Utility Programs

The objective of the third and final pilot participant workshop was to introduce participants to existing utility programs that match their current lifestyle, home types, and appliances, and to gauge their willingness to participate in such programs. The activity at the final workshop involved discussing several existing utility programs, providing hypothetical household scenarios, and listening to which utility programs the participants chose for their hypothetical households. We gained insight into participants’ processes for balancing the risks and benefits of participating in these programs.

We found that although some participants had participated or considered participating in the programs covered in the activity, many had not. Participants were generally hesitant to commit to ongoing programs that could result in compromised comfort, but appreciated information on one-off programs that provide one-time incentives without giving up comfort or flexibility. They also appreciated the customized information provided on the estimated savings for participating in each program.

Customer Service

To enable participants to contact DeltaLumin team members directly with questions or concerns, we created the “DeltaLumin Hotline,” and provided an email and phone number. The Hotline was staffed from 1-4pm Monday - Thursday. During this time, a member of the DeltaLumin team would be responsible for answering calls and emails, responding to any missed calls and emails, and manually entering any surveys or workshop RSVP's received. To ensure good customer service and consistency among team members, we created a customer service manual, which included phone and email etiquette, FAQ’s, strategies to handle tough questions, Desk Duty Responsibilities, and instructions on how to use Salesforce.

We used Salesforce to manage all of our participant contact information, track participant communications, and track workshop attendance and payments. In using this tools, we were able to streamline the transition between team members at desk duty, as well as prep for any follow up communications and troubleshooting.

customer service stats

DeltaLumin staff also used the “desk duty” time to follow up with participants whose surveys were incomplete or to confirm their eligibility. During the enrollment period, having a person available to answer questions about the pilot and expectations of participant commitments was an important part in building a trusting relationship with participants.

During the pilot implementation phase, the volume of calls and emails was drastically reduced. Participants would use the Hotline to RSVP for an upcoming workshop or to get workshop details. After each workshop, we reached out to all participants who did not attend and gave them the opportunity to make up the survey.

For the population enrolled in the pilot, phone communication was the most popular communication channel. During the pilot implementation phase a total of 17% of participants communicated with us via email, while 29% communicated with us via phone.

Learning from our Participants

We gained insights and reflected on user input using various methods throughout the DeltaLumin Pilot. This mixed method approach -- which included in-person workshops, written surveys, and website analytics -- allowed us to compare findings and triangulate insights.

Synthesizing the Workshop Series

Throughout the pilot, DeltaLumin team members interacted with participants directly at monthly workshops. Though we were aware of the interference in-person interaction with participants could have on their survey responses, the workshop format allowed us to capture valuable unstructured feedback from our participants in these early stages of product development.

Several themes found in the workshops informed the iteration of the DeltaLumin suite of features throughout the pilot, and continue to influence the development of DeltaLumin. These themes include:

Survey Results

Monthly surveys were designed to collect user feedback including: (1) preferred frequency and medium of interaction with DeltaLumin materials, (2) evaluation of the clarity of the information presented in each feature, and (3) actions and intentions related to energy behaviors and decisions.

Communication Preferences

In the initial baseline survey, 49% participants indicated that they preferred to receive energy updates from DeltaLumin by email, and 51% preferred mail. Throughout the pilot, all 75 participants received energy updates by mail and the 53 participants who provided an email address also received the same information by email. Participants were also able to access their online dashboards at any time throughout the pilot period.


90% opened their mail

62% opened their email

67% viewed the dashboard

1% did not open any


95% opened their mail

63% opened their email

60% viewed the dashboard

5% did not open any


90% opened their mail

55% opened their email

61% viewed the dashboard

6% did not open any

At each workshop, participants were asked to report whether they interacted with their energy information by mail, email, or through the dashboard. Although in the initial baseline survey there was a near-even split between participants who preferred email communication and those who preferred to receive their energy updates via mail, throughout the pilot participants tended to refer to their mailed energy updates more than their emailed updates or their online dashboard.

Feature Evaluation: Readability

Participants generally found DeltaLumin communication materials easier to understand than their conventional utility bills. With the introduction of each new feature, participants were asked to evaluate its understandability by answering the following question: “How much do you agree or disagree with the following statement: ‘It is easy to understand my [feature].’” Note: Participants responded to this question before discussing the new feature with a DeltaLumin staff member, with the exception of the Program Match feature, which, due to programmatic timeline constraints, was surveyed after group discussion.

At the Pilot kick-off, 31% of participants reported that they agreed or strongly agreed that it is easy to understand their conventional utility bill.

After the introduction of each feature:

52% of participants agreed or strongly agreed that Spending Forecast was easy to understand,

60% of participants agreed or strongly agreed that Appliance Diagnosis was easy to understand,

68% of participants agreed or strongly agreed that Energy Spending Goal was easy to understand,and

85% of participants agreed or strongly agreed that Program Match was easy to understand.

This widespread understanding of the feature set was encouraging. Next, we looked at how participants’ comprehension translated into action or intention around energy-related behaviors and purchasing decisions.

Spending Forecast

Appliance Diagnosis

Energy Spending Goal

Program Match

General Engagement

Throughout the pilot, the DeltaLumin team asked participants to report both conservation actions (changing behaviors to use less energy) and efficiency actions (acquiring energy-using equipment that uses less energy). Throughout the pilot, participants consistently reported increasing the actions they took to conserve energy, and the number of participants reporting sustaining conservation actions steadily increased. This is an important finding as conservation actions are only successful at continuing to save energy if they are sustained over time.

Efficiency actions can take more time to implement since they often require a financial investment, however, we did observe an increase in the frequency of efficiency actions, even over the pilot’s short timeframe. The number of participants who increased the number of efficiency actions taken each month increased dramatically between the second and third months of the pilot.

DeltaLumin’s definition of engagement included both action and the intent to act in the future as a result of exposure to communication. This was especially important to capture because of the short duration of the pilot and the lead time required to act on some of the suggested energy saving strategies. Information on intention was only collected during the last two months of the pilot. The number of participants who reported that they intended to take more conservation actions nearly doubled between the midpoint and the end of the pilot, and the number of participants who said that they would be willing to increase efficiency actions more than doubled in the same period.

NO DATAAVAILABLE NO DATAAVAILABLE Conservation Intentions Efficiency Intentions Percent of Participants who Increased Intentions Percent of Participants who Sustained Intentions Percent of Participants who Decreased Intentions Conservation Actions Efficiency Actions Percent of Participants who Increased Actions Percent of Participants who Sustained Actions Percent of Participants who Decreased Actions Actions Taken Intentions Reported MONTH 1 MONTH 2 MONTH 3 MONTH 1 MONTH 2 MONTH 3 10% 20% 30% 40% 50% 60% 10% 20% 30% 40% 50% 60%

Spending Forecast and Engagement

In an attempt to better understand why we observed this increase in energy-related action and intentions, we also looked at the relationship between the specific information communicated to participants in each feature and their reported actions and intentions.

Every month, pilot participants received a Spending Forecast in their mid-month energy update. This feature showed our estimate of their monthly spending-to-date, and what we anticipated their total spending for the month would be. In each month of the pilot, a higher percentage of participants who received communication indicating that they were on track to overspend their typical monthly bill reported increased conservation actions than participants who received communication indicating they were on track to meet or underspend their typical monthly bill.

In month one, of participants who received communication indicating that they were likely to overspend their typical monthly bill, 46% reported increasing conservation actions, compared to 19% of participants who were told that they were on track to underspend their typical monthly bill. In month two, 42% of participants who were told they were likely to overspend reported increasing conservation action, compared to 22% of participants on track to underspend at mid-month. In the last month of the pilot, 41% of participants who were told they were likely to overspend reported increasing conservation action compared to 11% of participants on track to underspend at mid-month. There was no consistent pattern in reported efficiency action. In months one and three participants who were told they were on track to overspend were less likely to take additional conservation actions, while the opposite was true in month two. We hypothesize that this is a result of the higher initial monetary cost and effort it takes to implement efficiency actions compared to conservation actions.

! MONTH 1 MONTH 2 MONTH 3 MONTH 1 MONTH 2 MONTH 3 Efficiency Actions Hypothesis Conservation Actions % participants who decreased actions % Participants who sustained actions % Participants who increased actions Participants who received an update indicating they were likely to overspend for the month Participants who received an update indicating they were on track for the month We hypothesize that participants told they were likely to overspend will take more action than participants told they are on track.

Appliance Diagnosis and Engagement

Pilot participants received an Appliance Spending Diagnosis in each of their mid- and end-of-month energy updates. This feature illustrated the five appliances or electronic devices that were using the greatest amount of energy--and thus costing them the most money--to operate that month, and estimated their monthly spending-to-date on that item. It was difficult to measure the relationship between appliance-level communication and reported intention or action, however, we did find that participants liked and appreciated the feature as a way to identify opportunities for savings.

Goal Setting and Engagement

In months two and three of the pilot, participants received communication about their progress toward their energy savings goals in their mid- and end-of-month energy updates. For month two, all participants were assigned a suggested goal of 5% savings. During the workshop in advance of month three, participants were asked to set their own energy savings goal. We expected to see significantly more engagement in the month that they set their own goals, but observed no change in engagement levels corresponding to assigned vs. self-selected goals.

Website Analytics

Energy Dashboard views were tracked using Google analytics. Higher website visit counts seemed to be associated with the delivery of monthly updates and attendance at pilot workshops. Website views also became more consistent later in the pilot.

Number of visits to the website each day

to the website on of the pilot

Month 1 Month 2 Month 3 Mid-Month Update End-of-Month Update Mid-Month Update End-of-Month Update Mid-Month Update Workshop 1 Workshop 2 60 50 40 30 20 10


Looking Forward

The first year of DeltaLumin provided a number of features that allowed users to better understand their energy use. In year 1, we also reviewed two other features, Flexible Payments and Giving Back, that give users reasons other than being provoked with their energy use information to be motivated to save energy. Year 2 will focus on building out a payment platform that can allow flexible payments and refining our year 1 feature set. We are also going to be building out our onboarding program to meet the needs of low-income customers’ unique circumstances.


Get In Touch

Meet the Team

Kevin Dick, LEED AP

Project Management & Product Ownership

Colleen McGinnis

Visual Design & Development

Martin Brown

Data Management & Energy Modeling

Nishaat Yunus

Program Communications & Outreach

Ryan Anderson

Financial, Business, & Energy Modeling

Helen Behnke-Hanson

Payments Research & Administration