Not all renewables are created equal: quantifying the emissions benefits of institutional renewable energy purchasing options

By Gavin McCormick and Chiel Borenstein, in partnership with Jaclyn Olsen and Caroleen Verly from the Harvard University Office for Sustainability and Chad Laurent from Meister Consultants Group (A Cadmus Company)

In recent years, institutional climate action targets, renewable energy subsidies, and the rapidly falling costs of wind and solar have led more and more large institutions to begin purchasing significant quantities of off-site renewable energy. The practice has grown rapidly, from 70 megawatts purchased in 2012 to over 2,780 megawatts, as of February 2018. Naturally, all these new renewables are reducing pollution. But…exactly how much pollution?

The Boston Green Ribbon Commission Higher Education Working Group, an alliance of leading sustainability-minded institutions, aimed to find out. The Working Group’s chair, Harvard University, partnered with Meister Consultants Group (a Cadmus Company), and RMI subsidiary WattTime to conduct a study exploring methods for quantifying the actual emissions impacts of institutional renewable energy purchases. The results were intriguing.

Notably, the study, entitled Institutional Renewable Energy Procurement: Quantitative Impacts Addendum, found that the answers may be less straightforward than they initially appear. Evidently, not all renewable energy projects are equally effective at reducing emissions. (Currently, the most common emissions accounting framework treats all renewable energy projects as equally reducing emissions.) Better measuring this variation of impact between projects could soon create new opportunities for renewable energy buyers to begin reducing emissions even faster, more cheaply, more reliably, and more credibly due to the new evidence-based approach.

The Higher Education Working Group—consisting of Boston College, Boston University, Harvard University, MIT, Northeastern University, Tufts University, and the University of Massachusetts, Boston—had already been active in illuminating and streamlining institutional renewable energy purchasing. In 2016, the group authored a report in partnership with Meister Consultants Group offering detailed background information on renewable energy procurement options, as well as guidance on impact claims for institutions already making or looking to make renewable energy purchases.

While attending an RMI Business Renewables Center (BRC) member event, Jaclyn Olsen, Associate Director of Harvard’s Office for Sustainability (OFS), met Gavin McCormick, co-founder and Executive Director of Watt Time, and became intrigued by the work WattTime was doing on quantifying carbon impacts of renewable purchases. Jaclyn proposed a partnership to build on the research that the Working Group had already done on the topic, and the result was a collaboration between OFS, WattTime and Meister Consultants Group to create a report for the Working Group members that brought this new way of assessing emissions reduction impacts from renewable purchases to potential purchasers.

Three Ways to Count Emissions

Most institutions today report their greenhouse gas emissions using the carbon footprinting approach, as laid out in the Greenhouse Gas Protocol (GHGP). While the process involves multiple methods, hierarchies of emissions factors, and other complexities, at a high level it’s a simple approach: Organizations count how much regular electricity they purchase from the grid, subtract off the amount of renewable energy they purchase, and multiply the remainder by the average emissions intensity of the local grid. This framework allows for straightforward comparison of renewable energy commitments across institutions; however it does not differentiate between varying carbon impacts of different renewable energy projects.

Before we describe the study’s findings, it is important to note that carbon footprinting is not the only way to measure emissions. The Quantitative Impacts Addendum study identifies three different ways institutions can measure the emissions impacts of renewable energy purchases: (1) the status quo, carbon footprinting; (2) avoided emissions; and (3) quantification through the generation of carbon offsets. Each has its own benefits and drawbacks.

The study’s primary goal was to uncover the implications of these differences, so that institutions making renewable energy purchasing decisions will have a broader and deeper understanding of the emissions impacts of the projects they are considering.

1) The Status Quo: Counting Megawatt-hours, Not Emissions

The simplicity of carbon footprinting comes at a cost. The GHGP is very explicit that this approach measures the change in emissions that an institution “owns” in an abstract accounting sense, not necessarily the actual real-world emissions reductions caused by renewable energy purchases.

The reason this distinction matters is that the real-world emissions reductions can vary widely. After all, adding renewable energy to the grid only reduces emissions if it displaces existing power plants. But which power plants are displaced? A renewable energy project that displaces mostly coal will reduce considerably more emissions than one that displaces natural gas, or even other emissions-free resources like hydropower.

2) A Measurement Change: Avoided Emissions

The avoided emissions method is also defined under the GHGP, and is classified as an optional calculation. This method establishes a framework for measuring not megawatt-hours, but emissions. By measuring which existing or future power plants a renewable energy project displaces, it measures the actual emissions impacts of a project.

Employing this methodology, the differences in emissions impacts between renewable energy projects can be substantial. The report finds that renewable energy purchases by Boston area schools could reduce anywhere from 791 to 2,187 pounds of carbon dioxide per megawatt-hour—nearly a 300% variation among projects of identical size—depending on the power plant being displaced.

It’s important to note that while the GHGP allows organizations to measure avoided emissions, the GHGP does not allow organizations to use these calculations in their main emissions inventory. So organizations that declare carbon targets and choose to voluntarily define them in terms of the emissions inventory cannot use the avoided emissions method. This could lead to a situation where the claimed emissions reduction is higher or lower than a more accurately calculated value.

3) Carbon Offsets: Counting Emissions Towards Declared Targets

Unlike the avoided emissions methods, projects measured using carbon offsets can be “counted” towards an institution’s official emission inventory. To ensure the integrity of that system, projects are only eligible for carbon offsets if they pass a series of tests that they are valid and additional (truly reducing emissions beyond what would have occurred in the project’s absence). While ensuring the highest levels of accuracy, the carbon offset process is also much more time-consuming and administratively burdensome than the avoided emissions approach. It is also very difficult to prove additionality for renewable energy projects, so many renewable energy projects will not be eligible.

Pros and Cons of Each Method

There are clearly pros and cons to each approach. In determining which method to use, key factors institutions could consider include the following:

Where Next?

The main reasons to measure emissions are 1) to ascertain as accurately as possible whether we are collectively moving towards the emissions reductions we all know are needed, and 2) to allow actors to make accurate comparisons of the impacts of different choices.

When some institutions are using one method and others are using a different method, it is difficult to accurately compare the impact of different individual actions, and to calculate the collective impact. There is a need for a clear and consistent way for institutions to accurately measure the impacts of renewable purchases. It would certainly be possible for the GRC Higher Education Working Group member institutions to collectively define a new standard that draws the best elements out of the three methods and discards the drawbacks. Regardless of the method schools select (or create), acting together maximizes transparency and reduces administrative costs. The report recommends that whatever the Working Group decides, the members collectively decide it together.

Do you really know where your electricity is coming from?

Four years ago, I was part of a group of graduate students from UC Berkeley and software engineers from Google and Climate Corporation who met at a hackathon. We unexpectedly discovered that by pooling combined skills, we could solve a problem that hadn’t been cracked. For the first time, we could know when we flip on a light switch exactly that power comes from.

We wanted to know this because with the rise of energy storage and smart devices, it was getting easier and easier for us to automatically set our equipment to use energy at any particular time we liked. But as environmentalists, we were struck that no one had ever answered this question: if I want to run a device when the grid is providing the cleanest energy, “watt time” is that?

We knew that, more and more often, power grids were experiencing brief moments of surplus clean energy. But when? To find out, we built our own software tool to determine—in real time—where our power was coming from. Soon, we had an app that could tell us specific times that we could use, say, our own laundry machines so that they would be running on surplus wind power. We were thrilled to become the first people on a modern power grid to not just passively consume energy, but to actively choose where and how our energy was being made.

Afterwards, we marveled that it had been so effortless to choose clean energy at the simple press of a button. But although the technology could let anyone just say no to polluting, it didn’t save any money. And as the economists well knew, no energy technology had ever scaled that didn’t save money. We assumed that was the end of it.

But the team couldn’t stop thinking about it, and more and more volunteers with deep technical expertise in the energy industry joined the effort. We started hearing from team members with day jobs at the World Resources Institute, the U.S. Department of Energy, Navigant, MIT, Stanford, PG&E, and countless other institutions. Two hundred and thirty volunteers and a lot of customer research later, we belatedly realized we had been wrong. People wanted this technology. A lot of people.

They showed us just how many thermostats, appliances, batteries, lighting systems, and other types of commercial devices23 billion of them worldwidewere connecting to the internet in order to make smart choices. Nearly every one of those devices could be “WattTime-enabled” to effortlessly, instantly allow its owner to choose energy that fit the owner’s values. And because it ran in the cloud, our solution was fully capable of cutting the carbon footprint of a million-device fleet in minutes.

Stunned by the potential impact, we decided to build a system we call automated emissions reduction (AER). AER distills the massively complex problem of identifying where your power comes from into receiving a simple data feed that can be read by a smart device with the addition of two lines of code. With AER, WattTime is making consuming cleaner energy simple, effortless, cheap, and automatic.

Our earliest AER implementations automatically reduced emissions from humble electric golf carts for the UC Merced sustainability team. We steadily progressed to automatically reducing emissions from refrigerators, then air handlers, and soon, entire building energy management systems for UC Berkeley.

Over time, we began working more and more with one of the most respected institutions in sustainability, Rocky Mountain Institute. RMI carefully validated our algorithms, examined our code and our potential impact, and helped us workshop the fledgling AER industry with 60 interested organizations in Chicago in spring 2017.

Today we are thrilled to announce that, after thoroughly vetting the tech, RMI has both validated our work and decided to bet big on it. This week, RMI is formally incorporating WattTime as a subsidiary organizationWe’ve gone from a team of volunteers to a nonprofit tech startup with a mission to allow the most accurate and credible measurement possible of emissions reductions. And we will benefit from the resources, the network, and the objectivity of the RMI team.

This partnership is a match made in heaven. RMI’s high-level vision of a next-generation, customer-centric electricity system aligns perfectly with WattTime’s dogged pursuit of a disruptive technology solution that gives anyone who uses energyfrom people to large corporationsthe right and the tools to choose for themselves how their energy should be made.

A concept launched by a small team of committed volunteers has become a reality and a movement around a common-sense idea: electricity users need the freedom to choose their power. Microsoft has joined our efforts, as have sustainability leaders from Kaiser Permanente to the City of Austin. Will you join us?

Email us today at contact@watttime.org

Gavin McCormick is cofounder and executive director of WattTime.

Beyond energy efficiency – using the power of data to find the cleanest hours of the day

By Rob Bernard, Josh Henretig, TJ DiCaprio -- Microsoft
Originally posted on the Microsoft Green Blog

On an average morning, you turn off your alarm, turn on the lights, power on your smartphone that was charging overnight, take a hot shower, make a cup of coffee, all while watching the local news. This morning routine is all powered by electricity. The green-minded citizen will turn those lights and appliances off quickly, take a shorter shower, and make sure everything is off before leaving the house. Taking those energy-efficient steps is helpful.

But what if you wanted to do more to help the environment by changing not only how much energy you consume, but what kind of energy you consume? That’s a bit more challenging. At present, most households have no choice or ability to directly influence their individual energy mix—but thanks to big data that’s all about to change.

The Smart Energy Azure Demonstration platform is a user-friendly platform available to anyone with an Azure subscription. The solution builds on the tremendously innovative work done by WattTime. Their API provides data on generation mix down to the megawatts generated from each fuel source; average carbon emissions; and marginal carbon emissions, which is the part of the carbon footprint that you can actually affect by using or conserving energy at a particular place and time. And because the grid’s energy mix changes based on the weather, the platform also pulls in global weather data and forecasts from the Wunderground API.

With data sets customized to their local power grids, consumers can make much more informed decisions about how to adjust their energy consumption and cut energy costs. But knowing this information is just the beginning. By combining these insights with a Microsoft IoT suite that will enable users to sync their home devices with the system’s data, users will soon be able to optimize the energy use of their homes in real time. (The steps for getting the system up and running are clearly detailed in the GitHub page for the solution.) By doing this, households can leverage new solutions, like smart thermostats and smart home apps, to tailor their individual energy use even further and proactively align with times of the day when more clean energy is available on the grid.

These small changes can make a big impact. According to the Rocky Mountain Institute (RMI), enabling water heaters and air conditioners to adjust their timing just slightly could reduce carbon emissions in the United States by over six million metric tons per year—the equivalent of taking one million cars off the road. In addition, RMI found that carbon emissions from loads connected to the PJM grid in Chicago, IL, can be reduced by 5 to 15 percent simply by prioritizing energy usage for periods when coal plants are not on the margin.

To put this theory into practice, we’re working to test the Smart Energy Azure Demonstration platform in enterprise-level applications, like universities. This year, we’re teaming up with Princeton University on a “Marginal Carbon Emissions Project” to see how the platform performs in a larger, multi-building campus setting and to co-develop new projects, including one that would allow the university to measure the CO2 emissions of using the grid compared to tapping Princeton’s onsite power generation at any given time. This will allow the university to further customize its energy utilization and drive daily efficiency.

At Microsoft, our goal is to empower our customers with the tools and technology to achieve more, sustainably. We’re excited by the potential of this and other new technology to help consumers make more informed energy decisions by bringing data to their fingertips—so that running a greener home is as easy as making your morning coffee.

Combating climate change by measuring carbon emissions correctly

By Jamie Mandel and Gavin McCormick. Originally posted on RMI Outlet.

Carbon emissions are arguably the most important thing for our society to learn how to manage in the coming years. The largest single source of U.S. carbon emissions is our electricity system. And yet, we do not measure emissions from our electricity use correctly, meaning we cannot manage our emissions effectively.

But now, thanks to a new technology that accurately measures moment-to-moment carbon emissions on our electricity system, we can unlock a whole host of new opportunities to manage emissions creatively and with less effort. With new software that automatically tracks the actual emissions impacts associated with specific actions on the electricity system, both in real time and ahead of time, we can now use our appliances at times when our electricity is the cleanest.

End-use flexibility

Many uses of electricity have inherent flexibility—that is, the timing can be changed by small or large amounts without impacting the quality of the service that device is providing. As Rocky Mountain Institute explored in The Economics of Demand Flexibility, harnessing this flexibility can save consumers and companies money while lowering grid costs.

The same is true of carbon emissions—harnessing the flexibility of end-use devices can make them run, on average, 15 percent cleaner than a “dumb” device, at no cost or quality impacts for the end-user.

Millions of people and thousands of corporations try every day to manage their carbon emissions. Unfortunately, much of this effort occurs without measuring these emissions correctly. Personal and corporate efforts to manage carbon emissions from electricity typically happen in one of two ways:

1) Wthout any measurement, by focusing on efforts that are generally associated with reduced emissions. For example, many corporations invest in things like efficiency, solar PV, and grid-sourced clean energy, but do not attempt to quantify the emissions savings associated with specific investments.

2) With coarse measurement of average emissions intensity, primarily by using eGrid historical data to estimate averages for electricity-related emissions. For example, a corporation might deliberately site a data center at a location on the grid that is, on average, cleaner than other options and claim some associated carbon emissions savings.

Thanks to new technology, it is now possible instead to know the actual emissions impacts associated with specific actions at a specific place on the electricity system, in real time—and even ahead of time through predictive algorithms. More importantly, it is now possible to assess future decisions based on marginal—rather than average—emissions factors, which, according to most economists, is the correct way to properly understand emissions impacts.

The emissions hidden in the margins

The difference between average and marginal emissions factors can be very large, and quite important. An average factor refers to the amount of emissions generated over a given time, divided by the amount of energy produced in that time. For example, the U.S. Pacific Northwest gets most of its electricity from hydropower, a low-emissions energy resource, and thus its average emissions factor is very low.

A marginal emissions factor refers to rate at which emissions would change with a small change to electricity load. Continuing the simplified Pacific Northwest example, imagine a time when hydropower is providing 75 percent of the region’s power and gas-fired power plants are providing the remaining 25 percent. This means that the average emissions factor of power in the Pacific Northwest would be very clean, at 25 percent the emissions intensity of natural gas (approximately 210 lbs. CO2 per megawatt-hour (MWh)). So at first glance, a great way to reduce a company’s or a person’s carbon footprint would be to move to the Pacific Northwest, where the electricity is very clean.

Yet in many cases, natural gas is the marginal resource, meaning that if a new kilowatt-hour of electricity is needed at a certain time, it will be provided by natural gas. So a company or an individual moving to the Pacific Northwest would increase carbon emissions at a rate equal to 100 percent of natural gas (840 lbs. CO2 per MWh)—a very big difference! Thinking in marginal rather than average carbon emissions can dramatically affect a company’s or a person’s choice of optimal environmental impact.

Estimating emissions impacts based on average emissions factors can have these types of effects on a recurring basis, across the U.S. This is because the portfolio of generators dispatching energy into the grid changes every five to 15 minutes, changing the marginal resource. For example, Midwest utilities mostly burn coal at night; if you own an electric vehicle there, you would have lower CO2 emissions if you deliberately charged it during the day. On the other hand, California’s electricity market has more efficient gas plants on the margin at night than during the day, so you should charge your electric vehicle (EV) in the evening to minimize your CO2 emissions. And with an Internet-connected EV charger, you can cut emissions even further with micro-timing. For example, you can time the EV charging to shut down when less-efficient peaking plants briefly kick on (say when the wind subsides or a cloud passes over), and turn it back on five minutes later when the wind returns or the cloud moves on and the marginal generator is cleaner.

Things you can do when you account for carbon emissions correctly

Accounting for carbon emissions correctly unlocks a whole host of new emissions-management opportunities. You can:

RMI and WattTime are working together to measure carbon emissions correctly and reduce them cost-effectively.

WattTime is a California-based nonprofit that has developed software to accurately forecast carbon emissions on the margin, in real time. This data can be used to control the timing of device charging, apply carbon emissions data to the models that renewable energy developers use to site projects, provide strategic advice to corporations on how to most cost-effectively reduce emissions, and provide more accurate reporting and verification of emissions.

RMI is using this new technological tool to unlock new markets for carbon reduction, and to maximize the value of these reductions. This technology can be used to improve the profitability of distributed energy resource companies and retail energy providers by lowering customer acquisition costs, accelerating corporate sustainability efforts, and improving the way that carbon emissions are measured and, ultimately, priced.

For example, 240 EV customers nationwide are charging their EVs with cleaner energy than their neighbors. Thousands of thermostat customers in Chicago are learning that cooling their houses with fewer carbon emissions is as easy as pushing a button. By using WattTime, millions of independent devices can be seamlessly checking the emissions content of the grid and making small decisions about the timing of electricity use to lower carbon emissions.

A key founding principle at RMI is that people don’t want raw kilowatt-hours. They want hot showers, cold beer, and illumination. Similarly, the planet doesn’t care how many kilowatt-hours we reduce. It cares how much we reduce CO2 emissions. So why not start measuring them directly? Together, we will help people and companies easily reduce their carbon emissions to help create a world that’s thriving, verdant, and secure, for all, for ever.

How we use Librato to monitor data quality

What’s the problem?

WattTime analyzes power grid data in real time from dozens of open data sources. Because the data is used to optimize the behavior of smart devices in real time, it is very important that we always have the most accurate up-to-date data. This means that we have had to tackle a classic engineering problem: building a highly reliable system out of less reliable components.

There are two main sources of unreliability in our data ingestion system. First, any of our incoming data sources can go down for a period of time without warning, creating a gap in our data record. Second, our cluster of worker servers may not run the data scraping jobs for any number of reasons, deepening the potential gaps. You can see how we’re in a tricky predicament of always needing the most accurate data and yet having a number of reasons for something to go wrong.

As the saying goes, if you can’t measure it, you can’t manage it. Today’s post explains how we built a monitoring and alerting system to detect gaps in our data ingestion pipeline using Librato.

Designing the Solution

Here are the primary characteristics we wanted in a monitoring and alerting system:

  1. The solution needs to run 24/7. We want to provide our end-users with cleanest available energy, and whether the current energy supply is clean can change every five minutes. If our data is not up-to-date, then the energy supply can change without our knowledge, and we might miss an opportunity to give our users a choice to save carbon.
  1. The solution needs to run in a way that's isolated/decoupled from how the data scraping tasks normally get run. We don’t want our monitoring system to be dependent on the system it’s supposed to be monitoring! This means that we’d either have to spin up our own separate monitoring service (and maybe a monitor for the monitor…) or use a reliable third-party SaaS tool.
  1. The solution needs to have a way to identify any new gaps in our data as soon as possible, so we can triage and fix the problem before it gets worse. If we’re using a third-party tool, that means we need to give it a way to hook into our data pipeline.
  1. The solution needs to have a way to send us alert messages when a problem occurs. We like the workflow of Slack, but email would be ok as a fallback.
  1. The solution should make it easy to configure the frequency and thresholds for triggering alerts. Overly noisy alerting systems get ignored, so we thought it was a good idea that any system we implemented didn’t bombarded us with notifications, just sending the important actionable ones.

The WattTime API is currently running on Heroku, a popular cloud platform-as-a-service provider. Heroku has a great ecosystem of high-quality third-party “add-ons” that we can trust to have good uptime (satisfying criterion #1), even if something bad happens on our end (satisfying criterion #2). We decided to start by surveying Heroku’s add-ons to see which ones would help us satisfy our other design criteria: data pipeline integration (#3), Slack integration (#4), and configurability (#5).

After some doc hunting, we established that many add-ons had Slack integration, so that didn’t narrow our solution space very far. Instead, we decided to make our choice primarily based on the mechanism of integrating with our data pipeline. There was a wide range of options here: some add-ons would collect data from us if we printed it to our logs, others would collect data if we raised an exception, etc. Exception-based add-ons would be a great fit for detecting failed requests during data ingestion, but they wouldn’t help us monitor failures in our worker cluster overall. A sufficiently configurable log-based add-on, on the other hand, would be able to send alerts either if a problematic value appeared in the logs, or if the log stream stopped getting updated altogether. If such an add-on existed, it would allow us to meet all five design criteria.

And it does: Librato! Librato is a service for visualizing and creating alerts based on metrics. The Heroku add-on comes preconfigured to read metrics from Heroku log streams. In fact, we were already using Librato to visualize dyno performance metrics that Heroku printed to our log stream—but we hadn’t tried configuring any alerts. Reading Librato’s docs on alerts, we discovered that it supports both kinds of alert triggers we wanted. This made it a great choice for a monitoring service that we can configure to watch our data and alert us if something unexpected happens.

Step 1: Logging data quality

Librato is organized around the concept of a “metric.” A Librato metric can be any kind of time series data: something that can be graphed with a time stamp on the x axis and a number on the y axis. When the metric hasn’t been reporting or hasn’t logged a new datapoint within a certain period of time, visualizations like this can help us clearly see what is going on.

We chose “lag time” as the metric to track. We define lag time as the age of the most recently ingested data point of a particular type. Lag time serves as a good metric because it helps us get to the heart of the problem. If our goal is to have the most accurate up-to-date data, we’d like to know when our gaps start and for how long they occurred for.

To implement data quality logging, every time we collect data, we figure out how old the newest data point is, and print that number of minutes to the logs in Librato’s specific format. Here’s a screenshot of what the output looks like in our Papertrail logs:

Log messages formatted as Librato metrics, in Papertrail
Log messages formatted as Librato metrics, in Papertrail

Step 2: Setting up alerts

We have several ways to detect if something has gone wrong with our data collecting. If a new datapoint has not been added within an acceptable amount of time, then we may want to get alerted and see if there is anything we can do on our end. To do this, we set “condition type” to “goes above” when creating an alert:

Example Condition

We can also check if our metric stops reporting entirely. In that case, we set “condition type” to “stops reporting”. Both alerts serve to inform us as quickly as possible if something were to go wrong.

Screen Shot 2016-08-09 at 11.09.04 AM

Step 3: Receiving alerts

To set up Slack integration, Librato requires you retrieving the Webhook URL for your Slack channel. In the Slack desktop app, go to the top left and click on Apps & Integration. This will take you to a directory of different apps that can integrate with Slack. Search for "Incoming WebHooks."

Screen Shot 2016-08-08 at 3.46.32 PM

From there, go to "add configurations" and you should be able to find the webhook URL.

Once you have the URL, you can go to Librato’s Integrations tab and click “add configuration” on the Slack tab. Paste in the URL and give it a title you’ll be familiar with.

Screen Shot 2016-08-08 at 3.49.05 PM

Then whenever an alert is triggered, you’ll receive a Slack message showing you why the alert was triggered and other relevant information. 

Screen Shot 2016-08-08 at 3.55.39 PM

What we're thinking about next

While it’s really nice for our system to alert us whenever anything goes wrong, we thought that maybe it would be even better practice if our system was self-healing. As it stands, our system sends us an alert, and from there a living breathing human being has to take time out of what they’re doing and investigate. So for our next step, we will create a system to patch up data holes as they happen. In this way we can be more sure that we have the most accurate up-to-date data at any time.

Partnership with Building Clouds

Do you manage a large commercial or industrial facility? Perhaps a university, apartment complex, or hotel? Today we're pleased to share that, thanks to WattTime's newest partnership with Building Clouds, you can now cut your emissions through WattTime-enabled technology in nearly any equipment.

Facility managers everywhere have learned that not all equipment manufacturers play nice with each other and that interoperability is a major problem. What we love about Building Clouds Strati-Fi(TM) controllers is that they allow for remarkably quick and cost effective monitoring and control of most equipment types in commercial or industrial scale HVAC equipment. So whatever your building currently runs on, you can now WattTime-enable it by giving Building Clouds a call.

We've been working with Building Clouds for over a year to pilot this technology and work out all the kinks. Our first project began with Sutardja Hall, which became the first building in the world to optimize its HVAC package's energy loads in real time to cut carbon emissions. Building Clouds President Bob Wallace met with us on the rooftop of Sutardja Dai Hall and demonstrated the technology installation, which took place in under an hour. We were able to collect data almost immediately and automatically started cutting carbon less than a week later.

The implementation and results of this went over so well with UC Berkeley that we recently agreed upon a second project for the Residential and Student Services Building (RSSB). This deal included connecting the Strati-Fi(TM) controllers to WattTime-enable two sixty-ton air handlers. Both projects have successfully achieved absolutely zero increase in the building's energy bill or on occupant comfort. In fact, as a mischievous test, we did not announce the project until one month after installation, during which no building occupants were even aware of any change.

After over a year of successful pilots, we're delighted to share that WattTime and Building Clouds are scaling up our partnership to release our technology to the broader market. If you're looking for cost-effective building automation solutions that also cut emissions, please take a look at buildingclouds.com or WattTime's own Shop page.

Energate Inc launches first WattTime-enabled thermostat

Happy Earth Day from WattTime! We’re delighted to celebrate it this year by launching our new partnership with Energate Inc, creator of the HōlHōm smart thermostat.

A select few of Energate’s HōlHōm smart thermostat owners in the Chicago area will soon be offered a new feature – to enable “Clean Power Mode” by WattTime. Those of you already familiar with WattTime can guess how it works: in Clean Power Mode, these thermostats will actively prioritize electricity from environmentally-friendly power plants by shifting electricity consumption to moments when those power plants have surplus energy.

As usual with WattTime, we’ve also bent over backwards to ensure that enabling this feature will be free, effortless, and will not affect how comfortable anyone’s home or office is. That’s possible because air conditioners and heaters work by continuously cycling on and off anyway, so they can easily deploy WattTime’s timing-based technology just by making those cycles happen intelligently, not at random times.

It really is environmentalism, made effortless. Sound pretty good? We know that in surveys, the vast majority of people agree, telling us that they would choose a smart thermostat with a feature like that.

But, as any good social scientist knows, it's easy to say something in a survey. You have to also check what people really do in practice. So, as part of the work we’re doing supported by the Great Lakes Protection Fund, our wonderful partner Delta Institute is helping us conduct this pilot with Energate as a careful, scientifically rigorous test. What Delta is measuring is, if two smart devices are sold side by side and only one of them offers Clean Power Mode, does it make buyers choose that one more often? If it turns out the answer is yes, we think other companies who sell smart devices will quickly get the message that choosing to go green is just plain good business. Since 40% of thermostat sales nationwide are now smart thermostats, that could add up to a lot of devices, pretty fast.

Because this pilot is a science experiment as much as a product release, not just anyone can sign up for a WattTime-enabled smart thermostat from Energate today. But if the pilot does find that Clean Power Mode is indeed popular, we’ll be expanding to other regions soon. If you’d be interested in trying the world’s first smart thermostat that automatically prioritizes clean energy, you can sign up on our mailing list here.

WattTime launches a pollution reduction collaboration in the Great Lakes

We are thrilled to announce that WattTime has received a substantial grant from the Great Lakes Protection Fund to lead a coalition of nonprofits and companies in reducing mercury pollution from coal plants. The project will be a collaboration between WattTime, Rocky Mountain Institute, National Wildlife Federation, Delta Institute, Energy Emissions Intelligence, and several corporate partners.

In 2008, a Federal court struck down the national Clean Air Mercury Rule that required coal-fired power plants to limit dangerous mercury emissions. With repeated attempts to replace the rule continuing to face uncertain political futures, badly-needed efforts to return to safe mercury levels in the Great Lakes have stalled.

Right now if you're using the power grid near the Great Lakes, you're dumping mercury in the water.

But a core WattTime value is choice. Whether it's mercury, carbon dioxide, or any other pollutant, we believe nobody should be allowed to make you pollute without your consent.

So in collaboration with this powerhouse team of leading names in environmental activism and technology, WattTime will be developing and deploying technology to make it possible for people in the area to tune their smart devices to “just say no” to drawing power from the dirtiest mercury-spewing coal plants. You can read more about the project here.

Would your smart home or smart building technology company like to showcase your eco-friendly credentials and join our pilot? There's still time to get involved: contact us to learn more.

WattTime featured at the UC Carbon Neutrality Summit

WattTime yesterday joined the University of California Carbon Neutrality Summit [link] as one of two featured startups for the Entrepreneurs forum [video]. At the conference, Governor Jerry Brown and UC President Janet Napolitano both spoke about the urgent need for more innovative climate change solutions.

Many speakers focused on Napolitano’s vow to turn the ten UC campuses into “living laboratories” to generate solutions that can be adopted on state, national and global levels.

“Climate change impacts issues as varied as disease management, food security, the preservation of water resources, the stability of fragile governments, and transportation infrastructure,” Napolitano said. “Addressing these challenges, and reducing our carbon footprint, is a moral imperative.” [source]

To address these challenges, the UC Climate Solutions Group presented 10 scalable solutions to move the world towards carbon neutrality. The group, comprised of 50 experts from 10 UC campuses and national laboratories, stressed the moral implications of climate change in the executive summary of their report: "Bending the Curve: Ten scalable solutions for carbon neutrality and climate stability.”

“15 percent of us contribute 60 percent of the pollution. We’re leaving behind a planet of uncertain future for our children, grandchildren and generations unborn,” said Veerabhadran Ramanathan, chair of the UC Climate Solutions Group.

As part of the Entrepreneurs forum, WattTime Executive Director Gavin McCormick spoke in particular of the importance of the UC’s “living laboratories” concept. Panelists agreed that the concept has been invaluable for helping ideas take off by allowing university facilities to be used to test innovative new ideas early on. McCormick remarked that early adoption of new WattTime technologies at UC Merced and UC Berkeley was a crucial factor in WattTime’s ability to calibrate our new carbon saving technology [link] to the daily operating needs of users.

“This is a call to action. We put all of our best minds in California on this — a very formidable force. Nothing less than that is required,” said Brown.

Did you know these 16 surprising facts about clean energy?

I'm a long-time Harper's Magazine reader. My two favorite features are the insanely hard cryptic crossword, and the Harper's Index. So for your data-digesting pleasure, check out this mini Harper's Index of some of WattTime's favorite facts about building a smart, clean grid!

(more…)