Finally!

I am tied up doing my budget for FY 2016, but had to take a moment to post this link from DOE that discusses energy baselining and tracking.  AT LAST, a guide from the Feds that goes beyond the per-square-foot method of trying to assess performance.  This should really be a motivator for people to rethink how they measure and monitor energy utilization.  At least people who think instead of just doing what they are told.  And indeed, this could actually be a useful approach for organizations that want to effectively and (pretty) accurately assess and manage energy use.  Worth a perusal if nothing else.  The link is here:

http://energy.gov/sites/prod/files/2015/01/f19/Energy%20Intensity%20Baselining%20and%20Tracking%20Guidance.pdf

If there’s any justice in the world, this will be the first nail in the coffin of EPA’s Portfolio Manager…

Enjoy!

Advertisements

Pushback…

As I mentioned in an earlier Post, the Big Thinkers from the City of Boston decided it would be a great thing for building owners and operators to document and submit their utility consumption data, which would allow properties to be publicly  “scored” based on the Environmental Protection Agency’s Energy Star Portfolio Manager benchmarking tool.

Portfolio Manager is a simple tool that essentially calculates utility consumption per square foot and ranks buildings on that basis.  We have discussed previously why such benchmarks are fundamentally flawed and unreliable in principle.   But now a report has been released that demonstrates such benchmarks to be flawed in practice.

Which, logically thinking, only makes sense.  How can a broken idea lead to a correct result?  Only by dumb luck.

There is a description of this study in today’s Boston Globe:

http://tinyurl.com/brngkca

If I find a link to the actual study I will add it as well.

I think this is it:

http://tinyurl.com/czk9sam

The money quote in the article is quite to the point:

“‘The benefits it [the reporting ordinance] achieves may be nil and the cost possibly significant,’ said [Robert] Stavins, who conducted the study in conjunction with Analysis Group, a Boston consulting firm.”

Hmm.  Well, that sounds helpful, no?

It is striking that the benefits are potentially “nil” according to this study,  given the huge investment in time and effort that has already been expended in collecting and entering data by various constituencies.  Surely someone, somewhere in the City of Boston carefully evaluated the Portfolio Manager benchmarking tool and employed thought experiments to examine a range of “what if” scenarios to prove the efficacy and reliability of benchmarks before rolling it out.

Right?

Now, no doubt, the study linked to above will be discredited by some because it was funded by a constituency that views the City’s reporting requirements to be burdensome and potentially deceptive when it comes to demonstrating their environmental stewardship.  “Ah, it’s motivated by self interest, ergo it can’t be valid.”

There are also stakeholders who have invested significant effort in establishing Portfolio Manager as the tool of choice for energy reporting and benchmarking for the City.  I have found it rare for people with that level of buy-in to reassess and change direction, even if bad results can be glimpsed on the horizon for those willing to look.

So it’ll be interesting to see if this report gets any traction.

If you’d like to see a description the ordinance itself, it is located here:

http://m.cityofboston.gov/news/Default.aspx?id=5997

If you’d like to read my take down of simplistic benchmarks like Portfolio Manager, you can find it here:

http://tinyurl.com/d2l5oy5

***

Please bear in mind that most facility owners and operators embrace efficiency improvements and  want to be good environmental citizens.  However, it is frustrating in the extreme for them to be forced to use reporting tools that demonstrably will not deliver the kind of intelligence and the kind of results that the city claims will be achieved.  If only the policy makers has worked with the stakeholders beforehand, the current climate of discomfort and concern might have been largely avoided.  Water under the bridge, I suppose, but hopefully others will learn from our experience.

In general, though, we can take away a good lesson.  If you are going to do something, make the effort to do it carefully, accurately, and well.

Why I am Not a Fan of Most Energy Benchmarking

Benchmarking in the energy industry is generally capital-H Horrible, with all sorts of unexamined assumptions and presumptions that can readily  be shown to lead to incorrect conclusions.  I have railed against tools like Energy Star Portfolio Manager, but to no avail.  It is a bill of goods that has been sold to upper management and many consultants – who always likes things to be short, sweet and simple.  In this case, way too simple.

As a management tool, most benchmarks are awful.

My reasons for benchmarking skepticism are located here:

http://tinyurl.com/d2l5oy5

This is not to say that good benchmarking metrics can’t be created.  But we need to take the process away from the well intentioned policy types and have a discussion about what rigorous and accurate metrics should look like.