Finally!

I am tied up doing my budget for FY 2016, but had to take a moment to post this link from DOE that discusses energy baselining and tracking.  AT LAST, a guide from the Feds that goes beyond the per-square-foot method of trying to assess performance.  This should really be a motivator for people to rethink how they measure and monitor energy utilization.  At least people who think instead of just doing what they are told.  And indeed, this could actually be a useful approach for organizations that want to effectively and (pretty) accurately assess and manage energy use.  Worth a perusal if nothing else.  The link is here:

http://energy.gov/sites/prod/files/2015/01/f19/Energy%20Intensity%20Baselining%20and%20Tracking%20Guidance.pdf

If there’s any justice in the world, this will be the first nail in the coffin of EPA’s Portfolio Manager…

Enjoy!

Living with our Changing Climate

There is a column in today’s New York Times that is absolutely loaded with fantastic links pertaining to climate science. It’s written by Andrew Revkin, so the column itself is also packed with interesting and useful information as one would expect. Highly recommended and worth investigating.

http://dotearth.blogs.nytimes.com/2014/09/10/can-humans-get-used-to-having-a-two-way-relationship-with-earths-climate/?rref=science&module=Ribbon&version=origin&region=Header&action=click&contentCollection=Science&pgtype=article

Thinking Machines

There is an interesting article on the Atlantic Monthly site about Artificial Intelligence (AI) investigator Douglass Hofstadter.  The article itself is located here and is well worth a read:

http://tinyurl.com/pcfejwf

This column is relevant here for a couple of reasons.

To me, one major takeaway of the column is that Hofstadter’s research is not only qualitatively distinct from what I will call “industrial AI” conceptually, it is intellectually superior.  By which I particularly mean, it is (or will be) vastly more relevant in helping humankind understand ourselves and our world – even though it is quantitatively less productive than industrial AI – if his research is successful and understood.

However, industrial AI (let’s abbreviate it IAI) is more effective at delivering results that can be monetized than anything Hofstadter has done.  And thus, a brute force computing monster like Google can do amazing things, and make lots of money, without really addressing deep questions about learning and thinking that Hofstadter focuses on.

I think that this column indirectly says something in general about questions that are relevant to energy and energy conservation, though I may be making a stretch.  But let me take a shot.

Hofstadter is ultimately interested in figuring out how human beings think (hint, he thinks we are absolutely amazingly evolved analogy machines) whereas IAI is principally concerned with delivering accurate and reliable results.  Which, I would add, is a non-trivial and intellectually interesting field of research.  Nonetheless, Hofstadter is pursuing something more akin to fundamental natural philosophy that could lead to astounding discoveries about how and why we think.  The results-oriented application of his findings would then be layered on top of that conceptual foundation.

Let’s get a little more specific.

I can input an English sentence into Google Translate and it will convert that sentence into serviceable Spanish or French or Chinese or any of a number of foreign languages.  But Google Translate does not know what it is translating, nor does it even know that it is translating.  It is a dumb system that provides pretty smart results.

Hofstadter is looking for something different.  An AI system that would, somehow, “understand” that it was translating, and that would rely upon associations instead of brute force to come up with a workable solution.

Just so.   But the relation to energy?

As I think I have written elsewhere, there seems to be a growing trend to treat energy conservation – via energy policy – sort of like IAI.  Namely, if you throw enough resources and regulations and credentialed “experts” at it, you will get the results that you desire.

And thus, the policy maker might (and probably would) conclude that if $100,000 on energy conservation saves one million kilowatt-hours, then $200,000 will save you two million.

Alas, this is not automatically true.  But why it is not true requires a Hofstadterian investigation into where energy is actually being used, why it is being used, and whether it can be judiciously reduced.  After that, maybe you can or maybe you can’t deliver two million kilowatt-hours of electrical energy savings.  But most essentially, you will know why you can or why you cannot.  It is a deeper and intellectually more profound understanding of the situation.

This is, of course, partially an argument for more control of the framing of technical issues by competent experts, which I have broached elsewhere.

I think, though, that we can broaden this.  The sidelining of Hofstadter (he is not in the mainstream of current AI research) parallels the sidelining of deep competence in many fields of endeavor in the United States today.  Respect for deep competence has been supplanted with respect for, or deference to, gaudy credentials and smartly packaged deliverables.  And so we have semi-competent consultants and policy makers issuing poorly informed advice that is acted upon to the detriment of the enterprise.

This is not a good trend, and it is a difficult one to reverse.  Hofstadter at least shows that the qualitatively superior path can be taken.  This path will not necessarily deliver great immediate profit, but the potential long-term benefits could be enormous.  A smart enterprise, then, should treat consultants with a wary eye, as one option that does not preclude other approaches or ideas.

The Optimism of Tom Friedman…

Today, New York Times’ columnist Tom Friedman weighed in on carbon emissions and the growing role that natural gas can play in our energy future.  The article itself is here:

http://tinyurl.com/p2wm2ja

Although he didn’t write it, the column description on the Times’ web site does more or less capture Friedman’s position:

“The Amazing Energy Race – The United States is falling behind. To catch up, we need to reorder our priorities, find cleaner and smarter fuels and develop new technologies.”

Now, I am not a big Friedman fan, but I will concede that this column makes some good points.  It is true, for example, that burning natural gas results in about half the carbon emissions of burning coal for the same amount of energy (indeed, we show the calculations elsewhere on this web site.)

It is also true that leaked and unburned natural gas (emitted into the atmosphere) has a much higher global warming potential (GWP) than carbon dioxide, which argues for great caution in natural gas extraction and transportation (GWP numbers are also located on another post on this site.)

And Friedman correctly warns that we should not be seduced by low-cost, lower-carbon-emitting domestic natural gas into not pursuing “renewable” energy opportunities such as wind and solar, since natural gas only slows the rate of carbon emissions, it does not address the issue.

Friedman also makes a reasonable political and policy argument as to how a carbon emission reduction program might be implemented without insurmountable obstacles blocking any hope of achievement.

He notes too that the Germans and Chinese among others are becoming heavily invested in renewable energy even as we more or less pay it lip service.

So all in all, a pretty reasonable column.

There is one place, however, where I find Friedman a bit Pollyannaish, and that is his unquestioned faith in our ability to carry out “…continuous innovation in clean power technologies…” and to conjure up “smarter materials, smarter software or smarter designs” in order to produce products that can operate while using less energy.

It is, of course this belief that “continuous engineering innovation” is a function of market forces rather than physics and science that kind of bugs me.  And it doesn’t bug me just because I’m an engineer.

It  bugs me because when policy wonks like Friedman throw “innovation” around like it’s some limitless and undifferentiated commodity, they risk selling policy solutions that will not or cannot be met.  It bugs me because they are unwilling to get their hands dirty and learn what the real world technical limitations are to energy conversion before they start marketing their seemingly clear eyed solutions.  It bugs me because we can’t afford to spend a decade or two or three chasing fantasies like carbon sequestration while “realistic” solutions – whatever that may  be, perhaps solar furnaces and solar hydrolysis factories – are not pursued because they are not cost competitive with the fantasy options.

And so we arrive finally at my point.

The politicization of science has already rendered the public’s discussions of such virtually settled fields as evolution, combustion science and climate science preposterously and idiotically combative.  Not to mention stupendously uninformed.

The subsequent devaluation of scientific expertise has consequently led to policy recommendations that do not defer to scientific consensus.  Rather, policy “experts” simply assume that the science will take care of itself while the “policy” can serve as an ideological stalking horse.

This is backwards.  Those with the expertise should play a much larger role in shaping policy that is realistic, hard-headed and achievable.

We recall that the Manhattan Project was directed by a physicist, not a politician or policy wonk.  That is not to say that the fruits of the Manhattan project were not used by politicians and policy wonks.  But at least the politicians and policy people got out of the way when the hard work needed to be done to figure out the physics and science.

When it comes to “clean” energy, we cannot assume that there will be equivalent breakthroughs to nuclear fission.  Better perhaps to let the brainiacs at MIT and Cal Tech help us understand what nature will allow, rather than to assert that “continuous innovation” will cure what ails us.

We had a Manhattan Project.  Why not a “Cambridge Project” to map out the feasible limits of energy conversion with the physics and technology currently at our command?

Then you could map out an energy policy that, with a little luck, all Americans could invest in.

How Well Can We Do?

As is probably intuitively obvious, things never work perfectly and without friction in the real world.

What this means is that we can calculate theoretically how much energy is required to, say, accelerate still air to flowing air with a fan (technically, we are adding kinetic energy to the air) but, in practice, you really need to use more energy because the fan and the motor that drives the fan are not 100% efficient and waste some energy.

In a earlier post, I explained that the power required by a fan can be calculated with the simple formula:

BHP = (CFM * Pressure) / 6356 * Eff_fan)

Where BHP = brake horsepower; CFM = cubic feet of air per minute at standard conditions; Pressure = pressure increase in inches water; 6356 is a complicated constant that converts terms to deliver power input as brake horsepower; and Eff_fan = fan efficiency.

So if you set Eff = 1, you can calculate the minimal amount of energy that would theoretically be needed to deliver the CFM and pressure specified.

I should add that the BHP can be directly converted to kW with the conversion factor of 0.746 kW/BHP.  For a motor that is not 100% efficient, the calculation is simply:

kW = (0.746 kW/BHP * BHP) / Eff_motor

So what, right?  But actually, you can figure out (very roughly) some interesting things with this.  For instance.

My main campus is roughly 2,000,000 square feet.

Generally speaking, air handling systems deliver about 1.5 CFM of ventilation air per square foot of space.

Air handlers might discharge at a static pressure of 4″ water.  So:

2,000,000 SF * 1.5 CFM/SF = 3,000,000 CFM  (approximate ventilation flow for 2M square feet)

3,000,000 CFM * 4″ / (6356) = 1,888 BHP  (power requirements in BHP, theoretical minimum)

3,000,000 CFM * 4″ / (6356 * 0.65) = 2905 BHP  (approximate power requirements in BHP, real world)

We can now really easily come up with a rough idea how much power is required to ventilate a 2M square foot facility.  Let’s assume the motor running the fan is 95% efficient:

( 2,905 BHO * 0.746 kW/BHP ) / 0.95 = 2,280 kW (roughly)

We can do a reality check by looking at an real building.  I have a property at 250,000 square feet that has about 375 HP of supply fan motors:

(250,000 SF * 1.5 CFM/SF * 4″) / (6356 * 0.65) = 363 BHP

So yeah, we’re at least in the realm of reason in our calculation results.

We can also do an amazing high level, but reasonably accurate, estimate of how much energy would be saved by resetting the fan discharge pressure downward by 2/10ths of an inch across the campus:

3,000,000 CFM * 0.2″ / (6356 * 0.65) = 145 BHP  (approximate power SAVED in BHP, real world)

Which isn’t chicken feed.  If we assume our fan systems run 18 hours per day, we can even calculate the value of static discharge reset.  Assume cost per kWh is $0.12.

(( 145 BHP * 0.746 kW/BHP ) / 0.95) * 18 Hr/Day * 365 days = 748,000 kWh

748,000 kWh * $0.12/kWh = $90,000 per year savings.

Not bad.  Interesting what you can estimate with only a smidgin of information and a few simple rules of thumb…

Solar Panel Reliability an Issue?

Today, the New York Times published an article on defective solar panels.  The article is located here:

http://tinyurl.com/omn2x9g

At issue is the corner cutting being taken by certain photovoltaic panel manufacturers, leading to premature panel degradation.  For instance, they mention one west coast installation where panels with an anticipated lifespan of 25 years began failing after just 2.  This led to losses of hundreds of thousands of dollars in missed revenue for the operator.

The solar industry is justifiably concerned with manufacturing integrity.  Solar power has been rapidly gaining credibility in both the residential and commercial sectors after early skepticism, but manufacturing defects that torpedo the lifecycle cost of solar installations could reverse that trend very quickly, and they know it.  A combination of industry-wide materials and manufacturing standards, coupled with performance warranties seem like it could address the issue, though they would probably add to the cost of solar, as well.  By the way, the panel failure rates seem to be in the 5% to 20% range.  A review by Meteocontrol in Germany also found that 80% of solar installations monitored in Europe were under performing, though the article did not specify precisely what this meant.

One thing that really interested me about the article was some information about the growth of solar power generating capacity.  The Times linked to a pretty interesting informational page at the Solar Energy Industries Association (SEIA) that shows United States photovoltaic installations in 2012 totaled over 3 GW!  Ten years earlier, photovoltaic installations totalled 0.02 GW.  That’s pretty astounding growth.  While the SEIA clearly has an agenda, I still thought their web page was interesting and informative, with very good graphics and charts.  You can take a look at it here:

http://tinyurl.com/cgldhl3

For someone like me who doesn’t really spend much time analyzing solar, there is some good rule of thumb numbers for the cost of ph0tovoltaic installations:

Installed Price

One can also do a little math and figure out that SEIA estimates a typical installation in Massachusetts will require 4 kW capacity.  So if we look about $5/Watt x 4,000 Watts we get an installed price (admittedly very rough) of about $20,000.  Now of course, tax breaks and other incentives might lower this cost,  but it’s still pretty steep.

The payback on such a system is going to be dependent upon local utility rules.  If “net metering” is permitted, energy generated in excess of what is needed to operate your house can be sold back to the grid.  If net metering is not permitted, battery or some other storage strategy must be deployed to capture the full potential of the system.

If we assume that 35% of our energy use is at night or early morning (for my house, that percentage might be higher) and if our monthly electric bill is $125/month, then a non-net metered installation without energy storage will deliver the following simple payback without tax breaks or other incentives:

$20,000 / ($125/month * 12 months/year * 65%) = 20.5 years.

That’s a long time.  You are effectively operating in the red for 20 years.  So if these panels do not last well past 20 years, the net present value of the installation will be unimpressive (see earlier post on net present value calculations if you want a refresher.)

But this also drives home why the solar industry should be anxious about premature panel failure.  Without a guaranteed useful life that is comfortably longer than the simple payback of a system, photovoltaics cannot be considered a solid investment.