<>31 Jan 2005 >UFTO Note - Waste Heat to Power
UFTO NOTES 2005
<>12 Jan 2005 UFTO Note - Modeling the Grid -- Breakthrough
UFTO Note - Waste Heat to Power
UFTO Note - Modeling the Grid -- Breakthrough
Date: Mon, 31 Jan 2005
There is a lot of interest in recovering waste heat. Combined
Heat and Power (CHP) is at the heart of a large part of the distributed
generation business. The heat can be used to heat water, provide
process heat, and even cooling. Converting waste heat to electric
power is getting attention as well.
On the bleeding edge, there are a number of efforts underway to do it
without moving parts--solid state conversion (of course not limited to
waste heat). There are programs, for example to put
thermoelectric converters on the exhaust manifold of diesel trucks,
with the goal of replacing the alternator. Thermoelectrics,
thermionics, thermophotovoltaics -- all are being pursued with renewed
vigor, in the hope that new physics can overcome the longstanding
problem of high cost and very low efficiency…a subject for another day.
Waste heat gets wasted only because it tends to be hard to use. A
diesel engine converts about 1/3 of the fuel energy to useful work
(electric power, in the case of a genset) -- the rest goes off as waste
heat in cooling water and exhaust -- unless a cost-effective means can
be found to use it, as in CHP.
Making more electric power with the waste heat is another matter. The
age-old Rankine cycle, the basis of all steam power plants, can be made
to work at lower temperatures by using something other than water as
the working fluid ("refrigerant"), typically an organic compound, thus
the term "organic rankine cycle" (ORC). In effect, this is a heat
pump or refrigerator running backwards. Instead of using
mechanical energy to create a temperature difference, mechanical energy
is produced by a temperature difference.
The main challenge isn't the theory, it's the practical difficulty of
doing it. Factors such as temperatures (inlet and outlet), flow
rates, size and type of heat exchangers, type of expander, materials,
controls, etc. must be considered in the trade-offs of cost,
performance, reliability and longevity.
It's a lot harder than it looks. Despite many attempts, and the
obviousness of the basic idea, there are actually not very many
commercial providers of such systems, particularly in smaller sizes
which can operate effectively at lower waste heat temperatures.
UTC, for example, announced it's new "PureCycle" 200 kW unit only last
Fall. It requires inlet temperatures above 500 deg F.
Ormat, (ORA-NYSE) long established ORC maker for geothermal plants, is
moving into the industrial waste heat market. They too need
relatively high temperature, for units in the 250kw - MW range.
They also sell small standalone ORC-based generators which burn a fuel
as the external heat source.
In Europe, one can find Turboden (Italy), Triogen (Netherlands) and
FreePower (UK). All require high temperature, with the possible
exception of FreePower, who say they can operate as low as 230
High temperature means industrial processes that put out high
temperature waste heat. Ormat, for example, has a 1.5 MW showcase
unit that takes air at 520 deg F from a cement plant in Germany.
The water jacket of the lowly diesel engine, however, can only be
allowed to go to around 230 deg F (and the water must be returned no
cooler than around 215 deg F). While such temperatures can be
readily adapted to CHP uses, power conversion is more difficult.
Cooler Power, Inc, a startup in California, has successfully built
units that work in this range. The engine's cooling water is
taken (before it goes to the engine's own radiator), and is fed to a
heat exchanger where it is heated further by the engine exhaust.
In another heat exchanger, the hot water heats and vaporizes the
organic working fluid, which then drives the expander which turns the
generator. The expander is key. In principle, any compressor technology
can work backwards to act as an expander: scroll, screw, turbine, or
piston. All have been used at one time or another. Cooler Power
initially used a scroll, but then developed its own proprietary
modification to a commercially available screw compressor, as the heart
of the system. They have a patent in final review stages covering the
modification and use of the screw expander, as well as the control
system and choice of working fluid.
Cooler Power has proprietary software to develop process flow diagrams
to size and specify components or installations. The
proprietary Program Logic Control (PLC) circuits are designed for
optimal failsafe performance and contain algorithms that are protected
from reverse engineering. Each of the key components (heat
exchangers, expanders, condensers, generators) are designed to
last 20+ years and come from one or more sub-sectors of the
existing industrial equipment industry.
The system can be scaled to fit applications ranging from 50 kW - 1
MW. Installed costs are in the range of $1500-1800/kW.
Depending on the sales price for power, payback can happen in 2 years
or less. It's important to emphasize that this is green power,
which usually enjoys premium pricing. There is no fuel cost;
operating costs are very low; and there are also (monetizable)
A showcase 50 KW beta unit was installed in 1992 at the Newby
Island landfill site in Milpitas, CA, on a 1 MW engine. A new 150
kW system will come on line in March. The company anticipates
installing 10 units in 2005, with rapid growth thereafter based on
already-established customer and marketing relationships, selling both
systems and power. They raising an equity investment round
currently, and welcome both investor and customer interest.
Ray Smith, COO
Cooler Power Inc,
Redwood City, CA
Date: 12 Jan 2005
To start the new year off with a bang, I may be going out on a limb
here, but I don't think so. I hope you'll take a close look at
DOE, EPRI and the entire power industry is abuzz with talk about how
the grid can be operated better. The grand vision comes up hard against
the incredibly difficult problem of modeling. For many decades,
the best mathematicians, operations researchers, utility engineers and
others have struggled to come up with (computerized) representations of
the grid that can guide planners and operators.
Since the beginning, despite ever faster-cheaper computers, and
tremendous innovations in algorithms and computational methods, the
state of the art has been forced to make many bad compromises among
such factors as speed, accuracy, detail, breadth, time domain,
treatment of boundary effects, and applications. Unless corners
are cut, a solution might not be found at all (i.e. converge).
Areas of study and tools are stove-piped into many separate categories
of time-scale and function:
- Real time (sec. to minutes)
optimal power flow, voltage and frequency control, contingency
- Short term (hours to a week)
unit commitment, thermal-hydro coordination
- Annual ( 1-3 years)
maintenance scheduling, rate-design, production costing, hydro
- Long term (3-40 years)
generations expansion, transmission planning, etc.
(see "A Primer on Electric Power Flow for Economists and Utility
Planners" EPRI TR-104604, Feb 1995.)
To make things worse, the industry is highly fragmented and way behind
the curve. Utilities don't have the same cadre of experts
in-house that they used to. Vendors sell "black-box" solutions
that don't live up to promises. Obsolete tools continue to be
used because "everybody else uses them" and "regulators accept them".
(Never mind the results may be worthless.) A guru of power flow
analysis, now retired, told me that much of the industry isn't even
using more powerful real time analysis tools that are over 25 years old.
So there are major institutional problems and technical ones, and the
two are intertwined. Not only is the problem fiendishly hard, but
lot of people also have vested interests in the status quo (e.g.,
experts have devoted entire careers, and don't look kindly at upstart
claims of a breakthrough--just as in every field of human endeavor).
This is a long prologue to a story of just such a claimed
breakthrough. Optimal Technologies appeared on the scene late in
2001, announcing they had analyzed the June 14, 2000 California
blackout, and stating they could have prevented it by fine-tuning the
grid according to results from their analysis tool, AEMPFAST.
Needless to say, the world was not especially open to the idea
that a newcomer had succeeded in coming up with a methodology that did
what so many had sought for so long:
"AEMPFAST is based on a new near-real-time (solves a several thousand
bus system in milliseconds) mathematical approach to network analysis,
optimization, ranking, and prediction called QuixFlow … a proprietary
N-Dimensional (non-linear) analysis, optimization, and ranking engine
that also has defendable predictive capabilities and is applicable to
any problem that can be modeled as a network. … QuixFlow uses no
approximations; it handles multiple objectives; and is able to enforce
multi-objective inequality constraints." [from factsheet - see
I have been closely following the company's progress since then.
Their revolutionary claims are finally beginning to overcome the
natural skepticism and resistance. At least one major ISO/RTO is
signing up, and DOE and a number of large utilities are taking it very
seriously. The implications are, as Donald Trump would say, "huge".
Here is an introduction in the company's own words:
Optimal Technologies is a private company focused on making power-grid
systems more efficient, more reliable, and more cost effective to plan
and operate. In other words, "smarter". Think of Optimal as the
Internet for power grids [or Sonet for telecommunications]
self-healing, self-enabling, lowest cost operation with highest
Problem: Power system infrastructures and the grid networks that
support them are breaking down faster than solutions can be developed
to address the underlying problems.
Because of inadequate core technologies and especially slow and limited
mathematical tools, the utility industry is plagued with many tools
based on algorithms that no longer work well for their intended tasks
and that do not work well together. Last year's blackout that effected
more than 50 million people should help provide some context. Despite
new advances in materials and hardware, blackouts and brownouts are
becoming larger and more common because utility system planning and
control methods are still in the horse and buggy era -- done much as
they were 50 years ago -- fragmented and piecemealed. In other words,
even though system peripherals (such as wind energy, distributed gas
generation, fuel cell generators, meters, and demand-side management)
are improving, the core grid Operating System that makes them all work
well together doesn't exist.
New Technology: Our software and hardware solutions are based on a
revolutionary new mathematical approach to network analysis,
optimization, and management. Our technology is far better than current
approaches to understanding and managing networks, and allows for both
local and integrated, end-to-end views of Generation, Transmission,
Distribution and Load. Unlike competing products, our technology can
view the complete energy delivery supply chain as an integrated asset,
which allows for entirely new levels of risk review and risk management
-- previously not possible. Optimal's new technology should be viewed
as "Foundational" in that it has pervasive application within the power
industry and provides a common framework for many new tools.
Optimal's Solution: Think of us as the much needed underlying
"operating system engine" that integrates, defragments, and prioritizes
utility planning, operations, and business processes in the best
controllable and defendable way. Our technologies have the ability to
simultaneously analyze, optimize, and manage generation, transmission,
distribution and customer load Ð down to the individual power line
and building. Instead of viewing customer load as a problem, our
technology has the ability to make all aspects of the system, including
customer load, potential risk-reducing resources [i.e. reliability
enhancers] not otherwise possible.
Products: Applications include: Congestion Management, Locational
Marginal Pricing, Simultaneous Transfer Limits, Multi-Dimensional
Reliability, Automated Network Planning, Emergency Control, System
Restoration, and Smart Asset Management.
Beyond the scope of this note, Optimal also has a suite of software and
hardware for the demand side, which enables measurement and control --
and optimization -- down to individual loads.
There is a great deal of information on the company's website:
Roland Schoettle, CEO
Optimal Technologies International Inc.
firstname.lastname@example.org 707 557-1788
AEMPFAST FACTSHEET (good starting point)