The Distributed Storage-Generation "Smart" Electric Grid of The Future
The Distributed Storage-Generation "Smart" Electric Grid of The Future
The Distributed Storage-Generation "Smart" Electric Grid of The Future
The Pew Center on Global Climate Change and the National Commission on Energy Policy.
Figure 1. The U. S. electric grid has become more unstable since 1998, with more
failures that affect large populations of customers than extrapolating of the previous 50
years would predict (Background plot from Amin, IEEE Computer Applications in
Power, 2001).
A computerized control capability could be used to model and better understand the
electric grid, yet none exists to date that can visualize the entire North American grid.
Such computer assistance is particularly needed because the grid is exhibiting more and
more behavior characteristic of chaotic systems. Since electricity on the grid is free
flowing, it moves as power pulses at nearly the speed of light (1010 cm per second).
However, in reality, electrons flow back and forth between the power plants and the
consumers at speeds slower than the flow of these power pulses. The electrons can
only move along the copper and aluminum wires and transformers of the transmission
grid at velocities between 107 and 108 cm per second.3 This disparity produces the nonlinear behavior of the system affecting the flow of electricity in unpredictable ways.
In addition to computer software more akin to the internet or air traffic control
systems, new hardware is needed to provide buffers to the cascades of failures that are
caused by congestion and disruptions of the flow of electricityfailures that propagate
across the grid. Power switches that combine thyristors (the electrical equivalent of
transistors which direct information in the computer) to redirect flow with capacitors to
provide buffering storage would give the grid operator the option of redirecting electricity
around obstacles and disturbances at the speeds needed to forestall failures. Currently
flow can only be redirected through the use of mechanical circuit breakers at speeds
adequate for most present, but not future, conditions. There are a few of these
experimental devices installed in the United States but their cost is currently too great for
large-scale usage.
On top of such electronic aid, distributed storage and generation hardware such as
High Temperature Superconductivity (HTSC) storage must be added to the transmission
grid to provide the capability to smooth out intermittent and unpredictable flow from
large-scale wind and solar farms. HTSCs have no resistance to electricity flow at
supercritical temperatures, but if heated up, for example, by an electricity spike, HTCS
become resistive and limit the propagation of the power surge. Surges and sags in power
could then be dealt with in fractions of a second without shutting down the whole system.
This technology is currently too expensive (see table 1), as well as being improperly
incentivized to gain widespread use today. For example, regulators do not allow utilities
to recover the costs of purchasing such equipment through consumer electricity rates. Yet
these large-scale storage systems must be in place before we can decentralize the grid to
accommodate significant amounts of smaller distributed generation (DG) and distributed
storage (DS) capabilities. These kinds of new grid hardware, e.g., HTSC, thyristors, and
capacitorscommonly grouped into the term power controllerstogether with DG
and DS, will make the overall grid network more efficient and stable by flattening out
peak-demand spikes and load variability.
These power controllers would also allow operators to charge a higher fee for
high quality power while no additional fees would be charged to users that dont need
completely stable power. The ability, provided by power controllers, to switch the flow of
electrons is required for this type of dual power system, in which higher quality, more
reliable power can be delivered at an added cost only to those consumers that need it,
while lower-quality, less reliable, less expensive power could be delivered to the rest of
us. The home is little affected by momentary brownouts, but such brownouts wreck a
semiconductor assembly line. Added revenue sources, such as from a dual system, are
needed to attract the private capital that is needed to upgrade and maintain the longdistance transmission system so that it can accommodate vast new wind and solar
farms. Presently, we all get the same high-quality, expensive power (99.999% of the
time it is within a strict range of voltage and frequency, called 5 9s in the power
business).
Table 1. Major new components of the new distributed Storage-Generation Smart Grid
of the future.
20-30 Year-out Technology Needs
There is great promise that high temperature superconductivity and nano-meter
scale technologies will deliver several breakthroughs that could revolutionize the grid 2030 years out. The high temperature superconductor/liquid hydrogen (HTS/LH2) super
energy highway newly proposed by EPRI and DOE might provide the clean and green
energy in both electrical and chemical forms to power urban transportation and electricity
needs simultaneously. This Super Grid would use a high-capacity, superconducting
power transmission cable cooled within a liquid hydrogen pipeline, with the hydrogen
used in fuel cell vehicles and generators. The Super Grid would accelerate the
deployment of HTSC technology by relaxing the stringent requirements and removing a
major cause of failure of existing superconductors. At present HTSC costs are excessive
because of the insulation requirements to keep HTSC devices cold. Placing the HTSC
inside liquid hydrogen pipelines would eliminate the need to insulate them. In such a
system, electricity and hydrogen provide a joint pathway for us to become progressively
less dependent on fossil fuels, reducing GHG and pollutant emissions, and increasing the
capability of the grid to accept large contributions of renewable energy sources.
Nano-scale transmission wires, called quantum wires (QW), might revolutionize
the grid even further. QW has electrical conductivity that is higher than copper at one
sixth the weight, and twice the strength of steel. A grid made up of such transmission
wires would have no line losses or weather dependencies, eliminating the need for
4
massive emergency generation capacity, and the grid could be buried without any special
handling. The transmission wires of the grid, if made from such QW would be virtually
immune to weather-induced outages, especially if laid underground. QW will perhaps be
spun into polypropylene-like rope that is non-corrosive and can be buried forever with
no fear of corrosion and no need for shielding of any kind. However, massive factories
will surely be required to weave the QW in the quantities needed by the grid. There are
more than 700,000 miles of transmission lines in the United States alone at this time.
Barriers to Success
A smarter grid is required to provide efficient, clean, plentiful, safe and secure
energy to power continued economic development with lessened environmental impact.
It could even be a web-enabled, digitally controlled, intelligent delivery system for both
power and transportation services.
How we get there from here is a much harder question. Currently, there are no
incentives for fixing the grid beyond short-term patches like laying additional
transmission wires around congestion. That strategy is much like urban highway
construction the more lanes a city provides, the more traffic the road attracts, producing
more congestion, requiring more lanes, etc.
I believe we must create several national test beds to experiment with how to
deploy new smart grid technologies on a large scale and in an integrated way. Such test
beds would combine promising technologies (see Table1) in various configurations and
experiment with how the system is improved through their use. Designing a smart grid is
difficult to do if individual technologies are deployed in isolation. Such test beds MUST
already be in operation within 10 years if we are to meet the power needs of the continent
20-30 years out. The grid cannot be experimented with live. We must be certain that
the grid is capable of handling each new technology BEFORE it is deployed. It is not an
option to connect new gadgets directly to the grid, and accidentally cause massive,
cascading blackouts. The problem with creating such national test beds is that the
electricity industry has among the lowest R&D expenditures of all companies
(Technology review, 2003). The federal government must recognize the electric grid as
vital to our prosperity and national security. A DARPA-like organization is required.
DARPA (Defense Advanced Research Projects Agency) funded, among other
developments, the Internet and super computers. Such an effort, dedicated to
modernization of the electricity grid, is needed within the U.S. Department of Energy.
1
U.S. DOE. 2003. National Electric Delivery Technologies Vision and Roadmap.
November. 2003. Available for download at: http://www.energetics.com/electric.html
2
A relationship between two variables is defined as being fractal if it is linear when examined in
exponential or log-log spacethat is, the same across many scales.
3
Termed the Fermi velocity.