GT researchers strive to solve tomorrow's energy problems.
If big problems demand big responses, then it’s only appropriate that the nation’s biggest engineering school steps up to address perhaps the nation’s biggest engineering challenge: energy.
Few sectors hold a greater direct impact on our economy—and nearly every aspect of modern life. Energy fuels our household comforts and conveniences, our vehicles, manufacturing and communications. And our demand for power to support our way of living and doing business is rising rapidly. The U.S. Energy Information Administration predicts that an increasing global population and growing economies will drive energy demand up more than 50 percent by 2035.
Reliably and affordably powering the present and near future—while still maximizing America’s energy independence and minimizing the risks of climate change—makes for an incredibly difficult puzzle to solve. But it’s also a challenge tailor-made for Tech, an institution known as much for its scientific and technological innovation as it is for its leadership on issues of great societal importance.
The Georgia Tech Strategic Energy Institute (SEI) brings the breadth and depth of this energy expertise together to define, design and develop a new energy supply and utilization paradigm for the 21st century and beyond. Founded in 2004, SEI engages more than 200 faculty members from diverse disciplines and serves as a conduit for connecting, coordinating and cultivating energy-related resources, expertise and infrastructure across the Institute. And its goal stands as nothing less than to solve the world’s most pressing energy challenges, namely:
- Developing clean, sustainable energy sources that are economically viable;
- Reducing carbon emissions;
- Improving energy consumption through greater efficiency;
- Exploring better ways to utilize existing technologies; and
- Understanding the economic and policy implications for our energy decisions.
“Georgia Tech’s energy program aims not just to address the current needs, but to address them in the context of the future—where the needs will be,” says SEI Executive Director Tim Lieuwen, MS ME 97, PhD ME 99. “We want to raise the important questions before they have been asked and shape the discussion and technology development around these complex issues as the energy landscape continues to evolve.”
Opportunities and challenges abound in all sectors. According to the International Energy Agency, the U.S. shale oil and gas revolution will next year put America ahead of Russia and Saudi Arabia as the world’s top oil producer. Yet, at the same time, Bloomberg New Energy Finance projects that nearly $700 billion will be invested in renewable energy—especially wind and solar power—over the next two decades, achieving 343 gigawatts of renewable energy capacity and revolutionizing the entire energy industry with it.
Of course, whether conventional or renewable, focusing on energy sourcing alone would leave out the equally important distribution problem. The existing electric grid, more than a century old, wasn’t designed to distribute and manage thousands of intermittent renewable energy generating sites and microgrids. To refashion a robust 21st century electric grid out of the emerging new energy infrastructure also requires careful planning and smart solutions.
From improving the technologies for conventional and renewable resources like solar, wind and nuclear, to stabilizing the grid to better support future energy needs, Tech researchers and the Strategic Energy Institute are working beyond traditional research boundaries to address our most pressing energy challenges on virtually every front.
Transforming Solar Power Into Global Power
When he came to Tech from Westinghouse in 1985, Ajeet Rohatgi became Tech’s first full-time solar energy researcher. Today Rohatgi is the John H. Weitnauer Jr. Chair in the College of Engineering and director of the University Center of Excellence for Photovoltaic Research and Education. And in no small part because of Rohatgi’s leadership in solar photovoltaic (PV) research, Tech has helped shepherd along—and develop the talent to staff—the still-nascent solar industry.
“Since 1992, I personally have collaborated with almost 50 companies in the U.S.,” Rohatgi says. “So every company that works in silicon solar cells has interacted with Tech in some form or another.”
And since 2009, Rohatgi has also served as CTO of a Tech spinoff company called Suniva which has 250 employees today building solar PV arrays with 170 megawatts of manufacturing capacity in Georgia, with another 200 megawatt plant now slated to be built in Saginaw County, Michigan. From both the academic and industry perspective, solar PV today is on the cusp of a revolutionary threshold. So-called grid parity, in which solar PV costs no more than any other conventional source, is achievable in the U.S., Rohatgi says.
“The price of [PV] electricity today is between 11 and 15 cents per kilowatt-hour,” he says. “And if you’re in a more sunny climate it is even cheaper. Now we’re within striking distance from grid parity with fossil fuels.”
According to a report issued by Deutsche Bank in October, between 36 and 47 U.S. states (including Georgia) will achieve grid parity by 2016. And just in the six states where solar is already at grid parity—where 90 percent of the U.S.’s solar energy production has been based to date—installed solar capacity is expected to grow sixfold within the next four years.
The U.S. Department of Energy has set a PV goal, Rohatgi says, of 6 cents per kilowatt-hour by 2020, which he adds is tough but achievable. By comparison, residential electric rates in Georgia this past summer averaged 12.55 cents per kilowatt-hour. (The cost to produce one kilowatt-hour of electricity is of course lower, ranging between 3.9 and 5.5 cents for coal and gas and 11.1 to 14.5 cents for nuclear.) At competitive rates, of course, the free market does much of the heavy lifting—leading to otherwise unlikely alliances such as when the Georgia Tea Party joined with local environmentalists last year to force Georgia Power to competitively procure more solar power in the state.
To help ratchet the cost of PV down, Rohatgi’s lab has pioneered the PV application of a manufacturing technique used in semiconductor manufacturing for computer chips. Called ion implantation, the process involves depositing charged ions (in this case boron and phosphorous) by accelerating them through an electric field and impacting these fast ions on a silicon substrate. The newly doped silicon wafer will then—if the ion implantation is done right—respond to sunlight striking it with a slightly stronger kick, yielding a slightly more efficient solar cell.
Such clever efficiencies are being developed and rolled out in academic and industry research labs all over the world today. And Rohatgi’s teams at Tech, as well as at Suniva, are making the Department of Energy’s 2020 goal closer and closer to reality.
“The price will continue to come down,” Rohatgi says. “There is a lot of technology innovation left. This is our expertise at Georgia Tech, We develop disruptive technologies. We develop innovative solar cell design, and we develop novel concepts that simplify the fabrication of advanced solar cells.”
Capturing Concentrated Solar for Thermal Energy
On the other hand, says Asegun Henry, assistant professor in Tech’s mechanical engineering school, the sun is either hiding behind clouds or below the horizon at least half the time on any given day. This means for solar energy to be a robust, grid-wide energy source, it also requires solar energy storage. And that’s where his technology comes in.
“I work on concentrated solar power,” he says. “It’s different from solar panels. It’s where you use sunlight as a heat source. You run a typical thermal power plant, which is how more than 90 percent of electricity is made today—through thermal energy, through heat.”
The idea behind a concentrated solar thermal power plant, such as the 110-megawatt Crescent Dunes plant under construction in northern Nevada, is to use arrays of mirrors that track the sun’s motion throughout the day. The mirrors all concentrate the sunlight to a single location on a central tower where salt is then heated and stored in large tanks. The tanks are so large, the molten salt stays hot for more than a month without freezing. Henry says these thermal storage tanks retain more than 99 percent of their heat during a typical use cycle. Then when it’s needed, whether at night or during the day, the molten salt is used to boil water to spin steam turbines and generate electricity just like any other coal-fired power station.
Although these types of plants are now being deployed around the world, the cost is still too high to compete directly with fossil fuels without subsidies. Henry’s group is helping to develop a new generation of solar thermal plant that he says will improve on the molten salt plant’s design and efficiency. “Our goal is to reduce the cost even more than what it is now,” Henry says. “We want to operate the plant at higher temperatures, so that it can produce power more efficiently.”
In the target temperature ranges his group studies, 1,350 to 1,500 degrees Celsius (2460 to 2730 degrees Fahrenheit), the best thermal storage medium Henry has yet found is liquid metal. It’s nowhere near its boiling point at such temperatures, and it’s otherwise very stable and unreactive, he says. However, metal pipes and valves no longer work at these temperatures, so Henry says his group is also developing ceramic pipes and valves for moving and storing the molten metal.
The good news, he says, is that conventional gas turbines also operate at these temperatures, so the equipment, designs and infrastructure for generating electricity is already well known and in place. The bad news is solar thermal probably won’t be cost effective outside arid climates such as Nevada.
Instead, Henry says, solar thermal could provide a powerful and effective new way to make fuels such as hydrogen, which can be stored and used anywhere in a fuel cell. And, of course, electricity is also transportable. In the future, desert-located solar thermal plants could generate an excess of electricity and sell the renewable resource to less-arid neighbors—something that Hydro Quebec actually does today, often generating more hydroelectric power than the Canadian province needs and selling its excess to other utilities in the U.S. and Canada.
A fellow assistant professor in mechanical engineering, Peter Loutzenhiser, has also been researching solar technologies that would make fuels such as hydrogen and carbon monoxide. These fuels can be transformed in hydrocarbons like petroleum and jet fuel via known chemical processes.
Using his lab’s seven 6-kilowatt xenon arc lamps as a source of artificial sunlight, Loutzenhiser and his students can simulate solar collector facilities that concentrate sunlight to 2,500 times its regular strength using mirrors. (As he notes in an introductory video on YouTube, his lab’s arc lamps pack enough heat that they can burn a hole in a half-inch thick steel plate in less than one minute.) His group experiments with using a two-stage process to transform water and CO2 to hydrogen and carbon monoxide by harnessing the thermal energy from his solar simulator.
Loutzenhiser says he thinks this technology could in the future be competitive with traditional oil and gas for making gasoline and jet fuels, albeit in this case these are fuels that also have no carbon footprint.
“This is a mechanism to mitigate CO2 emission,” he says. “It also acts as a means to decrease reliance on foreign entities for our fuels. We have the potential to transform the Southwest United States into a fuel processing station where we could produce fuels during the day and transport and use them throughout the country without significant changes in existing infrastructure. We could essentially use sunlight to drive all our transportation processes.
“The potential is exciting,” Loutzenhiser says. “The challenges are enormous, but that is why we are doing it.”
Engineering Fossil Fuels for Efficiency—
and a Low-Carbon Future
While solar technologies appear promising for the future, fossil fuels are, of course, still the world’s energy driving force of today. Some Georgia Tech faculty want to optimize fossil technologies to ensure we burn less for the same power output while also perhaps capturing and storing some of the carbon dioxide these fossil fuels produce.
David Sholl, the Michael Tennenbaum Family Chair and GRA Eminent Scholar in Energy Sustainability in the School of Chemical and Biomolecular Engineering, says his group is investigating a more sensible form of carbon capture for coal-fired power plants.
“In a conventional coal power plant, you burn the coal and end up with CO2 and various other gas products,” Sholl says. “If you want to avoid emitting that CO2, you then have to capture the CO2. But the alternative is to treat the coal in a pre-combustion way so you get the CO2 in a purer and more condensed form, which greatly reduces the economic cost of sequestration or processing.”
One technique Sholl’s group has studied involves developing metallic thin-film membranes through which a vapor gas stream from heated coal is passed. The membranes Sholl and his colleagues are investigating let only hydrogen gas through. The rest, including both carbon monoxide and carbon dioxide, can be diverted into other streams in which the CO2 might then be processed or sequestered. The gasified coal thus separates out into pure hydrogen, which can be burned in a fuel cell or converted to other fuels. And the CO2 stream can be stored underground or even channeled into shale formations to drive natural gas back up.
“That’s a key part of the Georgia Tech approach to this issue,” Sholl says. We can’t just look at it in isolation as a scientific solution. You have to think about it technologically scaled, too. It’s almost a slogan, but we really take seriously the idea that we should be trying to change the world. Our goal is not just to publish a paper, but we really want to imagine technologies that people will use. We also have very strong collaborations with industrial partners. So all these things involving large-scale energy generation require really enormous resources to make them work in the long run.”
Harnessing the Power of Wind
The spinning blades of a gas turbine certainly provide one unlikely application of aeronautical engineering expertise. The other is wind turbines. Lieuwen’s colleague Lakshmi Sankar, MS AE 75, PhD AE 77, Regents Professor and Associate Chair of the School of Aerospace Engineering, asserts that next-generation wind turbines can benefit from lessons learned in designing helicopter blades for the military.
Wind energy is today probably the closest any renewable power source has yet come to grid parity. One 2013 study by the U.S. Department of Energy’s Lawrence Berkeley Lab pegs well-sited wind power at 2.5 cents per kilowatt-hour, cheaper than the cheapest coal. According to the U.S. Energy Information Administration, new wind farm production this year has also doubled compared to 2013, seeing 675 megawatts of additional capacity added, especially in California, Nebraska, Michigan and Minnesota.
Yet for all the boom times, wind energy’s price tag can soar when wear and tear on wind turbines lead to not only repair bills, but also days or even weeks offline diagnosing the problem and waiting for the repair.
“One of the biggest issues currently is fatigue of the blades and the gears,” Sankar says. “And when they fail, wind turbines go out of commission till they can replace it. And it takes a lot of money and a lot of down time.”
For the Army and private military contractors, Sankar and his coworkers have explored next-generation helicopter blades that contain tiny pressurized air nozzles along their surface, a little like an air hockey table. His group discovered that carefully choreographed small blasts of compressed air can stabilize the blades and greatly reduce vibrations on the moving parts, especially during high winds and gusty weather.
Sankar is now trying to convince commercial wind turbine manufacturers to recognize the wisdom in investing $1 to $1.5 million in a system that can save $20 million turbines from having to be overhauled or replaced entirely.
“These things have to last 30 years like a house in order to pay themselves off,” he says. “So our selling point is that you can extend the life of the system—not only the blade. But because the blades are connected to the gears, you can protect the life of the gears. And because the gears are connected to the generator, you can protect the life of the generator.”
In addition to the aerospace engineering component of wind energy research, Senior Research Associate Mary Hallisey Hunt says earth and atmospheric sciences researchers at Tech are working on wind resource assessment to assist with siting wind farms where wind patterns and weather are most favorable. Structural and civil engineers are working on wind farm foundation design, she says, referring to a SEI-sponsored workshop last year on campus at which the gathered international experts considered how to build the offshore wind turbine foundations most able to withstand the offshore environment’s surf and strains.
“There’s a lot of work going on with different technology pieces on campus—and also policy assessment pieces,” Hunt says. “We’re working with the state to help understand permitting issues associated with siting wind energy facilities. This will allow people looking to potentially develop wind energy facilities to have more fact-based information readily available to identify the best locations.”
Georgia Tech alumni are also active in the wind energy industry. Tech alumnus Brian O’Hara, ME 97, president of the Southeastern Coastal Wind Coalition, is guiding policy makers toward more informed choices in planning, siting and developing wind energy in the Southeast. For his organization, all eyes are now on Virginia where the utility Dominion Virginia Power in collaboration with the U.S. Department of Energy is developing offshore test wind turbines in the waters off Camp Pendleton.
“There’s an economic development opportunity here,” O’Hara says. “Think about the offshore wind industry. We don’t have anything yet in the U.S. But look at the size of these machines. And we’ve seen where once there’s sufficient demand, manufacturing facilities are going to be located on the coast. Many of these machines are just too large to build inland and transport over land. So there’s a really big manufacturing and supply chain opportunity.”
Bringing on the Nuclear Renaissance
In addition to wind, solar and improved fossil, next-generation nuclear energy is also part of Tech’s and SEI’s broad-spectrum energy research. Farzad Rahnema, Chair of the Nuclear and Radiological Engineering/Medical Physics Programs, says the programs are investigating two promising nuclear reactor technologies that could yield increased safety, improved waste end products and powerful, affordable and compact design.
The Integral Inherently Safe Light Water Reactor (I2S-LWR), Rahnema says, is a promising compact Pressurized Water Reactor (PWR) concept, one that boasts both inherent safety and enhanced accident tolerance. I2S-LWR combines the large power of current reactors that are economical with a self-contained, “integral” reactor design, reducing the external penetrations into the vessel. And, as a result, the design has fewer possible points of breaks or leakage. And, he says, Georgia Tech leads the multi-university and multi-organization effort alongside a nuclear reactor vendor (Westinghouse) and a national laboratory (INL).
“This is a three-year project, and we’re 18 months into it,” Rahnema says. “By the end of three years we’ll be able to show the viability of the concept. And from there it’s just a matter of someone getting interested in that and taking that further.”
Georgia Tech is also leading a team of researchers that includes universities, a national lab (Oak Ridge) and a nuclear reactor vendor (AREVA) to advance another new reactor concept built around a solid-fuel with high operating temperatures. It’s called the FHR (Fluoride-salt-cooled, High-temperature Reactor), and the “F” in its name comes from the main component of its coolant. Unlike most conventional water-cooled reactors, FHR is cooled by a salt made of fluoride, lithium and beryllium. And because of the coolant’s high boiling point, the reactor can live up to the “H” in its name, too, with its high temperatures allowing it to run at higher efficiency.
“It’s inherently much safer,” Rahnema says. “And it’s also very economical because of the high efficiency. You can use the high-temperature heat for applications other than electricity, such as process heat. So it’d be a dual purpose reactor.”
The deployment of FHR technology, he says, promises benefits including passive safety, expansion of nuclear power beyond just electricity generation and what he calls proliferation-resistant nuclear waste—meaning reactor end products that are even more difficult to divert or proliferate compared to waste streams from conventional nuclear reactors.
However, challenges also remain before FHRs can be deployed, mostly related to their technology readiness. The Georgia Tech team will help to commercialize FHRs by addressing some of the remaining technology challenges such as removing radioactive hydrogen (tritium) generated in nuclear reactions in FHR’s coolant, removing impurities in the liquid salt coolant that come in during the reactions, reducing the salt’s corrosion on the reactor vessel itself, and selecting and testing the alloys in the reactor to withstand both the radiation and salty, corrosive operating conditions.
The FHR project, Rahnema says, will start at the beginning of 2015.
A little like Asegun Henry’s high-temperature solar thermal system, FHR could, for instance, make synthetic fuels like hydrogen while also powering conventional steam boilers to generate electricity.
“It’s just a matter of nuclear being competitive to other sources of electricity like gas,” he says. “I think we can make things more attractive to the public. Because if they’re even safer than the [present] generation reactors, they would attract proponents in terms of being more receptive to nuclear.”
Moving Energy Along on a 21st Century Grid
Whether mainstream or alternative energy, achieving the right mix of power generation technologies is just the first half of the battle. Delivering that energy to the customer via the country’s electrical grid is equally important.
Georgia Tech’s National Electric Energy Testing, Research and Applications Center (NEETRAC) is a crucial clearinghouse for grid-related research, including such important areas as reliability, security and the deployment of new technologies designed to keep the grid robust and efficient. Part of NEETRAC’s mandate is to help the electric utility industry better manage and maintain their physical infrastructure, including the power lines, transformers and substations that make up the grid. In the U.S. and Canada, 300,000 miles of high-voltage transmission lines deliver power generated from more than 2,100 power plants. Yet most of these transmission lines date from the 1960s, ’70s and ’80s—making our country’s aging electric grid vulnerable, while at the same time the grid must adjust to accept a broadening mix of alternative energy sources such as wind and solar.
“We anticipate that the nature of the grid is going to evolve tremendously over the next 20 or so years, primarily to adapt to the new idea that a lot of the sources of electricity will be widely distributed renewable generation—from solar and wind power primarily and perhaps other sources as well,” says NEETRAC Director Richard Hartlein, ME 76, MS ME82.
“The grid was never designed to accept small inputs of generation at multiple points along the grid. It was always designed to accept generation from very large sources like nuclear, fossil and hydroelectric plants. And that energy would be transmitted at high voltage to substations and on to the customer via lower voltage distribution lines. This new concept of injecting many smaller energy sources into the grid requires us to rethink how we designed and manage the grid. A lot of new, innovative technologies will be needed for the grid to operate reliably and efficiently under this new paradigm.”
NEETRAC’s activities are quite broad. The center has worked with academic faculty to develop computer models that help utilities better integrate distributed energy sources to the grid. The center also runs a high-voltage lab that studies future power line designs for above ground (overhead) and underground high voltage transmission lines. According to Hartlein, the latter is becoming increasingly attractive because right-of-way for above ground lines is harder and harder for utilities to secure.
“There’s an increasing need to put some transmission lines underground,” he says. “That means taking a bare conductor energized at 230,000 volts or 345,000 volts, covering it with insulation and installing it underground, which is a non-trivial thing to do. There are companies gearing up to supply the industry with these higher-voltage underground cable systems. Because NEETRAC has a high-voltage laboratory and significant high-voltage testing expertise, we have been involved in helping manufacturers and utilities prove out these complex, high-tech cable systems.”
On the evolution of the electricity industry, Santiago Grijalva—SEI associate director and Georgia Power Distinguished Professor of Electrical and Computer Engineering—says the electricity industry is undergoing a broad transformation that includes distributed generation sources being integrated into the grid, sensing and communication being overlaid throughout the system, and evolving utility business models.
Enabled by sensors and pervasive information, the traditional consumer is becoming more aware of energy utilization and capable of local energy management. Equipped with affordable distributed solar, demand response capability, and sometimes energy storage, the consumer is becoming a “prosumer” who not only intelligently consumes but produces, stores and offers energy and energy-related services to the grid.
Energy prosumers are emerging as an intelligent and economically motivated entity in microgrids, buildings, homes and EVs. They want to be in control, want energy on their terms, want value and services, and want to contribute to broader sustainability challenges. While prosumers could disconnect from the grid, being disconnected as a default operating mode is suboptimal and unreliable. They will remain grid-connected and the question is: How can utilities and the emerging prosumers coordinate the flow of energy, exchange of services and decision-making needed to maintain a reliable, profitable and sustainable grid?
A Georgia Tech project team, supported by the U.S. Department of Energy’s Advanced Research Program Agency-Energy (ARPA-E), has been working on these very issues for the last three years. This collaborative research effort brings experts from the domains of real-time power system operation, autonomous and networked control, cyber-physical systems, and stochastic distributed decision-making. Because the electricity infrastructure is expensive and very large, Grijalva says it cannot be replaced. Thus, a primary mechanism to increase the flexibility of the grid is to develop advanced control approaches based on a decentralized model.
“You don’t have 10,000 devices to monitor and control anymore,” Grijalva says. “You may have a billion smart energy devices. The centralized grid control architecture is not scalable to these massive amounts. So one of the things we’re developing is a decentralized control paradigm.” A redesign of the grid should more closely resemble the Internet’s decentralized structure than a centrally controlled network. In simple terms, he says, grid operators need to think of their customers less as passive nodes and more as prosumers each—whether consciously or via enabling technologies—seeking to maximize their energy use and minimize their costs.
“I see the current developments on grid modernization as the tip of the iceberg,” Grijalva continues. “We’re changing a century-old industry into something dramatically different. It’s a very exciting time, but there’s lots of work ahead. I sometimes tell my students to look at our laboratory library. Probably half of the books on power systems will have to be rewritten in the next decade.”
“It is not only an engineering problem,” he says. “There is no magic bullet that solves everything. There’s a need of architecture and system-wide understanding. Georgia Tech has the capability to understand these problems in a deep manner, going from the devices to systems, and to policy, market and business models.”
This originally appeared in Vol. 90, No. 4 of the Georgia Tech Alumni Magazine.