Any effective plan to tackle climate change hinges on a basic technology: long wires strung across tall towers.
The US needs to add hundreds of thousands of miles of transmission lines in the coming decades to weave together fragmented regional power systems into an interconnected grid capable of supporting a massive influx of renewables.
A national network of short spur lines and long-distance, high-voltage wires would deliver wind, solar and hydroelectric power to where it’s needed when it’s available across the country. It could help provide reliable backup power when heat waves or winter storms cause regional power shortages, and keep up with soaring demands as homes and businesses increasingly come to rely on electricity to power their vehicles, heating systems, and more.
It’s a grand vision with a few serious flaws. For starters, it will be massively expensive. A Princeton-led study found it will take an additional $350 billion for the US to develop the transmission capacity needed just in the next nine years. That’s under a scenario in which wind and solar provide half of the country’s electricity by 2030, putting the nation on track to zero out emissions by midcentury.
Even if the government and businesses free up the necessary funds, there’s a trickier challenge ahead: states, counties, cities, and towns across the nation would need to quickly sign off on a multitude of new transmission lines. And the US has become terrible at permitting such multi-state projects.
A series of efforts to deliver cheap, clean hydro power from Canada, wind from the Great Plains, and a blend of renewables from the Southwest have been mired in legal battles for years, or rejected, often because a single region balked at having the wires cut through its land. Even those large grid projects that do get built can easily take a decade to work through the approvals process.
Some help may finally be on the way. The roughly $1 trillion infrastructure package moving forward in the Senate, which has bipartisan support, provides billions of dollars for transmission lines. It also includes some provisions that might prove even more important than the money, by enhancing and clarifying federal power over project approvals.
Still, the package would represent just a small down payment on the investments and permitting changes that will be required.
The US doesn’t have a single grid. It has three creaking, disconnected systems, largely built around the middle of the last century, with limited abilities to swap electricity across states and larger regions.
The isolated grids mean that electricity from fluctuating sources like solar and wind can only be shipped so far, wasting some portion of the output and driving down prices when generation outstrips regional demand during particularly windy and sunny periods (which is occurring more and more as those sources make up a greater share of the electricity supply). For instance, California can’t send its excess solar power to Midwest cities during the middle of a summer day, or draw on the steady wind power from, say, Oklahoma when the sun starts to dip on the West Coast.
But operators of an integrated grid could tap into the lowest-cost electricity available across a far larger area and deliver it to distant places with high demand, notes Doug Arent, an executive director at the National Renewable Energy Laboratory.
Long-range, high-voltage transmission lines also enable more development of solar, wind, hydro, and geothermal plants in the regions blessed with the weather, geology, or waterways to supply them. That’s because developers will be able to count on larger customer bases in cities that may be a time zone or two away.
A recent Lawrence Berkeley lab presentation noted there’s already more than 750 gigawatts of power generation proposals in the queue across five regions of the US, awaiting transmission connections that could deliver the electricity to customers. The vast majority of them are solar and wind projects. (By way of comparison, the US’s entire fleet of large-scale plants can generate a little more than 1,100 gigawatts.)
Other countries are zipping ahead in this area. China has emerged as the world’s clear leader in high-voltage transmission, building tens of thousands of miles of these lines to connect its power plants with cities across the vast nation. But while China developed 260 gigawatts of transmission capacity between 2014 and 2021, all of North America added just seven, according to a survey conducted by Iowa State University.
“The US is lagging behind, yet it has every reason to catch up,” James McCalley, a professor of power systems engineering at Iowa State University and a coauthor of a national grid study published late last year, said in a statement.
A fraction of what’s needed
So how could the US begin to close that gap?
First, it will need more money. While the Biden administration has boasted that the infrastructure package provides $73 billion for “clean energy transmission,” those funds are spread across a wide array of efforts, including research and development as well as demonstration projects in areas like carbon capture and clean hydrogen.
The current version of the infrastructure package sets aside only about $10 billion to $12 billion specifically for erecting transmission towers and wires, notes Rob Gramlich, president of power consulting firm Grid Strategies.
That’s a fraction of the amount the Princeton study found the US will need to put in work in the next nine years. While federal spending is designed to unlock private capital, the US would still need to invest tens of billions more to get to the necessary scales this decade, says Jesse Jenkins, a coauthor of the Princeton study and an assistant professor at the university.
The bill also establishes a $2.5 billion revolving loan program for projects, which effectively makes the Department of Energy the initial customer for new transmission lines. This federal financing could help get time-consuming but necessary transmission projects under way before the developer has lined up customers. That could ease the perpetual chicken-and-egg problem between building more electricity generation and constructing the lines needed to transport it, observers say.
Eventually the federal government can sell those rights to clean electricity plants that need access to the lines as they come online.
It’s a promising policy tool that “just needs another zero in that budget line,” Jenkins says.
Though short on money, the proposed infrastructure bill does address approval logjams.
A long-running challenge in many parts of the US is that electricity generating capacity and energy demands grow faster than transmission systems. People and businesses want cheap, reliable electricity, but few embrace the necessary towers and wires—especially if they seem to deliver electricity and economic benefits mostly to far-off areas. There are often aesthetic, environmental, social justice, and business competition criticisms as well.
“If we are going to meet our climate goals, we have to figure out ways to approve these big transmission projects—and historically we’ve struggled to do so,” said Lindsey Walter, deputy director of the climate and energy program at Third Way, a center-left think tank in Washington, DC, in an email.
A 2005 energy law sought to address these tensions, granting the Federal Energy Regulatory Commission (FERC) the ability to step in and sign off on projects that could alleviate transmission constraints in certain areas designated national electric transmission corridors. But so far, the Department of Energy has only designated two such areas, in the mid-Atlantic and in Southern California.
In addition, a federal court of appeals ultimately limited FERC’s authority, finding it only had the right to sign off on projects if states or other jurisdictions held up an application for more than a year. It did not have the ability to overrule state rejections of applications under the law, the court ruled.
Facebook wants machines to see the world through our eyes
For the last two years, Facebook AI Research (FAIR) has worked with 13 universities around the world to assemble the largest ever data set of first-person video—specifically to train deep-learning image-recognition models. AIs trained on the data set will be better at controlling robots that interact with people, or interpreting images from smart glasses. “Machines will be able to help us in our daily lives only if they really understand the world through our eyes,” says Kristen Grauman at FAIR, who leads the project.
Such tech could support people who need assistance around the home, or guide people in tasks they are learning to complete. “The video in this data set is much closer to how humans observe the world,” says Michael Ryoo, a computer vision researcher at Google Brain and Stony Brook University in New York, who is not involved in Ego4D.
But the potential misuses are clear and worrying. The research is funded by Facebook, a social media giant that has recently been accused in the US Senate of putting profits over people’s well-being—as corroborated by MIT Technology Review’s own investigations.
The business model of Facebook, and other Big Tech companies, is to wring as much data as possible from people’s online behavior and sell it to advertisers. The AI outlined in the project could extend that reach to people’s everyday offline behavior, revealing what objects are around your home, what activities you enjoyed, who you spent time with, and even where your gaze lingered—an unprecedented degree of personal information.
“There’s work on privacy that needs to be done as you take this out of the world of exploratory research and into something that’s a product,” says Grauman. “That work could even be inspired by this project.”
The biggest previous data set of first-person video consists of 100 hours of footage of people in the kitchen. The Ego4D data set consists of 3,025 hours of video recorded by 855 people in 73 different locations across nine countries (US, UK, India, Japan, Italy, Singapore, Saudi Arabia, Colombia, and Rwanda).
The participants had different ages and backgrounds; some were recruited for their visually interesting occupations, such as bakers, mechanics, carpenters, and landscapers.
Previous data sets typically consisted of semi-scripted video clips only a few seconds long. For Ego4D, participants wore head-mounted cameras for up to 10 hours at a time and captured first-person video of unscripted daily activities, including walking along a street, reading, doing laundry, shopping, playing with pets, playing board games, and interacting with other people. Some of the footage also includes audio, data about where the participants’ gaze was focused, and multiple perspectives on the same scene. It’s the first data set of its kind, says Ryoo.
This NASA spacecraft is on its way to Jupiter’s mysterious asteroid swarms
Lucy will take black-and-white and color images, and use a diamond beam splitter to shine far-infrared light at the asteroids to take their temperature and make maps of their surface. It will also collect other measurements as it flies by. This data could help scientists understand how the planets may have formed.
Sarah Dodson-Robinson, an assistant professor of physics and astronomy at the University of Delaware, says Lucy could offer a definitive time line for not only when the planets originally formed, but where.
“If you can nail down when the Trojan asteroids formed, then you have some information about when did Jupiter form, and can start asking questions like ‘Where did Jupiter go in the solar system?’” she says. “Because it wasn’t always where it is now. It’s moved around.”
And to determine the asteroids’ ages, the spacecraft will search for surface craters that may be no bigger than a football field.
“[The Trojans] haven’t had nearly as much colliding and breaking as some of the other asteroids that are nearer to us,” says Dodson-Robinson. “We’re potentially getting a look at some of these asteroids like they were shortly after they formed.”
On its 4-billion-mile journey, Lucy will receive three gravity assists from Earth, which will involve using the planet’s gravitational force to change the spacecraft’s trajectory without depleting its resources. Coralie Adam, deputy navigation team chief for the Lucy mission, says each push will increase the spacecraft’s velocity from 200 miles per hour to over 11,000 mph.
“If not for this Earth gravity assist, it would take five times the amount of fuel—or three metric tons—to reach Lucy’s target, which would make the mission unfeasible,” said Adam during an engineering media briefing also held on October 14.
Lucy’s mission is slated to end in 2033, but some NASA officials already feel confident that the spacecraft will last far longer. “There will be a good amount of fuel left onboard,” said Adam. “After the final encounter with the binary asteroids, as long as the spacecraft is healthy, we plan to propose to NASA to do an extended mission and explore more Trojans.”
Reimagining our pandemic problems with the mindset of an engineer
The last 20 months turned every dog into an amateur epidemiologist and statistician. Meanwhile, a group of bona fide epidemiologists and statisticians came to believe that pandemic problems might be more effectively solved by adopting the mindset of an engineer: that is, focusing on pragmatic problem-solving with an iterative, adaptive strategy to make things work.
In a recent essay, “Accounting for uncertainty during a pandemic,” the researchers reflect on their roles during a public health emergency and on how they could be better prepared for the next crisis. The answer, they write, may lie in reimagining epidemiology with more of an engineering perspective and less of a “pure science” perspective.
Epidemiological research informs public health policy and its inherently applied mandate for prevention and protection. But the right balance between pure research results and pragmatic solutions proved alarmingly elusive during the pandemic.
We have to make practical decisions, so how much does the uncertainty really matter?
“I always imagined that in this kind of emergency, epidemiologists would be useful people,” Jon Zelner, a coauthor of the essay, says. “But our role has been more complex and more poorly defined than I had expected at the outset of the pandemic.” An infectious disease modeler and social epidemiologist at the University of Michigan, Zelner witnessed an “insane proliferation” of research papers, “many with very little thought about what any of it really meant in terms of having a positive impact.”
“There were a number of missed opportunities,” Zelner says—caused by missing links between the ideas and tools epidemiologists proposed and the world they were meant to help.
Giving up on certainty
Coauthor Andrew Gelman, a statistician and political scientist at Columbia University, set out “the bigger picture” in the essay’s introduction. He likened the pandemic’s outbreak of amateur epidemiologists to the way war makes every citizen into an amateur geographer and tactician: “Instead of maps with colored pins, we have charts of exposure and death counts; people on the street argue about infection fatality rates and herd immunity the way they might have debated wartime strategies and alliances in the past.”
And along with all the data and public discourse—Are masks still necessary? How long will vaccine protection last?—came the barrage of uncertainty.
In trying to understand what just happened and what went wrong, the researchers (who also included Ruth Etzioni at the University of Washington and Julien Riou at the University of Bern) conducted something of a reenactment. They examined the tools used to tackle challenges such as estimating the rate of transmission from person to person and the number of cases circulating in a population at any given time. They assessed everything from data collection (the quality of data and its interpretation were arguably the biggest challenges of the pandemic) to model design to statistical analysis, as well as communication, decision-making, and trust. “Uncertainty is present at each step,” they wrote.
And yet, Gelman says, the analysis still “doesn’t quite express enough of the confusion I went through during those early months.”
One tactic against all the uncertainty is statistics. Gelman thinks of statistics as “mathematical engineering”—methods and tools that are as much about measurement as discovery. The statistical sciences attempt to illuminate what’s going on in the world, with a spotlight on variation and uncertainty. When new evidence arrives, it should generate an iterative process that gradually refines previous knowledge and hones certainty.
Good science is humble and capable of refining itself in the face of uncertainty.
Susan Holmes, a statistician at Stanford who was not involved in this research, also sees parallels with the engineering mindset. “An engineer is always updating their picture,” she says—revising as new data and tools become available. In tackling a problem, an engineer offers a first-order approximation (blurry), then a second-order approximation (more focused), and so on.
Gelman, however, has previously warned that statistical science can be deployed as a machine for “laundering uncertainty”—deliberately or not, crappy (uncertain) data are rolled together and made to seem convincing (certain). Statistics wielded against uncertainties “are all too often sold as a sort of alchemy that will transform these uncertainties into certainty.”
We witnessed this during the pandemic. Drowning in upheaval and unknowns, epidemiologists and statisticians—amateur and expert alike—grasped for something solid as they tried to stay afloat. But as Gelman points out, wanting certainty during a pandemic is inappropriate and unrealistic. “Premature certainty has been part of the challenge of decisions in the pandemic,” he says. “This jumping around between uncertainty and certainty has caused a lot of problems.”
Letting go of the desire for certainty can be liberating, he says. And this, in part, is where the engineering perspective comes in.
A tinkering mindset
For Seth Guikema, co-director of the Center for Risk Analysis and Informed Decision Engineering at the University of Michigan (and a collaborator of Zelner’s on other projects), a key aspect of the engineering approach is diving into the uncertainty, analyzing the mess, and then taking a step back, with the perspective “We have to make practical decisions, so how much does the uncertainty really matter?” Because if there’s a lot of uncertainty—and if the uncertainty changes what the optimal decisions are, or even what the good decisions are—then that’s important to know, says Guikema. “But if it doesn’t really affect what my best decisions are, then it’s less critical.”
For instance, increasing SARS-CoV-2 vaccination coverage across the population is one scenario in which even if there is some uncertainty regarding exactly how many cases or deaths vaccination will prevent, the fact that it is highly likely to decrease both, with few adverse effects, is motivation enough to decide that a large-scale vaccination program is a good idea.
An engineer is always updating their picture.
Engineers, Holmes points out, are also very good at breaking problems down into critical pieces, applying carefully selected tools, and optimizing for solutions under constraints. With a team of engineers building a bridge, there is a specialist in cement and a specialist in steel, a wind engineer and a structural engineer. “All the different specialties work together,” she says.
For Zelner, the notion of epidemiology as an engineering discipline is something he picked up from his father, a mechanical engineer who started his own company designing health-care facilities. Drawing on a childhood full of building and fixing things, his engineering mindset involves tinkering—refining a transmission model, for instance, in response to a moving target.
“Often these problems require iterative solutions, where you’re making changes in response to what does or doesn’t work,” he says. “You continue to update what you’re doing as more data comes in and you see the successes and failures of your approach. To me, that’s very different—and better suited to the complex, non-stationary problems that define public health—than the kind of static one-and-done image a lot of people have of academic science, where you have a big idea, test it, and your result is preserved in amber for all time.”