Connect with us

Tech

Securing the energy revolution and IoT future

Published

on

Securing the energy revolution and IoT future


In early 2021, Americans living on the East Coast got a sharp lesson on the growing importance of cybersecurity in the energy industry. A ransomware attack hit the company that operates the Colonial Pipeline—the major infrastructure artery that carries almost half of all liquid fuels from the Gulf Coast to the eastern United States. Knowing that at least some of their computer systems had been compromised, and unable to be certain about the extent of their problems, the company was forced to resort to a brute-force solution: shut down the whole pipeline.

Leo Simonovich is vice president and global head of industrial cyber and digital security at Siemens Energy.

The interruption of fuel delivery had huge consequences. Fuel prices immediately spiked. The President of the United States got involved, trying to assure panicked consumers and businesses that fuel would become available soon. Five days and untold millions of dollars in economic damage later, the company paid a $4.4 million ransom and restored its operations.

It would be a mistake to see this incident as the story of a single pipeline. Across the energy sector, more and more of the physical equipment that makes and moves fuel and electricity across the country and around the world relies on digitally controlled, networked equipment. Systems designed and engineered for analogue operations have been retrofitted. The new wave of low-emissions technologies—from solar to wind to combined-cycle turbines—are inherently digital tech, using automated controls to squeeze every efficiency from their respective energy sources.

Meanwhile, the covid-19 crisis has accelerated a separate trend toward remote operation and ever more sophisticated automation. A huge number of workers have moved from reading dials at a plant to reading screens from their couch. Powerful tools to change how power is made and routed can now be altered by anyone who knows how to log in.

These changes are great news—the world gets more energy, lower emissions, and lower prices. But these changes also highlight the kinds of vulnerabilities that brought the Colonial Pipeline to an abrupt halt. The same tools that make legitimate energy-sector workers more powerful become dangerous when hijacked by hackers. For example, hard-to-replace equipment can be given commands to shake itself to bits, putting chunks of a national grid out of commission for months at a stretch.

For many nation-states, the ability to push a button and sow chaos in a rival state’s economy is highly desirable. And the more energy infrastructure becomes hyperconnected and digitally managed, the more targets offer exactly that opportunity. It’s not surprising, then, that an increasing share of cyberattacks seen in the energy sector have shifted from targeting information technologies (IT) to targeting operating technologies (OT)—the equipment that directly controls physical plant operations. 

To stay on top of the challenge, chief information security officers (CISOs) and their security operations centers (SOCs) will have to update their approaches. Defending operating technologies calls for different strategies—and a distinct knowledge base—than defending information technologies. For starters, defenders need to understand the operating status and tolerances of their assets—a command to push steam through a turbine works well when the turbine is warm, but can break it when the turbine is cold. Identical commands could be legitimate or malicious, depending on context.

Even collecting the contextual data needed for threat monitoring and detection is a logistical and technical nightmare. Typical energy systems are composed of equipment from several manufacturers, installed and retrofitted over decades. Only the most modern layers were built with cybersecurity as a design constraint, and almost none of the machine languages used were ever meant to be compatible.

For most companies, the current state of cybersecurity maturity leaves much to be desired. Near-omniscient views into IT systems are paired with big OT blind spots. Data lakes swell with carefully collected outputs that can’t be combined into a coherent, comprehensive picture of operational status. Analysts burn out under alert fatigue while trying to manually sort benign alerts from consequential events. Many companies can’t even produce a comprehensive list of all the digital assets legitimately connected to their networks.

In other words, the ongoing energy revolution is a dream for efficiency—and a nightmare for security.

Securing the energy revolution calls for new solutions equally capable of identifying and acting on threats from both physical and digital worlds. Security operations centers will need to bring together IT and OT information flows, creating a unified threat stream. Given the scale of data flows, automation will need to play a role in applying operational knowledge to alert generation—is this command consistent with business as usual, or does context show it’s suspicious? Analysts will need broad, deep access to contextual information. And defenses will need to grow and adapt as threats evolve and businesses add or retire assets.

This month, Siemens Energy unveiled a monitoring and detection platform aimed at resolving the core technical and capability challenges for CISOs tasked with defending critical infrastructure. Siemens Energy engineers have done the legwork needed to automate a unified threat stream, allowing their offering, Eos.ii, to serve as a fusion SOC that’s capable of unleashing the power of artificial intelligence on the challenge of monitoring energy infrastructure.

AI-based solutions answer the dual need for adaptability and persistent vigilance. Machine learning algorithms trawling huge volumes of operational data can learn the expected relationships between variables, recognizing patterns invisible to human eyes and highlighting anomalies for human investigation. Because machine learning can be trained on real-world data, it can learn the unique characteristics of each production site, and can be iteratively trained to distinguish benign and consequential anomalies. Analysts can then tune alerts to watch for specific threats or ignore known sources of noise.

Extending monitoring and detection into the OT space makes it harder for attackers to hide—even when unique, zero-day attacks are deployed. In addition to examining traditional signals like signature-based detection or network traffic spikes, analysts can now observe the effects that new inputs have on real-world equipment. Cleverly disguised malware would still raise red flags by creating operational anomalies. In practice, analysts using the AI-based systems have found that their Eos.ii detection engine was sensitive enough to predictively identify maintenance needs—for example, when a bearing begins to wear out and the ratio of steam in to power out begins to drift.

Done right, monitoring and detection that spans both IT and OT should leave intruders exposed. Analysts investigating alerts can trace user histories to determine the source of anomalies, and then roll forward to see what else was changed in a similar timeframe or by the same user. For energy companies, increased precision translates to dramatically reduced risk – if they can determine the scope of an intrusion, and identify which specific systems were compromised, they gain options for surgical responses that fix the problem with minimal collateral damage—say, shutting down a single branch office and two pumping stations instead of a whole pipeline.

As energy systems continue their trend toward hyperconnectivity and pervasive digital controls, one thing is clear: a given company’s ability to provide reliable service will depend more and more on their ability to create and sustain strong, precise cyber defenses. AI-based monitoring and detection offers a promising start.

To learn more about Siemens Energy’s new AI-based monitoring and detection platform, check out their recent white paper on Eos.ii.

Learn more about Siemens Energy cybersecurity at Siemens Energy Cybersecurity.

This content was produced by Siemens Energy. It was not written by MIT Technology Review’s editorial staff.

Tech

Facebook wants machines to see the world through our eyes

Published

on

Facebook wants machines to see the world through our eyes


For the last two years, Facebook AI Research (FAIR) has worked with 13 universities around the world to assemble the largest ever data set of first-person video—specifically to train deep-learning image-recognition models. AIs trained on the data set will be better at controlling robots that interact with people, or interpreting images from smart glasses. “Machines will be able to help us in our daily lives only if they really understand the world through our eyes,” says Kristen Grauman at FAIR, who leads the project.

Such tech could support people who need assistance around the home, or guide people in tasks they are learning to complete. “The video in this data set is much closer to how humans observe the world,” says Michael Ryoo, a computer vision researcher at Google Brain and Stony Brook University in New York, who is not involved in Ego4D.

But the potential misuses are clear and worrying. The research is funded by Facebook, a social media giant that has recently been accused in the US Senate of putting profits over people’s well-being—as corroborated by MIT Technology Review’s own investigations.

The business model of Facebook, and other Big Tech companies, is to wring as much data as possible from people’s online behavior and sell it to advertisers. The AI outlined in the project could extend that reach to people’s everyday offline behavior, revealing what objects are around your home, what activities you enjoyed, who you spent time with, and even where your gaze lingered—an unprecedented degree of personal information.

“There’s work on privacy that needs to be done as you take this out of the world of exploratory research and into something that’s a product,” says Grauman. “That work could even be inspired by this project.”

FACEBOOK

The biggest previous data set of first-person video consists of 100 hours of footage of people in the kitchen. The Ego4D data set consists of 3,025 hours of video recorded by 855 people in 73 different locations across nine countries (US, UK, India, Japan, Italy, Singapore, Saudi Arabia, Colombia, and Rwanda).

The participants had different ages and backgrounds; some were recruited for their visually interesting occupations, such as bakers, mechanics, carpenters, and landscapers.

Previous data sets typically consisted of semi-scripted video clips only a few seconds long. For Ego4D, participants wore head-mounted cameras for up to 10 hours at a time and captured first-person video of unscripted daily activities, including walking along a street, reading, doing laundry, shopping, playing with pets, playing board games, and interacting with other people. Some of the footage also includes audio, data about where the participants’ gaze was focused, and multiple perspectives on the same scene. It’s the first data set of its kind, says Ryoo.

Continue Reading

Tech

This NASA spacecraft is on its way to Jupiter’s mysterious asteroid swarms

Published

on

This NASA spacecraft is on its way to Jupiter’s mysterious asteroid swarms


Lucy will take black-and-white and color images, and use a diamond beam splitter to shine far-infrared light at the asteroids to take their temperature and make maps of their surface. It will also collect other measurements as it flies by. This data could help scientists understand how the planets may have formed.

Sarah Dodson-Robinson, an assistant professor of physics and astronomy at the University of Delaware, says Lucy could offer a definitive time line for not only when the planets originally formed, but where.

“If you can nail down when the Trojan asteroids formed, then you have some information about when did Jupiter form, and can start asking questions like ‘Where did Jupiter go in the solar system?’” she says. “Because it wasn’t always where it is now. It’s moved around.”

And to determine the asteroids’ ages, the spacecraft will search for surface craters that may be no bigger than a football field. 

“[The Trojans] haven’t had nearly as much colliding and breaking as some of the other asteroids that are nearer to us,” says Dodson-Robinson. “We’re potentially getting a look at some of these asteroids like they were shortly after they formed.”

On its 4-billion-mile journey, Lucy will receive three gravity assists from Earth, which will involve using the planet’s gravitational force to change the spacecraft’s trajectory without depleting its resources. Coralie Adam, deputy navigation team chief for the Lucy mission, says each push will increase the spacecraft’s velocity from 200 miles per hour to over 11,000 mph.

“If not for this Earth gravity assist, it would take five times the amount of fuel—or three metric tons—to reach Lucy’s target, which would make the mission unfeasible,” said Adam during an engineering media briefing also held on October 14.

Lucy’s mission is slated to end in 2033, but some NASA officials already feel confident that the spacecraft will last far longer. “There will be a good amount of fuel left onboard,” said Adam. “After the final encounter with the binary asteroids, as long as the spacecraft is healthy, we plan to propose to NASA to do an extended mission and explore more Trojans.”

Continue Reading

Tech

Reimagining our pandemic problems with the mindset of an engineer

Published

on

Reimagining our pandemic problems with the mindset of an engineer


The last 20 months turned every dog into an amateur epidemiologist and statistician. Meanwhile, a group of bona fide epidemiologists and statisticians came to believe that pandemic problems might be more effectively solved by adopting the mindset of an engineer: that is, focusing on pragmatic problem-solving with an iterative, adaptive strategy to make things work.

In a recent essay, “Accounting for uncertainty during a pandemic,” the researchers reflect on their roles during a public health emergency and on how they could be better prepared for the next crisis. The answer, they write, may lie in reimagining epidemiology with more of an engineering perspective and less of a “pure science” perspective.

Epidemiological research informs public health policy and its inherently applied mandate for prevention and protection. But the right balance between pure research results and pragmatic solutions proved alarmingly elusive during the pandemic.

We have to make practical decisions, so how much does the uncertainty really matter?

Seth Guikema

“I always imagined that in this kind of emergency, epidemiologists would be useful people,” Jon Zelner, a coauthor of the essay, says. “But our role has been more complex and more poorly defined than I had expected at the outset of the pandemic.” An infectious disease modeler and social epidemiologist at the University of Michigan, Zelner witnessed an “insane proliferation” of research papers, “many with very little thought about what any of it really meant in terms of having a positive impact.”

“There were a number of missed opportunities,” Zelner says—caused by missing links between the ideas and tools epidemiologists proposed and the world they were meant to help.

Giving up on certainty

Coauthor Andrew Gelman, a statistician and political scientist at Columbia University, set out “the bigger picture” in the essay’s introduction. He likened the pandemic’s outbreak of amateur epidemiologists to the way war makes every citizen into an amateur geographer and tactician: “Instead of maps with colored pins, we have charts of exposure and death counts; people on the street argue about infection fatality rates and herd immunity the way they might have debated wartime strategies and alliances in the past.”

And along with all the data and public discourse—Are masks still necessary? How long will vaccine protection last?—came the barrage of uncertainty.

In trying to understand what just happened and what went wrong, the researchers (who also included Ruth Etzioni at the University of Washington and Julien Riou at the University of Bern) conducted something of a reenactment. They examined the tools used to tackle challenges such as estimating the rate of transmission from person to person and the number of cases circulating in a population at any given time. They assessed everything from data collection (the quality of data and its interpretation were arguably the biggest challenges of the pandemic) to model design to statistical analysis, as well as communication, decision-making, and trust. “Uncertainty is present at each step,” they wrote.

And yet, Gelman says, the analysis still “doesn’t quite express enough of the confusion I went through during those early months.”

One tactic against all the uncertainty is statistics. Gelman thinks of statistics as “mathematical engineering”—methods and tools that are as much about measurement as discovery. The statistical sciences attempt to illuminate what’s going on in the world, with a spotlight on variation and uncertainty. When new evidence arrives, it should generate an iterative process that gradually refines previous knowledge and hones certainty.

Good science is humble and capable of refining itself in the face of uncertainty.

Marc Lipsitch

Susan Holmes, a statistician at Stanford who was not involved in this research, also sees parallels with the engineering mindset. “An engineer is always updating their picture,” she says—revising as new data and tools become available. In tackling a problem, an engineer offers a first-order approximation (blurry), then a second-order approximation (more focused), and so on.

Gelman, however, has previously warned that statistical science can be deployed as a machine for “laundering uncertainty”—deliberately or not, crappy (uncertain) data are rolled together and made to seem convincing (certain). Statistics wielded against uncertainties “are all too often sold as a sort of alchemy that will transform these uncertainties into certainty.”

We witnessed this during the pandemic. Drowning in upheaval and unknowns, epidemiologists and statisticians—amateur and expert alike—grasped for something solid as they tried to stay afloat. But as Gelman points out, wanting certainty during a pandemic is inappropriate and unrealistic. “Premature certainty has been part of the challenge of decisions in the pandemic,” he says. “This jumping around between uncertainty and certainty has caused a lot of problems.”

Letting go of the desire for certainty can be liberating, he says. And this, in part, is where the engineering perspective comes in.

A tinkering mindset

For Seth Guikema, co-director of the Center for Risk Analysis and Informed Decision Engineering at the University of Michigan (and a collaborator of Zelner’s on other projects), a key aspect of the engineering approach is diving into the uncertainty, analyzing the mess, and then taking a step back, with the perspective “We have to make practical decisions, so how much does the uncertainty really matter?” Because if there’s a lot of uncertainty—and if the uncertainty changes what the optimal decisions are, or even what the good decisions are—then that’s important to know, says Guikema. “But if it doesn’t really affect what my best decisions are, then it’s less critical.”

For instance, increasing SARS-CoV-2 vaccination coverage across the population is one scenario in which even if there is some uncertainty regarding exactly how many cases or deaths vaccination will prevent, the fact that it is highly likely to decrease both, with few adverse effects, is motivation enough to decide that a large-scale vaccination program is a good idea.

An engineer is always updating their picture.

Susan Holmes

Engineers, Holmes points out, are also very good at breaking problems down into critical pieces, applying carefully selected tools, and optimizing for solutions under constraints. With a team of engineers building a bridge, there is a specialist in cement and a specialist in steel, a wind engineer and a structural engineer. “All the different specialties work together,” she says.

For Zelner, the notion of epidemiology as an engineering discipline is something he  picked up from his father, a mechanical engineer who started his own company designing health-care facilities. Drawing on a childhood full of building and fixing things, his engineering mindset involves tinkering—refining a transmission model, for instance, in response to a moving target.

“Often these problems require iterative solutions, where you’re making changes in response to what does or doesn’t work,” he says. “You continue to update what you’re doing as more data comes in and you see the successes and failures of your approach. To me, that’s very different—and better suited to the complex, non-stationary problems that define public health—than the kind of static one-and-done image a lot of people have of academic science, where you have a big idea, test it, and your result is preserved in amber for all time.” 

Continue Reading

Copyright © 2020 Diliput News.