Evil, Metal-Destroying Bubbles Are Hackers' New Best Friends

LAS VEGAS—In 2015, Marina Krotofil gave a memorable and astonishing Black Hat talk about the process by which attackers could (and probably already had) attacked factories and large scale industrial infrastructure. Now the Lead Cyber Security Researcher at Honeywell, Krotofil returned to the security conference this year with a 600-pound water pump and a new weapon: bubbles.

Krotofil began her talk with a sentence perhaps never before uttered in the 20-year history of Black Hat: "Today, the topic will be bubbles." While most of us think of bubbles as harmless, soapy fun—drifting through the air and vanishing harmlessly with a touch—that's not the case from a physics perspective.

"We like bubbles...in champagne," said Krotofil. "But bubbles can be evil if applied outside of wine production."

The target for Krotofil's evil bubbles was equally unusual for Black Hat: a 610-pound, $50,000 industrial water pump, which sat next to the stage during her demonstration. In 2015, Krotofil had to show her work attacking chemical plants using a complex digital model. With the pump, she took a tiny piece of the factory and brought it with her to the conference.

Pumps, Krotofil explained, are the unsung hero of modern life, and are the most used piece of equipment on Earth. Some pumps used in industrial processes take 25 to 50 weeks to deliver, and are often custom-made. If one becomes damaged, it can result in costly downtime for factory. For this reason, many larger pumps are constantly monitored.

Here's where the bubbles come in: When bubbles implode in a liquid, they do so at a very high velocity and pressure, which creates massive shockwaves. "When the pump is cavitating it decreases the performance," Krotofil explained. Prolonged cavitation—or the formation of vapour cavities in a liquid—can even cause premature failure of the pump. All those tiny imploding bubbles create tiny pits in the metal of the pump's impeller, which moves fluid through the pump. Eventually, these pits cause the impeller to simply fall apart.

Not Really About Bubbles or Pumps

Remember, this is a hacking conference, not an industrial symposium on pumps. But Krotofil's massive setup was intended to prove a point.

From her previous work, Krotofil found that making a small change in a factory set off a cascade of consequences beyond her control. The problem, she realized, was that while she had control over some critical aspects of the factory, the actual physics of the place made her attacks uncontrollable. Equipment not under attack—some not even connected to a network—were affected.

"I now know that equipment can communicate with each other through the physics of the process," she said. "If it is a communication medium let's deliver a payload."

In this case, the communication medium is the water within the pump, and the malicious payload isn't code, but the cavitating bubbles.

More importantly, because pumps are critical to a factory's operation, they are frequently placed within stricter security—perhaps cut off from networks entirely and unreachable from the outside. Valves, on the other hand, are not.

"A valve is a boring target," she said. "I've been in a control room several times, and I've seen broken valves." Factory managers don't worry because valves tend to open when they fail, and the flow can still be managed by increasing or decreasing power to the pumps, she said.

In the setup she had on stage, Krotofil pointed out that valves and the pumps do not communicate electronically. But by seizing control of one or both of the valves on the machine, an attacker could create cavitating bubbles that greatly impacted the efficiency of the pump and, with time, could physically destroy it. The result would be similar to the Stuxnet worm, which was reportedly developed by the US and Israeli intelligence services to destroy centrifuges within an Iranian nuclear enrichment facility.

The Myth of the All-Powerful Attacker

Krotofil's demonstration was remarkable, showing how less secure devices can be used to affect critical pieces of infrastructure through a novel, physical attack. But in some ways, that wasn't her point. Instead, she went to great lengths to show the incredible difficulties involved in the attack.

For one thing, the attacker would have to spoof the valve position and flow data to prevent a manager from identifying the problem. Adding to this difficulty, the attacker would have no way of knowing how much and how long the cavitation would need to continue before physical failure occurred.

Krotofil also demonstrated how, by using the vibration, pressure, and temperature sensors on the pump, a manager could conceivably detect the attack. Not only that, a canny manager could even determine which valve was under attack based off the effect on the pump.

Far from peddling in fear, uncertainty, and doubt (commonly known as FUD), Krotofil's research focused as much on the novel and complex attack as it did on the limitations. "We all know what to expect in the future," she said, and for now, this kind of cyber-physical attack is far from an imminent threat. But researchers would do well to heed her work, and take note of how in a complex system, seemingly unconnected components can interact.

This article originally appeared on PCMag.com.