Lighting the Societal Forest

Our increasingly connected world makes us more vulnerable to disruption. But if we play it right, it can also make us stronger.

We must do our utmost to stop COVID-19, the healthcare worker tells you. It’s the usual speech: masks are a must to keep the virus at bay. Keep your distance from people; avoid unnecessary contact. Don’t be foolhardy after vaccines; they don’t make you invincible — but you should still take them, because something is better than nothing.

Sanitizers are provided at every corner, to keep you clean at all times. Mask dispensers are available whenever you need them; mask dustbins for when you don’t. Alternate seats are marked as such, so you remember not to sit. Payments have gone digital; temperature-scanners contactless. Want to know if you’ve come in contact with a case? There’s an app for that.

That’s cool, you say, but what about those people who’ve actually caught the virus? “Oh, we don’t really know about them,” the healthcare worker tells you. “Our job is only to prevent disease, not to help those who’ve caught it.”

Does that sound a bit ridiculous? It does to me too. But this is exactly what’s happening in another department: cybersecurity.


In December 2020, Russian hackers managed to infiltrate the computers of several U.S. government agencies. It’s not clear exactly what information they accessed, but the breach affected a number of departments including state, homeland security, and health. What’s more, the hackers also compromised systems of a top cybersecurity firm and large IT companies like Microsoft.

“The magnitude of this national security breach is hard to overstate.” — Thomas P. Bossert, former Department of Homeland Security advisor for President Trump, in a New York Times op-ed.

Does this security breach mean all those computer systems were insecure? Not necessarily. They had on their firewalls — the metaphorical masks of the software world, which filter what network traffic comes in and what goes out. But the companies also used SolarWinds Orion, a software used to monitor servers and make sure they’re doing okay. They were told to make firewall exceptions, as firewalls could interfere with the software’s performance. And then, somebody managed to breach SolarWinds Orion itself, thus giving it a “get past the mask free” ticket of sorts.

Thus, no matter how secure your firewall is — if you make an exception, then malicious actors can take advantage of it.


After this SolarWinds incident, cybersecurity experts have been urging people to use zero-security policies. There’s no single “zero security” setup; it’s more like an overall strategy than a specific product.

Most private company networks use a “castle and moat” model. Only employees are allowed to sign on, and they need to enter passwords or do other security checks to be let in: that’s the moat. But once they’re in, they can go wherever they want and load whatever they need to. They’re inside, so they’re trusted, and they can go anywhere they want: that’s the castle.

But what if there’s a security breach and an enemy manages to sneak past the moat? They can do endless harm, because they have unfettered access to the castle. That’s why zero trust assumes a security breach has happened and everyone is an enemy. Instead of posting sentries at the castle gates, they post them at every room.

But does this solve the problem? Again take the COVID-19 example. If I told you that, rather than monitoring temperatures for COVID, I would block non-essential people from entering — you might say that’s a better policy temporarily. But then, if I said I still don’t care what happens after a person or group of people get COVID — you would say that is still nuts!

After all, shouldn’t your top priority be ensuring people’s health is secure, even if and after they’ve got COVID? The same goes for cybersecurity. Comprehensive solutions should also take account building resilient societies to cyber related disruptions.


The Texas power and water outages last month highlight how intricately connected networks can fail. Almost half the electricity in Texas comes from natural gas, and half of the natural gas pipelines were unable to deliver due to extreme cold weather conditions. Residents were scared their water-pipes would freeze due to the lack of electric heating. Because of this, they left their taps on to keep them flowing. This, in combination with frozen and burst water pipes, led to water pressures being significantly reduced in the pipe. Boil water advisories were issued, and people were left without safe drinking water.

In another incident this year, a hacker tried to poison a Florida town’s water supply. They did this by accessing water computer systems, and increasing sodium hydroxide (lye) levels by 100 times. Luckily a supervisor saw this on their computer screen, and stopped it just in time before the contaminated water entered the water supply.

Yet another scenario involved a seemingly minor vendor disruption but this time in air carrier networks. On April 1st 2019, an outage in the IT networks of a company most of us had never heard of, caused delays across the airline networks of Delta, United, JetBlue, Southwest and Alaska. The company, Aerodata, is in charge of determining planes weight and balance data, and it’s the last application to be run before a flight’s takeoff. More than 50% of flights use Aerodata , which is probably why the 40-minute outage of this company affected multiple airlines and thousands of flights through the entire day.

More recently, 10% of world trade was physically blocked by a single container ship, the Ever Given, that got stuck in the Suez canal.

This shows how vulnerable global supply chains are, to unexpected vulnerabilities that are embedded in complex systems.


Cybersecurity isn’t just about keeping computers safe. Since there’s a bit of digital in so many other parts of life, all those systems are vulnerable to cyberattacks too. Connected systems are easy to disrupt, as you can see, because small incidents can have far-reaching consequences. Those consequences become ever more far reaching if you’re a malicious person — or a hostile country — and know exactly the right spot to hit.

Cyberattacks leave less footprint than wars, but they can still cause immense harm.

This is where our COVID-19 metaphor breaks down. An infected person will end up infecting others, through close contact. But the same is not true for societal disruptions, where many chains of propagation remain unknown.


Perhaps we should think of cybersecurity more as a dark forest: one where everyone and everything is interconnected, but we don’t know exactly how.

What we see is trees upon trees: what we miss is the millions of signals being exchanged through roots and mycorrhyzal fungi, each of which influences the whole in tiny, subtle ways.

The problem in the way we go about cybersecurity is that, while we do our best to block hackers, we don’t have a plan for what happens after they get in. We can hold the invaders out, but once they’re in, they’re everywhere.

What we need is resilience.


If we have a complete understanding of how the forest works — of all the sociotechnical interconnections that keep it running — then we can better defend ourselves against malicious actors who want to break in. The Florida water incident shows the need for humans in the loop — or at least advanced algorithms that know why a 100x lye level is unsafe for humans, and could only be a result of bad will.

So how do we begin to account for sociotechnical interconnections? This increasing connectivity and information is both a boon and a curse: It depends on how we use it.

I would argue that if we play this right, the Information Era has great potential for improved societal resilience and even equality amongst diverse people. True, there are many concerns about privacy and ethical issues, as well as what I’ve spoken about — the potential for malicious actors to take advantage of insecure information channels.

But just like the body fends off bacteria and viruses through biological sensing and defence mechanisms, why can’t our cities become resilient in the presence of increased information?

When Douglas-firs die an unnatural death, their remaining nutrients have been seen to spread out to nearby trees, as well as other species like the ponderosa pine. These nutrients help the other trees grow better, but also act as a warning signal for them to ramp up their defenses. What if we look out for signals in our societal forests, and use them to prepare for the next round?

Because of increasing connectivity, people are worried about malicious actors taking advantage of energy networks. But we can also use those networks ourselves, to map out the interconnections and societal perturbations from previous disasters. By knowing what happened during the Texas outages, we can keep safety systems in place to make sure it doesn’t happen again. And take note: it’s because of this very same connectivity that we can pinpoint exactly what went wrong, rather than throwing up our hands saying “whoops, some random technical glitch happened and we’ve no clue why”

Knowing how seemingly minor players like SolarWinds and Aerodata are actually so crucial — for IT and aeroplane take-off, respectively — could help develop policies that minimize impacts. Perhaps we could have a backup calculation system to step in where Aerodata fails, or an additional layer of defence to make sure a software isn’t doing something it’s not supposed to.


Ultimately, lighting up a dark societal forest could be critical for achieving a resilient society. My research team and collaborators are working in multiple directions to build societal resilience. We combine cross-disciplinary approaches in combination with complex network theory, and multidimensional data sets to light up societal interconnected networks. I must say we’ve had tremendous support from large societal private companies like HERE technologies, Twitter, as well as governmental organizations such as the Department of Transportation in providing diverse data.

It’s easy to sit back in a chair and say, “Well eventually our society will become resilient. It might take a few large-scale attacks, and loss of life, but it is an iterative process”. As a responsible citizen, I think we can do much better and speed up the ‘evolution’ of society to make it more resilient. We can’t afford to wait centuries or millennia when we have the power to get there now.

To this end, we can learn through modelling and data from previous perturbations, to keep ourselves prepared for the ones that are yet to come. And, since the world is so connected, we won’t even have to get up from our chairs.