Situational analysis of today’s Cyberspace

Today’s cyberspace is an amalgam of extremely large-scale and complex computer and network systems and infrastructures, where classical computing devices coexist with embedded devices (many of them mobile), in a practically seamless manner; devices will be highly programmable and dynamic; information processing will coexist with real-time control; computer-caused failures may be physical as well as virtual.

The value of the assets controlled by these converged CIIs is formidable and, in consequence, liable to attract very sophisticated targeted attacks, be they from organised crime and cyber-terrorism, cyber-hacktivism organisations or militias, or nation-state armies or agencies. Company-level systems will not be exempt from risk as well.

Wrapping up, cyberspace today could be succinctly characterized by: an immense, widely interconnected, largely interdependent distributed infrastructure; large exposure to threats, congregated by the “pressure to be on-line”, from e-government, e-business, to social networks; steadily increasing software vulnerabilities; degradation of the threat plane, with more powerful adversary actors and sophisticated exploit tools in scene.

What is Resilient Computing?

Resilient Computing is a new paradigm based on modelling, architecting and designing computer systems so that:

  • they have built-in baseline defences from first principles and first hour, whatever their mission;
  • such defences cope with virtually any quality of threat, be it accidental faults, design errors, cyber-attacks, or unexpected operating conditions;
  • provide protection in an incremental way, since not all threats are extreme, or all systems critical;
  • automatically respond to threats and adapt to a dynamic range of their severity;
  • seek to provide unattended and sustainable operation of the systems.

In essence, these attributes of computer systems respond to a generic definition of resilience:

The property of resuming original shape or position after being bent, stretched, or compressed; elasticity and plasticity.

The results reported by several teams in the recent years have significant advances in this quest, but by no means exhaust the problem. They are just a drop in the immense ocean of opportunities that the future will bring, for example by the integration of machine-learning and artificial intelligence techniques in the adequate way to serve two fundamental philosophical principles in the science of dependable and secure computer systems: there is no trusted data over non-trustworthy systems; every non-substantiated assumption is a weakness.

Why Resilient Computing?

Many of the threats and upset events described above come from societal movements and pressures at all levels — from governments through companies to people — that are hard to control or reverse. This (degraded) landscape is not bound to improve any soon. In response to such challenges, protection paradigms based on classic security or classic dependability, albeit necessary, are insufficient if working in isolation, and in a brittle way.

For example, the prevention and/or ad-hoc detection of intrusions caused by attacks and vulnerabilities is known to have imperfect coverage. Suffices one known and reachable vulnerability to completely defeat an otherwise strong-looking bastion protecting an enterprise intranet. Intrusion detection besides incomplete, will prove slow in a number of applications in the CPS area, such as autonomous vehicles or robots. On the other hand, on the dependability side, whilst faults in a well-designed car lead to an infinitesimal and acceptable probability of catastrophic failure, passing the safety certification, those faults in the hands of an attacker will lead, rather sooner than later, to catastrophic failures.

Furthermore, even if we would address threats under a combined paradigm (say, fault and intrusion tolerance), this would still not be enough. First, system complexity, distribution and exposure (openness) reached a point where it is no longer possible to take as a given static architecture configurations, known and fixed number of system nodes, stable system assumptions like synchrony, moderate or predictable threat/fault behaviours. Second, that situation is aggravated by a corresponding evolution of the threat landscape, which is everyday more severe, uncertain, dynamic and polymorphic — short of alternatives, systems designed by classic paradigms become brittle with regard to these unforeseen scenarios or threats.

Hence, a paradigm change in the way we protect systems is in order. We advocate for research leading to the additional attributes that prefigure the resilient computing vision, as already mentioned, like incremental protection capability; automatic reconfiguration and dynamic adaptation; sustainability.