Ian Fitzgerald, CIO, Truckee Donner Public Utility District
It was 2011; a year that may be considered the beginning of IT/OT convergence. LTE was the new technology on the block; touting fast ethernet speeds from almost anywhere. Machine to Machine (M2M) communications between industrial control system (ICS) devices was becoming a cost effective way to improve the reliability, redundancy, and operational timeliness of the nations’ critical infrastructure. To pave the way for this radical change, the way devices responsible for controlling electric, gas, and water commodities communicate, an even more radical change of the status quo was required; the merging of the traditionally separate information technology and operational technology paradigms.
Traditionally, operational technology was physically air gapped from the internet. Often ICS devices lived on their own network (OT), completely separate of the corporate’s network (IT). This physical separation was intentional. OT systems are antiquated, often using technology that is far behind their IT counterparts. Their lack of recent technology was often due to the “if it ain’t broke, don’t fix it” approach. OT systems require slow, steady reliability, and uptime is far more important than the latest way to move or view data. With a tradition of air-gapping, and running decades old technology, cyber-security was never a priority in the OT realm. That is of course until the convergence with IT began, and the difficulty of protecting our critical infrastructure became that much more problematic.
See Also: Top Artificial Intelligence Companies
It is often said that the next major war won’t be fought on the battlefield, but within the cyber world. Hospitals, banks, and critical infrastructure are the first targets a nation state will attack. You disable these three industries, and you will have crippled the country.
Using artificial intelligence (AI) or machine learning to determine network baselines, even as those baselines shift, allows CIOs to identify model breaches based on abnormal user behavior
Hospitals and banks have taken cyber-security seriously for years; even decades. Critical infrastructure, on the other hand, has been far behind; mostly due to this traditional separation of IT and OT.
Early this year a frightening rumor began to churn: Russia had successfully hacked the electric grid of the United States, and had the opportunity to turn off electricity. More recently in July of this year, the Department of Homeland Security (DHS) and the National Cybersecurity and Communications Integration Center (NCCIC) provided an in-depth de-briefing and confirmed this rumor with great detail.
The attack was unique, sophisticated, and designed to avoid all traditional cyber-security technology. The attack began with staging targets: smaller organizations with pre-existing relationships to the energy sector, having less sophisticated networks. From these staging targets, the hackers then moved on to their intended targets: electrical generation, transmission, and distribution companies that have employed sophisticated networks and more defensive cyber tools. Proceeding to a credential harvesting stage of the intended targets, the hackers used phishing and watering hole techniques. Hackers sent emails, built upon the trusted relationships with their vendors, attached with legitimate files but no malware. Instead, these files, once saved to the locale disk, would point to a file://corporation site looking for a normal.dot or shortcut image icon file using the time tested SMB protocol. Upon this file request, the server would request the client credentials, the victim would provide a user hash, and then the server provides the requested file. Voila, full user credentials. The hackers would then proceed to single-authenticated vpn.corportation.com and gain full-permission access to the converged IT/OT network. Even a full password reset would not stop the attack, since any refresh of the file would automatically send the newest credentials.
Mind blowing simple and sophisticated all at the same time.
To the traditional firewall, segmentation, intrusion detection, or endpoint protection security software, this transaction would look normal, and no intrusion would be detected. It is believed the Russians were embedded in our critical infrastructure for over two years, and still are today. This leads me to a fear I have had for as long as IT and OT networks have become one: what if the hackers are already inside. How do I know?
No longer can CIO’s rely on traditional methods of intrusion detection, but instead must look outside the box, beyond the normal “signature” patterns. New technology is beginning to emerge that could have successfully detected this sort of attack which traditional signature based technology missed. Using artificial intelligence (AI) or machine learning to determine network baselines, even as those baselines shift, allows CIOs to identify model breaches based on abnormal user behavior. Even though, in the fore mentioned case, the Russians were able to access ICS devices with actual credentials, connecting to certain devices at abnormal hours, using abnormal client-server relationships, or even abnormal user-device relationships would have been identified in real-time. Networks have a unique pattern of life which hackers are not privy to. Outsiders, working in any network, inherently will change this unique pattern, and be identified.
We can’t know of, or have perfect foresight into, the next attack on our critical infrastructure. But we can identify what is normal network behavior. We can start using the normality of network usage against the hacking community, adding another layer of defense. As with any technology, our goal is to identify breaches as quickly as possible, and then properly respond to those breaches. Relying on past attacks is a poor way to defend. Getting ahead and defending against the attacks we don’t know about is the future. Human behavior used to be a cyber-security deficit. With AI, we are able to turn that human behavior into an advantage, and should be something every CIO should be looking to implement.
Read Also