By Ben Reich, Chief Architect, OTORIO
In my previous post, I described the process of OT digital risk mitigation as an iterative one where practitioners must ask themselves: what are the most efficient mitigations that will achieve the most effective risk reduction for a specific asset, process, or an entire production floor?
However, once mitigation steps are implemented and an acceptable residual risk remains, there is still more work to be done. This is because the mitigation process will identify additional exposures and gaps that are part of the newly introduced ‘acceptable’ residual risk. This remains an ongoing process because it enables operational and OT security teams to constantly focus on vulnerabilities that attackers are most likely to exploit to try and damage an organization as much as possible. Only by repeatedly performing this OT risk assessment loop can companies achieve business resilience, and do so using a limited amount of resources.
The main objective of the posture assessment process is to address exposed vulnerabilities in the correct priority. In this post, we will explore OT vulnerability management, the nature of vulnerabilities, the way they should be scored, and how they apply to OT cyber security.
The National Institute of Standards and Technology (NIST) defines a vulnerability as:
“A weakness in the computational logic (e.g., code) found in software and hardware components that, when exploited, results in a negative impact to confidentiality, integrity, or availability. Mitigation of the vulnerabilities in this context typically involves coding changes, but could also include specification changes or even specification deprecations (e.g., removal of affected protocols or functionality in their entirety)."1
So vulnerabilities are known defects in an organization’s security posture, and fixing them can involve mitigations like upgrading a software version, disabling a communication protocol, or upgrading a password.
Creating an accurate, contextualized, and detailed asset inventory is the first step when developing an effective process for analyzing OT vulnerabilities. The inventory should include software and version data, asset connections, status, and administrative information (e.g., ownership, operational role, function). Current and accurate inventory will reflect different aspects of asset states.
Once an initial asset inventory is performed, vulnerabilities can then be associated with each corresponding asset. This association should be achieved via an automated process, especially when dealing with a large number of assets. Doing this involves creating and using an algorithm that can link semi-structured vulnerability data to assets in the network.
NIST’s Common Vulnerabilities and Exposures (CVE) database currently contains approximately 170,000 IT and OT known vulnerabilities, making it a major source of intelligence. This number, and the ongoing introduction of new vulnerabilities, highlight the scale and the need for automating their identification.
Scoring vulnerabilities involves quantifying the severity of each one using a vulnerability index. A standard way of scoring them is to utilize NIST’s Common Vulnerability Scoring System (CVSS), an industry-standard scheme that evaluates how easy it is to exploit a vulnerability, and the potential impact of such an exploitation on confidentiality, integrity, and availability. These three factors (a/k/a ‘CIA’) are also variables that measure a threat’s potential severity.
Common vulnerabilities, however, are insufficient by themselves to determine a particular asset’s exposure. Another source for determining this is an organization’s internal policy. If such a policy, for example, dictates that medium-strength passwords constitute a vulnerability, this must be reflected in the asset vulnerability calculation. Organization-specific security posture defects are the primary way that practitioners can incorporate policy as a factor when scoring vulnerabilities.
Industry standards and best practices are also important sources for vulnerabilities that contribute to risk. Examples of industry standards include ISA/IEC 62443 in Europe and NERC CIP in North America. Failing to adhere to best practices can include issues like permissive segmentation configuration, lack of EDR agents, and unwarranted communication between IT and OT areas in the network. These need to be wired into an all-encompassing vulnerability database where they can be modified by domain experts as industry standards and best practices evolve.
Practitioners should score organization-specific vulnerabilities using the CVSS system, putting them on the same scale as common vulnerabilities. The vulnerabilities database should be flexible enough for the practitioner to influence vulnerability scoring based on company policy.
Since any asset state can represent a vulnerability, it is wise to deploy an algorithm that applies organizational policy across all asset states is wise.
Thus, the basis for making the right posture decisions is to consistently use a vulnerabilities database where all vulnerabilities are scored using a standard method. This enables an organization to prioritize mitigation using risk. We will explore this further in the next blog posts of this series.
One of the things that we hear repeatedly from our customers is that confidentiality, data integrity, and availability are not a good reflection of their concerns when addressing OT environments. Instead, OT KPIs need to reflect parameters like safety and business continuity.
While this is a valid point, there are three reasons why discussing OT vulnerabilities centers around these definitions:
This logic does not preclude referencing OT KPIs in the risk model. The risk model takes OT KPIs into account as a consequence of confidentiality, integrity, and availability. This is done through a mapping process that I will discuss in my next post.
Vulnerabilities are one of the four components of risk and a major factor in posture analysis. A significant challenge is to build and maintain a vulnerability database that can be applied to assets in order to make decisions about prioritizing mitigation.
Mapping vulnerabilities appropriately is the basis for any good assessment. It is a process that involves a number of steps:
The best way to score vulnerabilities is to adhere to the CVSS system; this avoids the need to re-score all common vulnerabilities, while also complying with the industry standard.
The scale and magnitude of this process necessitates that it be automated. Doing so enables an organization to periodically assess posture in a consistent and scalable way, making it possible to compare assessments over time and materialize posture trends.
1 National Vulnerabilities Database, Institute of Standards and Technology, U.S. Dept. of Commerce.