The full implications of this definition will not materialize until further details are released, but responses thus far have been largely positive.
High-profile attacks have heightened demands for security moves, like the ones outlined in the executive order.
“SolarWinds was justifiably a very good wakeup call to the risks that are present throughout, particularly on the federal side, and the risks that come from use of shared software and the risks that then presents to broader national and economic security,” said Jonathan Welburn, a RAND researcher who writes on cybersecurity, supply chain risks and systemic risk in economic systems, in a recent Government Technology interview.
The White House likely expects forthcoming EO-critical security standards to ripple out to the wider software landscape, beyond just government procurements, according to Henry Young, who previously worked at NIST and now is director of policy at the Software Alliance (BSA), a U.S.-headquartered international software industry advocacy group. The idea is that vendors working for the federal government will simply follow these rules for all their products, making safer products more easily available to everyone.
Should security rules be too rigid, however, vendors might instead create compliant products for the federal government and less secure alternative versions to sell to the general public, Young told GT. He did not see significant risks of this happening, he said.
NEW VIEW OF “CRITICAL”
NIST broke with tradition when setting its definition of “critical.”
“The old way to think about criticality was related to where the software was used,” Young said. “We are now thinking about this in a fundamentally different way.”
Older thinking saw each agency determine for itself whether a piece of software was critical, based on if its deployment made the solution’s continued secure functioning essential to agency operations. NIST decided instead that software would be EO-critical based on the kinds of capabilities and functions it performs, regardless of how it is implemented or who uses it.
In a document detailing the new definition, NIST said this choice means vendors will not have to predict in advance who would use their software and how, before determining what security standards to follow. Vendors will instead know that certain kinds of products are required to follow particular rules.
Young said that the notion that this makes life simpler for vendors is reasonable but has not been tested before.
NIST’s definition holds that software is EO-critical if it includes at least one component with any of the following attributes, or if it integrates directly with and is incapable of functioning without a component containing any of these attributes:
- Is designed to run with elevated privilege or manage privileges;
- Has direct or privileged access to networking or computing resources;
- Is designed to control access to data or operational technology;
- Performs a function critical to trust; or,
- Operates outside of normal trust boundaries with privileged access
NIST created an initial list of 11 types of software that count as EO-critical under those stipulations, including solutions used for remotely monitoring system security or authorizing users seeking access to sensitive data or operations. The Cybersecurity and Infrastructure Security Agency (CISA) is now required by the executive order to build on NIST’s efforts and create a definitive list.
Welburn said the software types NIST has highlighted are a good starting place for improving national security.
“These are the software which, if they were to fail, the whole ability to respond to and recover from a cyber incident would fail with it,” Welburn told GT. “If you’re triaging, we do need to start there.”
Still, Welburn said it is also important to pay particular attention to risks associated with any software that is widely deployed throughout federal systems and with software used by essential infrastructure providers, regardless of the software’s particular function.
“If there are dams throughout the country that end up running the same shared software, that makes that software critical,” Welburn said. “It’s not EO-critical, but it’s critical under other definitions.”
SECURITY STANDARDS
NIST’s definition was partially guided by desire to ensure enough qualifying products would reach market. Had it based criticality on how acquiring agencies use the software, “the Government might seek to buy a product, but no vendor anticipated it would be EO-critical, resulting in either no products or a limited number of products available for the Government to purchase,” NIST wrote.
Ensuring a sufficient supply of products also requires NIST be realistic when setting requirements to which EO-critical software must adhere, Young said. Pushing industry to tighten security is valuable but pushing too far would see few firms able to comply, leaving federal officials with few to no options.
“I’m optimistic that the U.S. [government] understands this and the folks at NIST understand this,” Young said. “They’re not going to shoot the moon and try to put things in that might be realistic [only] in 2030.”
Aaron Cooper, BSA vice president of Global Policy, told GT that the security requirements NIST settles on will need to be flexible enough to ensure the guidance stays relevant and useful for the long term. Overly prescriptive rules may make sense for the present day but won’t keep up as technology and risks evolve.
For example, a hypothetical policy requiring software verify users via four-digit PINs would be wise 20 years ago but become a liability if it were still in place today, because it would prevent use of biometric or multifactor authentication, Cooper said.
NIST will initially only apply its requirements to on-premise software that controls security-essential functions or which would result in significant damage if breached. CISA and the Office of Management and Budget (OMB) will oversee this initial phase before later expanding rolling out requirements to other software.