Red Hat has earned a distinction for its Red Hat Enterprise Linux 8.2 platform that confirms the product as secure and safe for various computing environments.
The Common Criteria certification was provided by the National Information Assurance Partnership and the program was tested and validated by the federally-approved laboratory Acumen Security, Red Hat said Monday.
Clara Conti, vice president and general manager of the North American public sector arm of Red Hat, attested that the certification “shows our continued commitment to making Red Hat Enterprise Linux a platform that not only embraces innovation but also serves as the backbone for critical and security-sensitive operations.”
Conti, who is also a 2022 Wash100 Award winner, added that the Linux operating system is a crucial element of modern technology services and sensitive data activity.
The company’s attainment of the certification demonstrates its motivation to stay current with ever-changing cyberthreats and corresponding security demands. The certification process involves intense, standardized and continuous testing by a third party entity – in this case, Acumen Security – and looks to confirm a given platform’s ability to handle classified materials on an international scale.
The Red Hat Enterprise Linux 8.2 program allows the user to scale apps and roll-out technologies and workloads and is part of a larger fabric of Red Hat services, which include technical support and extended product lifecycles, as well as a security response team.
According to Kenneth Lasoski, director of Common Criteria matters at Acumen Security, bearing the CC certification means Red Hat can market its product to “security conscious customers, such as national security-related agencies, finance and healthcare organizations.”
In an Executive Spotlight interview with GovConWire in February, Conti spoke about where she predicts technology trends are heading in the near future, touching on open source artificial intelligence and machine learning at the edge.
“The next frontier is doing the analysis and decision making where the data are generated — at the edge. By moving the computing to where the data is, you can gain insight faster — possibly instantaneously. This unlocks the ability to potentially solve problems that were considered impractical or even impossible,” Conti said.