With strong identity and asset management controls and full visibility into users, applications, devices, networks and data, agencies can better monitor and manage database and data security. Agencies also need to consider the tools they are using for continuous policy-based monitoring and run consistent vulnerability and configuration scans to detect and mitigate potential threats or weaknesses in their security posture.
Through this, agencies can improve management of potential insider threats, collect data and intelligence based on policy to prioritize the most critical vulnerabilities, and harden risk management and mitigation tactics, while maintaining protection of their high-value assets across data centers.
Strengthen Vulnerability Management to Prepare for Future Attacks
Next, agencies should perform regular testing of all layers of the infrastructure using real-world attack scenarios based on credible intelligence of how adversaries compromise their targets.
With the massive amount of data and rapid growth of IT infrastructure, it is more critical than ever for agencies to scan and test databases for vulnerabilities so they can better understand the risk if attackers were to exploit uncovered weaknesses.
While vulnerability assessment technology has existed for more than 25 years, the toolsets and design can differ. Traditional VAT solutions focus on assessing systems more broadly, but other VAT solutions bring specific expertise to a particular set of IT assets, such as databases and data stores. The 2021 Gartner Market Guide for Vulnerability Assessment states that in-depth assessments of databases and applications such as enterprise resource planning systems are not widely supported in traditional VA solutions.
Today, traditional, broad-based VAT solutions have included database scanning only for compliance purposes and do not achieve the level of protection necessary to secure the data held in government databases.
Government and industry need to work together to optimize investments in threat detection and vulnerability assessments. It’s time to improve vulnerability assessment standards to ensure protection of critical data sets. Agencies should leverage trusted third-party investigators to ensure a return on security investments and that all controls and countermeasures are sufficient in reducing risk.
Zero Trust is a Journey as Databases and Threats Evolve
Adopting a zero-trust approach isn’t a one-and-done task. While agencies such as the DHS, the Defense Department and the Department of Health and Human Services have made significant progress in zero-trust adoption, there’s still more to do.
According to the Office of Management and Budget’s draft zero-trust strategy document, agencies have until the end of September 2024 to make headway on specific zero-trust security goals, including implementing an enterprisewide identity to access apps, completing inventory of every device used for government work, encrypting all DNS requests and HTTP traffic within their environments, treating all applications as internet-connected and implementing an enterprisewide logging and information sharing platform.
To accomplish this, government needs a database-specific security approach that includes continuous vulnerability and configuration assessments and remediation, database privileged access visibility and control, and continuous database activity monitoring to alert and respond to anomalous database activity.
As the threat landscape evolves, so should agencies’ security methods. As part of a zero-trust model, agencies should do regular security testing to help ensure proper configurations to reduce risk to critical functions and data. Remember, strong data-centric security means securing the data and data centers themselves. It’s time to take a holistic approach to security on our zero-trust journey.