Feb 18 2016
Security

For Federal Agencies, Preserving Data Integrity Is a Critical Task

Tampering with data jeopardizes the public’s trust in government, so how do agencies ensure it’s always correct?

Data integrity took center stage for federal officials in 2014 as Congress subpoenaed the email of Lois Lerner, a director at the Internal Revenue Service.

The investigation centered on agency practices and unlawful audits of political opponents; however, through the course of that investigation, IRS officials could not produce Lerner’s email because of a crashed hard drive. Backup tapes were erased as well, beginning a monthslong saga concerning data integrity.

The case shows the special obligation federal technology professionals have to preserve data integrity.

From tax records to legislative history, federal agencies preserve and protect the critical information that comprises the nation’s history and ensures continued, efficient operation. The duty to preserve federal records represents a sacred trust between the American people and the government, and IT professionals bear significant responsibility for maintaining that trust.

How Data Integrity Can Be Compromised 

Threats to data integrity arise out of either accidental or intentional circumstances.

In the Lerner case, the IRS said data was lost through a combination of human error and malfuntioning technology.

Those reasons account for many issues surrounding data integrity and far more cases of information loss.

Other common failures involve more nefarious means. Malicious actors or rogue insiders may intentionally alter or delete information to perpetrate fraud, cover up unauthorized activity or simply cause havoc.

Whatever the cause, agencies must implement a defense-in-depth approach that consists of multiple, over-lapping data integrity controls. Such controls should be designed to detect integrity issues in agency records, investigate the root cause of integrity failures and recover data affected by an integrity incident.

For agencies, the obligation to implement strict controls goes beyond best practices; it may be an issue of federal law. Most regulations covering information security mandate the use of integrity controls to protect against unauthorized modifications. Applicable standards vary based on an agency’s specific mission and data operations, but may include:

  • National Institute of Standards and Technology (NIST) Special Publication 800-53, Security and Privacy Controls for Federal Information Systems and Organizations, which requires agencies to use systems with “cryptologic mechanisms to detect unauthorized changes to software, firmware and information.”
  • The Payment Card Industry Data Security Standard (PCI DSS), which requires agencies that store, process or transmit credit card information to “deploy a change detection mechanism (for example, file integrity monitoring tools) to alert personnel to unauthorized modification” of critical content.
  • The Health Insurance Portability and Accountability Act (HIPAA), which includes several mandates for integrity controls on electronic, protected health information.

NIST SP 800-53, PCI DSS and HIPAA represent three examples of the compliance mandates used by federal agencies to protect the integrity of critical government records against accidental and intentional threats. Almost every agency falls under at least one of those regulations, underscoring the importance of data protections.

Monitoring Changes 

Integrity protection begins with standard operating system security functionality, such as file system access controls and strong authentication.

NIST SP 800-53 and PCI DSS mandate file integrity monitoring systems, solutions that use cryptographic hash functions to take irreversible mathematical fingerprints of critical content on a regular basis. They store those values in a read-only database for comparison to future scans. Any change to a monitored file — no matter how minor — draws attention from an integrity monitoring solution. If a change was unexpected, the system triggers an alert for further investigation.

Hash-based systems provide effective protection for data stored in files, but struggle to protect information stored in agency databases. Database engines typically store data in large files that change frequently, rendering hash-based defenses ineffective. Agencies should complement file integrity monitoring solutions with database change auditing technology. Such products log every change to information stored in a database, preserving the content and the identity of the user initiating the change.

How to Recover from a Breach 

IT professionals need the capability to detect data integrity issues but also to restore data to its intended form. A solid backup regimen that allows IT administrators to roll back unauthorized changes serves as the final, critical layer of data integrity controls.

Administrators with confidence in an agency’s backup routine will rest easier knowing they have the ability to recover from an unexpected data integrity crisis.

Implementing database auditing, file integrity monitoring and strong backup mechanisms can prevent agency IT staff from experiencing costly data integrity issues again.

John Lund/Glow Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT