Regulatory Compliance Starts with Software

The main driver behind software development has always been functionality. Software is meant to provide new or expanded access to data and services, and the requirements for software are typically written with these needs in mind.

Data privacy and integrity, which are often neglected in functional specs, have become non-negotiable business requirements, largely due to regulations that developed in the wake of public exposures over the last five years.

These new regulations require that software behaves differently, specifically, in the way that it identifies users, stores sensitive data and records access to system resources. Due to the volume and complexity of regulations, translating their intent and impact into software development is often frustrating for development teams.

Other Articles by Security Innovation

The Limits of IT Security

Security By Design

Secure Software Begins in the Development Process

Beyond Software Assurance: Building a Culture of User Security


FREE Tech Newsletters




Let’s look at a few of the most common regulations, explore their impacts on the software development lifecycle, and determine how to handle them as senior managers of information:

Got SOX?

The Sarbanes-Oxley Act (SOX) was signed into law in 2002. In the wake of the various accounting scandals of the period, its purpose was to give investors more confidence in the transparency and accuracy of the financial reporting process of publicly-traded US companies.

In a SOX audit, a company must be able to prove that confidential information cannot be exposed to unauthorized entities. Applications must generate sufficient information to be certain that critical data has not been modified, and must show that user roles, file access and all other points of data access are appropriately locked down to only those users that are supposed to access them.

The translation of these requirements into developer terms is by no means straightforward, but it is manageable. Executive-level identification of financial management systems provides the basis for development-level understanding of appropriate access and authorization control.

This means that, as senior managers, we must identify critical data and acceptable use/access cases for our development teams. Once this happens, it is easier to audit the developed system.

HIPAA

The Health Insurance Portability and Accountability Act (HIPAA) was passed in 1996 by the U.S. Congress as an attempt to guarantee that employees who changed or lost jobs would be able to carry forward their healthcare benefits to a new provider.

In order to do this, there was clearly a requirement to uniquely identify people and to associate their health records with them. Since so much of this information is private, and because there was widespread concern that private health information may negatively impact insurability or employability, the HIPAA legislation also established federal regulations that forced doctors, hospitals, other healthcare providers and insurers to meet baseline standards when handling electronic information, such as medical records and medical patient accounts.

The security provisions section of HIPAA is comprised of three different sets of requirements, each of which list specific safeguards. These provisions are of primary concern to software teams since they contain specific technical compliance requirements.

These technical requirements may look very similar to those listed for SOX since there is a great deal of similarity in the driving requirements from the compliance regulation. In SOX, the drivers are transparency, data integrity and auditability, whereas in HIPAA, the drivers are data confidentiality and auditability. This similarity leads to the natural commonality in the underlying development strategy.