IT professionals and data security solution providers who perform hard drive sanitization are trusted to effectively and completely eradicate data, thus protecting sensitive information from compromise. Data destruction standards and guidelines outline ways to properly execute this process. From those standards and guidelines, organizations create and implement processes and deploy solutions that enable them to sanitize a wide array of formats, often in very high volumes. In this way, a data destruction process is not unlike a manufacturing process: Identify the objective, create a process, deploy the tools to meet the objective and measure the result. Unfortunately, the one area where media sanitization processes have come up short (especially compared with manufacturing processes) has been in measuring the result, a step most commonly known as quality control (QC).
In any manufacturing operation, a rigorous QC process ensures the entire manufacturing process has achieved the objective. In fact, industry certifications focus on the QC process, as it is often the best litmus test for a successful manufacturing operation. Before late 2012, such QC practices were almost completely absent from media sanitization processes, and there were no guidelines in place requiring it, and few even recommending it. The existing sanitization process was considered to be adequate and QC redundant. That situation changed, as evidenced by recent revisions to NIST (National Institute of Standards and Technology) Special Publication 800-88, which has become the lynchpin of data sanitization standards. Section 4.7 of the document focuses specifically on the verification process and has been significantly expanded. This update reflects the increasing focus on total QC for any data erasure process, reducing risk and providing a higher degree of data security.
From this point forward, recognizing that media sanitization is a process with several potential points of failure, and that those failures have snowballing consequences, an independent QC measure clearly is warranted. Without this, any operation only can hope that its entire process is perfect, and that it is perfect every time. But when it comes to data security, “hope is not a strategy.” Simply running a separate “verify” pass after the erasure step provides almost no added assurance that the sanitization was successful because it is not an independent process.
Annual external audits performed by certifying authorities often have been regarded as the answer to this problem. The biggest shortfall to this solution is the infrequent periodicity of audits in contrast to the regularity with which the sanitization process is executed. In other words, these audits only tell you whether you got it right that time.
Industry leading certification bodies have recognized this limitation and have begun developing policies designed to hold media sanitization operations to a more rigorous standard.
Organizations need a way to ensure they are consistently achieving complete data security: They need a QC process.
What does a media sanitization QC process look like?
To answer this question, the potential points of failure within a media sanitization process must be identified so the QC process can adequately address each of them. Those points of failure are:
- the software;
- the hardware; and
- the execution.
Points of failure
Software. Many organizations use reputable, industry-recognized data erasure software. For these organizations, so long as the software is current and a valid support contract is in place, there should be little to no concern about a potential failure of the software.
In addition to these products, many organizations have either developed their own data erasure software (either in-house or through a contractor or university lab) or use older software tools that are no longer supported or updated. These scenarios call for an increase in the frequency of auditing and QC.
The broad install base of well-known data erasure software tools is one of their biggest advantages in terms of dependability. The industry’s leading erasure software packages have been subjected to many times the number of different hardware configurations and storage platforms than homegrown tools. In addition, these packages often have been independently tested by various private and government agencies to achieve various accreditations that validate their effectiveness on a sample of media. All of this helps to identify any anomalies or irregularities associated with various scenarios and allows them to be corrected in the ongoing product development cycle. This process is difficult to duplicate when software is only subjected to a limited scope of hardware in only a handful of operating environments or when a dedicated development team is not maintaining the product.
Changes in technology may outpace the development curve for an in-house tool, while driver limitations, chipset support, drive firmware and a host of other factors could limit the software’s capabilities. In some cases, the result could be a false indication of complete erasure, where more extensively used software platforms are prepared for these situations.
Hardware. In addition to the software, there is always a hardware factor. Every data erasure scenario is a little bit different. Combinations of hard drive interfaces, storage platforms, interconnects, chipsets, storage formats—components that are in between the erasure software and the data on the drives—are virtually endless. This high level of variation introduces a level of uncertainty that increases the need for implementation of a QC program.
Consider that any quality manufacturing process, which by its very nature relies on the same hardware and software every day, involves QC. Why would an operation with as many variables as data erasure not also require such process validation?
The fact is the erasure software can only sanitize the storage it sees; it cannot make up for limitations associated with the hardware on which it is hosted. Not all organizations will use industrial data erasure appliances to execute data erasure, especially on PCs. So it is critical that an organization be able to verify that its process is working, on demand, every time there is a concern, such as new drives or unfamiliar hardware.
Execution. “Erased drives go here. Unerased drives go here.”
“So, what are these?”
“Um…”
Execution starts with an ironclad policy and procedure for media sanitization, from the first touch to the last. Any certifying body that deals with data protection will require an organization to have this policy in writing.
The next element is having the right tools to execute the written plan. These tools include software and hardware.
Lastly, trained, competent operators are needed. Folks who understand the value of the data they are protecting, who are sensitive to the consequences of not doing so and who have the technical and organizational skills to implement the process as written, every single time.
In many equipment processing environments where media being sanitized are sold as refurbished storage, operators are normally encouraged to maximize production to fill standing orders. Sacrificing productivity for security is a very unlikely choice for an operator to make, as throughput is the most evident measure of that operator’s value to the organization. This creates a potential disconnect and even conflict between the organization’s priorities and the operators’.
No organization can create a process that is immune to instances of improper implementation, and this is another primary driver for having a QC program. In a process that is “human-centric,” training is paramount. An operator who makes a mistake likely is unaware of the mistake and will therefore repeat the error. One step that isn’t followed correctly, one option not selected in the software, one setting not changed in the BIOS, and the execution will not be correct. The result may not affect security every time, but the inverse relationship between how closely execution is monitored and controlled and the level of exposure to a potential breach cannot be argued.
QC challenges
Given the three main points of failure against which a QC process needs to be measured, it seems obvious that for the QC to be effective we’d need to exchange these variables for an independent perspective. In other words, we’d need to use different software, different hardware and a different operator to say that we have effectively audited the media sanitization process and validated that it is working properly and as expected. Not doing so simply would invite a repeat of a process breakdown, and complete assurance of process effectiveness would remain out of reach.
Unfortunately, some very real challenges are associated with creating a media sanitization QC process to account for all of these potential points of failure.
Cost. Relying on third-party audits as the exclusive QC measure is cost-prohibitive when executed against an adequate sampling. At $300 to $500 per drive for such a service, this simply is not scalable. An independent internal process is clearly more desirable.
Our ideal configuration would be a dedicated software program hosted on a dedicated hardware platform (or series of them, for that matter) run by a dedicated employee: A virtual duplication of the original data erasure operation. Yet, even though only a percentage of sanitized storage would need to go through QC, having dedicated hardware and software to perform this task could involve substantial cost. Furthermore, analyzing hard drives to validate that data were successfully erased (let alone erased using the intended method, and that any other aspects of the data destruction process, such as drive fingerprinting, host protected areas and device configuration overlays clearing, G-list tolerances, specific wipe algorithms, etc., were followed as specified) is likely to demand a highly trained storage engineer.
Implementation. How much space will our QC equipment occupy? How many man-hours will it consume? If we are going to a customer location to perform on-site erasure services, how can we possibly execute this on-site?
In addition to the direct costs of adding a complete QC process to a media sanitization operation, a host of other indirect factors, such as internal space or the portability of the process, may be present.
Process control. Even though the dedicated QC equipment must use different hardware and software than the sanitization equipment, as discussed previously, it still is likely to be comprised of an operating system, storage controllers, chipsets, BIOS and other “moving parts” in the QC operation. It is, therefore, no simpler than the original sanitization equipment and subject to its own points of failure. In other words, if we audit one complex system with another complex system based on the dubious assumption that the points of failure between the two will not align simply because variances are present between them, we cannot assure that the QC is adequate.
What is needed, then, is a simplified design that reduces the potential for systemic issues and can be rigorously tested and deployed repeatedly.
Every important process requires some level of QC, and even the use of quality tools and solid operating procedure is no excuse to ignore this critical step. In the absence of a cost-effective, easy-to-implement process, organizations will always compromise, even at the sacrifice of information security.
Using solutions such as The Validator from Haverhill, Massachusetts-based DestructData, a provider of data destruction solutions for demanding applications, effectively can address each of the potential points of failure that exist in a media sanitization operation by creating an independent testing environment, simultaneously eliminating the cost, implementation and process-control hurdles that would otherwise be associated with achieving this goal.
As the data destruction community continues to realize that data security is no less important than the scores of industries that are heavily regulated and require strict QC measures, and as certification programs, such as the National Association for Information Destruction (NAID), Responsible Recycling Practices (R2), e-Stewards and Asset Disposal and Information Security Alliance (ADISA), and standards creators, such as NIST, recognize the importance of data security through new, forward-thinking mandates to their respective industry segments, solutions that provide an independent testing environment can respond to the external pressures and internal concerns of data destruction professionals.
This text was adapted from information prepared by DestructData Vice President of Technology and Sales Michael Cheslock and reprinted with the permission of the company. More information is available at www.destructdata.com.
Explore the April 2015 Issue
Check out more from this issue and find your next story to read.
Latest from Recycling Today
- ReMA accepting Lifetime Achievement nominations
- ExxonMobil will add to chemical recycling capacity
- ESAB unveils new cutting torch models
- Celsa UK assets sold to Czech investment fund
- EPA releases ‘National Strategy to Prevent Plastic Pollution’
- South Carolina launches recycling app
- Resource Recycling Systems transitions to employee ownership model, refreshes branding
- APR upgrades PCR certification program