The editor at Computerworld gave me permission to share my monthly column with you on my blog:
Privacy and security are foundational to health care reform. Patients will trust electronic health care records only if they believe their confidentiality is protected via good security.
As vice chairman of the federal Healthcare Information Technology Standards Committee, I have been on the front lines in the debate over the standards and implementation guidance needed to support the exchange of health care information. Over the past few months, I've learned a great deal from the committee's privacy and security workgroup. Here are my top five lessons:
1. Security is not just about using the right standards or purchasing products that implement those standards. It's also about the infrastructure on which those products run and the policies that define how they'll be used. A great software system that supports role-based security is not so useful if everyone is assigned the same role and its accompanying access permissions. Similarly, running great software on an open wireless network could compromise privacy.
2. Security is a process, not a product. Hackers are innovative, and security practices need to be constantly enhanced to protect confidentiality. Security is also a balance between ease of use and absolute protection. The most secure library in the world -- and the most useless -- would be one that never loaned out any books.
3. Security is an end-to-end process. The health care ecosystem is as vulnerable as its weakest link. Thus, each application, workstation, network and server within an enterprise must be secured to a reasonable extent. The exchange of health care information between enterprises cannot be secured if the enterprises themselves are not secure.
4. The U.S. does not have a single, unified health care privacy policy -- it has 50 of them. That means that products need to support multiple policies -- for example, those of a clinic that uses simple username/password authentication and those of a government agency that requires smart cards, biometrics or hardware tokens.
5. Security is a function of budget. Health care providers' budgets vary widely. New security requirements must take into account the implementation pace that the various stakeholders can afford. Imposing "nuclear secrets" security technology on a small doctor's office is not feasible. Thus, the privacy and security workgroup has developed a matrix of required minimum security standards to be implemented in 2011, 2013 and 2015, recognizing that some users will go beyond these minimums.
In debating how to enhance security for all stakeholders without creating a heavy implementation burden, the workgroup has come up with these ideas:
All data moving between organizations must be encrypted over the wire. Data moving in an organization's data center should be encrypted if open wireless networks could lead to the compromise of data as it is moved inside the organization. There is no need to encrypt the data twice -- if an organization implements appropriate secure wireless protocols such as WPA Enterprise, the data can be sent within the organization unencrypted.
All data at rest on mobile devices must be encrypted. Encrypting all databases and storage systems within an organization's data center would create a burden. But ensuring that devices such as laptops and USB drives, which can be stolen, encrypt patient-identified data makes sense and is part of new regulations such as Massachusetts' data protection law.
Such proposals strike a delicate balance, for while attaining the goal of care coordination through the exchange of health information depends on robust security technology, infrastructures and best practices, it can't succeed if safeguarding patients' privacy is unduly cumbersome.
0 comments:
Post a Comment