When I purchased a car a few months ago, I found that the asking price involved more than money. I had to provide information on my annual income, my net worth and my debtseven though I wanted to pay for the automobile with cash, rather than a loan. Concerned that the information might be used for marketing purposes, or shared with other companies without my knowledge, I held my ground. I refused to provide information that was not legally required for registering the car with the state. We did get the deal closed, but it was not an easy process. The salesman and the dealer's systems seemed to assume the dealership had a right to this information.
I'm hardly alone in this. Every day, millions of people are asked for personal data and many worry about how it will be used. Today, as companies invest in CRM, data warehouses and analytic software, and consumers grow more concerned about privacy and identity theft, a fundamental question is coming to the fore: Who owns all that personal data? Is it the consumer, patient or taxpayer who provides the data, or the company that stores and uses it?
This is an issue CIOs can't avoid. Today, security is job No. 1, and it's the CIO's responsibility not only to protect customers' and employees' personal data, but also to make sure it's accurate, available and usable. After all, this information is one of the most important assets companies possess. But is "possess" the right word? Are companies stewards of the data they hold about people, or are they owners? Who gets to decide what kind of information about a customer can be retained, how detailed it can be, and how far back in time it can go? Who gets to decide which people, companies and departments have access to the data? We must come up with much clearer answers to these questions, or we'll be endlessly arguing with angry customersat least until Congress set the rules for us.
To me, the solution is very simple. People, be they customers, employees, taxpayers or patients, should be allowed to make these critical decisions about their personal data, and companies should abide by their choices. This is both a practical matter and a matter of rights.
It's a matter of rights, because personal data is a form of property. I am not a lawyer, but a strong case can be made that people own the personal data about them in much the same way they own any other form of property. When companies obtain data about me, say, my address, Social Security number or credit card number, it is because I gave it to them. How I gave it to themwhether I tell someone this information in a phone conversation, write it down on a form, or run a card with a magnetic strip through a card readerdoesn't matter. In every case, I provided this information for a particular use and not for some other, and it is my right to decide when and where else they may use it, who else they can share it with and how they may use it. A one-time use of my information shouldn't be construed as permission to use it in perpetuity, unless I grant it.
Even if the courts were to disagree with me, there's another reason companies should let people make these decisions: It's good for business. After all, people are in the best position to make sure the data is accurate; giving them a chance to do so will only improve data integrity. And if a company obtains permission to use information in a certain way, then it cannot be accused of misusing it if they do. That precludes legal problems down the road. (There are exceptions, of course; young children, for example, are not in a position to verify information, or grant permission for its use. But parents or legal guardians can act as surrogates in such cases.)
CIOs and other executives may worry that customers and patients will prevent them from making any use of the information at all. But I think most people would be willing to grant permission to use their data if they feel it would provide a benefit: Consider how many supermarket customers have signed up for loyalty programs that enable stores to track their purchases in return for discounts and coupons. In fact, CIOs, in their concern for security, may overlook the privacy trade-offs people are willing to make.
Last fall, at the Society for Information Management's annual meeting in New York City, Harvard Business School professor F. Warren McFarlan observed that sometimes people want to make it easy, not difficult, for strangers to obtain their personal information. If he were to have a health emergency when he's traveling, McFarlan said, he'd want to make it as easy as possible for a doctor to obtain information about his medical history.
In fact, there's a not-for-profit healthcare organization based in Long Beach, Calif., that's doing exactly that. MemorialCare Medical Centers, which operates five hospitals in the Los Angeles area, has a six-year-old program that provides people with free medical information cards that can be used at hospitals and doctors offices.
These cards have a magnetic strip embedded with personal information, including emergency contact information, current medications, immunizations and past medical procedures. The choice of what information to include is left up to the individual. So far, 650,000 people have signed up for the program, including many who were not MemorialCare patients when they joined. The medical benefits in an emergency are obvious, but to MemorialCare officials, the cards also serve a marketing purpose. They help MemorialCare compete with other Los Angeles healthcare providers by bringing prospective new patients to their hospitals before they suffer a medical emergency or need. And the convenience of the card makes it easier for people and their doctors to use MemorialCare's services.
Of course, while some people may want their medical information to be an open book, others may want to limit access to it. But that's the point: Two people may legitimately make very different and even opposite decisions about access to their personal information. That's why protecting the right of the individual who owns the data to make that choice must be a core principal for any organization and its IT function.
The real issue then for companies isn't whether their customers have that right, but how to administer it. Companies must think through their policies on who can get access to the personal information they store, craft procedures for checking its accuracy and making decisions about its use, and establish or integrate the back-end systems needed to execute these policies. For starters, customers should not be charged for access to their information, and companies should adopt an opt-in approach to the use of personal data. (In fact, according to CIO Insight's current survey on security and privacy on page 77, nearly two-thirds of IT executives support an opt-in policy.) I also believe that when people don't bother to check the accuracy of their personal data, or indicate how it is to be used, a set of default rules must be developed.
I don't pretend to have all the answers, except that none of this can be done overnight. It will take time and energy to establish policies and principles, and a good deal of time and education to implement them. A sound first step would be the creation of a joint industry-university council that reviews the entire issue and makes recommendations. But companies should start soon, before legislators pass piecemeal legislation that deals with this complex of issues one by one, rather than as a whole.
Darwin John has held CIO-level positions at the Federal Bureau of Investigation, The Church of Jesus Christ of Latter-day Saints and the Scott Paper Co. He is currently an adviser to Blackwell Consulting Services in Chicago, and was formerly a special adviser to the director of the FBI.