Menu
Menu
The CIOs code of ethical data management

The CIOs code of ethical data management

All it took was one email from an irate customer for Saab Cars USA CIO Jerry Rode to realize he had a developing public relations fiasco on his hands.

All it took was one email from an irate customer for Saab Cars USA CIO Jerry Rode to realize he had a developing public relations fiasco on his hands. In 1999, Norcross, Ga.-based Saab hired four Internet marketing companies to send customers information about Saab's new models. And although the auto company had specified that the program be opt-in (meaning it would email only the people who had agreed to receive such mail), one of the marketing companies apparently had a different definition of opt-in. And that meant one ticked-off customer, with more potentially on the way.

Rode fired the errant marketing company and immediately developed a formal policy surrounding the use of customer data. "The customer doesn't see ad agencies and contracted marketing firms. They see Saab USA spamming them," he says. "Finger-pointing after the fact won't make your customers feel better."

As Rode learned to his chagrin, data itself doesn't care how it's used. It will not stop itself from spamming customers or sharing personal, identifying details with third parties. It cannot decide to delete or preserve itself. Therefore, it falls on the shoulders of those who lord over ever larger terabytes of data the CIOs to develop ethical guidelines on how to manage it.

But most CIOs have not picked up the gauntlet. There are any number of reasons why. For one, the stuff is hard. The issue traps CIOs between two rather loathsome roles in the company the bad guy or the fall guy. Either the CIO stops marketers from using data for purposes some customers are not comfortable with, thereby damming potential revenue streams. Or the CIO imparts no ethical code and is left to explain why the company has allowed a customer's data to be exploited. Similarly, the CIO can tell the CEO what he can and cannot do with the company's data, thereby putting his own job in jeopardy. Or he can go along and run the risk of finding himself culpable for some misuse of data.

No matter what the situation, ad hoc decision making on data ethics isn't viable any more. CIOs must start a meaningful conversation about ethics at their company. Hard money is at stake here. Companies such as DoubleClick and Qwest Communications, which were both forced to pull back from controversial plans to share customer data with other companies, have seen their stock prices plummet. Soft money is at stake too. Brands such as Saab Cars USA stand to suffer uncalculable losses when customers feel violated. Data ethics is not just a moral issue; it's the intersection of business and morality. And if the CIO doesn't take the lead, then someone else like the customer or the government will.

The goal here is to explain why CIOs need a code of ethics when it comes to data management, and then to propose such a code.

Rode, for one, is all for it. "I don't see ducking the problem and blaming it on marketing or saying something slipped by as the answer," he says. "I see this as the CIO's responsibility. We need to take ownership of the problem and develop some rules."

Function Creep Everyone wants Scott Thompson's data. Thompson is executive vice president of Foster City, Calif.-based Inovant, the company Visa set up to handle its technology. Thompson's job includes the responsibilities of Visa CIO, the job from which he was promoted last year. At Inovant, he stands atop 35 billion credit and debit card transactions per year.

Sales and marketing would love to drill into Thompson's databases. They'd love to refine the data into loyalty programs, target marketing or partnerships between Visa and retailers. "There are lots of creative people coming up with these ideas," Thompson says. "This whole area of data sharing is enormous and growing. For the marketers, the sky's the limit."

But Thompson isn't playing ball. He is determined to prevent an environmental disaster: the pollution of the Visa brand through the violation of its customers' privacy. To share the data on what people spend their money on, in what stores, at what time of day all of which and more resides in Visa's databases would violate Visa's own rules for acceptable use of customer data.

Visa, Thompson says, errs on the side of caution with its credit card data policy, devised by Thompson along with staffers who specialize in privacy. He bars any use of it outside of its intended purpose, which is mainly billing. Keeping the data off-limits, he figures, keeps customers from choosing MasterCard instead.

This is not to say Thompson's not tempted to share the data. He is, every day. "Every single day of the week, someone outside our organization comes to us and says, 'If only we could use your data to do this or profile that, it would be a license to print a zillion dollars for Visa,'" he says. "I've heard a half-dozen ideas for uses of this data where I thought, That's pretty clever, I know that would work. But as I've said, we made this decision. We're not going to do it."

However, a larger question remains: Can Thompson guarantee as he did at one point during an interview that some unethical use of data won't crop up at Inovant or Visa? Many experts, and lawyers, think he cannot. After all, in a large majority of cases, the unethical use of data happens not through the malicious scheming of a rogue marketeer but rather by inattention.

Here's how: Customer data is collected and stored for some purpose, such as record keeping or billing. A sales or marketing professional figures out another way to use it. Or a partner company may ask to share it. Or another company could be willing to pay for it pure profit that would be tempting for any company.

This is known as function creep, and it's a serious concern for ethicists. The classic example of function creep is the Social Security number, which started simply as a way to identify government retirement benefits and is now used as a sort of universal personal ID, found on everything from driver's licenses to savings accounts. But the problem is exacerbated by technology that makes it incredibly easy to repurpose data.

N2H2, a company that makes Web-filtering software, fell victim to function creep. N2H2 software is popular in secondary schools for filtering inappropriate content. To do this, the software collects information about every website a student visits.

With all that data, a marketer could learn much about the students' surfing habits, and in 2000 N2H2 offered to sell its data (in the aggregate) for $10,000 a month, under the brand Class Clicks, through marketing company Roper Starch. N2H2 had two customers a Web portal that focused on education and the Department of Defense, which wanted to use the data for recruiting programs. The plan was exposed to the public after only a month, in part by the DoD, which got cold feet about the program and notified privacy advocates. After an ensuing wildfire of publicity, N2H2 decided to stop offering the Class Clicks service.

It sounds almost quaint to say that data should be used only for the purpose for which it was collected, and nothing else, but that is conventional ethical wisdom, according to Chris Hoofnagle, legislative counsel at the Electronic Privacy Information Center based in Washington, D.C. "Somehow, technology has led to this rogue theory that acquiring data about people gives you the right to own that data, that mere collection translates to ownership," Hoofnagle says. "But there's no theory of property that says that's OK." Hoofnagle blames the new mind-set on the ease with which data is now collected and stored, via database queries and massive CRM data warehousing systems.

The only current limits to function creep seem to be the limits of marketing and sales' collective imagination. "Marketing organizations can rationalize their way to the greater good to a point where it could blind anyone, especially a CIO sitting on the fence on ethics," says Jack Cranmer, CIO of Arizona-based Mayo Clinic Scottsdale, one of three group practices that make up the Mayo Foundation. In Cranmer's case, he's handling sensitive patient information that researchers want to use for medical studies. "But if you're targeting customers based on data you've collected for some other use, there's where you should start thinking about ethics," he says.

Often, by the time the CIO finds out about the function creep, or by the time it gets to corporate counsel, the marketing group is disseminating the data already, says Rich Honen, an attorney at Albany, N.Y.-based Honen and Wood, which advises corporate clients on data ethics. But Honen maintains that it's the CIO's responsibility to understand and track the flow of information and maintain an ongoing discussion with marketing and legal executives about the ethical use of the data.

Joel Reidenberg, professor of law at Fordham University in New York City and expert on data privacy, adds, "Look, there isn't a single company today that doesn't know where its money is. Why aren't companies paying attention to the flow of information the way they do money?"

Whose Data Is It Anyway?

Last year, Reidenberg received an offer for cell phone service from AT&T Wireless. The offer revealed that AT&T Wireless had used Equifax, a credit reporting agency, to identify him as a likely customer. "We used information we obtained from a consumer reporting agency," the disclaimer read in part. This disclosure is required by the Fair Credit Reporting Act (FCRA). The FCRA also requires Equifax to disclose that it has sold a credit report and to whom, if a consumer asks. Reidenberg asked. Equifax told him it had sold his credit report to AT&T Wireless.

It was good business. Equifax made money selling data it already owned; AT&T Wireless could hone its target market to get a better response to its marketing campaign.

The bad part was that, by law, credit information can't be used to sell anything. Reidenberg cites the FCRA, which forbids such repurposing in every case except when the data is used for "a firm offer of credit or insurance." In other words, the only product you can sell based on credit data is credit.

A spokesman for Equifax says that as long as AT&T Wireless (or any company for that matter) is offering the cell phone service on a credit basis, such as allowing the use of the service before the consumer has to pay, it is in compliance with the FCRA. Since cases around unfair marketing based on credit data often languish in court, many companies may take a calculated risk that they won't run into too much legal trouble for using credit data in their targeted offers.

Reidenberg makes a distinction between what is legal and what is ethical. "The average consumer would have no idea AT&T Wireless was violating the law," he says. "But even if and this is highly unlikely even if the courts decided after five years that this was a legitimate use of customer data, the customers will be outraged."

In the end, customers decide whether a company has acted ethically, and very often a consumer's notion of the rules shares little in common with what's allowed under law.

Qwest Communications, a telecom company based in Denver, is learning this the hard way. Earlier this year, Qwest planned on sharing internal customer proprietary network information (CPNI) data about its 12 million customers with other companies. That means your phone bill. Whom you call. How long you talk. How much you pay. What services you select. How often you use directory assistance. Like Visa's credit card transaction data, CPNI data is a target marketer's dream. If marketers knew you had relatives in Wisconsin and that you called them on Sundays, they could tailor a long-distance service expressly for you. If they sold the data to a travel agency, you'd receive solicitations for flights to Milwaukee.

Unlike the AT&T Wireless case, using this data seems to be legal. But customers didn't care about that. When they discovered Qwest's plan embedded in fine print in a Qwest brochure, they protested via the media, saying it was an unethical violation of their privacy. Qwest offered an opt-out service whereby consumers could call or go online and request that Qwest exclude their personal CPNI data. This process proved so difficult that some privacy advocates accused Qwest of making it difficult on purpose, to keep the number of customers who opted out at a minimum.

Finally, under unyielding customer pressure, Qwest dropped the idea for now. Pending the Federal Communications Commission's review of the rules around CPNI data later this year, Qwest has said it may yet use CPNI data as a marketing tool.

After several requests by CIO to speak to the CIO and chief privacy officer, a Qwest spokeswoman declined to comment, and questions sent via e-mail went unanswered.

Qwest could have avoided the flood of bad publicity by making the CPNI marketing program opt-in from the beginning. According to privacy advocates, opt-in is the appropriate way to handle customer data. Personal data is untouchable until you, the consumer, give us permission to use it after we, the company, tell you precisely what we want to use it for.

The problem is, marketers will do anything to avoid opt-in marketing. Statistics suggest that when personal data marketing becomes opt-in, more than nine out of 10 consumers will decline to join. If only 10 percent of the target market chooses to participate, the marketers are left with data that doesn't tell them very much.

The Next Branding Trend

When CIOs do take a lead on data ethics, the results are positive. Gene Elias, CIO of Quiksilver, a surfwear clothing company based in Huntington Beach, Calif., that targets 9- to 15-year-old children, has taken this thinking to the extreme. Elias has prohibited sales and marketing from using any of the customer data he possesses which amounts to personal information collected when people join mailing lists or become members at the Quiksilver website. He says the company doesn't retain credit card numbers after transactions in Quiksilver stores, and all of his data ethics guidelines pass muster worldwide.

Clearly such limits set up an adversarial relationship between marketing, which stands to benefit greatly from collecting and repurposing data, and the CIO, who stands to lose his reputation over a privacy flap. "So far, knock on wood, marketing understands when I tell them there are pitfalls in doing certain things with data," says Saab's Rode. "But it's also helped to offer alternatives, instead of just being the 'no' man."

One alternative suggested by Rode that has paid back is an online service called Saab-i. Rode makes sure it's the ultimate opt-in with absolutely no use of data without express consent of the consumers. In exchange for agreeing to be marketed to and have their data used in aggregate form, consumers obtain access to early notification of Saab promotions and can get their car questions answered. Rode says membership has doubled every year for three years.

Both Rode and Elias believe the next branding trend in the United States will be trust. Marketing opportunities lost will be made up through customer loyalty. Consumers will choose between vendors based on their policies around how ethically the company treats personal information, among other privacy litmus tests. And CIOs, they say, can play a crucial role in promoting this trend.

While Rode has developed formal rules on how to manage customer data, Elias admits that his are mostly informal. "If someone new came into marketing and I felt like they didn't understand how I guard this data, I might change my tune, make the rules a little more formal because the relationship is not there," he says.

A Hippocratic Oath

It might work informally at Quiksilver, but as Mayo Clinic's Cranmer notes, "not everyone has a strong sense of ethics." As a result, some CIOs say there needs to be a formal code of ethics or principles that lay out the CIO's moral obligations when it comes to data. Such a doctrine is meant to help CIOs in a way that the Hippocratic oath guides doctors. The six commandments we have suggested on Page 58 address not only the issue of customer data but data retention and deletion as well, and the role that the CIO should play in communicating to the rest of the organization why such guidelines are crucial.

"If we have guidelines, it will help the CIO know where the line is," says Cranmer. "It would also allow CIOs to hold up a document that says, 'This is what I ascribe to.'"

Got any ethical conundrums you'd like to share? Let Senior Writer Scott Berinato know at sberinato@cio.com.

Is it up to the CIO to start the conversation about the ethical uses of data? WEIGH IN with your opinion. To find the comment page, go to the Web Connections box at www.cio.com.

THE COMMANDMENTS The Six Commandments of Ethical Data Management

Electronic data about customers, partners and employees has become corporate America's most valuable asset. But the line between the proper and improper use of this asset is at best blurry. Should an employer be able to search employee files without employee consent? Should a company be able to sell customer data without informing the customer of its intent? What is a responsible approach to document deletion?

The law provides guidelines in many of those areas, but how a company chooses to act within the confines of the law is up to its officers. Since CIOs are responsible for the technology that collects, maintains and destroys corporate data, they sit smack in the middle of this ethical quagmire. Or they ought to. In an effort to provide guidelines for CIOs thinking hard about ethical data management (and to nudge those who aren't), we have developed with the help of more than 100 CIOs principles for ethical data management.

Here's how we did it: We asked members of the CIO Best Practice Exchange, our members-only online IT executive forum, to generate and then debate a set of principles for the ethical management of data. From this online discussion and follow-up telephone interviews, we drew up a set of seven principles to guide CIOs through the murky territory of data collection, manipulation and destruction.

Next, we put those seven principles back into the Exchange for a vote. The six survivors (those principles that received more than 50 percent of member votes) are listed right.

It has been proposed, and accepted, that...

1. Data is a valuable corporate asset and should be managed as such, like cash, facilities or any other corporate asset. Members gave unanimous support to this principle. The philosophy here is simple: The better you manage your corporate data, the more valuable your corporate asset. Poor management of that data is like throwing away money.

2. The CIO is steward of corporate data and is responsible for managing it over its life cycle from its generation to its appropriate destruction.

While all voting members agree that data is an asset, only 72 percent want to be responsible for the health of that asset. This then raises the question: If not the CIO, then who?

3. The CIO is responsible for controlling access to and use of data, as determined by governmental regulation and corporate policy.

According to 73 percent of our voters, marketing, HR or anyone else who wants a piece of the corporate jewels must go through their gatekeeper, the CIO.

4. The CIO is responsible for preventing the inappropriate destruction of data. Where were the CIOs of Enron and Arthur Andersen during their massive data destruction campaigns? Most companies, on the advice of corporate counsel, destroy data on a regular basis. But when the goal is to circumvent those policies and eliminate incriminating evidence, it falls on the CIO's shoulders, according to 69 percent of voters, to keep that data safe.

5. The CIO is responsible for bringing technological knowledge to the development of data management practices and policies.

Top executives cannot develop an effective data management policy without knowing the full range of technical possibilities for slicing, dicing, collecting and trashing it. And it is the CIO who owns that knowledge and must share with other members of the executive committee, according to all but one of the voters.

6. The CIO should partner with executive peers to develop and execute the organization's data management policies. This statement received 100 percent voter support. It goes both ways: A company that creates data management policies without the input of its steward will wind up with a toothless policy, as will the CIO who rules over data with an iron fist.

It has been proposed and rejected that...

The CIO is responsible for maintaining the accuracy and integrity of data.

Fifty-two percent of our members voted this statement off the island. Why? Garbage in, garbage out, members told us. CIOs can build systems that force users to conform to format, but they can't do much about users who enter inaccurate information.

Do you accept THE CIO'S SIX COMMANDMENTS? Post your vote and comments at www.cio.com/readerpoll. What guidelines for ethical data management have we missed? E-mail Best Practice Exchange Director Martha Heller at mheller@cio.com.

To Preserve or Not to Preserve, That Is the Question

Any CIO who does not have a corporatewide policy in place governing the retention and deletion of data is in big trouble

In general, the CIO's ethic is to preserve data at all cost. In the post-Enron era, that ethos is running smack into a more powerful corporate urge to get rid of any compromising data.

"I think, because of Enron, you'll see executives more focused on making sure there's nothing that will make us look bad," says Avram Kornberg, CTO of Oppenheimer Funds in New York City.

So what's a good CIO to do?

For starters, you should initiate the development of a clear-cut corporate policy that spells out when data should be preserved and when it is appropriate to delete it, if only to save storage space for future data. And once that policy is in place, the CIO's responsibility is to raise objections if someone in your company asks you to delete data in a way that does not conform to company policy or to legal policy.

"What happens is a CIO might get an order to delete something and he'll nod and say that's fine, thinking nothing of the request, but then the obscure machinery of fate exposes this and people think you've done something unethical," says Colin Potts, a privacy and technology expert who specializes in ethical issues as associate professor of computing at Georgia Tech. "It is not enough to have good intentions in IT, to think as long as you're not stealing or lying you're ethical. You have to look at it like engineering; you have to think about the technical consequences of what you're doing."

This means CIOs should save data, not delete it, unless there's an extremely sound reason to delete it. Even then, don't assume that data's deleted in the true sense of the word. Technology forensics experts can recover quite a bit of deleted data.

"The CIO will always be somewhat on the line for this," argues Thomas Bodenberg, a senior research associate at The Conference Board in New York City. "He has the tools."

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.
Show Comments