By the end of this year, the world's population is expected to hit seven billion. That's a huge number, but it pales in comparison to the 60 billion to 100 billion photos Facebook has reportedly stored on its servers. I bring this up, of course, because Facebook users are clicking their little fingers off tagging those billions of photos, and Facebook is happily adding those tags to its enormous databases of personal information.
Facebook launched a facial recognition system for a small number of users last December, but earlier this month it became a feature of everyone's Facebook account, at least in the U.S. What it is supposed to do sounds innocuous: suggest a tag for uploaded photos based on matches of other photos determined by the software. But it isn't innocuous. It's just plain creepy.
I'm not going off the deep end with scare stories about facial recognition that go something like this: A frat guy sees a woman in a bar that he thinks is hot, snaps her photo with his cell phone camera, and before you know it, he knows who she is, where she lives and anything else that she may have shared on Facebook -- or that her friends may have shared about her.
It doesn't work that way. Nor will you be able to go online, click on a random photo and find out the identity of everyone in it.
So what's the problem? Trust. Facebook hasn't earned it. There's enormous potential for misuse of facial recognition information, and Facebook has a long record of misusing all sorts of data.
Privacy abuse pays
The social networking giant has fooled us over and over again, blithely exposing users' private information to any advertiser who happens to get interested. It's a tired drama. Facebook messes up, they get caught, the media freaks out, Facebook apologizes.
Then the cycle starts all over, as it did last year when the Wall Street Journal learned that it's not just Facebook that's harvesting personal data but Facebook's platform developers as well. That data, some of which made it possible to identify specific users, was being shared with advertisers and Internet tracking companies, whether those users had opted for privacy or not.
Why would Facebook do such a thing? In a word, money. There's a wonderfully symbiotic relationship between Facebook and the major app developers. Apps make the Facebook service much more attractive; indeed the proliferation of cool add-ons propelled Facebook past also-rans like MySpace. And without Facebook, the developers are in Palookaville. Everybody has an incentive to just get along and keep on raking in the bucks.
And those bucks are very big indeed. Facebook is privately held, but it is widely believed to have posted revenue of about $500 million in 2009, and significantly more in 2010. And a big chunk of that, maybe as much as $50 million, came from the sale of virtual goods used with various applications. A report in AdvertisingAge estimated that the aggregate Facebook-related revenue for third party developers was actually larger than that of Facebook itself. Facebook has to be thinking of a way to cash in on that, perhaps a revenue sharing arrangement for the sale of virtual goods.
And of course, there's the never-ending issue of Facebook's privacy settings. Time and again we learn that some chunk of data is shared by default, which is to say, you've opted in unless you've explicitly opted out. That's exactly what's happening with facial recognition. Facebook has automatically opted you in, which means your friends will see suggestions of photos in which to tag you, unless you change the setting.
Given its record, what are the odds that Facebook will say no to a lucrative deal that monetizes that store of carefully-tagged photos? Even if Facebook were scrupulous about user privacy, that data store would be a very tempting target for hackers, given how fragile security at even major financial institutions appears to be these days.
Then there's the issue of law enforcement. It is mighty easy for the feds and even local cops to get their hands on all sorts of records you might have thought were private or impossible to find. Federal officials, for example, have been grabbing location data harvested from cell phone towers for some time without getting an OK from a judge.
As I said, Facebook isn't making it possible to identify a random person by searching a database for his or her identity based on their appearance. But that doesn't mean it can't be done. Indeed, London has thousands of surveillance cameras scattered about the city and police have already found ways to match faces in those crowds with their owners.
Even assuming that our officials wouldn't do anything reprehensible with that data, what about foreign governments? Remember, Facebook has tens of millions of users outside the U.S. Wouldn't Hosni Mubarak's thugs loved to have plugged in their surveillance photos to a database and gotten the names of demonstrators in Cairo's Tahir Square?
I don't think I'm being unfair to Facebook, or acting paranoid. If I am, I have plenty of company.
The Electronic Privacy Information Center filed a complaint with the Federal Trade Commission, calling Facebook's actions unfair and deceptive: "There is every reason to believe that unless the Commission acts promptly, Facebook will routinely automate facial identification and eliminate any pretence of user control over the use of their own images for online identification."
The attorney general of Connecticut has expressed concern that Facebook facial recognition compromises consumer privacy. They are not very happy about it in Europe either, where the European Union has promised to investigate the matter.
Last week was my birthday (hold the flowers) and I was pleased when so many of my Facebook friends posted greetings on my wall. But was that good feeling really worth the risk posed by Facebook's feckless behavior? I'm not so sure.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.