Google’s Holiday Gift To China

There are only three shopping days remaining until Christmas Eve! Have you purchased and wrapped all of the presents on your Gift List?

Some of us, of course, confront more difficult challenges than others in choosing appropriate gifts for recipients. But imagine how tough it must be to select a gift for the world’s largest communist nation!

In a sense, that’s exactly what Google may have delivered for the government of China. On December 13th, the internet services giant announced that it will open a center for basic Artificial Intelligence research in Beijing.

So why is this a gift? Because Google’s services, like Facebook’s, are banned in China. And on December 18th, just five days after Google’s announcement, a Chinese official confirmed the ban by declaring:

That’s a question maybe in many people’s minds, why Google, why Facebook are not yet working and operating in China. If they want to come back, we welcome (them). The condition is that they have to abide by Chinese law and regulations. That is the bottom line. And also that they would not do any harm to Chinese national security and national consumers’ interests.

It’s possible, of course, that Google’s decision will help it gain access to the Chinese market in 2018. If that occurs, its AI Center may be perceived in retrospect as a profitable investment in a new business market.

But what if the Chinese government doesn’t open its market to Google next year? Perhaps the center’s Chinese technology specialists will provide valuable developmental expertise to the American firm. And perhaps those same specialists will learn just as much from Google.

At the moment, though, Google has made a commitment to open an advanced research center in a nation that bans its services from its entire domestic economy. Unless Google’s commitment eventually “pays off” in some substantive manner, it isn’t very difficult to characterize its decision as a gift.

Apple’s Differential Privacy

Business executives at Apple have always been somewhat ambivalent about the issue of customer privacy. On the one hand, they routinely claim that they maintain a much higher standard of confidentiality towards their user data than many other technology firms. And yet, on the other hand, artificial intelligence programs like Siri cannot learn the preferences of their users without accessing such personal information.

Last week, Apple drew attention to its new computer operating system by announcing that it will employ a technique known as differential privacy to balance these countervailing business imperatives. The term refers to the practice of mixing dummy (i.e. false) data into a large data set in order to make it more difficult for a party with data access to identify any particular user.

How does it work? Imagine, for instance, a bachelor who owns a single residential property. A fictitious wife and a vacation home might be added to his “big data” file without being included in his individual personal profile.

It’s a potentially effective strategy, but it’s a risky one as well. After all, a hacker might thwart its intent by discovering a way to identify and then delete the false content. Or the firm might mismanage its systems and lose the ability to distinguish between the true and the false data.

Given such concerns, perhaps Apple should consider a simpler approach to protecting user data. At the moment, it requires users to read its incomprehensible tiny-print disclosure language before they install its software on their devices.

Instead, perhaps the firm could simply explain the benefits and risks of its data management practices in basic layperson’s language. Each prospective user could then make an informed decision about whether the benefits of utilizing the services justify the risks of doing so.

Such a policy would place Apple squarely on the side of the principle of information transparency. It would also eliminate the need to engage in differential privacy techniques.

But what if Apple doesn’t opt for this policy? Then it’s quite possible that the firm will continue to employ such techniques for the foreseeable future, mixing its good data with the bad.