Wednesday, January 27, 2016

As someone who cares about privacy, how do I feel working for Google?

(Needless to say, all opinions are mine, I don't speak for my employer or anyone else.)

A friend recently asked me this question

"Thai: I think you have showed a very consistent view on privacy so I would like to ask a rather sensitive question (feel free to not answer): how do you feel working for Google, one of the most notorious companies in tracking and keeping people private data?"

I found my job rewarding. Google in general and my team in particular have done a lot to improve the privacy and security of not only Google users, but the Internet as a whole.

If Google wasn't pushing hard and investing heavily in the past few years, most traffic on the Internet would have still been sent unencrypted or over outdated protocols. Many core developers of OpenSSL, the software package that enables Internet encryption, are my coworker. Many of the most important innovations and researches on Internet encryption were done by Googlers. HSTS, public-key pinning, certificate transparency, etc. you name it, we invented it. TLS 1.3 the latest version of the most important security protocol on the Internet is inspired by QUIC, an in-house protocol that we developed to make the Internet faster and more secure. We also found and fixed many weaknesses in earlier versions of TLS, and put the final nail in the coffin of SSL 3.0, an outdated and insecure protocol that had been used by many big websites.

Google employs top security researchers, and allows us to work on whatever we think would make the Internet a safer place. You won't find anywhere else a large group of people, 500 and still hiring, that care deeper and contribute greater to the security and privacy of the Internet. There is no other company at Google scale that employs a group of world class security researchers (some of them were snatched from NSA or GCHQ, because we want to deprive these agencies of security talents), and lets them do whatever it takes to kill 0-day vulnerabilities.

Have you ever heard of Neel Mehta? Probably not, but he's the guy that discovered Heartbleed, and sitting on the same floor as I am. Michal Zalewski, Tavis Ormandy, etc. many big names in security are working at Google. Michal's AFL literally revolutionizes overnight the practice of finding software vulnerabilities by fuzzing. These days if someone found some cool bug somewhere, chances are that person was using AFL. Tavis's dive into antivirus software has been a very fruitful endeavor, in which he has found numerous critical vulnerabilities that affected millions of Internet users. Totally we have found and fixed thousands of vulnerabilities in many popular software, including Apple's, Microsoft's, etc. If you happen to use a computer or a smart phone at home or at work, many Googlers have worked hard to help you not got hacked.

It'll be naive to conclude that Google has done all this only for altruism's sake. Google needs a safe Internet to conduct our business. People won't use the Internet or even computers if they can get hacked easily. Google's investment in killing software vulnerabilities, in making the Internet safer for everyone, therefore, is a win-win, thus, sustainable, investment.

Re tracking and keeping people private data: I guess by private data you meant web browsing habit? Google mines this data to display better and more relevant ads, which in turn power the free Internet as we see it today. Historically, most people have been not willing to pay for Internet services, thus ads-supported content has been the only sustainable business model that can work at Internet scale. Google pays billions of dollars to publishers (e.g., New York Times, Nguyen Ha Dong of Flabby Bird, etc.) every quarter. Without this source of revenue, the Internet wouldn't reach where it is today.

Google is not alone in this business, but perhaps we are the most successful, thus the most criticized. Most companies track and collect user data. Have you ever thought where the Data from Big Data comes from? It's people private data, mostly. While reading this article, a fellow Googler reminded me that mining private data is not only for displaying ads but also for improving products and in some cases creating new products. He wrote,

"Google Now and Google Photos are good examples. I love these products. How would I use the products without allowing Google to mine my private data? Talking about ads, there are people who don't like ads, but for people who are OK with ads like me, seeing relevant ads is much better than random ads. Leveraging private data is everywhere. You have the option of not providing private data by not using the products."

Google should do better than our competitors, as we always want to keep a high bar for ourselves. This is why we give users a lot of choices to opt out of the personalized ads system. This is why we give users tools to delete their accounts and to completely wipe out or export their data. This is why Google employs hundreds of people whose one and only job is to ensure that our products give users better privacy controls over their private data. Aforementioned Googler is working in Search, and he wrote,

"Google cares deeply about user privacy. I can tell you that all user private data are encrypted at Google. If you work in systems, you probably know that there are many tricks that do not work on encrypted data. In Search we are loosing XX% capacity because of encrypted data. There are many other burden in enforcing this policy in Google beside losing machine resource, e.g. it is hard to debug since normal developer cannot see log trace, etc. Another big company, Facebook, does not encrypt your private data in its data center (it is true 3 years ago, I don't know the current status). This is not to underestimate how much Facebook cares about privacy, but to show that how far Google is willing to go with protecting user information."

When I joined Google four years ago, privacy was neither something that I cared much nor a hot topic. Working at Google has actually made me care more and more about privacy. As the saying goes, with great power comes great responsibility, with all the data that you have entrusted Google with, my team and I take it as our duty to not let you down. Thanks to Snowden, privacy has become an international issue, in which Google has attracted a lot of criticisms, many of which are groundless. There are also a lot of players out there benefiting from spreading FUD about Google. What I want to allude to you in this little rant is that we care very much about the privacy and security of our users and the whole Internet, and we very much want to offer you and other users choices, be it opting out of personalized ads, deleting your personal data, or subscribing to ads-free services.

You can learn more about how we think about and protect your privacy at You can manage your security and privacy settings at The privacy and security check ups are great tools that shall allow you to protect your accounts and adjust important privacy settings to your preference. Ads is still our and the Internet's cash cow, but we also have many fast-growing non-ads businesses, in which we generate revenue by earning user trust. We've launched many ads-free subscription services such as YouTube Red, Google for Work (a.k.a, Google Apps for Your Domain), etc. Recently we introduced Contributor, I worked on this project, which allows users to support the websites they like and at the same time see fewer ads. In sum, if you are willing to pay for services Google provides many ads-free choices.

I'll leave Google one day, but I'll be a loyal user for life. Google has simply earned my trust. I trust that none, even Google employees or any three letter agencies, can unauthorizedly read my emails, view my photos, download my documents, etc. without having to step over the corpses of an army of very skilled engineers or lawyers, working very hard to protect whatever data that I've entrusted Google with. And you should too.

No comments: