The practice of data science, unlike biomedical research, has no clearly demarcated standards for adequate consent and transparency. As our early class discussions have illustrated, it can be hard to determine under which circumstances consent should be required in such a rapidly evolving field. Europe’s General Data Protection Regulation (GDPR), implemented in 2018, provides the most sweeping protection of user privacy rights yet passed in the world.
The ramifications of GDPR were made starkly clear on January 21 when France’s CNIL fined Google $57 million for “a lack of transparency, inadequate information and lack of valid consent regarding the ads personalization.”  Some students in the class have raised the point that users really have no idea what is being done with their data, and that therefore it is impossible for users to consent. In many ways, the CNIL’s decision reflects this concern, as the CNIL finds that “The information on processing operations for the ads personalization is diluted in several documents and does not enable the user to be aware of their extent.” 
Additionally, the CNIL found that Google was in violation of GDPR for setting ad personalization as the default option for users. Google’s business model (and those of many other large multinational tech companies) is obviously heavily dependent on advertising revenue from personalized ads; requiring users to opt-in to personalization would have a significant impact on Google’s bottom line. With this largest-ever fine under GDPR, the data science practices of many of the largest tech companies will definitely come under more scrutiny.
Although currently the decision only impacts EU subjects, it is becoming increasingly clear that similar regulations may be coming to the United States. Microsoft President and Chief Legal Officer Brad Smith has described a United States privacy law as a “historical inevitability.”  GDPR and these future privacy laws will have an outsized impact on the development of norms regarding data science practice. It will be integral that regulators take into account many of the ethical principles discussed in our class as they move to police the practice of these companies. Given the unprecedented amount of data that is generated each day, the potential benefits of that data must be carefully weighed with the privacy rights of users.
As The Economist notes, “The Privacy Wars have begun in earnest.”  This battle between tech companies and European regulators will be worth monitoring in real time as more regulatory actions are taken to better understand how the legal standards for data science practice are evolving. Examining the successes and shortcomings of the GDPR’s impact on the use of data should lead to fruitful discussions.
 “The CNIL’s restricted committee imposes a financial penalty of 50 Million euros against GOOGLE LLC,” CNIL. January 21, 2019.
 “Google fine launches new era in privacy enforcement,” Politico. January 21, 2019.
 “The French fine against Google is the start of a war” The Economist. January 24, 2019.