Augmented Reality Must Have Augmented Privacy
Imagine walking down the street, looking for a good cup of coffee. In the distance, a storefront glows in green through your smart glasses, indicating a well-reviewed cafe with a sterling public health score. You follow the holographic arrows to the crosswalk, as your wearables silently signal the self-driving cars to be sure they stop for your right of way. In the crowd ahead you recognize someon
e, but can’t quite place them. A query and response later, “Cameron” pops above their head, along with the context needed to remember they were a classmate from university. You greet them, each of you glad to avoid the awkwardness of not recalling an acquaintance. This is the stuff of science fiction, sometimes utopian, but often as a warning against a dystopia. Lurking in every gadget that can enhance your life is a danger to privacy and security. In either case, augmented reality is coming closer to being an everyday reality. In 2013, Google Glass stirred a backlash, but the promise of augmented reality bringing 3D models and computer interfaces into the physical world (while recording everything in the process) is re-emerging. As is the public outcry over privacy and “always-on” recording. In the last seven years, companies are still pushing for augmented reality glasses—which will display digital images and data that people can view through their glasses. Chinese company Nreal, Facebook and Apple are experimenting with similar technology. Digitizing the World in 3D Several technologies are moving to create a live map of different parts of our world, from Augmented or Virtual Reality to autonomous vehicles. They are creating “machine-readable, 1:1 scale models” of the world that are continuously updated in real-time. Some implement such models through point clouds, a dataset of points coming from a scanner to recreate the surfaces (not the interior) of objects or a space. Each point has three coordinates to position them in space. To make sense of the millions (or billions) of points, a software with Machine Learning can help recognize the objects from the point clouds—looking exactly as a digital replica of the world or a map of your house and everything inside. The promise of creating a persistence 3D digital clone of the world aligned with real-world coordinates goes by many names: “world’s digital twin,” “parallel digital universe,” “Mirrorworld,” “The Spatial Web,” “Magic Verse'' or a “Metaverse”. Whatever you call it, this new parallel digital world will introduce a new world of privacy concerns—even for those who choose to never wear it. For instance, Facebook Live Maps will seek to create a shared virtual map. LiveMaps will rely on users’ crowd-sourced maps collected by future AR devices with client-mapping functionality. Open AR, an interoperable AR Cloud, and Microsoft’s Azure Digital Twins are seeking to model and create a digital representation of an environment. Facebook’s Project Aria continues on that trend and will aid Facebook in recording live 3D maps and developing AI models for Facebook’s first generation of wearable augmented reality devices. Aria’s uniqueness, in contrast to autonomous cars, is the “egocentric” data collection of the environment—the recording data will come from the wearers’ perspective; a more “intimate” type of data. Project Aria is also a 3D live-mapping tool and software with an AI development tool, not a prototype of a product, nor an AR device due to the lack of display." According to Facebook, Aria’s research glasses, which are not for sale, will be worn only by trained Facebook staffers and contractors to collect data from the wearer’s point of view. For example, if the AR wearer records a building and the building later burns down, the next time any AR wearer walks by, the device can detect the change, and update the 3D map in real-time. A Portal to Augmented Privacy Threats In terms of sensors, Aria’s will include among others a magnetometer, a barometer, GPS chip, and two inertial measurement units (IMU). Together, these sensors will track where the wearer is (location), where the wearer is moving (motion), and what the wearer is looking at (orientation)—a much more precise way to locate the wearers’ location. While GPS doesn’t often work inside a building, for example, sophisticated IMU can allow a GPS receiver to work well indoors when GPS-signals are unavailable. A machine learning algorithm will build a model of the environment, based on all the input data collected by the hardware, to recognize precise objects and 3D map your space and the things on it. It can estimate distances, for instance, how far the wearer is from an object. It also can identify the wearers’ context and activities: Are you reading a book? Your device might then offer you a reading recommendation. The Bystanders’ Right to Private Life Imagine a future where anyone you see wearing glasses could be recording your conversations with “always on” microphones and cameras, updating the map of where you are in precise detail and real-time. In this dystopia, the possibility of being recorded looms over every walk in the park, every conversation in a bar, and indeed, everything you do near other people. During Aria’s research phase, Facebook will be recording its own contractors’ interaction with the world. It is taking certain precautions. It asks the owners’ concerns before recording in privately owned venues such as a bar or restaurant. It avoids sensitive areas, like restrooms and protests. It blurs peoples’ faces and license plates. Yet, there are still many other ways to identify individuals, from tattoos to peoples’ gait, and these should be obfuscated, too. These blurring protections mirror those used by other public mapping mechanisms like Google Street View. These have proven reasonable—but far from infallible—in safeguarding bystanders’ privacy. Google Street View also benefits from focusing on objects, which only need occasional recording. It’s unclear if these protections remain adequate for perpetual crowd-sourced recordings, which focus on human interactions. Once Facebook and other AR companies release their first generation of AR devices, it will likely take concerted efforts by civil society to keep obfuscation techniques like blurring in commercial products. We hope those products do not layer robust identification technologies, such as facial recognition, on top of the existing AR interface. The AR Panopticon If the AR glasses with “always-on” audio-cameras or powerful 3D mapping sensors become massively adopted, the scope and scale of the problem changes as well. Now the company behind any AR system could have a live audio/visual window into all corners of the world, with the ability to locate and identify anyone at any time, especially if facial or other recognition technologies are included in the package. The result? A global panopticon society of constant surveillance in public or semi-public spaces. In modern times, the panopticon has become a metaphor for a dystopian surveillance state, where the government has cameras observing your every action. Worse, you never know if you are a target, as law enforcement looks to new technology to deepen their already rich ability to surveil our lives. Legal Protection Against Panopticon To fight back against this dystopia, and especially government access to this panopticon, our first line of defense in the United States is the Constitution. Around the world, we all enjoy the protection of international human rights law. Last week, we explained how police need to come back with a warrant before conducting a search of virtual representations of your private spaces. While AR measuring and modeling in public and semi-public spaces is different from private spaces, key Constitutional and international human rights principles still provide significant legal protection against police access. In Carpenter v. United States, the U.S. Supreme Court recognized the privacy challenges with understanding the risks of new technologies, warning courts to “tread carefully … to ensure that we do not ‘embarrass the future.’” To not embarrass the future, we must recognize that throughout history people have enjoyed effective anonymity and privacy when conducting activities in public or semi-public spaces. As the United Nations' Free Speech Rapporteur made clear, anonymity is a “common human desire to protect one’s identity from the crowd…" Likewise, the Council of Europe has recognized that while any person moving in public areas may expect a lesser degree of privacy, “they do not and should not expect to be deprived of their rights and freedoms including those related to their own private sphere.” Similarly, the European Court of Human Rights, has recognized that a “zone of interaction of a person with others, even in a public context, may fall within the scope of “private life.” Even in public places, the “systematic or permanent recording and the subsequent processing of images could raise questions affecting the private life of individuals.” Over forty years ago, in Katz v. United States, the U.S. Supreme Court also recognized “what [one] seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected.” This makes sense because the natural limits of human memory make it difficult to remember details about people we encounter in the street; which effectively offers us some level of privacy and anonymity in public spaces. Electronic devices, however, can remember perfectly, and collect these memories in a centralized database to be potentially used by corporate and state actors. Already this sense of privacy has been eroded by public camera networks, ubiquitous cellphone cameras, license plate readers, and RFID trackers—requiring legal protections. Indeed, the European Court of Human Rights requires “clear detailed rules…, especially as the technology available for use [is] continually becoming more sophisticated.” If smartglasses become as common as smartphones, we risk losing even more of the privacy of crowds. Far more thorough records of our sensitive public actions, including going to a political rally or protest, or even going to a church or a doctor’s office, can go down on your permanent record. This technological problem was brought to the modern era in United States v. Jones, where the Supreme Court held that GPS tracking of a vehicle was a search, subject to the protection of the Fourth Amendment. Jones was a convoluted decision, with three separate opinions supporting this result. But within the three were five Justices – a majority – who ruled that prolonged GPS tracking violated Jones’ reasonable expectation of privacy, despite Jones driving in public where a police officer could have followed him in a car. Justice Alito explained the difference, in his concurring opinion (joined by Justices Ginsburg, Breyer, and Kagan): In the pre-computer age, the greatest protections of privacy were neither constitutional nor statutory, but practical. Traditional surveillance for any extended period of time was difficult and costly and therefore rarely undertaken. … Only an investigation of unusual importance could have justified such an expenditure of law enforcement resources. Devices like the one used in the present case, however, make long-term monitoring relatively easy and cheap. The Jones analysis recognizes that police use of automated surveillance technology to systematically track our movements in public places upsets the balance of power protected by the Constitution and violates the societal norms of privacy that are fundamental to human society. In Carpenter, the Supreme Court extended Jones to tracking people’s movement through cell-site location information (CSLI). Carpenter recognized that “when the Government tracks the location of a cell phone it achieves near perfect surveillance as if it had attached an ankle monitor to the phone’s user.” The Court rejected the government’s argument that under the troubling “third-party doctrine,” Mr. Carpenter had no reasonable expectation of privacy in his CSLI because he had already disclosed it to a third party, namely, his phone service provider. AR is Even More Privacy Invasive Than GPS and CSLI Like GPS devices and CSLI, AR devices are an automated technology that systematically documents what we are doing. So AR triggers strong Fourth Amendment Protection. Of course, ubiquitous AR devices will provide even more perfect surveillance, compared to GPS and CSLI, not only tracking the user’s information, but gaining a telling window into the lives of all the bystanders around the user. With enough smart glasses in a location, one could create a virtual time machine to revisit that exact moment in time and space. This is the very thing that concerned the Carpenter court: the Government can now travel back in time to retrace a person’s whereabouts, subject only to the retention policies of the wireless carriers, which currently maintain records for up to five years. Critically, because location information is continually logged for all of the 400 million devices in the United States — not just those belonging to persons who might happen to come under investigation — this newfound tracking capacity runs against everyone.
Likewise, the Special Rapporteur on the Protection of Human Rights explained that a collect-it-all approach is incompatible with the right to privacy: Shortly put, it is incompatible with existing concepts of privacy for States to collect all communications or metadata all the time indiscriminately. The very essence of the right to the privacy of communication is that infringements must be exceptional, and justified on a case-by-case basis.
AR is location tracking on steroids. AR can be enhanced by overlays such as facial recognition, transforming smartglasses into a powerful identification tool capable of providing a rich and instantaneous profile of any random person on the street, to the wearer, to a massive database, and to any corporate or government agent (or data thief) who can access that database. With additional emerging and unproven visual analytics (everything from aggression analysis to lie detection based on facial expressions is being proposed), this technology poses a truly staggering threat of surveillance and bias. Thus, the need for such legal safeguards, as required in Canada v. European Union, are “all the greater where personal data is subject to automated processing. Those considerations apply particularly where the protection of the particular category of personal data that is sensitive data is at stake.” Augmented reality will expose our public, social, and inner lives in a way that maybe even more invasive than the smartphone’s “revealing montage of the user’s life” that the Supreme Court protected in Riley v California. Thus it is critical for courts, legislators, and executive officers to recognize that the government cannot access the records generated by AR without a warrant. Corporations Can Invade AR Privacy, Too Even more, must be done to protect against a descent into AR dystopia. Manufacturers and service providers must resist the urge, all too common in Silicon Valley, to “collect it all,” in case the data may be useful later. Instead, the less data companies collect and store now, the less data the government can seize later. This is why tech companies should not only protect their users’ right to privacy against government surveillance but also their users’ right to data protection. Companies must, therefore, collect, use, and share their users’ AR data only as minimally necessary to provide the specific service their users asked for. Companies should also limit the amount of data transited to the cloud, and the period it is retained, while investing in robust security and strong encryption, with user-held keys, to give user control over information collected. Moreover, we need strong transparency policies, explicitly stating the purposes for and means of data processing, and allowing users to securely access and port their data. Likewise, legislatures should look to the augmented reality future, and augment our protections against government and corporate overreach. Congress passed the Wiretap Act to give extra protection for phone calls in 1968, and expanded statutory protections to email and subscriber records in 1986 with the Electronic Communication Privacy Act. Many jurisdictions have eavesdropping laws that require all-party consent before recording a conversation. Likewise, hidden cameras and paparazzi laws can limit taking photographs and recording videos, even in places open to the public, though they are generally silent on the advanced surveillance possible with technologies like spatial mapping. Modernization of these statutory privacy safeguards, with new laws like CalECPA, has taken a long time and remains incomplete. Through strong policy, robust transparency, wise courts, modernized statutes, and privacy-by-design engineering, we can and must have augmented reality with augmented privacy. The future is tomorrow, so let’s make it a future we would want to live in.
Author: Kurt Opsahl
Date: 2020-10-16
URL: https://www.eff.org/deeplinks/2020/10/augmented-reality-must-have-augmented-privacy
eff.org
The Selective Prosecution of Julian Assange (2020-10-07) | As the extradition hearing for Wikileaks Editor-in-Chief Julian Assange unfolds it is increasingly clear that the prosecution of Assange fits into a pattern of governments selectively enforcing laws in order to punish those who provoke their ire As we see in Assanges case and in many others before this computer crime laws are especially ripe for this form of politicization The key evidence in the .. |
Let’s Stand Up for Home Hacking and Repair (2020-11-24) | Lets tell the Copyright Office that its not a crime to modify or repair your own devices Every three years the Copyright Office holds a rulemaking process where it grants the public permission to bypass digital locks for lawful purposes In 2018 the Office expanded existing protections for jailbreaking and modifying your own devices to include voice-activated home assistants like Amazon Echo and Go.. |
Sen. Ron Wyden Joins EFF on December 10 for Fireside Chat About the Future of Free Speech (2020-12-02) | Coauthor of Section 230 Wyden Will Address Calls to Repeal the ProvisionSan FranciscoSen Ron Wyden a fierce advocate for the rights of technology users will join EFF Legal Director Corynne McSherry on Thursday December 10 for a livestream fireside chat about the fight to defend freedom of expression and innovation on the webWyden is an original framer of Section 230 one of the legal pillars of the.. |
InternetLab’s Report Sets Direction for Telecom Privacy in Brazil (2020-11-16) | Five years have passed since InternetLab published Quem Defende Seus Dados? Who defends your data? a report that holds ISPs accountable for their privacy and data protection policies in Brazil Since then major Brazilian telecom companies have provided more transparency about their data protection and privacy policies a shift primarily fueled by Brazils new data protection law InternetLabs fifth an.. InternetLab’s Report Sets Direction for Telecom Privacy in Brazil |
Victory! EFF Wins Appeal for Access to Wiretap Application Records (2020-10-22) | Imagine learning that you were wiretapped by law enforcement but couldnt get any information about why Thats what happened to retired California Highway Patrol officer Miguel Guerrero and EFF sued on his behalf to get more information about the surveillance This week a California appeals court ruled in his case that people who are targets of wiretaps are entitled to inspect the wiretap materials i.. |
Asleep at the Wheel: Why Didn’t Carmakers Prepare for Massachusetts' Right to Repair Law? (2020-11-12) | The people of Massachusetts demanded their right to repair this month passing a ballot initiative to allow independent repair shops to access critical information about their cars by an overwhelming 749% majority Now automakerswhose scare tactics and false privacy and security claims did not fool Massachusetts votersare expected to use another known tactic from their playbook and ask the legislatu.. |
Open Education and Artificial Scarcity in Hard Times (2020-10-22) | The sudden move to remote education by universities this year has forced the inevitable: the move to an online education While most universities wont be fully remote having course materials online was already becoming the norm before the COVID-19 pandemic and this year it has become mandatory for millions of educators and students As academia recovers from this crisis and hopefully prepares for th.. |
Visa Wants to Buy Plaid, and With It, Transaction Data for Millions of People (2020-11-25) | Visa the credit card network is trying to buy financial technology company Plaid for $53 billion The merger is bad for a number of reasons First and foremost it would allow a giant company with a controlling market share and a history ofanticompetitive practices to snap up its fast-growing competition in the market for payment apps But Plaid is more than a potential disruptor its also sitting on a.. |
Podcast Episode: Control Over Users, Competitors, and Critics (2020-11-24) | Episode 004 of EFFsHow to Fix the Internet Cory Doctorow joins EFF hosts Cindy Cohn and Danny OBrien as they discuss how large established tech companies like Apple Google and Facebook can block interoperability in order to squelch competition and control their users and how we can fix this by taking away big companies legal right to block new tools that connect to their platforms tools that would.. Podcast Episode: Control Over Users, Competitors, and Critics |
EFF Files Comment Opposing the Department of Homeland Security’s Massive Expansion of Biometric Surveillance (2020-10-22) | EFF joined by several leading civil liberties and immigrant rights organizations recently filed a comment calling on the Department of Homeland Security DHS to withdraw a proposed rule that would exponentially expand biometrics collection from both US citizens and noncitizens who apply for immigration benefits and would allow DHS to mandate the collection of face data iris scans palm prints voice .. |