In season three's finale, "Hated in the Nation," Black Mirror presented a dystopian reality of Smart Cities, particularly the aggregation of sensors within an urban geographic space (Ricker et al., 2014; Hara et al. 2014). Although dramatized, the episode highlighted the perils of unconsciously releasing personal geographic information for the public to freely access and repurpose. In summary, the fictional private corporation 'Granular Project' developed artificial swarm intelligence, or mini drone bees, in response to bee extinction. With the United Kingdom government funding a significant portion of the project, the government linked citizen’s ID photos and names with face recognition software so that the artificial bees, with their embedded cameras, served as a government tool for city surveillance. An employee who worked on the project took advantage of this security breach by programming the bees to track and kill human targets in an act of terrorism. Initially unbeknownst to the general public, these targets were chosen through the hashtag #DeathTo that would trend on a Twitter-like platform. With knowledge of the target's whereabouts from a target's cellphone's International Mobile Station Equipment Identity (IMEI), the swarm of intelligence would automatically locate, swarm, and kill the target.
Yes, it is both dramatic and terrifying, but it is also plausible as our society continues to become more integrated with mobile technology, particularly smart phone applications that require access to cell phones’ location to provide services, such as Uber or PokemonGo. Mobile applications today leverage the complex relationship between a human, their mobile technology, and their surrounding physical and social infrastructures in order to provide location based services (LBS). In this relationship, a human acts as a sensor; they internalize external physical and/or institutional phenomena and then actively or passively publish these observations through their mobile GPS, camera, recorder, and internet access (Goodchild 2008). In other words, the human provides volunteered geographic information (VGI). For example, the Google Maps and Uber apps both require their users to access the internet with their mobile devices and provide their explicit location via their mobile’s GPS; in exchange, Google Maps provides route directions, including public transportation and bike routing, while Uber provides a transportation service. These apps are even expanding their market. For example, Google Maps is starting to include indoor routing and Uber is expanding as a delivery service.
We receive services from opening our coordinates, but we also risk losing our privacy. Consider the proliferation of check-ins on November 1, 2016, at Standing Rock Reservation, in which the police were accused of monitoring Facebook users' check-ins (Levin and Woolf 2016). Facebook users' geotagged posts were reportedly repurposed by the police to monitor protesters at Standing Rock Reservation. Although it is still unclear whether the police were correctly accused, as an act of global solidarity and technological outreach, Facebook users checked into Standing Rock to act as decoys.
When we blindly grant mobile applications access to our location, we are allowing an unknown public audience to repurpose our VGI. Whether or not the police monitored the Standing Rock check-ins, protestors who checked into Standing Rock did not expect that their check-ins would be monitored by the police. That being said, "slacktivism" occurred, where modern technology allowed people from all across the globe who share similar values on a geographic place to virtually amass (CBC 2016). So, rather than continue the dooms-day narrative introduced in Black Mirror, one which highlights the privacy perils of blindly releasing social media posts and your VGI to the public, we can create a new narrative that redefines privacy based on the complex relationship between humans, technologies, and geographic space.
- Julia Conzon