Improved technology has enabled states and corporations to keep their consumers/citizens under surveillance. This idea resembles Jeremy Bentham’s Panopticon1: an architectural design that enables a single security guard to view all the prisoners inside without them being able to tell when they are being watched or if they are being watched at all. This is called “imagined surveillance”2 because the individual has no idea when they are being seen.
[Image Source: AP]
Nonetheless, imagined surveillance has a massive effect because, in a world where anyone can access another’s information, people are forced to monitor their behavior by making a conscious effort in deciding what to post? On what site to post? Who to add on which platform? Who to hide content from? The reason for this effort lies in the potential employer’s ability to see the profile of the consumer, or a potential college admissions board’s ability to monitor the activities of the student via their online presence. Through this, recruiters are able to reject people even before interviewing them.2
Multiple nation-states also use this phenomenon to their advantage through the usage of CCTV cameras for example. Smart cities, such as London, employ this concept the most often in order to gain the status of “smart cities” (which can only be earned if these cities implement certain systems, which require citizen’s data). London, for example, even use CCTV cameras to monitor people in an attempt to prevent them from throwing wrapper(s) on the ground.
Digital Surveillance has led to what Brooke Erin Duffy and Ngai Keung Chan call “collapsed context”2, where the line between professional and personal is blurred, so individuals self-police their behavior on their private social media. They often keep up “multiple personas across different (online media) platforms”.2 For example, a consumer may post intelligible material on mediums they believe the employer is likely to check, such as Facebook, while on mediums relatively free from scrutiny, such as Instagram, the same consumer may post pictures of clubbing. The question that then arises is how much can we really do it; how marketable can we make ourselves without sacrificing our authentic selves? Frequently this causes us to internalize the marketable potential that we are showcasing for the world, and without realizing it, kill who we actually are.
Digital surveillance also harms marginalized communities the most because it gives higher authority to personal biases. For example, if corporations were to go on a female applicant’s social media profile and see that she is a mother, they might not want to hire her as she would have to divide time between work and household. Similarly, by checking social media profiles, corporations can learn the race and/or ethnicity of the applicants, which may feed into racist/ethnic prejudices. In both cases, the applicant will be affected negatively due to corporations rejecting them during pre-emptive screening.
In a research study conducted by Duffy & Chan, three ways in which people monitor their online behavior are identified. First, individuals may change their privacy settings to try to monitor who sees their content. People often put various “social actors who they fear would monitor their activity”2 on “limited” or “hide from”. The issue with this method is that people mostly tend to block their family, colleagues or peers who they think would “judge” them, which doesn’t impact their future selves in any manner as the people they are blocking do not dictate their future employability. For example, consumers are unable to block or hide content from future employers because they don’t know who these employers may be.
The second way through which individuals are self-policing is by ensuring that they share what society deems as appropriate and making sure to not share anything otherwise.2 The problem with this approach is: who defines what is appropriate and what is not? This approach may also impact marginalized groups to a greater degree because of society’s ability to hold them to a different set of standards. For example, for women, posting pictures in a certain attire even on their private accounts is considered inappropriate – Bailey Davis was fired for uploading a picture on her private account, yet men are not held accountable on the same standards.2
The third way that individuals monitor their own behavior is by creating “pseudonyms and multiple aliases”2. Finstas are an example of this because they are a platform where “individuals project a realistic version”2 of themselves and their daily routine as opposed to main accounts. Finstas are effective because they make it easier to post content that never gets out, and people can’t view anything of yours besides your bio and username. Finsta usernames don’t always have to indicate who you are. People often keep some other image as their display along with a made-up name as their username. The problem is, with an increased number of Finstas, it’s very easy for others to make a fake account of someone else and pretend to be the other person since no one can check the authenticity of the account unless they are added on it. It’s also very hard for the person to tell that their fake account has been made.
Digital media has given too much power to online websites, and as people leave more of their digital fragments online, these companies use the data they get from us to further personal gains. These companies also create an “illusion of voluntariness”5 whereas in reality, the decision to participate in surveillance is involuntary, and people only do it because corporations make them feel like they will be left behind if they don’t. It’s also very hard to resist the system because people “don’t know who to resist”5. In the contemporary world, the idea of Panopticon has been turned into reality in the form of a digital prison, where people feel trapped and simultaneously desire a space where they can be their authentic selves without the fear of being watched.
References:
- Panopticon or the Inspection House by Jeremy Bentham.
- You never really know who’s looking: Imagined Surveillance across social media platforms by Brooke Erin Duffy and Ngai Keung Chan.
- The panopticon’s changing geography by Jerome E. Dobson and Peter F. Fisher.
- Are we living in a post-panoptic society? by Tobias Champion.
- Davies, S. (1997). Re-engineering the right to privacy by Simon G. Davies.