top of page

Who is affected most by the lack of data privacy?

Mask Group 21.png

Research shows that certain demographics such as women, children, economically disadvantaged, and marginalized groups experience more severe consequences of data exploitation.

More devastating consequences can also occur when it comes to children, women, low-income populations, veterans and marginalized communities:

Data privacy and children

Mask Group 10.png

When it comes to children, digital data privacy is important because children are often more vulnerable to online threats such as cyberbullying, online grooming, and exploitation. Addressing these challenges requires a collaborative

approach, engaging stakeholders from various

sectors, including governments, technology

companies, parents, educators, and civil society.

By collectively striving to safeguard children's

rights, we can foster an inclusive and secure

digital environment that enables children to

thrive and grow while preserving their

dignity, privacy, and well-being.

Risks: Children face numerous risks to their privacy in the rapidly evolving digital world including identity theft, online harassment, and exposure to inappropriate content, (Livingstone & Third, 2017; Williamson, 2017).

Datafication of education: Datafication is where data on learning outcomes are fragmented to find patterns. However, this can bring risk to a child whose every educational metric can be isolated and jumbled without context or introduced with bias behind how the data is analyzed. (Williamson, 2019)

Privacy comprehension: Children often lack awareness and understanding of their privacy rights online, making them vulnerable to exploitation (Selwyn et al., 2003).

Internet of toys: Children are at risk of data breaches through the "internet of toys," expose them to potential dangers and data breaches depending on the amount of data collected by toys connected to the internet. (McReynolds, et al., 2017)

Advertisements and children: Commercial entities that advertise children without regard for age boundaries have been demonstrated to show how advertisements are demanding for children and exploit their developmental stage of peer group acceptance. (Nairn, Dew, 2007)

Predator activity: activity: The sharing of explicit images and the presence of pedophile networks amplify child sexual abuse. (Željko Đ., and Filipović, 2020)

References: Livingstone, S., & Third, A. (2017). Children and young people's rights in the digital age: An emerging agenda. Marx, G., & Steeves, V. (2010). Privacy, surveillance, and the children's internet. Journal of Social Issues, 66(1), 57-77. McReynolds, E., et al. (2017). Internet of toys: Legal and regulatory aspects of data security and privacy. IEEE Consumer Electronics Magazine, 6(2), 86-91. Nairn, A., & Dew, N. (2007). Advertising to children on TV: Content, impact, and regulation. International Journal of Advertising, 26(3), 357-383. Selwyn, N., et al. (2003). What do children in the digital age look like? Developing a profile of digital students. Journal of Computer Assisted Learning, 19(3), 296-306. Williamson, B. (2017). The datafication of primary and early years education: Playing with numbers. Williamson, B. (2019). Coding/learning: Mapping the intersections of computational thinking and progressive education. Learning, Media and Technology, 44(3), 367-382. Željko Đ., & Filipović, M. (2020). Online child sexual abuse: The impact of sexual predators' activities on minors. International Journal of Cognitive Research in Science, Engineering and Education, 8(1), 55-61.

Data privacy and women

Mask Group 19.png

Women face unique risks when it comes to digital data privacy:

Technology has often been exploited to cause harm to women globally. The Plunk Foundation aims to support women through data-privacy awareness, education, policy development, and digital privacy solutions.

Gender-based discrimination and bias: Strong privacy measures empower women to have greater control over their digital identities 

and reduce the risks associated with the misuse of personal information. 

It's important that the Plunk Foundation partners with women

leaders and women's organizations to effectively provide

solutions for women globally.

Online harassment, stalking, and gender-based violence: Violence affects women on social networking sites, necessitating robust data privacy mechanisms (Young & Quan-Haase, 2013). The study found that gender, age, and experience with Facebook were significant predictors of privacy concerns and protection strategies. For instance, females and older participants tended to express greater privacy concerns and employ more protection strategies.

Women's autonomy: Sensitive personal information, such as reproductive health data, requires special attention to ensure confidentiality and protect women's autonomy (D'Ignazio & Klein, 2016). The referenced paper argues that data visualization can perpetuate biases and reproduce existing inequalities if it fails to account for the diverse experiences and perspectives of marginalized groups.

References: De Hert, P., & Gutwirth, S. (2016). Privacy, data protection, and discrimination: How are they connected? Computer Law & Security Review, 32(3), 256-271. D'Ignazio, C., & Klein, L. F. (2016). Feminist data visualization. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 2182-2193). Lwin, M. O., & Williams, D. (2014). Cyberbullying on social networking sites among adolescents: The role of perceived parental monitoring and adolescent's self-disclosure behavior. Computers in Human Behavior, 40, 16-27. Nissenbaum, H. (2015). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press. Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on Facebook: The Internet privacy paradox revisited. Information, Communication & Society, 16(4), 479-500.

Data privacy and low-income populations

The economically disadvantaged often face limited access to resources and opportunities, making them more reliant on online platforms and services. These individuals may rely on digital platforms for financial transactions, accessing education, healthcare, job opportunities, and social interactions. However, their lack of financial means and digital literacy can make them more susceptible to privacy risks.

Exploitation and Discrimination: Economic

disparities can make individuals more vulnerable to exploitation and discrimination. Research has shown

Mask Group 18.png

that individuals from lower socioeconomic  backgrounds are more likely to be targeted by  predatory lending practices, fraudulent schemes, and unfair pricing tactics based on their personal data. Protecting data privacy helps prevent such exploitative practices and promotes fairness in economic transactions. (Brea, 2016)

Financial Inclusion and Opportunities: Access to financial services and economic opportunities are crucial for the economically disadvantaged to improve their circumstances. However, inadequate data privacy measures can lead to exclusion from financial systems. Scholars argue that individuals from lower-income backgrounds may face challenges in accessing credit, loans, and insurance if their personal data is misused or mishandled. Safeguarding data privacy enables the economically disadvantaged to participate fully in the digital economy and access financial services (Winkler,  2017). 

Vulnerability and Stigmatization: The economically disadvantaged often face social stigmatization and discrimination. Privacy breaches can further exacerbate these challenges. Studies have shown that individuals from marginalized communities are more likely to experience harm from data breaches due to a lack of resources and support. Their personal information may be used against them, perpetuating social inequalities. Protecting data privacy helps mitigate these risks and ensures that personal information remains confidential and secure (Baruh & Popescu, 2017).

Surveillance and Power Imbalances: Data privacy is closely linked to power imbalances in society. The economically disadvantaged are often subjected to heightened surveillance and monitoring due to their socioeconomic status. This surveillance can perpetuate social control and limit their autonomy. Scholarly research highlights the importance of privacy as a fundamental right to challenge unequal power structures and protect the dignity and autonomy of individuals, particularly those who are economically disadvantaged (Lyon, 2018).

References:  Baruh, L., & Popescu, M. (2017). Discourses of privacy and the privacy paradox: Analyzing the relationship between online privacy concerns, privacy fatigue, and privacy disclosure. New Media & Society, 19(10), 1528-1546. Brea, J. A. (2016). Discrimination, data mining, and data protection. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2083), 20160123. Lyon, D. (2018). Surveillance capitalism, surveillance culture, and social inequality. Sociology Compass, 12(8), e12592. Winkler, T. J. (2017). Financial inclusion, data privacy, and customer empowerment. The Journal of Business, Entrepreneurship & the Law, 10(1), 51-75.

Data privacy and marginalized communities

marginalized.png

Data privacy is not just a matter of personal choice; it is a fundamental right that plays a pivotal role in protecting marginalized communities in the digital age. Strengthening privacy protections, ensuring algorithmic transparency, and promoting inclusive policies are key steps in creating a fair and equitable digital landscape for all.

Economic Implications: Misuse and exposure of data can have severe consequences for marginalized communities. Historical and ongoing social injustices have made these communities more susceptible to discriminatory practices. Robust privacy protections are essential to address power imbalances and prevent the exploitation of personal information. (Turow & Duggan, 2018)

Algorithms and bias: Search engines and algorithms can perpetuate racism, contributing to the oppression of marginalized groups. The lack of algorithmic transparency exposes these communities to biased decision-making processes, deepening their digital vulnerabilities. Data privacy is a critical factor in dismantling discriminatory systems and promoting fairness. (Noble, 2018):

Diverse Risk Perceptions: Marginalized communities have different risk perceptions when it comes to data privacy due to their historical experiences of surveillance and discrimination. Recognizing and respecting these perspectives is essential in shaping inclusive privacy policies that safeguard their unique vulnerabilities. (Richardson & Schultz, 2018)

Impact of Automated Decision-Making: Automated decision-making systems disproportionately affect marginalized communities, particularly those living in poverty. Such systems perpetuate social exclusion and deepen existing inequalities. Upholding data privacy becomes crucial in mitigating the disparities faced by these communities. (Eubanks, 2018)

bottom of page