17
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Sexual and reproductive health and rights and bodily autonomy in a digital world

      editorial
      a , b , c
      Sexual and Reproductive Health Matters
      Taylor & Francis
      sexual and reproductive health and rights, digital technology, human rights

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction Digital technologies can have a transformational impact on health and lives of people, particularly those who are structurally discriminated against in different ways, such as by improving access to health information and data and improving diagnostic services. Technology such as Artificial Intelligence (AI) has great potential to help women, girls and gender-diverse people take better control of their health, bridging gaps in access to health-related information and education by providing timely, accurate, personalised answers to health questions. This could be a game-changer, especially in settings where certain health issues such as sexual and reproductive health and rights (SRHR) are still considered taboo. But it is now widely recognised that AI and digital technologies are neither inherently empowering nor sexist; they reflect the values of their contexts and creators. For example, the rapid adoption of digital technologies aimed at increasing efficiency of health care delivery has led to increased inequities and inequalities in health care resources in some contexts. 1 There is grave concern that the Internet is being weaponised to silence and target women’s, girls’, and queer voices. For example, a 2021 study revealed that 73% of women across the globe have experienced some form of online violence on Twitter, 2 and a 2023 study revealed that 20% of respondents who identified as transgender or gender-diverse and experienced tech-facilitated violence also reported severe impacts to their mental health including their desire to live. 3,4 Evidence points towards distinct geographic, economic, and social gaps in design and access to these technologies, including those related to gender, disability and race. 5,6 For example, male (or family/community) gatekeepers often control or restrict access to devices and the internet. 7 Several studies report these restrictions to be greater for younger women and girls. 8,9 The Internet has the potential to make health-related information more accessible. However, in the context of health services such as abortion, the level of misinformation has been described as the “next infodemic” 10 and questions have been raised about the role of social media platforms in propagating it. 11 Recent data from the United States reveals that telehealth interventions, including “digital abortion clinics” that connect patients with health care providers, have been effective in increasing access to abortion care in remote areas, and where it is criminalised. 12 However, a 2022 investigation revealed that Hey Jane – a US-based online abortion pill provider – used tracking tools that passed along user information to Meta, Google, and other companies. 13 Restrictive policy regimes that impair access to health care for women and gender-diverse people translate into the digital space, leading to an increase in surveillance and inhibiting access to services such as safe abortion. This commentary examines the challenges and implications for SRHR and bodily autonomy in a digital world and highlights the critical knowledge and policy gaps to be addressed to ensure that the transformational potential of technology reaches those who need it the most while safeguarding their human rights. SRHR and bodily autonomy in a digital world As restrictions and attacks on bodily autonomy intensify in many parts of the world, the datafication of our lives is also accelerating. Our bodies no longer exist just in the physical world – they also exist in the digital domain: for example, in the data about how many steps we took today, about the locations we visited, the routes we travelled, the information we searched for, the food we ordered, the financial transactions we made, and the data we shared with software applications about our physical health or mental health. AI can assemble these disparate pieces of data to create profiles of each of us – where the digital puzzle pieces of our bodies and lives are put together to form personal, sensitive, identifiable wholes. The implications are deep and require careful consideration. For instance, physical or mental health apps can share data about our bodies or minds with our health insurer, or our existing or potential employer. Our internet search for abortion pills, and chat app communication with our family about our body, and our personal choice, could be handed over to the government by Google and Facebook to build a case to prosecute if these services are criminalised in a particular setting. This is exactly what has happened recently in the US, where a woman and her daughter stood trial in the state of Nebraska for performing an illegal abortion – with a key piece of evidence provided by Meta, the parent company of Facebook. 14 As abortion bans across America are implemented and enforced, TechChrunch (an online newspaper) reports that law enforcement is turning to social media platforms to build cases to prosecute women seeking abortions or abortion-inducing medication – and online platforms like Google and Facebook are helping. 15 This is not just happening in America, but rather is a growing trend, including in low- and middle-income countries, where the internet is being used to target both individual women and providers of safe abortions. For example, in Kenya, where abortion is criminalised, 16 the internet is one of the main sources of SRHR-related information for young people but also puts them at risk of surveillance, mis- and dis-information. 17 Healthcare providers are now beginning to navigate these complex ethical and legal issues, but much remains to be done given the absence of a common global understanding, definitions, and data protection regulations. A 2022 study revealed that 87% of the 23 most popular women’s health apps on the current international market (the App Store on the Apple operating system [iOS] and Google Play on the Android system) shared user data with third parties, but only half really requested consent from their users. 18 As Kelly and Habib in their Commentary in this series point out, data and privacy around fertility apps have increasingly been called into question in a post-Roe America. 19 Some of the greatest risks that women and girls and gender-diverse people face with digital technology, include online harassment, cyber-bullying, cyberstalking, unsolicited sexual messages or images, nonconsensual sharing of intimate photos, child sexual exploitation and abuse, as well as data security and privacy risks. 20 According to the UNICEF 2020 report, women and girls, and people of diverse genders, are at greater risk of digital harm. According to the report, 52% of young women globally have experienced some form of digital harm, and 87% of them believe the problem is getting worse. 5 Social media, in particular, is perceived as an unsafe space, with 68% of online abuse of women and girls taking place on social media platforms. 21 The violations of human rights of women and girls and LGBTI persons is reported widely; however, online environments continue to operate as an accountability free zone. Evidence also points towards inherent bias in algorithmic models used by social media platforms against gender equality and SRHR, which is further compounded by the refusal of big tech companies to address abuse and attacks on gender and SRHR advocates. In 2021, anti-gender equality activists conducted a targeted campaign against a well-known Somali women’s rights activist who uses her Facebook page to support survivors of domestic violence and rape in the Somali diaspora. She was restricted from posting by Facebook as her content had been repeatedly mis-flagged as inappropriate, 22 demonstrating how “community standards” can be arbitrary, whereby tech companies can ultimately create and enforce their own norm systems with limited or no accountability or, alternatively, could be misused by the state. Unfortunately, this is not an isolated incident. There are many other varied and concerning examples of how social media platforms can silence specific users. It now seems undeniable that certain online spaces are fertile ground for harmful activity targeting gender norms. A recent report by ODI, “Hidden in plain sight” 23 demonstrates how the infrastructure of social media shapes gender norms, unpicks how the technological design, profit models, and organisational hierarchy all give way to patriarchal norms, and in doing so, perpetuate sexist, heteronormative and racist stereotypes. The absence of meaningful due-diligence and precautionary approaches has exposed wide accountability deficits in legal regimes to address these situations. Furthermore, recent evidence points to a thriving digital economy based on gender trolling and violations of human rights, disincentivising meaningful action on these issues by tech companies. In addition to gender bias, AI faces other diversity bias challenges, including race. For example, face recognition algorithms which were studied by Algorithmic Justice League found that the share of input images on which various facial recognition algorithms were based consisted of 80% images of white persons and 75% male faces. As a result, the algorithms had a high accuracy of 99% in detecting male faces. However, the system’s ability to recognise black women was significantly lower at only 65% of the time. This has two main implications: first, homogeneous datasets will lead to biased algorithms – making AI-driven digital health interventions ineffective or actively harmful for excluded communities. Second, the technology needed to drive these innovations is designed in most cases by male coders – even though this is beginning to change in the fem-tech industry. In order for there to be more diversity and representation in datasets, there needs to be wider access, meaningful use, and more inclusive design. 24 Thus, as efforts are made to digitise health systems, the importance of ensuring an intersectional approach goes to the heart of addressing the multitude of issues confronted with AI. It is imperative that ethical issues in digital health solutions are anticipated before they arise, and that they are treated with the same commitment to the review, the oversight, the human rights dimensions commonly applied throughout other areas of health. For example, at the design stage there need to be validation studies that take into account the diversity of the populations, informing machine learning algorithms and advancing transparency of the underlying data sets informing the predictive models on which clinical decisions are expected to be made. An important step in this direction is the informed involvement of the served and impacted communities in the design and ideation processes. Governance deficits and business model of big tech companies To understand the situation, one needs to look at the full spectrum of issues, which also include the business model of big tech companies; how decisions are made with regards to gender and digital technologies; and the algorithmic models and codes which underpin content on social media platforms. The depth and breadth of personal data being generated, stored, and used grows exponentially, and existing approaches to data governance have not kept pace, resulting in severe challenges to gender equality. It is evident that seemingly anonymised personal data can easily be de-anonymised by AI and facilitate tracking, monitoring, and profiling of women and girls and other gender diversities as well as predicting behaviours. Together with facial recognition technology, such AI systems can be used to cast a wide network of surveillance. Critically, given that law is silent on many privacy issues, mere compliance with regulations, some of which may be significantly outdated and not aligned with technological advances, no longer suffices. In this context, while menstrual apps are often promoted and perceived as digital tools that enhance women’s bodily autonomy, the aforementioned power asymmetries highlight the need for further unpacking how to truly ensure autonomy in digital space. The gender costs of business models based on data extraction, the concentration of power, and – fundamentally – privacy and data protection legislation that is not keeping up with digital transformations, require urgent action. Big Tech companies often operate in what can best be described as neo-colonialist enterprises in an accountability freezone. Clearly, data governance needs to be reimagined. The Lancet and Financial Times Commission on Governing Health Futures argues that “The governance of digital technologies in health and health care must be driven by public purpose, not private profit”. The Commission has called for “a new approach to the collection and use of health data based on the concept of data solidarity, with the aim of simultaneously protecting individual rights, promoting the public good potential of such data, and building a culture of data justice and equity”. 25 The present safeguards are woefully inadequate. The language of data protection itself needs to be questioned. Data is often referred to as oil and hence, a majority of existing frameworks focus on the protection of data as a resource. However, in a world where physical and digital identities are inextricably linked, an embodied approach to data governance and a holistic understanding of bodily autonomy and integrity, dignity and freedom, that spans across the digital and physical spaces, are critical to fundamentally improving things. For example, consent-based frameworks must be viewed through a prism of relational autonomy rather than individual autonomy, in that conditions must be created for informed data ownership at the individual level before consent can be given for data use. Conclusion If the potential of digital technology for safeguarding and enhancing sexual and reproductive health and rights is to be realised, we must ensure the widest possible public debate about alternative approaches to data governance. Given the ubiquity of technology-facilitated gender-based violence, existing power asymmetries in the design, deployment, and regulation of new technologies, including gender biases in big data sets and the algorithms that use them, we need to position rights of women and girls and gender diverse people at the centre of these debates, by ensuring meaningful participation and voice in decision making related to digital technologies.

          Related collections

          Most cited references7

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Inequities in Health Care Services Caused by the Adoption of Digital Health Technologies: Scoping Review

          Background Digital health technologies (ie, the integration of digital technology and health information) aim to increase the efficiency of health care delivery; they are rapidly adapting to health care contexts to provide improved medical services for citizens. However, contrary to expectations, their rapid adoption appears to have led to health inequities, with differences in health conditions or inequality in the distribution of health care resources among different populations. Objective This scoping review aims to identify and describe the inequities of health care services brought about by the adoption of digital health technologies. The factors influencing such inequities, as well as the corresponding countermeasures to ensure health equity among different groups of citizens, were also studied. Methods Primary studies and literature, including articles and reviews, published in English between 1990 and 2020 were retrieved using appropriate search strategies across the following three electronic databases: Clarivate Analytics’ Web of Science, PubMed, and Scopus. Data management was performed by two authors (RY and WZ) using Thomson Endnote (Clarivate Analytics, Inc), by systematically screening and identifying eligible articles for this study. Any conflicts of opinion were resolved through discussions with the corresponding author. A qualitative descriptive synthesis was performed to determine the outcomes of this scoping review. Results A total of 2325 studies were collected during the search process, of which 41 (1.76%) papers were identified for further analysis. The quantity of literature increased until 2016, with a peak in 2020. The United States, the United Kingdom, and Norway ranked among the top 3 countries for publication output. Health inequities caused by the adoption of digital health technologies in health care services can be reflected in the following two dimensions: the inability of citizens to obtain and adopt technology and the different disease outcomes found among citizens under technical intervention measures. The factors that influenced inequities included age, race, region, economy, and education level, together with health conditions and eHealth literacy. Finally, action can be taken to alleviate inequities in the future by government agencies and medical institutions (eg, establishing national health insurance), digital health technology providers (eg, designing high-quality tools), and health care service recipients (eg, developing skills to access digital technologies). Conclusions The application of digital health technologies in health care services has caused inequities to some extent. However, existing research has certain limitations. The findings provide a comprehensive starting point for future research, allowing for further investigation into how digital health technologies may influence the unequal distribution of health care services. The interaction between individual subjective factors as well as social support and influencing factors should be included in future studies. Specifically, access to and availability of digital health technologies for socially disadvantaged groups should be of paramount importance.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The Lancet and Financial Times Commission on governing health futures 2030: growing up in a digital world

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Privacy, Data Sharing, and Data Security Policies of Women’s mHealth Apps: Scoping Review and Content Analysis

              Background Women’s mobile health (mHealth) is a growing phenomenon in the mobile app global market. An increasing number of women worldwide use apps geared to female audiences (female technology). Given the often private and sensitive nature of the data collected by such apps, an ethical assessment from the perspective of data privacy, sharing, and security policies is warranted. Objective The purpose of this scoping review and content analysis was to assess the privacy policies, data sharing, and security policies of women’s mHealth apps on the current international market (the App Store on the Apple operating system [iOS] and Google Play on the Android system). Methods We reviewed the 23 most popular women’s mHealth apps on the market by focusing on publicly available apps on the App Store and Google Play. The 23 downloaded apps were assessed manually by 2 independent reviewers against a variety of user data privacy, data sharing, and security assessment criteria. Results All 23 apps collected personal health-related data. All apps allowed behavioral tracking, and 61% (14/23) of the apps allowed location tracking. Of the 23 apps, only 16 (70%) displayed a privacy policy, 12 (52%) requested consent from users, and 1 (4%) had a pseudoconsent. In addition, 13% (3/23) of the apps collected data before obtaining consent. Most apps (20/23, 87%) shared user data with third parties, and data sharing information could not be obtained for the 13% (3/23) remaining apps. Of the 23 apps, only 13 (57%) provided users with information on data security. Conclusions Many of the most popular women’s mHealth apps on the market have poor data privacy, sharing, and security standards. Although regulations exist, such as the European Union General Data Protection Regulation, current practices do not follow them. The failure of the assessed women’s mHealth apps to meet basic data privacy, sharing, and security standards is not ethically or legally acceptable.
                Bookmark

                Author and article information

                Journal
                Sex Reprod Health Matters
                Sex Reprod Health Matters
                Sexual and Reproductive Health Matters
                Taylor & Francis
                2641-0397
                6 November 2023
                2023
                6 November 2023
                : 31
                : 4
                : 2269003
                Affiliations
                [a ]Director, United Nations University International Institute for Global Health (UNU-IIGH) , Kuala Lumpur, Malaysia.
                [b ]International Project Manager, UNU-IIGH , Kuala Lumpur, Malaysia
                [c ]Chief of Gender and Health, UNU-IIGH , Kuala Lumpur, Malaysia
                Author notes
                Author information
                https://orcid.org/0000-0002-4960-4994
                Article
                2269003
                10.1080/26410397.2023.2269003
                10629411
                37930349
                48ae2bfb-8a67-4cd8-9566-e02f921c58fc
                © 2023 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group

                This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial License ( http://creativecommons.org/licenses/by-nc/4.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited. The terms on which this article has been published allow the posting of the Accepted Manuscript in a repository by the author(s) or with their consent.

                History
                Page count
                Figures: 0, Tables: 0, Equations: 0, References: 25, Pages: 5
                Categories
                Commentary
                Editorial

                sexual and reproductive health and rights,digital technology,human rights

                Comments

                Comment on this article