top of page
Writer's pictureDanielle Simpson-Baker

And the BS begins... Sex Ed, AI & the Censorship Machine

As a consumer, I’m not above indulging in the latest tech trends, especially the ones like AI that aim to make my life and work easier. However, I’m also not oblivious to the fact that my work, being sex-related, will probably never be fully integrated into society without a few roadblocks.


So naturally, it was no surprise to find that the resulting content of at least 5 of my work-related ChatGPT prompts were flagged for violating policy guidelines. After the 5th flag, I got annoyed and indignant enough to send this prompt through: “alright mf, write me a blog post on the role of censorship in AI when it comes to sex education.” This is what it came up with (with my edits in italics):


Artificial intelligence has the potential to revolutionize many aspects of our lives, including how we approach sex education. However, there is a growing concern that AI censorship is hindering the progress of sex education and denying people access to important information.


Sex education is a critical part of human development, and yet it remains a taboo subject in many


cultures. Despite the overwhelming evidence that comprehensive sex education can improve health outcomes and reduce risky behavior, there are still many places in the world where it is not taught in schools or is censored by the government.


AI could help fill this gap by providing access to information on sex education that is accurate, inclusive, and non-judgmental. However, there are concerns that AI is being censored when it comes to sex education, which could have serious implications for people's health and wellbeing.


One of the main concerns about AI censorship is that it perpetuates harmful stereotypes and misinformation about sex. For example, if an AI algorithm is programmed to censor certain sexual terms or images, it could lead to the suppression of important information about sexual health and reproductive rights. This could lead to a lack of awareness about consent, sexually transmitted infections, contraceptive options, and healthy sexual practices, which could have serious


consequences for individuals and public health.


Another concern is that AI censorship perpetuates social inequalities. For example, if an AI algorithm is programmed to censor certain sexual orientations or gender identities, it could lead to


discrimination against marginalized communities. This could lead to a lack of access to information about safe sex practices, and could also lead to increased stigma and discrimination against these communities.


It's important to note that AI censorship is not always intentional. Sometimes, algorithms are programmed to censor certain terms or images for reasons unrelated to sex education, such as to protect children from explicit content. However, even unintentional censorship can have serious consequences for people's access to information about sexual health and wellbeing.


So what can be done to prevent AI censorship of sex education? One approach is to ensure that AI algorithms are designed with inclusivity and accuracy in mind. This means involving diverse


stakeholders in the design process, such as sexual health experts, educators, and members of marginalized communities. It also means testing algorithms to ensure that they are not unintentionally censoring important information about sex.


Another approach is to raise awareness about the importance of sex education and the risks of AI censorship. This means engaging with policymakers, educators, and the public to ensure that sex education is seen as a critical component of public health and wellbeing. It also means advocating for more transparency in how AI algorithms are programmed and how they are used to censor information about sex.



In conclusion, AI censorship of sex education is a growing concern that has serious implications for public health and wellbeing. It's important to ensure that AI algorithms are designed with inclusivity and accuracy in mind, and to raise awareness about the risks of censorship. By working together, we can ensure that everyone has access to accurate and inclusive information about sex education.


Well, when she’s right, she’s right. And yes, ChatGPT is a femme in my world.


But for real, it sounds like there’s a need for some sex-positive AI developers to make sure the right things aren’t being filtered out. Or maybe even to just give us sex-positive folks our own AI spaces and platforms where we can artificially discuss sex in peace. Thoughts?



33 views0 comments

Recent Posts

See All

Σχόλια


bottom of page