TikTok is one of the many online social platforms that foster different communities and unite them. BookTok, one of the many communities on the app, is known as a space for book lovers and others to interact about current reads, book recommendations, new book releases and more.
However, BookTok has been under scrutiny over a popular creator's sponsorship of Character.AI. This generative artificial intelligence app steals multiple authors’ characters, words, stories and tone to create stories without authors’ consent or knowledge. With the stolen material, Character.AI creates chat rooms of characters for users to interact with to create stories, chat with characters and more.
Not only is this dangerous, but it is also harmful to the book and reading community moving forward.
Marianna Moore, also known as mariannasreads on TikTok, posted a sponsorship on June 25 promoting Character.AI. Her reasoning behind sharing this app is meant to inform those who are experiencing a reading slump, as anyone can create stories with their favorite characters.
This caused others on BookTok to post opinions on how this situation is harmful. Many found Moore’s reasoning odd and stated that this is a form of plagiarism. Stealing any form of another person’s work is bad, but why is this especially worse?
Generative AI is a technology that many come in contact with. It takes existing content, in this case, books and novels, to create new content based on inputs or prompts. ChatGPT is a well-known source of Generative AI, pulling from online sources to respond to prompts it has been given.
Before Moore’s sponsorship, Character.AI was being served lawsuits from parents of children who believe they were exposed to inappropriate content on the website. The chatbots used on the site allow users to interact with any character(s) they choose; however, since the chatbots are designed to respond to any input, many minors have been exposed to harsh material.
Originally, the lawsuits unfortunately started from a mother whose son died by suicide after using Character.AI and developed an inappropriate relationship with a character. Two more similar lawsuits were filed after, and Character.AI changed its safety guidelines, stating that a person must be 18 years or older to access the website.
Though Character.AI modified its guidelines, a child can easily lie about his or her age and use the app without any penalty. This fosters a lack of safety for users, especially because the website is taking characters away from their intended perspective, which is created by human authors.
Another issue sparked when author Jessa Hastings commented on the matter. Hastings is famously known for creating her book series "Magnolia Parks" and recently her new release, "Conditions of Will.” She spoke out on Instagram stories about Moore’s sponsorship that left fans confused.
“You don’t have a Roomba, you wouldn’t be caught dead with a Ring camera, and you definitely don’t use an Apple Watch, an Oura Ring or a Woop,” Hastings said. “So, you on your AI horse, I hope to god your hands are well clean of all the above or else you’re nothing but a giant hypocrite.”
The issue with Hastings' statement is that she is comparing two different types of AI. Roombas, Google Maps, Apple Watch and the rest she listed are forms of Machine Learning (ML) AI. ML is a subcategory of AI that is tasked with creating patterns from data provided.
For example, Google Maps uses data programmed by individuals to be dispersed into the app so that users can access directions, traffic backups, restaurants and more. Generative AI is the opposite. It is created by humans, but it goes beyond human capabilities.
This is pure plagiarism. Authors, such as Victoria Aveyard, responded to Hastings and Moore’s comments on Generative AI. Aveyard said, “If you use Generative AI, you’re not a writer, you’re not an artist. You’re a thief.”
Just like Aveyard stated, a writer is not a thief. Using any form of Generative AI is stealing material, not original work. Promoting this is pushing the world further past reality and making AI more comfortable to use, which is a hard argument to win.
Moore posted an apology a day later, saying she didn’t fully understand the harmful impacts Generative AI has on authors and readers. AI shouldn’t even be in the same conversation as writing books. The comfort level of AI, especially in university classrooms, is eerie given the current political climate we are under.
Reading slumps understandably happen but try to pick up a book by a favorite author or a new book with an interesting twist. Continue to support authors and local bookstores, not AI websites.
Natalie Saddler is a sophomore studying journalism at Ohio University. Please note the views and opinions expressed in this column do not reflect those of The Post. Want to talk to Natalie about their column? Email them at ns505423@ohio.edu.





