
The Vocal Fries
The monthly podcast about linguistic discrimination. Learn about how we judge other people's speech as a sneaky way to be racist, sexist, classist, etc. Carrie and Megan teach you how to stop being an accidental jerk. Support this podcast at www.patreon.com/vocalfriespod
The Vocal Fries
Algospeak
Carrie and Megan talk with Adam Aleksic about his new book, Algospeak: How Social Media is Transforming the Future of Language.
Contact us:
- Threads us @vocalfriespod
- Bluesky us @vocalfriespod.bsky.social
- Email us at vocalfriespod@gmail.com
Thanks for listening and keep calm and fry on
Carrie: Hi, and welcome to the Vocal Fries Podcast, the podcast about linguistic discrimination.
Megan Figueroa: I'm Megan Figueroa.
Carrie Gillon: And I'm Carrie Gillon.
Megan: Hey, everyone. We're here.
Carrie: We're here. Yeah.
Megan: We're here. Yeah, I feel like the world is falling apart around me, but hey, we're still recording, right?
Carrie: Yeah, it really does feel like we're this far away from ecological collapse, fascist collapse. I don't know. I mean, collapsing into fascism, not fascism collapsing, unfortunately.
Megan: Yeah. Exactly.
Carrie: Yeah. It feels rough.
Megan: Yeah. But we have a nice, fun episode today. So [crosstalk] we'll be all right.
Carrie: Yeah. It is a fun episode, yeah. About the use of language on the internet, which is always a hot topic because language changes faster there, I think.
Megan: Yeah.
Carrie: And speaking of tech, this came out last week, there's an article in the Annals of Internal Medicine Clinical Cases called A Case of Bromism Influenced by Use of Artificial Intelligence.
Megan: Bromism?
Carrie: Yeah. When you ingest bromide, it causes some kind of syndrome that they call bromism.
Megan: Okay.
Carrie: And I guess it was more common in the early 20th century. I don't know why. Just maybe we weren't as careful with what we were ingesting.
Megan: Yeah, it could have been a byproduct or something, or not a byproduct, but yeah, maybe a byproduct.
Carrie: Yeah. So bromism, or bromide toxicity, apparently causes things like neuropsychiatric and dermatologic symptoms.
Megan: Whoa. Okay. So you can actually see it on a person.
Carrie: Yeah. Apparently up to 8% of psychiatric admissions were because of bromism.
Megan: At the time [inaudible]?
Carrie: Yeah.
Megan: Wow.
Carrie: Yeah. Oh, this is why. Okay, so bromide salts were in many over-the-counter medications.
Megan: There it is.
Carrie: That would target things like insomnia, hysteria, and anxiety. Yeah, so us, we would have been...
Megan: I would have been bromide.
Carrie: Maybe. But yeah, certainly we would have been targets of it, or not on purpose necessarily, but yeah.
Megan: Right.
Carrie: Oh, the FDA, the US FDA, eliminated bromide between 1975 and 1989. So actually, in our lifetimes.
Megan: Yeah. Wow. That's pretty late.
Carrie: Yeah. So now they have, bromism has re-emerged because of dietary supplements, and yeah, there's a lot of wild supplements out there, but really be careful of the ones that you take, please.
Megan: Yeah, because none of them are FDA approved.
Carrie: Right. There's nothing overseeing them really. Only if you make certain claims, do they get in trouble, or they're completely toxic. Yeah. So sedatives sometimes contain bromide. If you take too much dextromethorphan, you can also get bromism.
Megan: Wow.
Carrie: Yeah. So, this is not just stuff out there, this is AI being involved. So there's a 60-year-old man with no psychiatric or medical history. I assume there was some medical history, but no relevant medical history, because people don't normally have zero medical history. But anyway.
Megan: Yeah. At 60? No.
Carrie: At 60. Come on. So no probably relevant medical history presented to the ER. And he said, "Oh, I think my neighbor is poisoning me." So at first he said, "Oh, I'm not taking any medications. No supplements." Everything looked normal. But there was something in his blood. So he was admitted because his labs were weird, right?
Megan: Yeah.
Carrie: So they admitted him. Then he said, oh, I have multiple dietary restrictions. I distill my own water. He was very thirsty, but was also paranoid about drinking the water that was offered to him.
Megan: Oh, is that why he distills his own water?
Carrie: Probably, yeah.
Megan: Because he doesn't trust, okay.
Carrie: Yeah. The doctors are like, hmm, I wonder if you consume some heavy metal, because that can do weird things to your blood, right?
Obviously.
Megan: Yeah.
Carrie: Within the first 24 hours of being admitted, he started getting more and more paranoid. He had auditory and visceral hallucinations. So then they put him in involuntary psychiatric hold. Sounds like he needed it, unfortunately.
Megan: Yeah.
Carrie: Yeah.
Megan: This sounds like a House episode.
Carrie: Yes! Yes, it does. Yeah. So he was given a drug for psychosis. And then they treated him with IV fluids and electrolytes, etc. So then he was stable enough to be in the inpatient psychiatric unit. Everything seemed to be okay, or better anyway. They thought, oh, maybe he just wasn't feeding himself properly because he's a very restrictive vegetarian. He also had deficiencies like vitamin C deficiency, B12, folate deficiencies, etc. So, they had to refeed him. They had to give him foods to make sure that he was getting more normal.
Okay. Then, now he's like no longer in his paranoid state. He's getting the food that he needs and the fluids he needs. Now, he's being able to talk again, report things more carefully. He says, oh, I have had, I've recently started getting acne. 60-year-old man. Fatigue, insomnia, and some other things. And the doctors are like, hmm, maybe this is bromism.
Megan: Oh, so the doctors right away, once the dermatological thing came in, they were like, this might be.
Carrie: Yeah.
Megan: Okay.
Carrie: Because there's the psychiatric stuff and now dermatological and now they're thinking, hmm, maybe this means bromism. So, yeah. Then he said, oh, I read about the fact that salt, the regular table salt, sodium chloride, can be bad for your health. And yes, it can. Too much of it can. But too little of it is also bad for us.
Megan: Yeah, it's true.
Carrie: So, I have to make sure I get enough salt personally.
Megan: Me too. Yeah.
Carrie: So, yeah. Anyway, he was surprised he could only find literature about how to reduce sodium in your body and nothing about getting rid of it entirely.
Megan: Yeah, because we need salt.
Carrie: Because you shouldn't. Yeah.
Megan: Our cells need salt to do their jobs.
Carrie: Yeah. So then, oh boy, he studied nutrition in college. Oh, no.
Megan: No.
Carrie: So then he was like, okay, I'm going to do an elimination diet and I'm going to eliminate sodium chloride. So for three months, he replaced sodium chloride with sodium bromide and obtained from the internet after a consultation with ChatGPT.
Megan: Oh my god.
Carrie: In which he had read that chloride can be swapped with bromide. But for other purposes, like cleaning. Not for ingesting. But ChatGPT doesn't know this, right? Because it doesn't know anything.
Megan: Oh, yeah. Okay, so it didn't say that and he ignored it. ChatGPT just didn't say it.
Carrie: We don't know. Maybe they do have the logs. I don't know. Because you can look at the logs. Everything's public, people.
Megan: Right.
Carrie: If you're talking to ChatGPT, everyone can see it.
Megan: Yep.
Carrie: If you're not scared of using AI for other reasons, be scared of that, at least.
Megan: Yeah.
Carrie: So after three weeks of getting normal food or normal-ish food, his psychotic symptoms improved and he seemed to be stable. They took him off the antipsychotic medication and yeah, he seemed to be fine.
Megan: Oh my god. I wonder if he was distilling his own water before or if that was a new thing.
Carrie: I'm going to guess that's an older thing because that just feels like a... It seems like he was already a very strict vegetarian, right?
Megan: Yeah.
Carrie: He already seemed to really care about what he put in his body. So I could imagine that distilling water is a really old thing.
Megan: Yeah. Okay. But it also seems paranoid too, so that's why I was wondering.
Carrie: Yeah, it depends on where you live. In some places, it's necessary.
Megan: That's true. Yeah. Good point.
Carrie: So we don't know where this is, right? This is a case study with very anonymized, we just know like high-level details. So I have no idea, is this person living off of a well or something where maybe it makes more sense? I don't know.
Megan: Right. Oh my god. I'm surprised. Because this is across the spectrum of ages that people are trusting ChatGPT to do things, which is interesting.
Carrie: Yeah. I don't understand why more people aren't skeptical. People have just bought right into it. It makes me feel more misanthropic and I don't want to. I don't want to feel that way. I want to feel more connected to other humans. And it just makes me feel very disconnected to a huge percentage of the population.
Megan: Yeah. I don't trust it. I wouldn't trust it with anything concerning my body and health, [crosstalk] I'm sure.
Carrie: No. The only thing it's good at is creating sentences that sound like good English.
Megan: Yeah.
Carrie: That's all. If you want it to create some sentences that sound like good English, yes, it can do that. But it doesn't know what it's doing. It's using statistics. That's it. Just to create sentences. It doesn't understand what it's doing. I don't know. It's going to break my brain.
Megan: Yeah. I feel like your brain is already breaking.
Carrie: Yeah. I know. It is. Definitely.
Megan: Oh, my gosh. I can't believe that was in a legit medical journal, too.
Carrie: Yeah. Obviously, I don't know for sure, but I'm going to trust this journal that this is a real case. Yeah. It sounds real.
Megan: There's going to be more, not necessarily a bromide, but there's going to be more, or bromism. There's going to be more cases of people hurting their bodies because of ChatGPT.
Carrie: Yeah. People are falling in love with ChatGPT or other AI, Gen AIs, other LLMs. There's a whole subreddit of people falling in love, right?
Megan: Yeah.
Carrie: They're showing off their engagement rings to their AIs.
Megan: Oh, my god. Which they bought themselves.
Carrie: Yeah. Well, the AI told them which one to buy. It's a deep sickness.
Megan: And what was it that Emily and Alex told us? I don't remember who said it, but it was they anthropomorphize it by using pronouns. We've, not we, but some people feel like they're talking to someone out there.
Carrie: Yeah, the way that it refers to itself, it makes it seem like it has a mind. Yeah.
Megan: Right. Which was intentional.
Carrie: I think so.
Megan: They could have made it so that it wasn't like that, right?
Carrie: Yeah, there was a choice that was made. I don't know if they knew how damaging it was going to be to make that choice. Because like we talked about in our bonus episode, by the way, join our Patreon if you want to hear it. We talk about them tricking themselves, right? So the makers of the AIs being fully tricked by their own creation, like psychics. So it's hard to know exactly where the decisions were made that were malicious or calculated, maybe is a better word. And where it was just like, oh, it made sense to them. And then it tricked them. It's really hard to know.
Megan: Yeah, absolutely.
Carrie: Anyways.
Megan: Wild.
Carrie: Stay safe out there.
Megan: Yes.
Carrie: If I could ban LLMs or at least really severely limit their use, I would. But anyways.
Megan: Yeah.
Carrie: Anyways. Well, thank you. And yeah, patreon.com/vocal fries pod for all the bonus episodes. And we'll see you next month.
Megan: Yeah.
Carrie: Enjoy. So excited today to have Adam Aleksic on. He's a linguist and content creator currently living in Manhattan. He creates educational linguistic videos on TikTok, Instagram and YouTube and holds a linguistics degree from Harvard University where he served as president of the Harvard Undergraduate Linguistics Society. He's written for the Washington Post, discuss online language on NPR and lectured on language and social media at Stanford, Yale, Harvard and other top universities, including a TEDx talk at the University of Pennsylvania. He's also the author of Algospeak, how social media is transforming the future of language. So welcome.
Adam Aleksic: Thank you. It was a way too convoluted of a bio. I really appreciate it.
Carrie: I just took it from your website. So we always start our questions off with you've written a book with why did you want to write this book and why now?
Adam: Yeah. Well, as a linguist and as a content creator, I can't turn off linguist brain and content creator brain when I'm making my own videos and when I'm looking at other people's videos, I'm thinking about what's the language choice here. I can feel my own language choice being affected by algorithms. And the idea behind this book, Algospeak, algorithms are shaping every aspect of our language.
It started with words like unalive. You can't say kill on TikTok because there's censorship on the platform. So creators rerouted to use the word unalive instead. And that's a literal example we can point to of algorithms shaping our speech in real time. And now there's kids in middle schools writing essays about Hamlet thinking about unaliving himself. But it's also changing every aspect of our language. That's just the tip of the iceberg. It's changing our means, it's changing our trends. It's changing the communities that come up with language and how ideas diffuse out of those communities.
Megan: I'm just thinking right now, I think the way that the algorithm made me first start doing something different was when I didn't want it to pick up on me saying Trump, because I didn't want Trump related stuff. So I would maybe put an asterisk for one of the letters.
Adam: Yeah, that's so funny. I literally used that as an example in the first chapter Tr*mp. I've seen people use other evasive speech called Voldemorting. So instead of saying the name, you use a euphemism like he who must not be named. So I've seen people refer to Trump as 45 or Cheeto or Orange Man or something like that. That also has a double effect, right? You're not just avoiding the algorithm. You're also signaling disrespect. It's empowering to reclaim and treat that name as a taboo because you're also now using language as a form of power.
Carrie: I love that. Yeah, that's absolutely true. If someone sees that and someone else agrees with Trump is not someone to be respected. They'll be like, okay, you're signaling that you agree with that.
Adam: I think there's a few layers of communication happening whenever you use a word like unalive. You're literally using it to mean kill. You're also using it to signal, yeah, the algorithm isn't letting me talk about this. You're also using it to signal, I'm versed in this internet sociolect, this general way we have of communicating that other people understand because you could use a lot of different euphemisms for the word kill. But this is the one that's the most apt for this particular environment. And so we as humans are very good at molding our speech for each environment.
Carrie: Yeah. And you talk about that especially with respect to 4chan, the only way you can really signal that you're part of this group. Just the way that that platform works is to use the correct vocabulary.
Adam: Right. It's necessary on the platform to demonstrate a performative proficiency in their shared slang. And this is partially why 4chan came up with so many words, also the meme culture. I think a lot of things come out of culture and out of platform design. There's a reason a good rule of thumb for internet slang right now is that a slang word either comes from African-American English or comes from 4chan.
Carrie: Yeah. Well, it's so sad, but yes, [crosstalk] it's true.
Megan: Yeah.
Adam: There's a few exceptions, but it's shocking how much that holds true.
Carrie: Can you think of an example that's an exception?
Adam: Yeah. The word delulu was popularized by K-pop stans. The phrase in my blank era is popularized by Swifties. And each of those communities, they have their own dialects within, like, the New York Times did a piece a while ago on the Swiftie fanilect, which is if you're really deep in a Swiftie culture, they use all kinds of language and in jokes and in references that are obscure to an outsider. And in a similar way, incels, unfortunately did the same thing. This is how new words are formed out of a shared need to invent words to demonstrate this reality, to bond with one another, to build community. I think algorithms amplify that through their creation of echo chambers that are also porous and allow those words to filter through more easily.
Carrie: Yeah, we talked about the Taylor Swift fanilect at one point.
Megan: We did, yeah.
Adam: It's a fascinating subculture, yeah.
Carrie: Yeah, absolutely. So what is algospeak?
Adam: Algospeak to the internet before was just these words used as euphemisms to avoid algorithmic censorship. So seggs is another example instead of sex because you can't say sex on TikTok. The watermelon emoji instead of the Palestinian flag. That's traditional algospeak. I do try to broaden the definition. I think all of our language is now molding around the algorithm and the medium affects the message, right?
So each new medium we have of communicating affects how we're going to communicate. When we moved from oral storytelling to paper, we were allowed to invent chapters and space out our words.
When we moved to the internet, it allowed for the written replication of informal speech. Now, that we're moving to algorithms, unfortunately, that means we have to abide by the platform priorities,
which is let's commodify everybody's attention so that we can sell more of your data and throw more advertisements at you. But unfortunately, if you're trying to communicate online, you have to grab people's attention. You have to play into that. And everything that goes viral conforms to platform priorities like that.
Carrie: Yeah, you even talk about how there's different ways of speaking, the different intonational patterns that influencers have. You even talk about that as being part of algospeak.
Adam: Right online, I use so-called educational influencer accent. I talk quickly in real life. I do get excited about etymology, right? But online, I present this exaggerated caricature of a persona where I speak extra quickly. I'll rerecord things if I don't think I spoke fast enough. I'll stress more words to keep you watching my video. All that because it's good for attention, which is good for my video being pushed in the algorithm.
And each influencer, in the same way that different communities come up with different ways to express the reality, depending on the community, influencers in different genres will also develop accents that work for that niche. So there's the lifestyle influencer accent, the stereotypical, hey, guys, get ready with me to get on a podcast. That also is good at floor holding because dead air is terrible on social media. You're going to scroll away if you hear silence.
So the uptalk keeps you hanging on every word. And it also drags out the word to prevent that feeling of dead air. Then you'll have like the Mr. Beast style. He's shouting at you. Every word is popping. You'll have ASMR accents. You'll have a lot of different stuff like this, just depending on what's going to go viral for that particular style of video.
Carrie: Did you learn this along the way for yourself?
Adam: Yeah, it started out for me. My research into algospeak started with these words like unalive and seggs. And then the more I kept thinking about it, the more I'm like, wow, it's really not just this one sphere of language. It's really everything. And I started to notice my own patterns. Unfortunately, it's hard to do things normally when, I think that fellow linguists can just relate that when you try to have a conversation, sometimes you start thinking about the words or turn taking or something like that. In the same way, it's hard for me to film a video without thinking about what I'm doing. I am very intentional with it. I try to be authentic about that and communicate to my followers. Yes, I am manipulating you, but so is everybody. At least I'm being open about it.
Carrie: Yeah.
Megan: Yeah.
Carrie: I think that's part of the reason why I can't I can't do the video stuff. I get to in my head.
Adam: That's totally valid. It's really not for everybody. But I don't necessarily think people should be spending more time on social media or anything. I do think it is genuinely useful for amplifying some voices.
Carrie: Yes.
Adam: I think it has a useful function, like any tool, it can be wielded for good and for bad. But there's a reason as a TikTok linguist that I went for a more analog thing by trying to write a book here.
Carrie: Yeah, fair. Yeah, it's interesting that you bring that up because it is a very different genre versus TikTok videos, especially, not even like YouTube videos is also different.
Adam: There's there's stuff you just can't express. There's nuance. There's ambiguity. Etymology is always just a messy, complex story. And it's not just word evolved from point A to point B. Sometimes word sees influencing its development. Sometimes there's a aesthetic that shapes how the word evolves. There's just vibes. There is a lot of things going on. You can talk about the exact process of how the word changed. Is this metathesis? Is this reduplication? Or you can talk about the social patterns of how it changed.
There's a sociolinguistic angle. You can talk about maybe a change grammatical. So there's so many different stories you can tell, but you have to flatten it to one story if I'm trying to make a quick video about in social media. Otherwise, it's it's not going to go viral, unfortunately. So I'm really glad that with the book format, I do have an ability to express that. At the same time, the book has its own limitations.
Carrie: Yes.
Adam: I have to speak in formal academic English, which is something at least online I can speak informally.
Carrie: I think your voice in the book is still not exactly informal, but a lot less formal than other books are.
Adam: Thank you. I try to be conversational.
Carrie: Yeah, it's more conversation. It is more conversational, I would say. Yeah.
Adam: But that's that idea that we code switch for every single medium. I can feel myself doing that in book versus TikTok video. In the same way every single influencer on social media. Is code switching into this algospeak dialect.
Megan: Yes.
Carrie: Is algospeak different from other forms of self censorship?
Adam: What I keep seeing is that time and time again, we're having the same processes, the same underlying human uses of language occurring through new mediums. The example I gave earlier was seggs. I like to analogize that to in 1948, the American author Norman Mailer tried to publish his book, The Naked and the Dead. And they said you can't publish this, you use the F word way too many times.
So he goes through and places all the thousand plus instances of the F word with the word fug with a G. And really, that's no different than replacing sex with seggs with the G. It's the exact same thing that's happening. And throughout history, we've seen all of this occur again and again. In newspaper, comics, we see the same kind of asterisk substitution that you mentioned for putting in the middle of the word Trump when you're not sure whether Trump is going to be censored, because algorithms do tend to suppress politically sensitive videos and also just as a form of reclaiming power because that's the function.
All this is not new. We've also had evasive speech and anti language used since we've had cockneyed slang. We had the criminal cants and dialects. None of that is really new. Humans are very good at expressing themselves under surveillance. And we've been doing that. So I try to emphasize a lot in the book that we're not cooked. The language is not 1984 or well in dystopian language. It's actually us being incredibly creative with finding ways to express ourselves.
Carrie: Yeah. So you bring up surveillance, capitalism. And so what is that? What is the connection between algospeak and surveillance capitalism?
Adam: Well, we're always being watched whenever we talk to other people. That's always been a thing. And we've talked about performativity and linguistics since J.L. Austin in the 1950s. This idea that when you are being watched, you will modify your communication. We know this is true. It's just obvious. And in the same way, the algorithm is constantly watching. It's not a human person. Your FBI agent is not behind your computer screen looking at you. Instead, it's a darker, more boring Kafka's dystopia where your words are being surveilled.
Everything you say is transcribed through an NLP analysis when you upload a video. Everything on screen is analyzed through a computer vision program. All this turns into an embedding of what your video represents. And then they decide how to distribute that. And influencers, we don't have to know every single step of the process, but we know that the algorithm is, quote, unquote, watching our content. And we use words to signal to the algorithm. We'll use trending words specifically because words are metadata. Every single word you use online is a piece of metadata about your video.
Metadata used to be hashtags and titles, information that gives information about information.
Yeah. But but now all words are literally that in that every single word is scrutinized and we need to use these trending phrases. We also need to grab attention. We need to be signaling what kind of video this is, not only to our audience, but to the computer that's going to distribute that to our audience.
Carrie: Terrified.
Megan: I've never actually thought about that. Because I've participated in this because when Twitter or actually used to be Twitter, I do hashtags, right? So I knew that something was picking up on it and it would distribute to the right people or wherever that might be trending.
Adam: You don't need hashtags anymore. Hashtags are honestly they had a lot of flaws, right? You could have a homonym. If I do hashtag Apple, you don't know whether I'm talking about Apple the company or Apple the fruit.
Megan: True, yeah.
Adam: They're limited. Sometimes there's misspelling. Sometimes there's nonstandard spellings or pronunciations. Hashtags had a lot of problems. And in some ways, the fact that all words are metadata actually makes it pretty easy to sort words. But on the other hand, it is being used, I would say, by pretty nefarious platforms that are trying to make money off of us.
Carrie: Yes.
Adam: And speaking of, you brought up surveillance capitalism, everybody should read the Shoshana Zuboff book to really understand what's happening.
Carrie: Yes, there's a lot going on and it's hard to keep track. So, yes, more readings. Always good.
Adam: I think the best thing we can do right now is just be incredibly informed about what these platforms are doing to us. I think language is incredible as a proxy for examining culture. And we can talk about the language objectively. Again, I don't think there's anything wrong with the language. The words are just ways for humans to express themselves and relate to one another. But they are little canaries in the coal mine about what's going on in society as a whole.
The fact that our language is changing to grab attention and it is changing. It is coming from incels maybe sometimes. You need to be paying attention to stuff because that's also there's a vehicle. There's a pipeline for incel ideas from that community to the mainstream. It's true that attention is the currency of the internet. The words are pointing to these broader issues.
Megan: Yes, absolutely.
Carrie: Wow. We are being watched. It's true. I know.
Adam: It's a scary thought, yeah.
Carrie: Every time I talk about something new, I mentioned we should get a green couch. All of a sudden I'm getting tons of green couches in my Instagram feed.
Adam: Well, it's not even that your phone is listening to you. That would be far too simple. And it's not that would be illegal. What is happening is that let's say your friend looked up green couches the other day and they have a conversation with you. The conversation is not being recorded. But the fact that your phone is on the same Wi-Fi network as the other person's phone means these phones can communicate through the Wi-Fi network. Now your phone knows it's in proximity to another phone that recently looked up green couches, which is why you're probably getting green couches. People don't think about that one. We write it off with a simple story.
Carrie: No.
Adam: But I think we should be radically aware to the extent to which these companies are tracking us.
Megan: Yeah.
Carrie: It's true. And yet that explanation has never fully worked for me because I'm not on my friends Wi-Fi. I know this. And we had a conversation about something and then it came up in my, I don't know. I don't know how they figured it out because I was not on her Wi-Fi. I've never been on her Wi-Fi.
Adam: Well, there's a few things going on. That Wi-Fi thing was just one of many things they used to construct user profiles of you. They know all your standard demographics, right? They know the way your thumb rests on your screen can give them information about your consumptive behaviors, your location. They have all of that. Yeah. Cross app activity.
Carrie: Yeah, there's probably stuff that I can't even conceive of that's influencing what they're doing. Okay, so I get you brought up being cooked, too? And you said no. So why are we not cooked?
Adam: That's the last chapter of the book. It's titled Chapter 10, are we cooked? And I think language keeps doing what language has always done. Children are using slang not because they're commodified consumers drones, but they're using it to connect with other children because that's what language is. It's a mechanism for us to connect to other people. And we still are very good at that. We come up with new ways to avoid censorship. We come up with new ways to express our reality, which is always changing. And we we do this through the constraints that we have.
Creativity isn't about everything being perfect. It's about you have a set of constraints and then you find a way to work with that. And if anything, the communication we see online is a great example of the tenacious creativity of the human spirit.
Megan: That's true.
Carrie: That's a good way of thinking about it.
Adam: So at least with language, we're not cooked. There's other questions we should be asking ourselves more broadly, like, screen time and our attention spans shrinking, our literacy rates dropping. These seem like they're probably true and we should be paying attention to that. But because it's a book about, at the end of the day, language, language is fine. I think there's a tendency to ascribe the term brain rot to some words. Neurologically speaking, there's nothing that makes your brain worse after learning one word over another word.
Megan: Right. Yes.
Carrie: I feel like it just expands it.
Adam: Literally, it gives you a new plasticity. Yeah.
Megan: Yeah. A couple of years ago, I was getting a lot of interview requests for talking about the word Latinx. And I got this hate mail about, why do we need to add another word? It's just going to confuse people. And it's just because language changes. Why is this so upsetting to people that we're broadening our language in a way that's more inclusive for people or we're broadening our language because it helps us interact with each other on social media. I don't know why it's so scary to some people. Do you have any idea?
Adam: I think it built, like, woke language is seen negatively because it almost feels exclusionary. It feels like there's a negative vibe of people feel like they're left out, right? And I think good language should feel inclusive. And I know the goal of that language is literally to be inclusive. So it sounds paradoxical to say that. But what I see with, and I talk about this in the book, how do memes actually diffuse across platforms? How do ideas spread through social conduits? Why are we borrowing incel slang in African-American English?
It's because we perceive that as cool or funny to sometimes negative effect. But it feels like these words are easily adaptable. It feels like they're not being forced onto us. It feels like they're compelling to adopt. And that's how slang spreads. But if someone's going top down from an ivory tower saying you all are intolerant saying Latino or Latina, you got to start saying Latinx. That's when people feel a repulsion to it. That is top down in position of language versus a bottom up natural kind of growth through social conduits of what's funny or what's cool.
Carrie: Do you think that interacts with age? Because I'm wondering if bottom is still repulsive to people that are in a certain age group.
Adam: Definitely we see different age cohorts use language differently.
That's 100% true. Older generations tend to always have a more crystallized idea of what language is or what language should be. Whereas younger generations are coming up with new language specifically to annoy adults a lot of the time, to build their own identity in contrast to existing identities, to forge a community as young people in the world. And this can include adopting something like progressive language. It can include using a word like skibidi. All of that.
Carrie: Yeah, that's where my brain goes is skibidi. There's a TikToker who does an analysis of all the skibidi toilet videos. And I can't stop myself from watching them, even though I'm like, this is bizarre. I don't need to know this.
Adam: I love that man. That's Aiden Walker. He's a former meme researcher at Know Your Meme and I love his work. People should be treating skibidi toilet more seriously.
Megan: You're probably right.
Adam: Each one of those YouTube shorts has been viewed hundreds of millions of times. You can't just turn a blind eye to that. That's a huge point [crosstalk] of culture and connection for young people right now.
Megan: True.
Carrie: Yeah, that's true. So I'm glad that I do interact with TikTok because I've learned so many things through TikTok. And that keeps me from solidifying my brain too much.
Megan: Well, I don't feel like my brain is saying no to new slang or anything like that. But I feel like I'm behind. You get older, you get behind.
Carrie: You do. Yeah.
Megan: Yeah.
Adam: Well, you're probably not even being targeted for the young person brain rot videos. The algorithm is not even going to send it to you because it collects demographic information about your age and stuff.
Megan: Oh, wow. Okay.
Carrie: Oh, yeah.
Megan: Yeah. Okay, they're targeting [crosstalk] younger people with this stuff.
Adam: They take a lot into account when they build this shadow version of yourself online and decide whether or not to distribute videos to you. Critically, there's a few disconnects when that happens. One, you look at a video on your for you page, you just assume it's for you. Easily enough, it might not be the best possible video you can get, but just a video that's best at getting your attention and therefore best at commodifying your existence on the platform.
So already it's not exactly for you. But also the algorithm shadow version of yourself is not yourself. It's a map of a territory in the map, can never be the territory. And this sort of representation is a flatter online version of what you've chosen to represent, given already flat online stimuli of what the real world represents. Yeah, your online profile, while it might match up with your in-person interests, is always going to be a different thing.
And then algorithms make mistakes as well. We don't know exactly how they work. All of this means that I, as a creator, could make a video intending it for a certain kind of audience, algorithm could reanalyze it, send it to a completely different audience. Nobody's any the wiser.
Carrie: Yeah, I feel like the only only way I learn about the young people stuff is through another older person saying, what is this? So I didn't know [crosstalk] these videos.
Adam: That's how you know you're old. Yeah. You're unc status as young people would say.
Carrie: That's how I found out what a Labubu was, is through someone going, what is a Labubu? And now my intern has a Labubu.
Adam: Honestly, that's how a lot of trends replicate by this feeling of I need to be caught up with whatever is happening. One point, nobody knew what Labubus were. Everybody had their moment of understanding what a Labubu is. And in that moment, there was a feeling of joining an in-group, which makes you feel good. It feels like it satisfies a psychological need because it does. We as humans want to feel like we're part of a group and social media plays into that tendency.
Carrie: Absolutely.
Megan: And as we feel like we join this group, now we feel like we want to participate in this trend. And then we replicate the trend for others to follow. And this has always been human behavior. We've always had fads and memes and trends throughout human history. I think algorithms compound and amplify natural human behavior. And again, it's not actually a new thing, but the medium is making it happen faster. It's making it happen more intensely than before.
Megan: Yeah, definitely. It feels like fads come and go way faster now.
Adam: And that's not a coincidence. Yeah.
Megan: No, it's not. Does that interact with surveillance capitalism? Would it be better for fads to last longer to make more money or how does that...?
Adam: No, the logic of social media is rapidly switching your attention from one product to another product, hopping between trends. And there's always a new thing to be caught up on. As soon as most people know what a Labubu is, it's no longer desirable to be in the in group. Because the Labubu craze is probably everybody knows about, even the boomers know about Labubus at this point. That means that there's going to be a new trend that feels more hip, more in.
And when I already said words are metadata, but these are also often labels that we can attach ourselves to and from. There's entire aesthetics that work as trends, lifestyles like cottage core or clean girl or all these come and go in the span of a few months or there's a craze and then it's over. And then these lifestyle influencers hop to the next aesthetic because they are also signaling to the algorithm that they're tapping into that.
It's what I call the engagement treadmill. There's an existing meme or trend or fad creators see that that's trending. And then they use that phrase because that's also signaling to the algorithm that you want your video to go viral. It does go viral because it is a trend. It's a real trend. And then it's amplified to your audience and becomes more of a trend and then more creators use it. And we end up in this positive feedback loop of words being perpetuated into virality simply because people are trying to metalinguistically signal to the algorithm that they want their video to go viral.
Carrie: I just feel old. This just makes me feel very old.
Megan: It's funny because I was going to say I don't feel like I interact with the algorithm very much. But of course I do.
Carrie: Of course you do.
Megan: It's acting on me.
Carrie: Yeah.
Megan: It's not like I have to be a creator like you are and think about these things.
Adam: Right.
Megan: It's acting on us.
Adam: And I don't think this is something we can escape as humans. I walked into a coffee shop the other day. It was called Gold Coffee Shop. And as I walked in, I realized what this coffee shop is doing is they're capitalizing on the idea of gold in our collective imagination as this good thing. And they're hopping onto a trend to better sell a product to us, which is exactly what's happening on social media. So it's not actually a new thing. I do think the exact consumer pressures are making our existence feel more commodified. But it's always been a thing that people hop onto trends, co-opt them to sell us more products.
Carrie: It's just because it's faster. It feels more wasteful. Right. Like the tulip craze was also very silly.
Adam: Absolutely.
Carrie: But it took a lot longer for the craze to come and go, right?
Adam: Right. And the same with with words. It might have taken a word like rizz decades to catch on, but now it could become the Oxford English Dictionary Word of the Year for 2023, only a year after it really hit the scene. I do think words took longer to really catch on as fads in the past.
Carrie: Yes, absolutely. Yeah.
Megan: So why hasn't the algorithm caught on to unalive yet? Why is that still working?
Adam: Oh, it has. Yeah. No, if you search up the word unalive on TikTok right now, it'll redirect you to the page saying here are some mental health resources. You shouldn't be afraid to ask for help. It'll say that. So unalive is also clearly being censored by the algorithm right now. And also creators are tapped into that. And they will often respell the word as U-N-@-L-!-V-E or something. Or they'll also do an asterisk like you did with Trump. It's a game of whack-a-mole, right? Mallet comes down. New word pops up. Mallet comes down again. New word pops up like a mole in that game.
Yeah. But people still rely on it because it's part of the internet sociolect at this point. It's because it's an accepted term, even though we know that the game of whack-a-mole has moved on. There's still more reliance on that word than there otherwise might be. And also it's found genuine offline usage as a euphemism. Last year, the Seattle Museum of Pop Culture put up an exhibit for the 30th anniversary of Kurt Cobain's suicide. Not saying that he committed suicide, but that he unalived himself at 27. And it's really being used offline by actual kids in middle schools. I think unalive is very much alive.
Carrie: Yeah, I saw that. I hadn't seen it in real life, but I saw that part of your book and I was like, oh, I didn't know they had done that. Because I've been to that museum a long time ago now.
Megan: Me too. Yeah. Do you think that was someone younger within the museum system that decided on that or they were...
Adam: They must have been because how do you even think to use that word unless you're on TikTok? It must have been a young curator or something. I don't know. Museum of Pop Culture, surely they're tapped into pop culture.
Carrie: Yeah.
Megan: That's true. Yeah.
Carrie: Yeah. I don't know about in this case, but there's lots of museums out there where they have the older person using the Gen Alpha, Gen Z speech.
Adam: All right, yeah.
Carrie: So it could be that it's an older person who is trying to stay up.
Adam: Why do you think you've seen those videos of museums using Gen Z speech to go viral? It's because those are trending words that they can hack onto to push their video further in the algorithm and then they go viral.
Carrie: Yes.
Adam: It's the engagement treadmill in action.
Carrie: Absolutely. Also, they know that I'm going to like a museum TikTok versus someone else, right?
Adam: Right. The algorithm has constructed a shadow version of yourself that also likes museums.
Carrie: Yes. And that part's accurate.
Adam: It's not all wrong. I got some excellent videos about urban design in my feed today because I like urban design. They know that. But I also have a lot of other interests that possibly can't be represented.
Carrie: Yeah. Oh, yeah, for sure. It's definitely a very flattened, like you were saying before, a very flattened version of us that it's seeing because it's not a human. So it could not possibly know what a human is.
Adam: Kyle Jake also has an excellent analysis of this in his book Filter World about how algorithms homogenize culture. I believe there is a general homogenization of language happening because if we are using words as metadata to signal to the algorithm certain things, we will use broader words rather than like niche words. The example I use when I talk about cottage core is it's everything's a kind of a core. Core is already signaling as a fashion aesthetic.
You could use other words to describe this. You could use the word like bucolic or something. I could use a whole description, nothing that exactly captures that aesthetic. I used to say, I like earthy tones in my clothing. But now I can say I'm goblin core. But actually that maybe limits the extent of the expression that I could have had because earthy tones encompasses a larger range and doesn't have the same semantic weight of the word goblin core.
Carrie: Yeah, I was thinking it was more like, I don't know, autumn or something. Goblin core is not where my brain would have gone with with earthy tones. That's interesting.
Adam: Oh, sure. No, there's a few directions. Yeah. Perhaps, I'm misusing that too.
Carrie: So are there any words that are up and coming or different ways of spelling words [crosstalk] that you've, yeah, that you particularly find interesting?
Adam: Always. This year alone, we're seeing the word huzz, which is an algospeak re-spelling of hoes. We see that going viral. Fine shit or fine shyt. It's S-H-Y-T. And that's the algospeak euphemism for the word shit. We see these kind of terms going viral as more examples of algospeak right now.
Carrie: Oh, yeah. The one thing I was going to ask is, were there any examples that you found researching this book that you hadn't come across naturally that you were surprised by or interested by?
Adam: I was perhaps most taken aback by the word preppy, which to us probably describes this academic aesthetic. Ivy League style started with stores like Brooks Brothers and Ralph Lauren. If you ask any middle school girl today, they'll tell you that preppy is bright pink clothing with smiley faces on it. And that's not at all what it meant to me. I was fascinated by this when I first encountered this in my research, I had to text my 11 year old cousin to confirm, is this what this actually means? And she said, yeah, whatever is cheerful and bright and pink.
Carrie: Wow.
Adam: And this is again, there's natural semantic drift happening here to some degree. So we start with these really high end retailers. Then we move to maybe more mid class brands like Abercrombie and Fitch, Hollister, Aeropostale. These are marketing to younger children than the original retailers. And now preppy becomes synonymous with what with what middle school girls like to wear. And in the end, that's bright pink clothing with smiley faces on it. So that's natural.
But the algorithms have compounded that there are online stores, preppy boutiques that sell preppy clothing. This is the preppiest thing in my preppy store. And they'll actually start their videos like that. And they use preppy as not only a hashtag, but repeatedly drop it throughout the video because they know this is how the word is being used by some younger people. But in doing so, they perform it into more existence and realize the language change more than otherwise would have been.
Megan: Yeah.
Carrie: Yeah, that one is also very jarring to me as well. As someone who grew up partially in the '80s, like preppy.
Adam: Right.
Carrie: [Inaudible]
Adam: No, it's definitely not preparatory academies anymore. Right. We've lost that association.
Megan: Yeah, no. And I grew up during Saved by the Bell where Zach Morris was called preppy by A.C. Slater because of the way he'd dress. What I think of is like Ralph Lauren-esque kind of stuff. So, yeah. So, now, they're thinking of it as, it actually reminds me of the word peppy, having [crosstalk] smiley faces.
Adam: Yeah. What I said earlier, it's not always word A to word B and sometimes word C is acting on it. I think it's very possible there's some sort of phonesthetics connection there between peppy and preppy perhaps that in their heads they associated that way. And it makes it feel more natural to use preppy in this manner as well. But also, the word preppy is also always meant. No, I am confusing with peppy. Yeah, [crosstalk] I think people have been doing that.
Megan: You see, yeah.
Adam: Yeah.
Megan: Yeah.
Adam: Preppy [crosstalk] has always been...
Carrie: That's where my brain went too, actually. Oh, yes. Earlier, I can't remember exactly what you said...
Adam: Perky is also like, there are few in that genre.
Carrie: Yes.
Megan: Yeah.
Carrie: Yeah, that's right. So you also said something about how there's more of a homogeneity going on with language, at least with English anyways, probably is happening in other languages too. And that reminded me of something else you bring up in your book is context collapse. We're seeing everything from everywhere and we don't understand necessarily. I don't know, it's an interesting phenomenon to watch that we're all interacting and not understanding each other because we're not in the right groups.
Adam: That's what I was getting at when I was talking about algorithm might distribute a video incorrectly and you still think it's for you on your for you page. You think an African-American English slang word you see is meant for you because the influencer is speaking directly to you. It's on your for you page. They're looking at you in the eyes. It's a very intimate experience, honestly, if you're in bed lying on your side with your phone in your face, you subconsciously, you're not even thinking that deeply about what you're consuming. But somewhere subconsciously feels like this is an okay word to use. You might replicate that now.
Let's say you're a non-African-American person replicating that language and then somebody else sees that video. And now we've completely lost the context entirely where there's not even an understanding at all. That's just two degrees separation. A lot of times these words move across filter bubbles and move across social boundaries in a way where we're 10 degrees of separation. Nobody remembers where these words came from, whether they came from Black people or from incels or whatever. And maybe that's bad. You can draw your own conclusions there.
In some ways, like the fact that there's middle schoolers saying what the sigma, which originally was the word sigma was popularized by the manosphere incel circles. The fact that they're saying that is not that alarming because they don't even know, they don't know where it came from.
Megan: No.
Adam: They just think it's a funny word to connect over. And at the end of the day, that's another reason I don't think we're cooked. We're just using language as we always have very frequently, not knowing where words come from, but nevertheless using them to bond with one another.
Carrie: Yeah. Sigma one always makes me laugh because it just feels like such a bad word. The condition I have is, why would you even want to say that? But for the kids, yeah, you said [crosstalk] they don't talk like that.
Megan: But you have the context, though. Yeah.
Carrie: Yeah.
Adam: It's just funny for them. What the sigma?
Megan: More context.
Adam: It's a funny interjection. It doesn't mean, it doesn't refer to this hierarchy that incels built up. But maybe it potentially could make that idea more accessible. The incels certainly think that's the case. So there is there is an [crosstalk] argument that it dangerously opens up an ideology.
Carrie: Maybe.
Adam: But at the same time, I really don't think we should be stressing out about it too much. I'm not trying to ring any alarm bells here.
Carrie: No, for that one, I don't think it's that big of a deal. It's also a Greek letter, they could attach all kinds of [crosstalk] [inaudible] if they wanted to.
Adam: Oh, no, the word itself is fine. I'm concerned with the hierarchical mode of thinking, the philosophy of seeing the world through these social dynamics that incels have built up. I think it is definitely more normalized than it used to be.
Carrie: 100%. But just using the word as a kid.
Adam: Totally.
Carrie: I don't think, I mean, maybe. But I doesn't feel like it's you're going to get [crosstalk] there.
Adam: No, I don't think you should be concerned if your kid is saying what the sigma.
Carrie: No. Yeah, not that. Other things, maybe, but not that.
Megan: Do you feel like the algorithm skunks words more quickly, like woke? Does it play into that?
Adam: There's definitely something going on there. Woke, I think, already started being pejorated before the rise of algorithmic short form video that started with, yeah, mid the first part of Donald Trump's first campaign, right? Woke was already being turned into this negative word. So that's natural. I do think phrases can be poisoned more quickly, for sure.
Carrie: Yeah, it does feel like it is quicker. But it's hard to know for sure because I feel like I'm losing touch with what is time. Everything is just happening so quick now.
Megan: So is there anything that we didn't touch on that you would like to share with our listeners?
Adam: I also want to emphasize that not only are words metadata, they're also all memes. A meme is just a unit of culture that's transmitted. Memes also have lifespans there. They're fads that can die out and words also have lifespans that die out. When a word is really tied to a comedic meme in our heads, like the word skibidi, I don't think we're going to be using that one in a few years because it dies out when the meme dies out. And sometimes they have a longer tail. Sometimes they have use in our language. They fit like a niche, a lexical gap. But words equal memes, equal metadata, and we can't talk about any of them separately if we're really trying to understand them.
Carrie: Makes sense?
Megan: Yeah, makes sense. Absolutely. Do you have anything else [inaudible]?
Carrie: All right. Well, this has been really fascinating.
Megan: This was really fun.
Carrie: Yeah.
Megan: Thank you for writing the book.
Carrie: Yes.
Adam: Yeah, well, thank you for your time. Yeah, I hope people buy the book. We'll see.
Carrie: Yes.
Megan: Yes.
Carrie: Definitely.
Megan: Yes, absolutely. Go out and get it for sure.
Carrie: Yes, for sure.
Megan: Yeah.
Carrie: Yeah. And we always leave our listeners with one final message. Don't be an asshole.
Megan: Don't be an asshole.
Adam: Don't be an asshole.
Carrie: Thank you. The Vocal Fries Podcast is produced by me, Carrie Gillon, theme music by Nick Granham. You can find us on Tumblr, Twitter, Facebook, and Instagram @vocalfriespod. You can email us at vocalfriespod@gmail.com, and our website is vocalfriespod.com.
[END]