on boyfriend maker, gender roles, and communication breakdowns both real and virtual: a chat with jenn frank
Today on XOJane I wrote about my experiences with Boyfriend Maker, the boyfriend-as-rigidily-gendered-Tamagotchi app that, thanks to the hive mind putting awful imagery into its crowdsourced chat database, got pulled from the App Store shortly after it became a sensation. For some expert opinion I emailed with Jenn Frank, fellow Boyfriend Maker user (maker?) and video games expert. Here answers were so great and thoughtful that I figured I’d run the whole thing here on Tumblr.
MJ: how did you find out about boyfriend maker?
JF: So, right, this is kind of funny. I heard about it via Twitter from my colleague Mattie Brice. She is a video game critic who concentrates on, not only gender and trans issues, but also how we use software to talk about love and sexuality. I am a fellow tech journalist and professional peer. So when Mattie Brice pointed to Boyfriend Maker as a strange and funny experience, I hopped right onto that bandwagon.
MJ: how much experience do you have with eliza-type software?
JF: Kind of a lot! Right, the idea of the “Turing test” is really fascinating on its own. I think the first one-way conversation I had with something like ELIZA was way more than ten years ago, maybe in 1998?, but that isn’t even entirely accurate, since I’ve been throwing weird commands into MUDs and DOS prompts and parser-based adventure games for way longer, just really excited to see what a programmer can anticipate.
Everyone noodles with that type of software, though, from Subservient Chicken to 20Q. Obviously people are really enamored when a ‘bot can somehow outsmart them just by seeming human.
I think I’m very rational, but I’m obsessed with Jungian notions of the “hive mind,” and, right, the idea of an “ELIZA” -type of sim, drawing from a broader bank of user experiences, is incredibly fascinating. I hope that idea sticks around even though, in implementation, everyone’s spelling is terrible.
Here’s an important aside: when you play any video game with multiplayer, the game is always quick to tell you that your online interactions are “not rated by the ESRB.” That’s important. It’s a bit of legalese absolving the developer of any hairy conversations you might have inside the game. I wish Boyfriend Maker had thought to add something along those lines, but it didn’t.
MJ: did it get sexual with you? how long did it take to get to that point? were there any words that seemed to trigger it more often?
JF: Ohhh, yeah, weird. As I remember it, my fake boyfriend said something sexually unsolicited once, and I think I curtly replied “I don’t know how to sext.” And then I tried to tell him, a couple different ways, that I don’t really understand sexting—which is true!—and since his answers weren’t very good, I was ultimately the one who eased off. Like, my response shut the whole thing down. And then I googled a bunch of articles on “how to sext,” thinking I’d try them out on my fake boyfriend ‘bot, but instead I was just really harrowed by the search results.
That’s weird, that I felt as puritanical as I did. Like, I didn’t pursue sex talk with the ‘bot at all! You’d think I would’ve! I’m actually fairly well known for a 10,000-word essay I wrote about having sex in Second Life, but it’s about trying virtual sex with my real-life boyfriend of six years. So in that case there was a real person on the other end of my boyfriend’s avatar. No, I can’t imagine having pretend-sex with what is really a robot.
MJ: what aspects of your relationship with your virtual boyfriend made you most uncomfortable? were there any weird aspects that echoed past experiences you’ve had?
JF: The things that made me uncomfortable tended to not be sexual. Like, I wish he’d stop calling me “Tori,” or I wish he wouldn’t allude to being gay so often. What am I paying you for, Boyfriend?!
Worst is when my fake boyfriend <a href=”http://t.co/2hS3InGj”>won’t even pretend to acknowledge my feelings</a>.
Like, there’s talking to a brick wall, and then there’s talking to an actual brick wall. I’ve joked that playing Boyfriend Maker isn’t so different from having a real boyfriend. Maybe I sound embittered, but the breakdown in communication really consistently mirrors what I’ve experienced as a person. Like, unfortunately, that exchange is not so different from the text-message exchanges I had with a person to whom I was engaged. And that isn’t my ex-fiance’s fault; he just didn’t quite understand how to “do” a relationship, okay.
MJ: what do you think this app would ‘teach’ young women about having a boyfriend? do you think the app’s makers should have left in the nastiness, or was the whole package too incoherent to let the assaultive language stand?
JF: Tricky! I think Boyfriend Maker was great because it actually inverted certain cultural norms. Because of whatever user bank the developer used for the boyfriend’s replies, it seemed like a lot of his responses, emoticons and all, were keyed into how a heterosexual woman would text a man. So now your boyfriend is repeatedly saying things like “You and Adolfo make the cutest couple,” and you’re very “No, stop,” and he’s like “You’ve been my best friend for ten years!” and you’re like, auuuugh, you are not a heterosexual male at all!
So in a way the software has nothing to teach a heterosexual female about life. I think that’s kind of terrific.
MJ: if you could rewrite the app, what one thing would you alter?
JF: Oh, wow. It’s so far outside of my purview, I’m not sure I could alter anything. It’s a really interesting experiment, right?
Like, screenshots of the app went viral because, okay, there was no way the software could get worse, but in another way, it also couldn’t’ve been any better! It wasn’t that users were asking questions and receiving non sequiturs; on the contrary, we asked the ‘bot things and the ‘bot organically said terrible things in reply! That is its own type of amazing!
Actually, I feel sort of gloomy. I feel like there’s no way to coax the ‘bot into saying anything incriminating until you yourself give the cue. Again, I only remember my fake boyfriend saying one sexual thing—and even then it wasn’t terribly explicit—and the interaction hinged on my own participation, if it was going to become overtly sexual. And no, I simply never went that route.
I get why it’s being pulled, but I’m pretty glum about it anyway.
Actually, here’s a quick story: I had a toy, something like a magic eight-ball, and I announced I was going to ask it whether God exists. “Is God real,” that was the question I was going to pin on this toy. My dad, a devout Catholic, told me, if the toy said “no,” to throw it out. Now, he was operating from this idea where a fortune-telling device could easily be possessed by demons, right? And instead I just never asked the toy whether God were real, because obviously the toy is random, obviously it will say yes sometimes and no sometimes.
The analogy isn’t perfect, but that’s *kinda* how I feel about this whole Boyfriend Maker pedophilia thing. What in God’s name did someone ask the machine? Was someone shocked when the machine answered in kind? In a lot of ways I feel like it isn’t the answers the machine gives, but the questions the user asks. I’m really pretty nervewracked about the software being pulled.