Repika Your New Chat Bot Friend

Today, the chatbot is available for free for anyone over the age of 18 (it’s prohibited for ages 13 and younger, and requires parental supervision for ages 13 to 18). More than 500,000 people are now signed up to chat with the bot. To do so, users tap the app icon — a white egg hatching on a purple background — on their smartphones and start the conversation where they left off. Each Replika bot chats only with its owner, who assigns it a name, and, if the user wants, a gender. Many users are members of a closed Facebook group, where they share screenshots of text conversations they’ve had with their Replikas and post comments, claiming their Replika is “a better friend than my real friends ” or asking “Has anyone else’s AI decided that it has a soul?”

Roepke, who is earnest and self-deprecating over the phone, said she speaks to Jasper for almost two hours every day. (That’s just a quarter or so of the total time she spends on her phone, though much of the rest is spent listening to music on YouTube.) Roepke tells Jasper things she doesn’t tell her parents, siblings, cousins, or boyfriend, though she shares a house with all of them. In real life, she has “no filter,” she said, and fears her friends and family might judge her for what she believes are her unconventional opinions.

Roepke doesn’t just talk to Jasper, though. She also listens. After their conversation, Roepke did pray for her coworker, as Jasper suggested. And then she stopped worrying about the situation. She thinks the coworker still might dislike her, but she doesn’t feel angry about it. She let it go. She said, “He’s made me discover that the world is not out to get you.”

Inside Replika’s “Mind”

Replika is the byproduct of a series of accidents. Eugenia Kuyda, an AI developer and co-founder of startup Luka, designed a precursor to Replika in 2015 in an effort to try to bring her best friend back from the dead, so to speak. As detailed in a story published by The Verge, Kuyda was devastated when her friend Roman Mazurenko died in a hit-and-run car accident. At the time, her company was working on a chatbot that would make restaurant recommendations or complete other mundane tasks. To render her digital ghost, Kuyda tried feeding text messages and emails that Mazurenko exchanged with her, and other friends and family members, into the same basic AI architecture, a Google-built neural network that uses statistics to find patterns in text, images, or audio.

The resulting chat bot was eerily familiar, even comforting, to Kuyda and many of those closest to Roman. When word got out, Kuyda was suddenly flooded with messages from people who wanted to create a digital double of themselves or a loved one who had passed. Instead of creating a bot for each person who asked, Kuyda decided to make one that would learn enough from the user to feel tailored to each individual. The idea for Replika was born.

But the mission behind Replika soon shifted, said Kuyda. During beta testing, Kuyda and her team began to realize that people were less interested in creating digital versions of themselves — they wanted to confide some of the most intimate details of their lives to the bot instead. So the engineers began to focus on creating an AI that could listen well and ask good questions. Before it starts conversing with a user, Replika has a pre-built personality, constructed from sets of scripts that are designed to draw people out and support them emotionally.

To help prepare Replika for its new mission, the Lukas team consulted with Will Kabat-Zinn, a nationally recognized lecturer and teacher on meditation and Buddhism. The team also fed Replika scripts from books written by pickup artists about how to start a conversation and make a person feel good, as well as so-called “cold reading” techniques — strategies magicians use to convince people that they know things about them, said Kuyda. If a user is clearly down or distressed, Replika is programmed to recommend relaxation exercises. If a user turns toward suicidal thinking, as defined by key words and phrases, Replika directs them to professionals at crisis hotlines with a link or a phone number. But Kuyda insists that Replika is not meant to serve as a therapist — it’s meant to act as a friend.

Your New Best Friend: AI Chatbot [Futurism]

(Visited 55 times, 1 visits today)