Listen, humans are great and all, but sometimes they’re horrible. That’s especially true if you’ve just spent 12 hours stuck in a flying aluminum tube with a few hundred of them. Now all you want to do is lock yourself in a hotel room, and for the love of all that is holy get away from humans.
Ah, but wait. The cursed fates dictate that someone’s gotta check you into your hotel. So you roll into the lobby, heartbroken, to find humans behind the counter, but also a humanoid robot called Pepper. You strike up a conversation with the robot, and one thing leads to another and it’s checked you into your room, no human interaction required.
You’ve just avoided getting cranky with a well-meaning person, sure. But you’ve also done something more subtle: You’ve interacted with a robot like few humans have before you. Because Pepper is part of the first wave of intelligent machines that promise to not only make our lives easier, but to bring a strange new form of interaction into being.
To be clear, Pepper isn’t meant to replace hotel employees altogether, but to complement them. “I think for the very near future what you’re going to see with robotics is more around replacing and automating tasks as opposed to full blown duties of different jobs and roles,” says Steve Carlin, chief strategy officer of SoftBank Robotics America, which makes Pepper.
Pepper can walk you through the check-in process either with a conversation or through a touchscreen on its chest, but it has to call a human if you want help with your bags. (The robot has arms, but they’re not meant for lifting things. Instead, they help make Pepper expressive.) The hospitality industry is just far too complex and sensitive for a robot to navigate on its own. Want to complain that you didn’t get a non-smoking room, and than instead of a new room you want a refund so you can take your business elsewhere? Good luck getting Pepper to understand.
The problem is context. Pepper can handle basic conversation—what’s your room confirmation number, do I have the right room here, would you like help with your bags. Lots of yes/no questioning. “Robots don’t understand ambiguity,” says Omar Abdelwahed, head of studio for SoftBank Robotics America.
Getting robots to better understand us isn’t just about helping them parse our language, because communication is about so much more than words. Pepper will need to understand intent and attention.
“If you’re not looking at me, I might not need to listen, and so I shouldn’t respond,” says Abdelwahed. “That largely doesn’t exist today.” It’s why you have to say “Alexa” or “Pepper” in order to get their attention, and why any of these devices rudely butt in when you mention them in casual conversation. Robots also have to get better at recognizing subtle facial cues that might indicate anger or impatience or friendliness, so they might adjust the urgency of their communication, for instance.
In these early days of human-robot interaction, it helps for machines to come paired with a touchscreen. That’s especially true for Pepper, which works as a greeter not just at hotels, but hospitals, malls, and airports. It might, for instance, bring up a map on its screen to give directions, instead of just rattling them off turn-by-turn like a maps app would. The screen also works as a redundant input, so instead of confirming your room with a verbal “yes,” you can tap a “yes.”
So even though Pepper doesn’t always understand what you mean, you can revert back to the “old” ways of the touchscreen. But keep in mind these are very early days for conversational intelligences—these systems will only improve. “Ideally, the first interaction you have with a robot will be the worst, because it should be able to learn from its interactions with you and others,” says Carlin.
We might ask, though: What’s the point is of embodying an AI in the first place? Alexa works just fine sitting there on your countertop, after all. Well, it’s about engagement. Pepper commands attention because it’s novel, true enough, but also because it’s tapping into our instincts for interpersonal communication. You can feel its presence, and you can’t help but look it in the eyes when it’s talking to you.
So maybe, just maybe, it can charm you out of your post-flight funk someday soon.
More from the HardWIRED series
Pepper works as a hospital greeter, but roaming the halls is another more mobile robot called Tug, which autonomously delivers drugs and food.
As Pepper shows, it’s hard as hell to get a robot to understand nonverbal cues. One robot, though, is reading the faces of deaf infants in fascinating ways.
Pepper is vaguely humanoid, though it rolls instead of walks. Because it turns out that getting a robot to move on two legs is an exceedingly difficult thing to do.