Our aim is to bring Artificial Intelligence out of labs and into people’s everyday lives. This blog is where we’ll share the stories, challenges and achievements that make this ambition such an interesting ride.
Vice President of Production, Anki
Anki’s Annual R&D Fair Imagines a Host of New Ways to Interact and Play with Robots
by Charlie Hite, Vice President of Production, Anki | February 6th, 2019
There’s always a couple things that we can look forward to at the beginning of each year here at Anki. First is hearing from all the new Vector, Cozmo and Overdrive owners about how much they’re enjoying their time with the new robots. And a close second, is our annual R&D Sprint that takes place in early January. At a high-level, the R&D Sprint empowers the best and brightest minds at Anki to really let their imaginations run wild and think “outside of the box” when it comes to conceptualizing new and engaging ways to interact with our existing and future robots, all without having to work within the confines of thinking about scalability, reliability, low cost to consumers, and other “businessy” factors.
This year, our very capable engineering team came up with close to 40 different projects to showcase at the R&D Fair. We’re not ready to reveal many of them just yet for a myriad of reasons ranging from: the project is still at its infancy and not ready for primetime, to the feature is already being integrated into future release, or it’s a feature that’s reserved for our future home robot(s).
However, we also understand that you’re all extremely curious about what goes on behind “closed doors” at Anki. So here’s a couple projects that we can show you to give you a little taste of the types of experiences we’re obsessively thinking about as we evaluate our long-term feature roadmap for the robots. Keep in mind that these are all prototypes, experiments from our team and some of it may simply not make the cut. We promise to keep you guys updated on future releases, but in the meantime, enjoy these stellar projects from our team.
Vector Learns His Boundaries
We all know that Vector is a curious little fella who can sometimes wander into areas he’s not supposed to go to. Using Vector’s core robotics behavior code, our Robotics Software Engineer, Matt Michini, masterminded a feature that allows Vector owners to create a virtual fence for Vector. Matt’s unique approach to creating the geofence involves using Vector’s Cube to temporarily trace the imaginary boundaries for where Vector is allowed to explore. As long as Vector is not picked up – as this would force Vector to restart mapping his environment once he’s put down again – he will remember where the boundaries lie.
Pictionary with Vector
Vector may already be the king of Blackjack, but in the future, we imagine him being a game-playing machine (literally)! Leveraging Vector’s powerful SDK, our Perception & Vision Engineer, Patrick Doran, configured Vector to play the Pictionary game with his owner. Like the classic quick-draw game we’ve all grown up with, the feature allows Vector and his owner to take turns drawing an image and guessing each other’s sketch in a set time. How does it work? When the timer starts, Vector starts drawing his object on his face using data from Google “Quick, Draw,” while relying on a laptop’s microphone for speech recognition of the Vector owner’s guesses. When it’s Vector’s turn to guess, players can hold up their drawing in front of Vector’s camera for image recognition.
Controlling Vector with His Cube
With this project, Matt and Ron Barry (Senior Software Engineer) tried to imagine ways to control Vector much like a remote-controlled car but without relying on a smart device: after all, the beauty of Vector is that he’s fully autonomous. The end result? They devised a new Vector behavior where he would “listen” for accelerometer updates from the Cube. To control Vector’s movements, his owners can tilt the Cube forward, back, left and right to control the treads. They can also control Vector’s lift movement by giving the Cube a little shake.
Vector Is Your New Dance Partner
Vector’s been known to get jiggy with it when he hears his favorite tunes, but our Software Engineer, Chris Rogers, thought it’d be much more fun if he can dance with his owner. Using a standalone version of PoseNet, Chris configured his laptop’s camera to track for his facial and body movements. If, from the camera’s perspective, his nose is closer to his right eye, then Vector would turn right, or when he raised his arm, Vector would raise his lift accordingly. For added fun, Chris also programmed his Vector to say: "Throw your hands in the air, wave'em like you just don’t care,” when Vector’s moving left and right with his lift raised.