At InnoRobo, I got a chance to speak with Colin Angle, CEO of iRobot – in a very candid interview about his view on the robotics industry, his vision for the AVA and how he sees the robotics industry shaping up in the coming years.
Cool over Function
As we started out, in keeping with his presentation he gave earlier in the day, he started off with a discussion on how there have been hundreds of millions of dollars spent on making cool demos – but relatively little in the way of solving higher value business needs.
To illustrate his point, he mentioned the incredible effort that has been undertaken on the development of humanoid robots. He calls this an exercise of “cool over utility”. As he mentioned, the challenge of having to build a system that supports the model of bipedal legs and actually executing walking and balancing has been a costly adventure. Even the most exciting systems often have a team of scientists walking behind it, the system itself has a mean-time to failure of about 45 minutes, and the performance is limited – all to the cost of millions of dollars.
Whereas the iRobot Warrior system, Colin feels is the first practical human-sized robot ever designed. Handling drops from 10 to 20 feet, able to carry payloads of over 200 pounds, and being able to go where human-sized systems should go – the Warrior is a solution to a problem that does not need a bipedal system, just a solid system that solves a high-value problem.
Thoughts on Remote Presence
So, in keeping with my theme, I steered the conversation to remote presence and how he saw AVA potentially accomplishing this. And he quite nicely broke down the problem and how AVA is an attempt to resolve the puzzle.
First and foremost, he wants to deliver an experience better than being there yourself – regardless of the travel time. He wants to mimic “presence” in such a way that the experience for you (the pilot) is rich, deep and intuitive. And keeping with many people in this space, he does not feel that remote-controlled webcams or the Cisco telepresence solutions are solving this.
To achieve ubiquitous remote presence, an RC webcam is not effective since there is limited ability for the pilot to truly understand the environment they are in. While a person could learn the environment over time (e.g., where the offices are, where the conference rooms are), wouldn’t it be better to have the RPS know the entire layout and allow you to request the place to go and simply take you there? Cisco telepresence solutions are not effective in other cases simply due to the very nature of the systems themselves – limited in freedom, tied down to a single location, and very limited in being able to express you outside of the magic screen.
Colin’s vision is the ability to have a surrogate “you” – one that could, in any location, be able to be present and do things that you normally would do. Go to the room you wish to go, carry on a conversation outside of a room, be aware of who is around, where they are spatially and go to them with minimal effort.
How the AVA fits in
He was amused that people thought the AVA was iRobot’s effort into robotic telepresence – he sees the AVA as a “generic platform” for supporting all of the robotic functions that are necessary for enabling remote presence that iRobot is known for (“Do what iRobot is great at”). For instance, as we discussed the functional components of AVA, he pointed out the various features that the AVA platform supports:
- Downward facing IR for cliff detection
- Braking systems to ensure the system is not going to fall
- Small physical footprint (on the order of a human) to ensure fast turning radius and strong stability
- Bumpers and upward facing sonar for detection of objects that could potentially collar the head of the device
- Two PrimeSense sensors to enable a better understanding of the world through 3D mapping both of the navigation environment (downward facing) and the environment in front (on the camera assembly)
- A LIDAR component which he wants to reinvent to bring the cost down (most expensive piece of the system)
- Control surfaces for participants to move the system without physically pushing the system (through the bumper pads on the neck) to improve management of the system
- Telescoping neck (via lead-screw) to ensure a lower center of gravity for movement/motion while affording a variable height for engagement with participants either standing or seated
- Positioning control for the neck/head component
- Supports adding manipulators on the system through a rail mechanism on the back of the neck of the AVA
Providing a platform for application development
Colin said that iRobot’s primary focus is on the “robotic functions” for a “generic platform” – to help others overcome the liability issues. iRobot has done a lot of work – through their previous designs and their own operating system (AWARE2) – to make as safe and reliable platform as possible. Rather than trying to make a specific platform for remote presence, Colin said that it is iRobot’s intent to build the platform and let developers/designers develop on platforms they know of – and let them create a solid system.
I got somewhat confused here – it sounded like he was suggesting that iRobot would not compete in the application development and would not build a system for specific purposes – like remote presence. And, when I pressed, he clarified that iRobot would not get in the way of something like Pad-to-Internet-to-Pad communications (e.g., FaceTime, qik, Skype), but in building a navigation interface (e.g., a web front-end for piloting the system) for the pilot to interface with AVA, iRobot might offer a solution. But like Apple, iRobot’s solutions for various applications could sit alongside of any other third-party solutions – enabling these developers to build a better interface/application that would interface as well with AWARE2 and control the AVA platform.
Yes, it is our intention to develop apps for AVA alongside other developers, as we need to, as you say, “prime the pump”. As we look at the way things are likely to play out, iRobot is committed to being best in the world at autonomy/navigation software, platforms, manipulation, and the integration of 3rd party hardware – while we aspire to be a one of many application developers.
But for remote presence, the idea of having a tablet with a camera and a large screen (like the Motorola Xoom or the iPad 2) connecting to the AWARE2 API would easily support the creation of a RPS and allow the developers to rapidly iterate versions. And with an extendable head and telescoping neck, the placement of the pilot’s face would be an easy effort and allow for remote presence to potentially become true.
Other juicy bits
From other conversations, I learned that there are a number of the AVA prototypes out in the market space already – in the midst of prototype development for various problems. I could see the vision Colin has allows for an augmented reality for the pilot – being able to have a click-and-response action within the view of the RPS (e.g, open doors, tag people in a meeting, set vision points to track where people are and respond to them rapidly by turning the head). How this comes about will be an interesting exercise in the coming years.