The single most-asked question we at Misty Robotics get is: “what can Misty do?”
This question comes at us from consumers, developers, investors, and other robot companies. And there are as many answers as perspectives, but the shortest answer is “probably not as much as humanity expects a robot to do.” And that’s just fine.
If you think about all of the actions that humans perform on any given day, the human “Task Space” pie chart might look something like this:
Of course, those human task categories break down into a huge number of actual jobs that humans around the world are doing daily, with skill and panache.
Compared to this, the world of successfully-performed robot “jobs” is very small indeed:
• driving forklifts in a warehouse (Kiva, 6 River Systems, et al)
• placing parts on an assembly line (Fanuc, Yaskawa, ABB, et al)
• assisting humans in pick & placing items into boxes (Universal, Fanuc, formerly Rethink)
• vacuuming the carpet (iRobot, Neato, et al)
That’s not a whole lot of jobs. Sure, there are dozens of startups currently exploring a few more job areas. And there are dozens more R&D efforts feverishly afoot that are either expanding the range for these current market-accepted jobs or exploring new ones. Yet, as we’ve written in the past (“Why the robot industry is so focused on single-task robots”), the robotic market is deeply mired in what we believe to be conventional “one robot, one task” thinking.
One robot, many tasks
Misty is all about doing hundreds, perhaps thousands of jobs. Misty is a robot platform built for software developers to explore the far range of the possible jobs for robots today.
What kind of jobs will Misty be able to fill? The short answer is: (almost) any job that requires independent mobility, eyes, ears and a mouth — because those are her four core competencies.
Looking at the Task Space, we can rule out many segments of human tasks that Misty cannot perform:
• Physical/manual labor that requires sophisticated manipulation with hands, lots of force, or legs that are able to climb stairs or step over significant obstacles
• Rote/manual labor tasks that require hands/legs for manipulation.
• Office work that requires lots of manipulation with hands/fingers.
• Sophisticated mental work that requires deep thinking along with hearing, speaking, etc. Misty’s not likely to replace law or economics professors any time soon.
• Empathic work that requires being highly attuned to others. Misty’s also not replacing therapists any time soon.
What does that leave? It leaves office work that doesn’t require manipulation. It leaves rote/manual work that doesn’t require walking up steps or manipulation. It leaves the empathy jobs that focus on companionship, to help with loneliness. In addition, Misty can indeed accomplish some percentage of the “long tail” of tasks that only require mobility, ears, eyes, and/or mouth.
We’ve broken these “job” types down into a few categories. But before you see them, ask yourself: “What would I do with a programmable entity that could operate/roam independently, “see”, “speak,” and “hear?” (For some ideas, check out our Community site to see posts in the ever-growing “what will you build” topic.)
Here’s our categorization:
• Roam & collect data. Whether it’s the HVAC person we spoke to who said, “I hate having to come back to a building I’ve repaired and walk around to check the temperature. If I had a robot to explore and collect the temperature data I could be making more money on the next job.” Or the sound engineer in an auditorium: “It’s boring to walk around the space and figure out whether the sound is reaching every area.” Roaming and collecting data is a basic manual task that many humans don’t really like or feel is cost-effective.
• Roam & deliver. This could be envelope delivery within an office, toothpaste delivery at a hotel, or coffee delivery at home. True, Misty II is diminutive enough that she can’t carry a hefty payload — but we’re confident her payload capacity will go up as these jobs are validated.
• Roam & report. How many security officers at night have a standard path they walk to observe sights and sounds that are out of place? A lot. We also spoke to an energy pipeline company who has difficulty getting humans to go to their remote locations to “roam and report” on the condition of the pipeline (cracks, spills, environmental concerns). My favorite example here comes from a friend who sells consumer goods to retail: “We pay a lot of money to humans to walk into a store, take a picture of our end-cap and send it to us for compliance — why can’t Misty do this at night for all of the suppliers to the retail store?”
• Roam & map. Interior decorators prefer 3D models that include high-definition photo/video in order to craft their creations. Real estate professionals want high-fidelity maps of interior spaces so they can work with clients on which commercial space is the ideal fit.
• Roam & control. Many environments have elements within them to control. Offices and homes have thermostats, lights, cameras, and other manipulable elements embedded in them. We humans aren’t very good at optimizing the energy flow in our spaces by turning off lights or managing temperatures — Misty can explore her environment and manage these controls regularly.
• Roam & entertain (or just flat-out entertain). Some of the most fun examples we’ve had came from people who started with a “practical” application like the ones above, but then they got creative. “How about a robot soap opera on YouTube?” or “What if we had a remote Robot Escape Room where you have to figure out the puzzles of the room, but the robot is the one actually doing them?”
• Roam & interact. Retail stores have very high turnover and, frequently, one of the most common jobs is to direct a customer to a particular part of the store. Or, for a customer standing by a particular product, to answer questions about the product. Or, in a care setting, there are numerous routines jobs that require providing medicines to disabled or elderly people and validating that they took those medicines.
We’re quite confident that you, the reader of this article, will invent a few dozen jobs for Misty just by yourself.
Will Misty be able to do all of them perfectly?!? Doubtful. Many of our customer-inventors will explore the boundaries of the Task Space. Some jobs will require more brain power. Some jobs will need different camera angles (a taller Misty). Some will require enhanced simultaneous object recognition or on-the-spot audio learning. Nevertheless, many of them will be highly successful in the jobs they invent for Misty to do. And when that happens, consumers will start to see Misty out and about performing her regular tasks, whether that’s in retail store at night, patrolling the hallways of schools or office buildings, mapping an empty home for her real estate employers, or entertaining millions of YouTube viewers with her various roles in “As the Gears Turn.”