I have no experience with hardware, so that's what I found enticing about Misty. I could write code that causes something to happen in the world, not just on computer screens or phones."

— Cameron

Join the conversation
View our projects

Tools FAQs

What kind of SDK will you offer?
Misty II has an on-board SDK to enable commands for control and autonomous skills. Initially JavaScript is the primary language supported but additional wrappers have already been written and will continue to be created.
How do I see debug information for my robot skills?
The Misty APIs come with substantial error reporting information that’s available via the normal console for most common Javascript IDEs or the console available in most browsers. There is also debug and log information available in the Misty Companion App when working with the robot in the field.
What sorts of things are enabled by the Misty API Explorer?
The API Explorer is always evolving to best reflect the state of Misty's capabilities, however, you can always count on being able to test out Locomotion, Object Detection, Mapping, Computer Vision, Audio and Visual Inputs and Outputs, and Fart Sounds.
What can I do with the Misty Blockly client?
With the Misty Blockly app developers and non-developers can use use a block interface to chain together various functions in order to execute simple or complex code for their Misty robot. In many cases, Blockly is a good way to quickly test something and get some code generated as a reference—even for experienced developers.