Misty drives in a two-dimensional map

Misty II Project Directory, Part 3: Tools, Sandboxes, & Other Inventions

Welcome back!

Part 1 of this series collected a list of skills from the Misty II developer community, and Part 2 included links to sample code maintained by the Misty Robotics organization. Part 3 focuses on community-created tools, experiments, and other inventions that provide new ways to explore the platform and use Misty’s capabilities.

Each entry includes a link to the source code or web page where you can try the project for yourself. Misty Robotics does not maintain any of these projects, and you may run into issues as you explore. Proceed with caution, a tolerance for the unknown, and a healthy sense of adventure. We’ve shared links to existing discussion threads where you can chat with a project’s creator in the community forums. Join the conversation if you have questions or comments, or just to say thanks for sharing!


Python Wrappers – If you’re interested in coding Misty with Python, check out these wrappers for her REST API. The misty_py project is an async/await Python 3.7 wrapper for Misty’s REST API, and Wrapper-Python provides a slightly different implementation. Both Python wrappers are supported by the Misty community.

Moving Map – A mapping interface that started life as a custom extension of the Command Center web page, Misty’s Moving Map collects time-of-flight, bump sensor, and inertial measurement unit (IMU) data from Misty’s WebSocket connections to draw a top-down 2D map while you drive Misty around her environment. A Misty icon shows you the robot’s location and orientation in the map, and you can draw walls and paths based on the data from Misty’s sensors. You can even save your map as a JSON data object and upload it when you want to use it later. There’s an active discussion about the Moving Map project (with links to an earlier version) on the community forums.

Map Misty’s space, mark obstacles, and draw paths with the Moving Map tool.

misty-interact – A Node.js server that lets you control Misty via REST API commands. This project includes several example applications that use external services for interactive functionality (like speech recognition, response generation, and text-to-speech). You must supply your own credentials for these services before the examples will work. Check out the discussion in the community forums.

misty-client – A library for sending REST API commands, setting up WebSocket connections, and mapping, grouping, filtering, and transforming data from WebSocket messages. This library was used to build the misty-interact server linked above.

rerobots Misty II Sandbox – This sandbox creates a live connection to a remote Misty II in your web browser. Use the in-browser code editor to write real Python code for Misty’s REST API, and watch the robot run your commands in the live video feed. This sandbox is a great way to get hands-on experience with Misty’s REST API and try out a robot before you decide to pick one up for yourself. Scott, creator of rerobots, talked with us about the sandbox in an Uplinks broadcast earlier this year. Check out the recording, and if you have questions or comments, be sure to join the discussion in the community forums

Screen Shot at PM
Connect to a remote Misty II at rerobots.net to code against the robot’s REST API in real-time.

mistygrind – A tool for static analysis of the JavaScript skills and REST API client code you write for Misty II. This tool is intended to reduce errors in your skill code before you deploy it to your robot. Check out the discussion in the community forums.

simpleDock An experiment with auto-docking using Misty’s JavaScript SDK. Assuming that a) Misty’s charger is against the wall, b) Misty is slightly to either side of the charger, and c) Misty is facing in the charger’s direction, Misty uses time-of-flight sensor data and battery charge status information to attempt to drive onto the charging pad automatically. This project is highly experimental, and the code doesn’t rely on APIs that detect the charger using custom fiducial markers or the charger’s built-in infrared reflectors. We’ve included it here as an example of one way to approach the problem of auto-docking.

Teleop – An experimental desktop application that lets you remotely control your Misty II, stream images from her camera, and issue text-to-speech commands from your PC. This app downloads pictures from Misty’s RGB camera as fast as the robot can take them, resulting in a low-framerate, low-latency livestream of whatever your robot is looking at. Use your keyboard to drive the robot, and click the screen to move Misty’s head in the direction you want to look.

affedefdfeea
Remote control your Misty II (and see what Misty sees) with the experimental Teleop desktop app.

We hope the skills, sample code, tools, and projects shared in this series provide a rich collection of resources that inspires you and helps you learn about developing for the Misty platform. If you think something should be included here, please leave a comment to us know. We can’t wait to add your project to the list!

Leave a Reply

Your email address will not be published. Required fields are marked *

Free Misty Robot Giveaway for Developers

Enter to win a Misty prototype and then receive a trade-in for a brand new Misty II during the first wave of shipments!