Misty’s Sensors at a Glance

What they do, why we put them there, and how you might use them in your skills.

Screen Shot at PM
Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on email

Misty’s sensors are her toolkit for understanding the world. From pitch velocity to object distance, the data these sensors provide is where the rubber hits the road in coding Misty’s skills. Understanding a bit about how these sensors work (and how to use this data in your code) is a key part of programming our robot friend.

This post aims to provide a 10,000-foot overview of Misty’s sensors. It covers (at a high level) what Misty uses them for, and runs through examples of how you might use them in your own skills. Let’s get started!

Time-of-flight

A time-of-flight sensor – frequently shortened to “ToF” – measures how long it takes for something (like a laser, or a beam of light) to travel through something else. In Misty’s case, that “something else” is the space between her and the nearest obstacle. 

Misty uses eight time-of-flight sensors (three front, one rear, and four facing downward) to calculate the distance of objects in her path. She uses the front and rear sensors primarily for obstacle avoidance. Register for the stream of data from these sensors to know when something is too close, and Misty needs to find another path.

distillery misty stops

The function of the downward-facing sensors is typically the opposite. Misty uses these to know when an object (like the floor) is too far away. They’re put in place to detect dangerous edges, and should prevent Misty from taking unnecessary tumbles. 

Inertial measurement unit (IMU)

Inertial measurement units (IMUs) are typically used to measure the orientation, rotational velocity, and acceleration of an object. IMUs are the backbone of navigation in all kinds of devices, like aircraft, satellites, GPS systems, and (our favorite) robots.

Misty’s IMU reports her current pitch, roll, and yaw, her rotational velocity in each of these directions, and her current acceleration along her X, Y, and Z axes. Misty uses this information to do a lot of heavy lifting behind the scenes of her locomotion API. Additionally, IMU data is integral to Misty’s simultaneous localization and mapping (SLAM) capabilities.

figure eight resized
Coding robots to navigate figure-eight patterns? An IMU is the sensor for you.

When you plug IMU data into your own skill code, you’re ready to program precise locomotion, engaging interactions, and realistic movement. Combining IMU and face recognition data makes it possible to code Misty to turn and follow faces she detects. Or, as another example, you might stream audio localization data from Misty’s mics and teach her to run away from loud noises. 

Cameras

With a 4K RGB camera, a wide-angle fisheye lens, and stereo infrared cameras, Misty’s got cameras in spades. All of these cameras come packaged together in Misty’s visor, and they’re calibrated in parallel to improve Misty’s accuracy when performing computer vision tasks.

Misty’s RGB camera streams image data to the 820 processor on her headboard for on-board face detection, training, and recognition. You use this functionality in the skills you write by registering for face recognition events in your code. 

Other uses for the RGB camera include taking pictures and recording video. Misty stores images and video files locally, and you can do with them whatever you’d like – print a picture of someone you’ve caught snooping, offload video recordings to your PC, or send images to the cloud for additional processing.

Next up are Misty’s fisheye lens and infrared cameras, both of which are components of the Occipital Structure Core depth sensor Misty uses for simultaneous localization and mapping. If you’d like, you can also use these cameras to capture black-and-white fisheye photographs, or to generate a matrix of depth information about the scene in front of Misty.

Screen Shot at PM

Misty uses her depth sensor to generate an “occupancy grid” – a two-dimensional matrix of values indicating areas of open space, occupied space, and “covered” space (areas that Misty can drive beneath).

Capacitive touch

We’ve paneled Misty’s head with six capacitive touch sensors – four on top, one on her chin, and one near her carrying handle. These sensors generate an electric field that conductive objects (like your finger) disrupt. Each time this happens, Misty sends an event message indicating the position of the sensor and whether that sensor was touched or released. Consider using these messages in combination with face recognition data to have Misty react differently when touched by different people.

Screen Shot at PM

Bump

The bump sensors on Misty’s base – two in front, and two in rear – are panels that activate switches when pressed. For the most part, Misty uses these for obstacle avoidance. When Misty is right up against an object, one of her bump sensors should activate to indicate that something is in her way.

In addition to obstacle avoidance, you might consider using bump sensors as “buttons” that can initiate a new sequence of actions – like learning a new face or getting the weather from an external API. As a robot, Misty lacks interface elements we’re used to. She doesn’t have a keypad or buttons in the traditional sense. But she does have capacitive touch and bump sensors, and these input methods are readily available for kicking off other parts of your skills. 

Microphones

Misty uses an array of three far-field microphones to record the sounds she hears. She processes this audio on her headboard to provide a stream of data you can use in your skills, like the volume and relative position of any sounds or voices Misty hears.

Screen Shot at AM

When you register for audio localization events, Misty feeds this information into your skill code, and you can use it to program unique reactions. You can have Misty drive toward friendly-sounding voices, or drive quickly away from not-so-friendly ones. Or maybe you’d prefer Misty to be curious, in which case you can program her to investigate sounds and take pictures of what she finds – it’s entirely up to you!

Connect your own!

When Misty’s built-in sensors aren’t enough, you can use the UART serial port on her backpack to connect your own third-party sensors. If it can be hooked up to a microcontroller, it can be configured to communicate your skill code.

Need a light sensor, so you can code Misty to dim your Hue bulbs after dark? Check. What about a lightning detector, so Misty can remind you not to leave your umbrella at home on stormy days? Check. Or maybe a thermometer, so Misty can roam a workroom and let you know when the temperature is too high? Check. (Seriously – check it out.)


Any one of these sensor types provides ample room for exploration and experimentation. Bring in the others, and possibilities abound.

Have other questions about Misty’s sensors? Don’t be shy – join the Misty Community and ask!

Leave a comment

Your email address will not be published. Required fields are marked *

Authentic, Relevant Robot News
Join for industry highlights and insider views
Featured Posts
Categories
Archives
Archives

Free Misty Robot Giveaway for Developers

Enter to win a Misty prototype and then receive a trade-in for a brand new Misty II during the first wave of shipments!