Enable Advanced AI on Your Robots and Drones

Our open source low latency streaming sensor solution enables compute-intensive AI on robots and drones.

Wireless pose camera

Packed with sensors: 

  • RGB camera (synchronize multiple cameras over ethernet for stereo or more).
  • UWB for 3D positioning.
  • IMU and magnetometer for orientation.
  • 3D ToF camera for depth images (optional).

Want to run advanced AI such as large vision transformers or even language models (LLMs) on your robot or drone but you are limited by edge compute?

Our low latency streaming sensor allows you to quickly capture and send data for remote AI processing.


Pose sensor

Never loose track of your robot or drone. Get absolute positioning (x,y,z) and orientation (quaternion) in real time. The pose sensor is based on UWB and IMU/magnetometer. 

The wireless pose camera comes already equipped with the pose sensor for sensor fusion applications!

Build it yourself

Our tech is fully Open Source. Everything you need to build or modify the wireless pose camera or pose sensor can be found on our GitHub page.

Test it out before leveling up on additional sensors. With only a Raspberry Pi and a Raspberry Pi camera, you can run our code and get RGB streamed wirelessly into Python.

from mayfly.sensorcapture import SensorCapture

def show_data():
    cap = SensorCapture('config.json')
    while True:
        sensor_data = cap.read()

A few lines of Python

Get all your sensor data streamed wirelessly with low latency into Python. Ready to deploy into PyTorch, JAX and TensorFlow.

Enhance your SLAM solution with UWB

– Enhance your existing SLAM solution with low compute low power instantaneous absolute positioning using our pose sensor.

– Eliminate relocalization issues in your SLAM solution. Always know exactly where your robot is.

– Fuse Mono, Stereo or depth sensor with UWB positioning.



Stream sensor data to run advanced AI remotely. See our demos with tutorial!