Partnership Announcement

Autonomy in off-road highly dynamic environments, without network access

Augment vision and perception in highly dynamic environments.

Semantic labelling
Locate obstructions (holes, floods) or assets (rubble, plant or equipment) on a map.
Simply annotate a map to inform machines of no-go or obstructed areas (flooding, soft ground, workmen).
Dynamic routing
If routes are obstructed, dynamically choose another route.
Network free navigation
Store large maps onboard in low memory and navigate edge only without connectivity.
Object avoidance
Situational awareness of small static or dynamic objects in near-field e.g. before poles and wires.
Proximity detection
Augment and enhance current sensor modalities e.g. RF with vision.
Navigate in dirty conditions
Reduce or eliminate LiDAR and Ultrasonics in near field sensing, replacing them with lowest cost 2D CMOS camera vision.
Map maintenance
Navigate with significant scene change and dynamically maintain maps as the environment / route changes e.g. new building work.
Small spaces
Navigate or inspect small and/or featureless spaces.

Coming next

Autonomous job prioritization. Enabling a machine to independently and automatically prioritize its most important tasks and dynamically route in real time.


Try the Opteran Mind

We provided several approaches to test our general purpose autonomy.


Our algorithms integrated with the Unreal Engine allow you to test the Opteran Mind in simulations that represent your business needs.

Reference hardware

The Opteran Development Kit (ODK) with integrated software, hardware and pre-calibrated cameras allows you to integrate Opteran algorithms directly into your machine. Simply, plug in a USB or Ethernet to access PX4 over Mavlink or ROS APIs and get going.

Reference machines

To simplify this, we have pre-integrated the ODK into a drone and ground-based robot to allow you to test the Opteran Mind quickly and simply.