Partnership Announcement

Opteran Mind

Opteran turns biological brain algorithms into the building blocks for machine autonomy, interfacing them with industry standards.

The software is deployed edge-only on low-end silicon. Stabilizing vision and extracting motion to perceive in real-time. Machines building maps by simply recognising spaces as they pass through them. Weighing up information to make decisions.

Machines responding adaptably as in nature.

decor

How it works

Our patented algorithms provide innate edge only autonomy for machines as a set of integrated software modules:

Opteran Vision

Passive omnidirectional vision (360x180 degrees 4π steradians), algorithmically stabilised at point of capture. Low-cost 2D CMOS cameras, high frame rates (150fps+) with no buffering.

Opteran Perception

Detects structure & motion in variable lighting, poor weather and aliasing without expensive active sensors or fragile feature extraction.

Opteran Action

Vision-only collision prediction, avoidance & motion planning without huge 3D point cloud complexity and overhead.

In the lab coming soon

Opteran Spatial Engine

Ultra-low memory (only 1kB/m2) facilitating edge-only dynamic learning and navigation indoors and outdoors with no data capture, labelling or training requirement.

Opteran Decision Engine

Natural brain decision making to action select and task-switch without using condition statements or reinforcement learning.

In the lab coming soon

Advantages of Natural Intelligence algorithms

General purpose (air and ground), edge only, no network, no data, no training, secure and private, simple integration, standards-based (ROS1 / ROS2, Mavlink/PX4 or Ardupilot), ultra-low SWaP$.

Using low cost, low bandwidth colour vision:

OPTERANLIDARRADARCAMERAEVENT CAMERA
Colourat sensorat sensornot supportednot supportednot supportednot supportedat sensorat sensornot supportednot supported
Distanceat sensorat sensorat sensorat sensorat sensorat sensorcentral processingcentral processingnot supportednot supported
Motionat sensorat sensorat sensorat sensorat sensorat sensorcentral processingcentral processingcentral processingcentral processing
Localisationat sensorat sensorcentral processingcentral processingnot supportednot supportedcentral processingcentral processingcentral processingcentral processing
BandwidthLevel 1 of 3Level 1 of 3Level 3 of 3Level 3 of 3Level 2 of 3Level 2 of 3Level 3 of 3Level 3 of 3Level 3 of 3Level 3 of 3
CostLevel 1 of 3Level 1 of 3Level 3 of 3Level 3 of 3Level 2 of 3Level 2 of 3Level 2 of 3Level 2 of 3Level 3 of 3Level 3 of 3
  • At sensorAt sensor
  • Not supportedNot supported
  • Central ProcessingCentral Processing

Deploying the Opteran Software for ultra-low SWaP$

Our software algorithms, like insect brains, are ultra-efficient.

Deployable on CPU, FPGA or ASIC with the SWaPs constraint being just the hardware e.g. on Xilinix FPGAs ~2W only.

Our reference PCB with FPGA, used for testing and validating our software, is only 10g.

Try the Opteran Mind

We provided several approaches to test our general purpose autonomy.

OPTERANTESTING
01
Simulation

Our algorithms integrated with the Unreal Engine allow you to test the Opteran Mind in simulations that represent your business needs.

OPTERANTESTING
02
Reference hardware

The Opteran Development Kit (ODK) with integrated software, hardware and pre-calibrated cameras allows you to integrate Opteran algorithms directly into your machine. Simply, plug in a USB or Ethernet to access PX4 over Mavlink or ROS APIs and get going.

OPTERANTESTING
03
Reference machines

To simplify this, we have pre-integrated the ODK into a drone and ground-based robot to allow you to test the Opteran Mind quickly and simply.