English ▾
Change language
Deutsch (deu) françios (fra)
English ▾
Change language
Deutsch (deu) françios (fra)

Autonodyne: The sum of the parts

Autonodyne is a Boston-based software AI company specialized in control of highly automated or uncrewed vehicles. We got our start in aviation but have since branched out into other domains (air, sea, and land). We have real products and software now.

We are building upon our existing extensive set of functionality to help foster the age of true autonomy and artificial intelligence. Our technology and architectures are applied across a wide range of use cases and scenarios. We like to think we are providing “additive autonomy” and some sophisticated “Reasoning-On-The-Edge (ROTE™)” to enable Uncrewed Vehicle (UV) products and services.

Below, an overview of how Autonodyne works. For a graphic version of this overview, please download Autonodyne Overview.

Use one or many vehicles

Most users of Autonodyne’s autonomous flight technologies seek to control one vehicle for a specific task. Our portfolio of work includes control capabilities for a broad spectrum of vehicles. For many of these vehicle types we have developed autonomous behaviors that permits operating groups of them (swarms or hives) in collective formations and maneuvers.

Sometimes operators often want to pilot different types of vehicles simultaneously during a mission or need the vehicles to be able to interact as a team. Allowing dissimilar vehicles to work together is a cornerstone of our autonomous flight technologies.

Small Uncrewed Aerial Systems (sUAS)
Cargo and Delivery Uncrewed Vehicles (UV)
Advanced Aerial Mobility (AAM) Vehicles
Simplified Vehicle Operations (SVO)
Optionally Piloted Vehicles (OPV)
Fixed-wing Uncrewed Aerial Vehicle (UAV)
Uncrewed Combat Aerial Vehicles (UCAV)
Uncrewed Underwater Vehicles (UUV)
Uncrewed Surface Vehicles (USV)
Uncrewed Ground Vehicles (UGV)

Select the right supervisory media

Your selection is based on your needs. Perhaps you need to control a vehicle or two from your cellphone or you want preview capabilities for multiple participants on a variety of media. Autonodyne’s software engines are capable of running on a wide selection of display hardware. Whether it is your hardware or ours.


Mobile

Tablet

Desktop
Mission
Computer
Supported operating systems
Android Android
Windows Windows Windows
Linux Linux Linux
RTOS

Connect to almost anything, communicate from almost anywhere

Communication and data transfer between uncrewed vehicles and station operators are key to control the UV and to process the data provided by them. Autonodyne’s software engines are link agnostic minimizing transmission and communication problems. We use complex algorithms that allows our software to send information to any combination of UVs using different data links. The software enables Line of Sight (LOS), Beyond-Line-Of-Sight (BLOS), and technical data communication. When we don’t have communication connectivity with a vehicle, our on-board processing uses our “Reasoning-On-The-Edge” (ROTE™) capability which is able to run the mission and will communicate back to the human supervisor to provide situational awareness when/if connectivity is reestablished.

Autonodyne already supports a myriad of datalink options and message set standards in an effort to provides as much flexibility as feasible and adaptability to existing vehicle configurations.

Many control input choices

Sometimes it makes sense to use alternative control input methods. Our systems are designed to support a variety of human-machine interface devices ranging from the traditional mouse and keyboard, to commonplace game controllers and now mixed or augmented reality devices along with voice and gesture control. Please ask our sales team to help you with your selection:  sales@autonodyne.ai.

Wi-Fi Bluetooth™ Wired Other
Keyboard/
mouse
Mixed
reality
VR goggles &
paddle controller
Augmented
reality
Voice
command
Hand-held
controllers
Wearable
controllers
Gesture
control

Operate from sea to stratosphere

With our software engine you can control an autonomous vehicle in multiple dimensions. While the most typical applications for autonomous flight involve control of UVs at low to medium altitudes — space is the limit. Back on the planet’s surface you can control land rovers and other land craft. And when we’re done with our current efforts, head out to sea and you can control vehicles that operate on the surface — or below it.

SPACE

HIGH

ALTITUDE

MEDIUM

ALTITUDE

LOW

ALTITUDE

LAND

SURFACE WATER*

UNDERWATER*

SPACE

HIGH

ALTITUDE

MEDIUM

ALTITUDE

LOW

ALTITUDE

LAND

SURFACE WATER*

UNDERWATER*

SPACE

HIGH

ALTITUDE

MEDIUM

ALTITUDE

LOW

ALTITUDE

LAND

SURFACE WATER*

UNDERWATER*

SPACE

HIGH

ALTITUDE

MEDIUM

ALTITUDE

LOW

ALTITUDE

LAND

SURFACE WATER*

UNDERWATER*

centered image

Many behaviors to increase capabilities

Autonodyne’s growing library of navigation control behaviors permit your vehicle or team of vehicles to perform a variety of mission-specific maneuvers. The behaviors are provided on an a-la-carte basis or in task-specific packages.

After a vehicle is commanded with a set of behaviors, it allows humans in the human-machine team to make better informed decisions, expand their reach and access, and increase safety and productivity while permitting the vehicles to focus on what they do best. For a complete inventory of our behaviors see Autonomy Behaviors.