English ▾
Change language
Deutsch (deu) françios (fra)
English ▾
Change language
Deutsch (deu) françios (fra)

Built to bring control to autonomy

Our core development areas are software-focused. Our roboticists spend most of their time on Smart Automation, Multi-ship operations, and Manned-Unmanned Teaming (MUM-T) doing a lot of work with computer vision for precision landing, object recognition, and obstacle avoidance. The majority of our core capability is algorithm driven, and our AI experts are constantly building new advanced autonomy behaviors and improving autonomous path planning. These efforts include the use of machine learning and sensor output to enrich our understanding of the surrounding environment, thereby consistently improving the algorithms employed. Visualization of the vehicle state, navigation plan, mission planning and execution are a large part of what we do and we use this in our control stations, Advanced Aerial Mobility, and in our augmented/virtual/mixed reality work.

Manned-Unmanned Teaming with high-performance UAS

We have been providing Human-Machine Interface software and autonomy control since 2016. In these images we are working with Kratos Unmanned Aerial Systems to control their UTAP-22 Mako air vehicles from mobile tablet controls in manned fighter and Command/Control aircraft. In one image we are simultaneously controlling two actual air vehicles and two simulated ones. Our software was hosted on the ruggedized tablet in the image below and was connected to the airframes from the time they were on the launch rail, through flight, and recovery back on the ground.

Autonodyne Archives

Smart Automation and Simplified Vehicle Operation

In aviation, Smart Automation is defined as automating certain elements of flight preparations and flight control. Specifically, it targets context-sensitive tasks to reduce the workload peaks and overall workload of humans. We view this as a means to accomplish Simplified Vehicle Operations. Some examples, to name a few, include:

Autonodyne treats Smart Automation as an initial “back stop” to mitigate the human as a single point of failure in decision making. In the long run, with advances in AI technology, predominantly human-run operations will give way to ultra-reliable automation. Ultimately, this gradual evolution will enable full autonomy.

Flight system and flight mode complexity keeps increasing. As does the variation between aircraft and software versions in fielded legacy systems. These few trends alone highlight the need for a simpler approach.

Autonodyne believes Smart Automation starts with flight critical but deterministic tasks (e.g., nominal checklist usage, system monitoring, etc.). In addition, as a pilot’s experience and confidence in Smart Automation increases, the automation pendulum will swing to also include non-deterministic tasks (flight/mission planning, contingency planning, decision support, self-preservation, reaction to imminent threats) in the transition from those human-run operations to ultra-reliable automation.

We are currently operating an Optionally Piloted Vehicle (experimental Cessna 182) where we experiment with smart automation, simplified vehicle operations, and advanced forms of pilot assistance. We have also modified a Cirrus SR-22 to serve as an OPV and flying testbed for these technologies.

Intelligent directions

Autonodyne’s path planning algorithm that directs vehicle control behavior demonstrates the power of “smart automation” and serves as a logical transition into real autonomy and eventually full AI. In this scenario, a package will be delivered to its destination in the shortest, quickest route. Other scenarios use the most energy efficient routing and still others use least probability of detection (e.g., “good neighbor” paths in civil use cases, and threat avoidance in defense use cases) routings.

1

Static obstacles like homes, towers, hills, and mountains are mapped out in 2D/3D in internal memory based on GPS and sensor data. Algorithms eliminate risk by setting “no-fly” perimeters.

Dynamic obstacles weather, airspace, and airborne traffic are also mapped out in 2D/3D.

2

As the drone moves forward, multiple paths are evaluated in seconds around the identified obstacles until an optimal path is determined.

3

The drone delivers its package.

4

1

Static obstacles like homes, towers, hills, and mountains are mapped out in 2D/3D in internal memory based on GPS and sensor data. Algorithms eliminate risk by setting “no-fly” perimeters.

Dynamic obstacles weather, airspace, and airborne traffic are also mapped out in 2D/3D.

2

As the drone moves forward, multiple paths are evaluated in seconds around the identified obstacles until an optimal path is determined.

3

The drone delivers its package.

4

1

Static obstacles like homes, towers, hills, and mountains are mapped out in 2D/3D in internal memory based on GPS and sensor data. Algorithms eliminate risk by setting “no-fly” perimeters.

As the drone moves forward, multiple paths are evaluated in seconds around the identified obstacles until an optimal path is determined.

3

1

The drone delivers its package.

4

Dynamic obstacles weather, airspace, and airborne traffic are also mapped out in 2D/3D.

2

As the drone moves forward, multiple paths are evaluated in seconds around the identified obstacles until an optimal path is determined.

1

Static obstacles like homes, towers, hills, and mountains are mapped out in 2D/3D in internal memory based on GPS and sensor data. Algorithms eliminate risk by setting “no-fly” perimeters.

3

1

The drone delivers its package.

4

Dynamic obstacles weather, airspace, and airborne traffic are also mapped out in 2D/3D.

2

SOURCE: Autonodyne archives

RCU-1000 running on a Panasonic FZ-N1 Android EUD connected to a Teal Golden Eagle sUAS

Control stations

Autonodyne designs and builds control station software (e.g., "RCU-1000") that serve as a supervisory tool for uncrewed or non-traditionally operated/piloted vehicles. One variation of these control stations is “Common Control Stations” where we use “software wrappers” that know how to communicate to a vehicle using existing infrastructure, hardware and communication protocols. In these cases, we don’t add any special hardware or software to the UV. Whether using a physical Autonodyne RCU-1000 Control Station or downloading our software onto your hardware, either approach enables a single operator to control all of the vehicles in a network at the same time

Autonodyne has considerable expertise in Natural User Interface (NUI) design. This technical capability enables us to provide any control station operator with an industry-leading user interface that is graphically elegant, intuitive, and user-friendly. To the extent the human operator needs to interact with the vehicle or vehicles, this interface is a natural extension of the human his/herself.

Our control stations can run on a host of different hardware platforms (mobile phones, tablets, PCs, laptops, etc.), operating systems (Android, Microsoft Windows, and Linux), and is link agnostic. Input devices it supports include a traditional keyboard/mouse, commercial gaming controls (e.g., Xbox), voice/gesture commands, augmented reality devices (e.g., HoloLens) and others. On touch-screen media you can use multi-touch behaviors you have grown accustomed to including pinch, zoom, etc. See Product Overview for a complete listing.

SOURCE: Autonodyne archives

A mixture of portable control stations running Autonodyne software.

SOURCE: Autonodyne archives

One of the unmanned Kratos UTAP-22 Mako air vehicles flying as an unmanned wingman with this AV-8B Harrier, all while under our tablet control.

Manned-Unmanned Teaming (MUM-T)

The capabilities provided by MUM-T technologies offers a new level of interoperability (LOI) between ground forces, manned aircraft, and Uncrewed Vehicles (UVs).

The ability of both ground forces and manned aircraft to share UV products significantly increases situational awareness and improves the quality of decision-making. For example, live and still images acquired from the sensor payloads of UV can be shared across the network. In the air domain, groups of low-cost attributable uncrewed “loyal wingmen” act in unison with manned aircraft to serve as force multipliers by adding mass and quantity.

SOURCE: Autonodyne archives

Manned-Unmanned Teaming – a pair of Kratos UTAP-22 unmanned wingmen to an AV-8 Harrier.

Autonodyne has been building and flight-testing systems that enable Manned-Unmanned Teaming since the company’s inception. We have built the on-board Mission Computer hardware and software and the off-board Control Station software for multiple MUM-T programs. These products have been flight tested to TRL-7 (a system prototype demonstration in an operational environment).

Our designs use a task-based approach (e.g., “Follow Him”, “Loiter”, “Fly Over That”, “Surveil”, “Stage”, “Stack”, “RTB”, etc.) to provide a myriad of functionality. That enables the human supervisor, the manned part of the manned-unmanned team, to significantly reduce the amount of cognitive bandwidth needed to control or direct the unmanned team members.

This is equally applicable across multiple domains (air, sea, and land).

Team Power

In this Manned-Unmanned Teaming (MUM-T) scenario, a pilot involved in a search and rescue operation relays to a ground operator potential sites to search for a lost hiker. The operator immediate sends various UVs to observe those areas.

Video gathered by the drones in the network is shared with the search and rescue team.

Umanned Vehicles

(Drones)

Human operating a plane equipped with a mission computer.

The operator scans video being collected by the UVs in the network.

A swarm of drones spots a hiker. The pilot confirms the hiker from a video feed and a rescue operation begins.

2

1

Human with a mobile control station.

Umanned Vehicles

(Drones)

Video gathered by the drones in the network is shared with the search and rescue team.

Human operating a plane equipped with a mission computer

The operator scans video being collected by the UVs in the network.

A swarm of drones spots a hiker. The pilot confirms the hiker from a video feed and a rescue operation begins.

2

1

Human with a mobile control station

Human operating a plane equipped with a mission computer

Human with a mobile control station

Umanned Vehicles

(Drones)

The operator scans video being collected by the UVs in the network.

1

A swarm of drones spots a hiker. The pilot confirms the hiker from a video feed and a rescue operation begins.

2

Video gathered by the drones in the network is shared with the search and rescue team.

Human operating a plane equipped with a mission computer

Umanned Vehicles

(Drones)

The operator scans video being collected by the UVs in the network.

1

A swarm of drones spots a hiker. The pilot confirms the hiker from a video feed and a rescue operation begins.

2

Video gathered by the drones in the network is shared with the search and rescue team.

Human with a mobile control station

SOURCE: Autonodyne archives

Multi-ship collaborative operations from the same Autonodyne Common Control Station.

Multi-ship collaborative autonomous operations

Autonodyne’s software has been specifically designed to support multi-ship collaborative autonomous operations. This approach permits dissimilar UVs to communicate with each other to perform swarm-like behaviors. While multiple architectures are possible to enable these capabilities, to date, we’ve implemented an IP-network-based approach so that every mission computer and control station is an entity in the network.

On the swarm behavior front, example functionality supported so far includes multiple forms of x-y-z offset station keeping such as flying in formation with other aircraft, maintaining a position, or pattern referencing a fixed-position asset. Another example that we have implemented is a marsupial behavior where an unmanned air vehicle rides on the back of an unmanned ground rover and when the ground rover can go no further, the air vehicle automatically launches from its back to continue the mission. (See Autonomous Behaviors).

We are also moving to mesh network architectures that support n>>1 or swarming behaviors. To date, we routinely operate 10 dissimilar UVs simultaneously under the watchful eye of a single human operator using a mobile device to interact with the swarm when necessary or desired. The collaborative formation works as a distributed collective to find targets, reassign formation roles — if required — and share various forms of sensor-acquired data.

Working in groups, with other teams

As part of a humanitarian mission, a control operator has dispatched a swarm of autonomous quadcopters to deliver needed first aid supplies. It will be traveling through a war-torn area in a mountainous region so, he programmed a set of behaviors that allow a single drone, or the group, to follow safe routing. Other capabilities at the operator’s disposal:

Autonodyne supports these capabilities

Capabilities under development

The operator can share ground condition information with cargo planes.

Other drones, or swarms, can be activated to support the mission.

Multiple fixed-wing drones can be called to survey the damage.

Autonomous subs can be sent to survey damage to a delivery port.

A convoy can deliver supplies to remote areas.

Autonomous delivery boats can move supplies along waterways.

Autonodyne supports these capabilities

Capabilities under development

The operator can share ground condition information with cargo planes.

Other drones, or swarms, can be activated to support the mission.

Multiple fixed-wing drones can be called to survey the damage.

Autonomous subs can be sent to survey damage to a delivery port.

A convoy can deliver supplies to remote areas.

Autonomous delivery boats can move supplies along waterways.

Autonodyne supports these capabilities

Other drones, or swarms, can be activated to support the mission.

Capabilities under development

The operator can share ground condition information with cargo planes.

Autonomous subs can be sent to survey damage to a delivery port.

Multiple fixed-wing drones can be called to survey the damage.

Autonomous delivery boats can move supplies along waterways.

 

A convoy can deliver supplies to remote areas.

Other drones, or swarms, can be activated to support the mission.

Autonodyne supports these capabilities

Capabilities under development

The operator can share ground condition information with cargo planes.

Autonomous subs can be sent to survey damage to a delivery port.

Multiple fixed-wing drones can be called to survey the damage.

 

Autonomous delivery boats can move supplies along waterways.

 

A convoy can deliver supplies to remote areas.

SOURCE: Autonodyne archives

Our computer vision detection and classification algorithm running on Times Square video feed.

Artificial Intelligence

Autonodyne is stitching together artificial intelligence capability with the goal that it can one day serve as a trusted agent for humans. When we get there, this will relieve many of the cognitive and physical burdens on humans and radically expand the art of the possible. This is best understood in real-world use cases such as:

This is more than just a force multiplier – it has the power and potential to unlock so much more.

We are experimenting with “Colored Petri Nets,” basically, stringing together a library of our existing and work-in-progress autonomy behaviors into logical sequences. By building upon that library of sequences we can, like in football, create “plays,” call the plays, and ultimately create full “missions.” When the mission’s software is grouped into an “autonomy engine,” the collective functionality of the software can pilot vehicle(s) in response to a vehicle’s sensor input and the operator’s mission objectives.

We are now starting to fold in semantic reasoning to build “contextual awareness.” This awareness means the software is sensing and evaluating the environment of the autonomous vehicle and building a virtual scene based on what it is sensing. On that scene, it fuses data from other available inputs. Then, it can perform some at-the-edge reasoning to draw conclusions and determine on its own possible courses of action (COAs). From those COAs, it builds a game plan and uses our control station displays as a means to convey those plans to any human supervisors on-the-loop. The human can select from the offered COAs, manually construct their own mission, or just let the software run its course.

For now, our human on-the-loop is serving as the sanity checker and authority figure of AI-enabled autonomous systems. As humans develop trust in this AI systems, they will slowly take off their “training wheels” in the manned-unmanned team collaboration and use the technology that aims to provide assured autonomy.

Source: Autonodyne archives

We modified the software in the Avidyne IFD-series of Mission Computers to have connectivity with our mobile common control stations. These units were first tested in our lab and then installed in the company experimental Cessna 182 that we use as an Optionally Piloted Vehicle flying testbed.

Mission computers

Autonodyne designs and builds mission computer software for defense and civilian applications. The mission computers manage a broad range of functions including vehicle navigation, vehicle health and status, on-board systems control like payloads, communications to/from other on-board systems like autopilots, and external communication links.

Autonodyne’s autonomous flight control software can be bundled with mission computer hardware from Avidyne’s IFD Series (bottom row) and North Atlantic Industries’ (top) NIU1A. The flight management systems are FAA-certified avionics.

Source: Autonodyne archives

We applied our Optionally Piloted Vehicle software to a Cirrus SR-22 (seen in-flight during the OPV flight tests, and parked on the ground post-flight), and a Cessna 182.

Advanced Aerial Mobility/Optionally-Piloted Vehicles

Autonodyne believes a new generation of Vertical Takeoff and Landing (VTOL) vehicles will enable a new form of urban transportation. Now referred to as Advanced Aerial Mobility (AAM), the transportation model will look much like its urban surface transportation counterparts Uber and Lyft. Towards those ends, Autonodyne can provide:

Source: Nasa

NASA envisions an air transportation system that moves people and cargo between places previously not served or underserved by aviation – local, regional, intraregional, urban – using revolutionary new aircraft that are only just now becoming possible.

SOURCE: Autonodyne archives

Augmented Reality Integration (Microsoft Hololens) with Autonodyne RCU-1000.

Augmented Reality (AR)

Autonodyne’s has conducted considerable research and development to create an Augmented Reality (AR) control station using commercially available AR devices like the Microsoft HoloLens, or Meta 2. Our AR flight-testing operations have identified a few areas where AR can have a profound impact. They include: