English ▾
Change language
Deutsch (deu) françios (fra)
English ▾
Change language
Deutsch (deu) françios (fra)

Photo taken at Camp Roberts, CA, JIFX in January, 2020.

The autonomy scene today

Autonomy is sometimes a difficult thing to define. There is a long-term model that may eventually eliminate all human involvement. For the foreseeable future, Autonodyne believes a supervisory human role will be essential. We subscribe to the school of thought as described in ‘Our Robots, Ourselves’ that involves the human and machine working together trading control and shifting levels of automation to suit the situation at hand. At certain times, in certain places, the vehicle is very autonomous and in others, it takes more human involvement.

What is autonomy?

autonomy

The quality of being autonomous (i.e., without the need to be controlled by outside entities; self-determination).1

Autonodyne puts the human in a supervisory role commanding high-level behaviors. We use rich sensor and algorithmically-enhanced models of the environment to move across the spectrum of automation, moment by moment, driving in and out of clouds of autonomy and risk.

We make great use of agent technology in appropriate places such as calculating and suggesting re-routing options, looking up and suggesting procedures, drawing upon a database of past events to offer situationally appropriate suggestions, and off-loading high human workload, but highly deterministic tasks.

In general, Autonodyne delivers control systems at the heart of Uncrewed Vehicles (UVs) to be used for safe, and affordable transportation of significant cargo in alignment with current manned infrastructure.

1 — Autonomy definitions are from ASTM’s Autonodyne Design and Operations in Aviation: Terminology and Requirements Framework, 2019; F-38 Unmanned Systems Committee of ASTM (American Society for Testing and Materials)

For more information regarding Autonomy Behaviors, please click here. Posted 26 October 2019.

Autonodyne in the autonomy spectrum

Obstacle avoidance role

Degree of automation

Autonomy level

Autonodyne research and development area

0

1

2

3

4

5

Autonomy level

LEVEL 0

LEVEL 1

NONE

LOW

PILOT: “In” the loop

Pilot: Control of UAS is 100% manual.

Pilot: Remains in control.

Human involvement

UV Control: None

UV control: Has control of at least one vital function.

SENSE & ALERT

NONE

LEVEL 2

LEVEL 3

PARTIAL

CONDITIONAL

PILOT: “On” the loop

Pilot: Responsible for safe operation.

Pilot: Acts as fall-back system.

Human involvement

Machine involvement

UV control: Can take over heading, altitude under certain conditions.

UV control: Can perform all functions “given certain conditions.”

SENSE & ALERT

SENSE & AVOID

LEVEL 4

LEVEL 5

HIGH

FULL

PILOT: “Out” of the loop

Pilot: No reponsibility

Machine

involvement

Machine

involvement

UV control: Has backup systems that take over when one system fails.

UV control: Able to use AI tools to plan their flight as autonomous learning systems.

SENSE & NAVIGATE

SOURCE: dronelife.com

Obstacle avoidance role

Degree of automation

Autonomy level

Autonodyne research and development area

0

1

2

3

4

5

Autonomy level

LEVEL 0

LEVEL 1

NONE

LOW

PILOT: “In” the loop

Pilot: Control of UAS is 100% manual.

Pilot: Remains in control.

Human involvement

UV Control: None

UV control: Has control of at least one vital function.

SENSE & ALERT

NONE

LEVEL 2

LEVEL 3

PARTIAL

CONDITIONAL

PILOT: “On” the loop

Pilot: Responsible for safe operation.

Pilot: Acts as fall-back system.

Human involvement

Machine involvement

UV control: Can take over heading, altitude under certain conditions.

UV control: Can perform all functions “given certain conditions.”

SENSE & ALERT

SENSE & AVOID

LEVEL 4

LEVEL 5

HIGH

FULL

PILOT: “Out” of the loop

Pilot: No reponsibility

Machine

involvement

Machine

involvement

UV control: Has backup systems that take over when one system fails.

UV control: Able to use AI tools to plan their flight as autonomous learning systems.

SENSE & NAVIGATE

SOURCE: dronelife.com

1 — Autonomy definitions are from ASTM’s Autonodyne Design and Operations in Aviation: Terminology and Requirements Framework, 2019; F-38 Unmanned Systems Committee of ASTM (American Society for Testing and Materials)

For more information regarding Autonomy Behaviors, please click here. Posted 26 October 2019.

Autonomy level

Degree of automation

Obstacle avoidance role

Autonodyne research and development area

LEVEL

LEVEL

LEVEL

LEVEL

LEVEL

LEVEL

0

1

2

3

4

5

NONE

LOW

PARTIAL

CONDITIONAL

HIGH

FULL

PILOT: “In” the loop

PILOT: “On” the loop

PILOT: “Out” of the loop

Pilot: Control of UV is 100% manual.

Pilot: Remains in control.

Pilot: Responsible for safe operation.

Pilot: Acts as fall-back system.

Pilot: No responsibilities

Human

involvement

Machine

involvement

UV control: Has control of at least one vital function.

UV control: Can take over heading, altitude under certain conditions.

UV control: Can perform all functions “given certain conditions.”

UV control: Has backup systems that take over when one system fails.

UV control: Able to use AI tools to plan their flight as autonomous learning systems.

NONE

SENSE & ALERT

SENSE & AVOID

SENSE & NAVIGATE

NONE

SOURCE: dronelife.com

1 — Autonomy definitions are from ASTM’s Autonodyne Design and Operations in Aviation: Terminology and Requirements Framework, 2019; F-38 Unmanned Systems Committee of ASTM (American Society for Testing and Materials)

For more information regarding Autonomy Behaviors, please click here. Posted 26 October 2019.

Autonomy level

Degree of automation

Obstacle avoidance role

Autonodyne research and development area

LEVEL

LEVEL

LEVEL

LEVEL

LEVEL

LEVEL

0

1

2

3

4

5

NONE

LOW

PARTIAL

CONDITIONAL

HIGH

FULL

Pilot “in” the loop

Pilot “on” the loop

Pilot “out” of the loop

Pilot control of UAS is 100% manual.

Pilot remains in control.

Pilot responsible for safe operation.

Pilot acts as fall-back system.

NONE

Human

involvement

Machine

involvement

UAS has control of at least one vital function.

UAS can take over heading, altitude under certain conditions.

UAS can perform all functions “given certain conditions.”

UAS has backup systems that take over when one system fails.

UAS able to use Ai tools to plan their flight as autonomous learning systems.

NONE

SENSE & AVOID

SENSE & ALERT

SENSE & NAVIGATE

NONE

SOURCE: dronelife.com

1 — Autonomy definitions are from ASTM’s Autonodyne Design and Operations in Aviation: Terminology and Requirements Framework, 2019; F-38 Unmanned Systems Committee of ASTM (American Society for Testing and Materials)

For more information regarding Autonomy Behaviors, please click here. Posted 26 October 2019.

Autonomy technology in use

Technology is transforming how humans and machines work together. The development of optionally-crewed and uncrewed systems empower your efforts. Here’s how: pairing up humans and machines to perform tasks simplifies decision-making and speeds up execution while keeping humans out of harm’s way or having to travel to difficult locations. The integration can easily multiply your capabilities and increase your presence and reach. It’s a sensible partnership that will enable you to save time, money, and in emergency situations — lives.

Some example use cases were we have a capability include:

The possibilities seem bound only by our imaginations. Six areas where Autonodyne is developing technologies of the future. Select a scenario to preview then, click on the numbered circles to read examples about autonomy in action.

autonomous

An entity that can, and has the authority to, independently determine a new course of action in the absence of a predefined plan to accomplish goals based on its knowledge and understanding of its operational environment and situation. Having the ability and authority to make decisions independently and self-sufficiently.

Package Delivery

autonomous

An entity that can, and has the authority to, independently determine a new course of action in the absence of a predefined plan to accomplish goals based on its knowledge and understanding of its operational environment and situation. Having the ability and authority to make decisions independently and self-sufficiently.

It is widely believed that package delivery via unmanned or autonomous platforms is on the verge of explosive growth, especially performing last-mile delivery to rural areas with low to average population densities.

Autonodyne is in strategic partnerships with companies seeking to leverage their hardware in the area of package delivery such as Target Arm, who builds the universal launch and recovery platform shown on the delivery truck; and Valqari, the builder of the Smart Drone Delivery Mailbox, a mail delivery receptacle for drone and traditional deliveries.

Delivery trucks will travel shorter, more direct vicinity routes rather than pulling into every residential street to drop off deliveries at each home. Transporting a small fleet of delivery drones inside, one is released from the Target Arm roof-top launch platform to the address on the package.
Sophisticated software engineered by Autonodyne guides each drone to its destination in the safest, most direct route. As the truck moves through the neighborhood additional drones are dispatched from the platform.
Autonodyne can guide larger packages to an April Tag, a checkered landing target like a QR code, via their "path-planning" algorithms and computer vision technology. Cameras in the drone will recognize the unique checkered fiducial code before landing and releasing or, making a drop.
Smaller packages or mail are delivered to a Valqari smart mailbox. As it nears the mailbox, the drone’s sensors home in on a signal broadcast from the box. The drone will perform a precision landing on the mailbox, release the package, and fly away. The Valqari mailbox automatically secures the package and notifies the intended recipient of it’s secure arrival.
As the drones deliver their cargo, Autonodyne software sequences them back to the delivery truck where each is captured by the cage. The drones are prepared for their next delivery.

Delivering packages using Autondyne’s path planning algorithm

In this animation, the calculated delivery paths to all delivery addresses in this neighborhood are shown. However, only a single package is being delivered to an address at the bottom right. The path planning algorithm will calculate various paths to the address and then select the optimal one. The optimal path avoids natural and man-made obstacles as it searches for the shortest route. After the drone follows the planned route and delivers its package, it returns to the delivery truck.

autonomous flight

A flight that does not require human decision making and instead relies on automation that can independently determine a new course of action in the absence of a predefined plan to execute management or operational control of a flight.

Humanitarian Assistance and Disaster Relief (HADR)

autonomous flight

A flight that does not require human decision making and instead relies on automation that can independently determine a new course of action in the absence of a predefined plan to execute management or operational control of a flight.

Many natural disasters are terribly destructive. They leave in their wake a trail of injury, death, loss of livestock, property damage and economic loss. A quick and effective first response to a disaster can make the difference between life and death and reduce suffering.

In the early stages of first response efforts the setting is typically emotional and chaotic. When Hurricane Maria hit Puerto Rico in 2017, wind knocked out 80 percent of the island nation’s power lines and a black out thereafter lingered for months. Poor communications and limited relief and rescue access to rural communities, to name a few things, created a humanitarian crisis in the ensuing months. Having help early on from autonomy technologies with Humanitarian Assistance and Disaster Relief (HADR) efforts may have eased the pain in Puerto Rico.

An example of how Autonodyne technologies can in help in a natural disaster:

With an Autonodyne RCU-1000 Desktop at a central command post and CBX-1000 serving remote locations, a communication relay can be created where situation reports, and critical information can be coordinated and relayed to Autonodyne-enabled response teams on the ground and assets in the air.
Autonodyne’s software is link agnostic permitting any rescue personnel with Autonodyne Control Stations to ingest video and sensor feed data from nearby surveillance or monitoring sources such as aircraft or satellites.
Autonodyne’s software will be able to deconflict uncrewed vehicles from crewed platforms, thereby enhancing safety in what is likely to be a confused and complex crisis response.
A control station operator, using Autonodyne’s extensive library of autonomy behaviors, can program a swarm of drones to canvas an area and gather intelligence on the level of destruction.
Simultaneously, the operator can program an individual drone to monitor a critical hot spot.
Also, a scouting expedition executed by a swarm of drones can be programmed by the operator to look for survivors stranded by floodwaters.
If the swarm locates victims in need of assistance, the operator can dispatch and coordinate a rescue operation with water-based drones.
Finally, the operator can coordinate sorties of much needed first aid deliveries to a multitude of ground-based sites.

autonomous system

Hardware, software, or a combination of the two that enables a system to make decisions independently and self-sufficiently. Autonomous systems are self-directed toward a goal governed by rules and strategies that direct their behavior.

Intelligence, surveillance, reconnaissance (ISR)

autonomous system

Hardware, software, or a combination of the two that enables a system to make decisions independently and self-sufficiently. Autonomous systems are self-directed toward a goal governed by rules and strategies that direct their behavior.

Nowhere is the ability to simplify decision-making more important than in an area of conflict. Military personnel are often in harm’s way and can be easily task saturated. In this scenario, a squad leader equipped with an Autonodyne RCU-1000 Mobile control station is studying the friendly forces deployed in his sector. From a nearby command center, a control station operator with a CBX-1000 control station has dispatched his unit as well as a variety of other aerial and ground assets to the area. Essentially, the quick actions have rapidly multiplied forces to that spot.

The squad’s mission is to intercept enemy ground troops approaching their position.

How Autonodyne’s autonomy technology is being designed for military operations:

A low- or medium-fying, fixed-wing drone was initially dispatched to patrol the area in search of enemy soldiers. It spotted two large groups with its infrared sensors and camera. The operator immediately programmed the drone to loiter in the area and conduct reconnaissance.
A local area commander quickly dispatched a squad to an intercept point where he had made plans to send land- and air-based assets. The squad leader looks at video feeds from the drones and tracks the progress of the other unfolding missions on his RCU-1000 Mobile.
The swarm’s mission was to arrive at the site quickly, approach the advancing enemy troops slowly, intercept them, then, hover in place from a distance in hopes those actions would deter their advance.
As the drones hovered in place, the operator used the drone’s cameras to monitor the situation and have their infrared lasers paint potential targets — if needed.
Land-based drones were first positioned in the area from the command center then, as they neared, steered manually into place by the squad leader. Along with the hovering airborne drones and loitering fixed-wing plane, the quickly assembled assets presented a formidable force.
Information gathered by the fixed-wing and drones is fed into the ATAK system and used by the command center operator to coordinate further action including potential sweeps from nearby fighters jets and any unmanned loyal wingmen they have also being controlled by Autonodyne software from the manned cockpits in Manned-Unmanned Teaming operations . The quickly coordinated actions and overwhelming show of force compelled the enemy army to retreat.

Wildland fire management

The 2020 fire season placed considerable strain on firefighting resources, particularly along Pacific Coast states in the western US.

Given the sheer number of fires annually and their scale, state and federal wildland fire agencies are looking to autonomy technologies for help. In particular, unmanned drones paired with control stations, are being tested and used to improve their ability to help protect other firefighters and the public. Autonodyne is involved with this exploratory effort. Some capabilities that are being developed:

The wide monitor of the Autonodyne RCU 1000-Desktop allows firefighters in the command center to get a strategic view of all assets in the fire zone and remotely control any Autonodyne-enabled assets. This theater view makes it quicker and easier to coordinate a response and execute plans.
Gathered information like drought and weather conditions, wildland fire coordinates, intelligence from the ground, and timely plans of action can be disseminated in real-time to any firefighters with Autonodyne-enabled media like mobile phones, and tablets.
The communication network extends to aircraft and land assets with Autonodyne mission computers. Firefighting tankers can be directed to critical hot spots and airborne UVs can be remotely controlled or programmed for autonomous flight to perform surveillance runs. The deconfliction between crewed and uncrewed airborne assets is critical, often the most important element in being able to respond with UVs at all.
The delivery of first-aid or other critical supplies to front-line firefighters can be safely, quickly, and easily coordinated from a distance.
A search-and-rescue operation involving a swarm of UVs equipped with infrared and video cameras can be executed to systematically scan for civilians unaware of fast-moving fires.
Uncrewed assets equipped with infrared-sensors can be safely sent out to scan for new fires in areas where lightning is striking.
Unmanned Ground Vehicles (UGV) with video cameras can be sent to patrol roads in search of victims fleeing the fire or to deliver supplies.

Conservation, management of coastal and maritime ecosystems and resources

Autonodyne’s common control station are capable of controlling multiple dissimilar UxS uncrewed platforms for use in many areas of scientific research. One of Autonodyne’s areas of interest is to use our capabilities to aid organizational efforts that aim at conserving and managing coastal and marine ecosystems and resources. Our ability to control and communicate with many uncrewed platforms with our RCU-1000 Common Control Stations enable us to develop creative approaches to conduct scientific research. The next two scenes illustrate how Autonodyne's technology can be used to help conduct operations both above and below the sea that is geared to protect people and the aquatic environment:

Operations above the sea

The activity of a fleet of UGVs and USVs can be coordinated to monitor coastal soil erosion and measure its runoff into nearby fisheries. A mobile command station can gather the data collected by the land and air drones as well as refuel them for extended missions.
Drones can routinely canvas an area of interest and their collected samples can be safely recovered and stored in a secure smart mailbox.
Satellites overhead provide Beyond Visual Line of Sight (BVLOS) comm relay back to the NOAA command post. A RCU-1000 COP is used along with Line-Of-Sight (LOS) tactical datalink connections when LOS/mesh networks are available.
Surveys of marine life, habitat, and ecosystem can be performed with UAVs or in coordinated MUM-T operations. This can include surveys that assist monitoring levels of Bluefin Tuna, Atlantic Cod, and ocean oxygen and CO2 levels.
Surveys of marine life, habitat, and ecosystem can be performed with UAVs or in coordinated MUM-T operations. This can include surveys that assist monitoring levels of Bluefin Tuna, Atlantic Cod, and ocean oxygen and CO2 levels.
Surveys of marine life, habitat, and ecosystem can be performed with UAVs or in coordinated MUM-T operations. This can include surveys that assist monitoring levels of Bluefin Tuna, Atlantic Cod, and ocean oxygen and CO2 levels.
Using a fixed-wing UAV or sUAV you can take an aerial survey of seals along the shoreline and count the population. A periodic survey can help determine the level of the primary food source that attract Great White sharks.
Coordinate and deconflict Manned-Unmanned Teaming (MUM-T) operations between a C-130 and other fixed-wing UAS dropping sonabouys and dropsondes in the area.
A fixed wing UAS with a sensor “flashlight” can sample the air and measure water temp, salinity, etc). As it runs low on energy, it can do an airborne handoff to an inbound replacement vehicle with a fresh power supply.

Operations below the sea

In some of the examples below, our software is using collected data to direct other uncrewed underwater vehicles (UUVs) where to reposition and start using Computer Vision and use our trained Machine Learning algorithms. In the others, Autonodyne software is used to control the movements of UUVs and uncrewed surface vehicles (USVs).

A group of sUAS hover low over the water dipping their collection receptacles to gather samples. When they return to shore they drop them off at Valqari smart mailbox for later collection and acidification measurements. The sUAS are launched and recovered from a Target Arm mobile vehicle.
Autonodyne software controls the movements of an unmanned surface water vehicle (USV) such as the Boeing Waveglider as it sails around the Gulf of Maine listening for and monitoring fisheries of Atlantic Cod or Bluefin tuna.
Our software navigates a UUV to areas of interest to collect water column samples.
The movements of a fleet of Saildrones can be coordinated as they monitor the bay’s CO2 levels or chart its underwater bathymetry for safe navigation.
With Autonodyne’s extensive autonomy behaviors, the UUV has been enabled to track and document the behavior of an endangered species like the Kemp's Ridley sea turtle.
The operator standing on the pier can control a tethered UUV to evaluate the soundness of its pilings. He can remotely flag areas of interest for closer evaluation.
The contextual awareness capabilities of Autonodye’s software enable the UUV to visually detect the acidification effects on oyster bed shells.

A truly autonomous drone can decide on destination, route, and is self-controlled in the air, land, and sea without any human input.