English ▾
Change language
Deutsch (deu) françios (fra)
English ▾
Change language
Deutsch (deu) françios (fra)

Designed for simplicity and power

The Autonodyne design is placing increasing autonomy in system computation. This is intended to take more of the cognitive burden off human operators/supervisors and serves as a force multiplier to make the human and machine a more capable team. We do this via software but of course the software needs a device to operate on. While software is our principal product, the combination of our software running on a mobile device, workstation, or on-board mission computer becomes the tangible product.

In the ideal scenario, the software is using various sensor input feeds to accurately sense its environment, divine a game plan, and automatically execute that game plan. When it has communications reach-back to the human operators, a copy of that same software is running on a device in front of that human to provide visualizations that indicate to the human what the software has concluded and is executing. The human can either let the software run its course, or intervene to override or command some alternative action or behavior. We attempt to make these visualizations of the system a piece of “functional artwork.”

The human operator can place controls on the software to let it run fully autonomously, provide suggested courses of action that the human must select, or require a great deal of human input. This implies that system and software architectures must be able to provide trusted autonomy and this is a big part of our designs.

These “functional artwork” control stations support pre-mission planning/rehearsal minutes/hours/days ahead of any mission, mission-time execution command/control, and post-mission debrief/evaluation/training capabilities.

We get asked all the time, “What makes you different?” Our response:

  • A streamlined and targeted common control station – it is not the Q-Ground Control behemoth;
  • We’re more than just a pretty face – we combine an elegant Natural User Interface with some slick autonomy algorithms and behaviors. One without the other is only a half-solution;
  • We autonomously control many vehicles and vehicle types on multiple mission types and use cases – we do this on multiple media choices using multiple optional input devices; communicate across multiple datalink and protocol options; drive vehicles across multiple domains; and apply multiple autonomy behaviors;
  • We’re small, extremely agile, and malleable.

SOURCE: Autonodyne archives

Autonodyne software running on CBX-1000 and RCU-1000 Tablet platforms.

Vehicle control & management in one screen

As we progress from one human controller responsible for a single vehicle (“human-in-the-loop”) to that same human controller simultaneously supervising multiple vehicles (“human-on-the-loop”) to the human monitoring multiple missions being performed by multiple vehicles (“human-aware-of-the-loop”) to maybe eventually no humans involved (“What’s-a-loop?”), Autonodyne is striving to create modern interfaces designed with the user and the mission in mind. We are trying to not build engineering interfaces for engineers but instead, find that perfect design that blends simplicity and power to provide the human operator/supervisor/monitor the right level of situational awareness and the ability to affect changes if/when they need to be done. From providing 100% manual input (e.g. joystick and throttle control) to approving a suggested course of action, to just being a means to provide “commander intent”, the Autonodyne control stations are that powerful form of functional artwork.

SOURCE: Autonodyne archives

Task-based control screen for an Unmanned Combat Aerial System (UCAS).

Command multiple vehicles in multiple missions

Though most of our customers need to control one or a few vehicles on a mission, we envision an autonomous world where an operator needs to do that as well as other vehicles on other missions. With that in mind, our software engineers have developed and are enhancing that capability. Because there is considerable power when humans work together in numbers, we know that potential multiplies exponentially when humans can extend their reach by controlling a fleet or fleets of UVs.

Stage reserves

1
Use the Stage autonomy behavior to position reserve assets at desired locations.

Selecting assets is easy

2
With your finger, or input device, draw a circle around the assets you wish to select. We call it Lasso. It’s that simple.
1
2

Command center is your media

1
Whether it is a mobile phone, tablet, desktop computer, or 21:9 display, the media that controls your assets — the choice is yours.

Assign missions to the vehicles in your network

2
If you are running one or multiple missions, you can assign the assets involved a task-based mission. Some missions may be more important than other. So, prioritize the urgency of each one.

Monitor the status of each active mission

3
If operating multiple missions it would be nice to be able to check how each one is doing. You can. Just select the Mission Status pop-up menu and you can track each mission’s progress with an intuitive progress slider.
1
2
3

Layers of information

Autonodyne’s basic features package comes with a few data layers you can toggle on and off including a basic map that can switch to topographic and satellite views. The Grid Overlay layer allows you to pinpoint GPS/UTM/MGRS coordinates and the Metadata Overlay layer provides you detailed information on the assets in your network.

View sensor data feeds

You can see what the UVs in your missions see. Depending on the type of sensor, you can view video and infra-red imagery. Zoom in on the image, establish GPS coordinates, take a snap shot, and if you want, securely share an image or video within your network.

Control your assets with precision

Swap the 2D map view with the 3D “out-the-window” view whenever desired.

Stage reserves

Selecting assets is easy

Use the Stage autonomy behavior to position reserve assets at desired locations.

With your finger, or input device, draw a circle around the assets you wish to select. We call it Lasso. It’s that simple.

Monitor the status of each active mission

Assign missions to the vehicles in your network

If operating multiple missions it would be nice to be able to check how each one is doing. You can. Just select the Mission Status pop-up menu and you can track each mission’s progress with an intuitive progress slider.

If you are running one or multiple missions, you can assign the assets involved a task-based mission. Some missions may be more important than other. So, prioritize the urgency of each one.

Command center is your media

Whether it is a mobile phone, tablet, desktop computer, or 21:9 display, the media that controls your assets — the choice is yours.

Layers of information

View sensor data feeds

Autonodyne’s basic features package comes with a few data layers you can toggle on and off including a basic map that can switch to topographic and satellite views. The Grid Overlay layer allows you to pinpoint GPS/UTM/MGRS coordinates and the Metadata Overlay layer provides you detailed information on the assets in your network.

You can see what the UVs in your missions see. Depending on the type of sensor, you can view video and infra-red imagery. Zoom in on the image, establish GPS coordinates, take a snap shot, and if you want, securely share an image or video within your network.

Control your assets with precision

Swap the 2D map view with the 3D “out-the-window” view whenever desired.

 

Stage reserves

Selecting assets is easy

Use the Stage autonomy behavior to position reserve assets at desired locations.

With your finger, or input device, draw a circle around the assets you wish to select. We call it Lasso. It’s that simple.

Assign missions

to the vehicles in

your network

Monitor the status of

each active mission

Command center is your media

If you are running one or multiple missions, you can assign the assets involved a task-based mission. Some missions may be more important than other. So, prioritize the urgency of each one.

Whether it is a mobile phone, tablet, desktop computer, or 21:9 display, the media that controls your assets — the choice is yours.

If operating multiple missions it would be nice to be able to check how each one is doing. You can. Just select the Mission Status pop-up menu and you can track each mission’s progress with an intuitive progress slider.

Layers of information

View sensor data feeds

You can see what the UVs in your missions see. Depending on the type of sensor, you can view video and infra-red imagery. Zoom in on the image, establish GPS coordinates, take a snap shot, and if you want, securely share an image or video within your network.

Autonodyne’s basic features package comes with a few data layers you can toggle on and off including a basic map that can switch to topographic and satellite views. The Grid Overlay layer allows you to pinpoint GPS/UTM/MGRS coordinates and the Metadata Overlay layer provides you detailed information on the assets in your network.

Control your assets with precision

Swap the 2D map view with the 3D “out-the-window” view whenever desired.

 

Autonomous Drone Delivery

Here is an application being used by the US Marine Corps as they build out their tactical resupply capability. In this case, we have teamed up with Chartis Federal using their Periscope drones to autonomously deliver up to 110lbs of cargo approximately 20km.

Hosted on a ruggedized Windows tablet, one image depicts the creation of keep-in and keep-out geofences and the other image highlights the delivery behavior/mission.

Tactical Control of multiple sUAS

These images depict a version of the software controlling multiple dissimilar air and land vehicles. In each case, we hosted the software on Android mobile phones. When it is time to work, pull it out of your pocket and use or take a look at the data and store again when not needed.

Software interface

The interface of Autonodyne’s autonomy control program allows you to immediately send commands for a single vehicle, or group of vehicles, via a dynamic command wheel.

One app

Our latest product combines all of our software products into a single application. The integration permits you to execute all of our missions in one just session. Also, it runs on Windows, Linux, or Android devices of all shapes and aspect ratios.

The images below depict a recent program we completed in which our software was hosted on a ruggedized tablet and used to control multiple unmanned wingmen from another aircraft for some very high performance Manned-Unmanned Teaming (MUM-T).

Left, hosted on a 10.1-inch Panasonic FZ-G1; right, hosted on a 7-inch Panasonic FZ-M1.

The first image shows an in-flight handoff of control of the unmanned fighters from one mobile control station to another located elsewhere. The second image is an example of a new mission that was pushed or activated from a different control station.

Flight testing what we build

SOURCE: Autonodyne archives

Autonodyne works with a multitude of uncrewed vehicle manufacturers and we add our behavior software to a drone’s basic tool set, increasing the drone's functionality and creating its ability to perform mission-specific tasks. The photo shows some of the drones we have worked with.

Our engineers regularly flight test the software they create. Teams of 2-6 software and system engineers routinely head out to our flight test venues throughout New England and often travel to customer sites, demo locations, and large force exercises. We do a lot of validation in Modeling and Simulation environments and lab testing but the real test is how the software performs in the field and in real-world conditions. In addition to unencumbered outdoor venues, we are fortunate enough to be associated with the Kostas Research Institute in Burlington MA with an enormous outdoor drone cage.

Control stations wrapped in Autonodyne software

If you wish to purchase display hardware already bundled with Autonodyne’s software, you have plenty of choices. Your selection should be based upon your immediate needs, and where and how you intend to use the hardware.

When you plan on carrying the RCU-1000 software and are going be moving around as you use it, the RCU-1000 Mobile and RCU-100 Tablet models are good fits. Both are designed for the wear-and-tear of daily work and can come encased in a rugged plastic case. You can also purchase the Autonodyne RCU-1000 Laptop software-only package to control Uncrewed Vehicles (UVs) from your laptop computer.

Since the RCU-1000 software was designed to take advantage of existing UV platforms and communications links/protocols, we often describe the system as being a “software wrapper”. This means we do not need to put any Autonodyne hardware or software on the UV platform and instead use these the “software wrappers” as translators making use of the infrastructure that was already there. If we have the luxury of adding some hardware or software to the UV, then the overall system becomes that much more capable. Nice, but not necessary.

SOURCE: Autonodyne archives

A display of Autonodyne’s software running on cellphones, tablets, 21:9-ratio monitors, and CBX-1000 mobile units. Also displayed are a variety of input devices that can be used to operate our software.

If you plan on working at a stationary location while using our software, the RCU-1000 Desktop and RCU-1000 Wide software-only packages are good choices. The larger screens of both media permit the user to effectively manage more UVs and orchestrate their actions, since they are able to view multiple windows simultaneously in the workspace.

The 21:9 ratio display screen of the RCU-1000 Wide provides users with the most productive experience. The expanded workspace makes it easier to multi-task, allowing users to switch between applications. When using Autonodyne’s software you will have an unencumbered view of our main display window and can park other functional monitoring and management windows off to the sides.

Any hardware selected, from the RCU-1000 Mobile to the RCU-1000 Wide, can serve as the control hub in a network of other Autonodyne-driven or -enabled hardware.



RCU-1000
Mobile
RCU-1000
Tablet
RCU-1000
Laptop
RCU-1000
Desktop
RCU-1000
Wide

CBX-10001
Hardware/Software


Software only2

Corded


Internal battery


Wireless communication
Operating system





Android




Windows
Linux

1 - CBX stands for Common Operating Picture-in-a-Box

2 — Autonodyne software is downloaded to your hardware.

When a network of control stations is the best solution

Command and control of any UV or UVs in a network can be performed by one operator. However, Autonodyne’s software supports a networked system presenting a common operating picture to multiple users not geographically co-located. It can also enact handoff of control a vehicle or group of vehicles from one station to another. Basically, the operator can use any media device to serve as the hub for the spokes in the network.

This is especially useful when a team of people on the ground are working in conjunction with a variety of UVs (see Multi-Vehicles) on a mission. Autonodyne’s software are effective manned-unmanned teams involved in those missions, then objectives like saving lives or protecting property are easier to achieve (see Autonomy technology in use).

Autonodyne’s most powerful networked hardware system is the portable CBX-1000. Packaged in a sturdy Pelican™ case is 21:9 ratio display and any combination or number of tablets and/or mobile devices that are suitable for your purpose. If your team needs to quickly get to a remote location to work that has no electrical service — no problem. This self-contained, pack-and-carry system brings its own power supply1 and can go wherever you need it.

Whatever the right mix of hardware may be for your group, Autonodyne can provide it and make sure the hardware in your network can communicate with each member of the team.

Some typical network configurations:

1 - Comes with rechargeable battery packs.

RCU-1000

MOBILE

RCU-1000

TABLET

RCU-1000

WIDE

(or DESKTOP)

CBX-1000

RCU-1000

TABLET

RCU-1000

MOBILE

RCU-1000

WIDE

(or DESKTOP)

CBX-1000

RCU-1000

TABLET

RCU-1000

MOBILE

CBX-1000

RCU-1000

WIDE

(or DESKTOP)

RCU-1000

TABLET

RCU-1000

MOBILE

CBX-1000

RCU-1000

WIDE

(or DESKTOP)

Autonodyne-enabled mission computers

Mission computers manage a broad range of functions including vehicle navigation, vehicle health and status, on-board systems control like payloads, communications to and from other on-board systems like autopilots, and external communication links. Autonodyne modifies existing mission computer software, has created clean-sheet design new mission computer software for specialized missions, and frequently interfaces with 3rd party mission computer systems and software.

For example, we have modified existing Avidyne FAA-certified mission computers that were originally designed to be the principal device that human pilots use to operate conventional aircraft, talk with air traffic control, navigate their way through the airspace and a host of other functions. The human pilots interact directly with the devices through on-screen displays and input devices like knobs and buttons. Autonodyne was able to enhance that system to also allow full interaction and control from off-board locations. In other words, for our Optionally Piloted Vehicle (OPV) systems we’ve built and flown, the principal operator was located on the ground using Autonodyne RCU-1000 control stations, connected over datalink to the aircraft and manipulated the on-board mission computer software from the ground. In this case, we were able to retain all of that FAA-certified code in the aircraft units and add software to facilitate that off-board control inputs.

In the case of clean-sheet design mission computer creation, we have created software that was optimized for uncrewed aircraft when the concept of a human on-board looking at displays did not apply. Vehicle navigation, health and status monitoring, subsystem management and the rest are still required to be performed and we were able to do so using much more efficient software that didn’t need to make accommodations for humans on-board.