4 January 2022 — Drone Wars
In June 2021, Skyborg took control of an MQ-20 Avenger drone during a military exercise in California.
The influential State of AI Report 2021, published in October, makes the alarming observation that the adoption of artificial intelligence (AI) for military purposes is now moving from research into the production phase. The report highlights three indicators which it argues shows this development, one of which is the progress that the US Air Force Research Laboratory is making in testing its autonomous ‘Skyborg’ system to control military drones.
Skyborg (the name is a play on the word ‘cyborg’ – a biological lifeform that has been augmented with technology such as bionic implants) is intended to be an AI ‘brain’ capable of controlling an aircraft in flight. Initially, the technology is planned to assist a human pilot in flying the aircraft.
As is often the case with publicity material for military equipment programmes, it is not always easy to distinguish facts from hype or to penetrate the technospeak in which statements from developers are written. However, news reports and press statements show that over the past year the US Air Force has for the first time succeeded in demonstrating an “active autonomy capability” during test flights of the Skyborg system, as a first step towards being able to use the system in combat.
Official literature on the system states that Skyborg is an “autonomous aircraft teaming architecture”, consisting of a core autonomous control system (ACS): a ‘brain’ comprised of both hardware and software components which can be used to both assist the pilot of a crewed combat aircraft and fly a swarm of uncrewed drones. The system is being designed by the military IT contractor Leidos, with input from the US Air Force and other Skyborg contractors. It would allow the aircraft to autonomously avoid other aircraft, terrain, obstacles, and hazardous weather, and take off and land on its own.
The system is being developed as a priority initiative by the US Department of the Air Force, which hopes that Skyborg-powered aircraft will be available for regular operations by 2023 – possibly on a F16 or F-35 combat aircraft platform – and would eventually be adopted on the US Air Force’s planned sixth generation Next Generation Air Dominance fighter.
To date, tests of the system have been focused on small, fast-moving drones. The US Air Force has awarded contracts for development of the drone portion of Skyborg to Boeing, General Atomics Aeronautical Systems, and Kratos Unmanned Aerial Systems, and plans to take the best aspects from each design to create an optimal system able to undertake a wide variety of mission sets.
Screengrab from US Air Force Research Laboratory video on the Skyborg system
During 2021 test of the Skyborg system flying drone took place with a Kratos UTAP-22 Mako drone and a General Atomics MQ-20 Avenger drone, when the Skyborg ACS “demonstrated basic aviation capabilities and responded to navigational commands, while reacting to geo-fences, adhering to aircraft flight envelopes, and demonstrating coordinated maneuvering,” according to the US Air Force. Using the Skyborg ACS on drones from two different manufacturers demonstrates that the system has the potential to pilot multiple types of uncrewed aircraft.
In October 2021 two General Atomics Avenger drones were flown in coordination using the Skyborg ACS as a milestone towards testing the ability of the system to communicate between aircraft in flight and control a drone swarm. Further tests are planned to test teaming between crewed aircraft and multiple Skyborg-controlled drones.
In assisting the pilot of a crewed aircraft, the Skyborg ACS will be able to control the aircraft semi-independently from the human operator, who will be able to issue commands but will not have to physically fly the system. In December 2020, the US Air Force conducted a test flight in which a U2 spy plane was flown with the assistance of an AI co-pilot. The AI system, based on DeepMind’s world leading µZero machine learning computer programme and known as ARTUµ, was able to navigate the aircraft and search for potential targets while the pilot concentrated on searching for threats. In this respect, Skyborg software appears to play a similar role to ARTUµ.
The project builds on work which the Air Force Research Laboratory has already conducted to develop autonomy in high performance aircraft, such as the ‘Have Raider‘ manned – unmanned teaming project, and the Auto Ground and Air Collision Avoidance system for the F-35 aircraft.
Skyborg is also intended to be able to control a number of uncrewed drones which fly in association with the crewed aircraft. The drones are intended to be low-cost, ‘semi-attritable’ platforms that can be reused but are cheap enough that their loss in combat would be acceptable.
The programme aims to develop a family of uncrewed aerial systems that can be operated from a distance and can conduct aerial missions that might be too dangerous for human pilots to perform. Different drones would be available for different missions, with modular hardware and software based on a common AI backbone. The same approach is being taken to the development of other sixth-generation fast jet aircraft, including the British-Italian-Swedish ‘Tempest’ programme and the Franco-German Spanish Future Combat Air System. Each system is based around the concept of an AI-enabled combat aircraft which would fly with and control a number of drones.
Skyborg is not an autonomous weapon system, and may appear to be a relative benign use of military AI. However, it is intended to allow aircraft to operate with increased effectiveness and lethality, and represents a big step forward in AI arms racing between the world’s military powers.
Perhaps more worrying, the availability of uncrewed platforms enabled by systems like Skyborg is likely to remove a barrier against going to war from states which possess them, in the expectation that they will lose fewer of their own troops in combat. As Drone Wars and others have consistently argued, such a situation lowers the threshold for war and shifts the burden of harm from soldiers to civilians.
The US Air Force Research Laboratory, which is funding the Skyborg programme, says that “Skyborg will not replace human pilots. Instead it will provide them with key data to support rapid, informed decisions”. But the system can clearly be used to control drones which are able to operate, at least in part, autonomously. Autonomous technologies are being presented as auxiliaries which support human controllers, rather than replace them, as a promotional tactic intended to make them more palatable to the public.
The ‘State of AI Report 2021’ warns that “while AI’s growing impact on society and the economy is now evident, our report highlights that research into AI safety and the impact of AI still lags behind its rapid commercial, civil, and military deployment”. Systems such as Skyborg highlight the need for rapid international action to regulate the use of AI to prevent it being used for purposes which are intended to harm humans.