Klepsydra AI is used as a novel approach to onboard artificial intelligence for Earth observation in space.
Copyright: klepsydra.com – “Klepsydra AI Technology Evaluation for Space Use”
A novel approach to artificial intelligence on-board
New generations of spacecrafts are required to perform tasks with an increased level of autonomy. Space exploration, Earth Observation, space robotics, etc. are all growing fields in Space that require more sensors and more computational power to perform these missions.
Sensors, embedded processors, and hardware in general have hugely evolved in the last decade, equipping embedded systems with large number of sensors that will produce data at rates that has not been seen before while simultaneously having computing power capable of large data processing on-board. Near-future spacecrafts will be equipped with large number of sensors that will produce data at high-speed rates in space and data processing power will be significantly increased.
Future missions such as Active Debris Removal will rely on novel high-performance avionics to support image processing and Artificial Intelligence algorithms with large workloads. Similar requirements come from Earth Observation applications, where data processing on-board can be critical in order to provide real-time reliable information to Earth. This new scenario has brought new challenges with it: low determinism, excessive power needs, data losses and large response latency.
In this project, Klepsydra AI is used as a novel approach to on-board artificial intelligence. It provides a very sophisticated threading model combination of pipeline and parallelization techniques applied to deep neural networks, making AI applications much more efficient and reliable. This new approach has been validated with several DNN models and two different computer architectures. The results show that the data processing rate and power saving of the applications increase substantially with respect to standard AI solutions.
On-board AI for Earth Observation
The amount of data produced by a multi-spectral camera prevents real-time data transfer to the ground due to the limitations of downlink speeds, thus requiring large on-board data storage. Several high-level solutions have been proposed to improve this including the use on-board Artificial Intelligence to filter irrelevant or low-quality data and send only a subset of data.
In a different field, vision-based navigation, there is also a challenge of data processing combined with AI algorithms. One example is rendezvous with uncooperative objects in space, e.g., debris removal. Another example of this is autonomous pinpoint planetary landing, where the number of sensors and the complexity of the Guidance Navigation and Control algorithms make this discipline still one of the biggest challenges in space. One common element to these two use cases, is a well-known fact in control engineering: for optimal control algorithms, the higher the rate of sensor data, the better is the performance of the algorithm. Moreover, communication limitations may result in severe delays in the generation of alarms raised upon detection of critical events. With Klepsydra, data can be analysed onboard in real time and alerts can be immediately communicated to the ground segment.
Inference in Artificial Intelligence
There are several components to artificial intelligence. First, there is the training and design of the model. This activity is usually carried out by data scientists for a specific field of interest. Once the model is designed and trained, it can be deployed to the target computer for real-time execution. This is what is called inference.
Inference consists of two parts: the trained model and the AI inference engine to execute the model. The focus of this research has been solely on the inference engine software implementation.
Figure 1. AI main components
Trends in Artificial Intelligence inference acceleration
The most common operation in AI inference by far is matrix multiplications. These operations are constantly repeated for each input data to the AI model. In recent years, there has been a substantial development in this area with both industry and academia progressing steadily in this field. While the current trend is to focus on hardware acceleration like Graphic Processing Units (GPU) and Field-programmable gate array (FGPA), these techniques are currently not broadly available to the Space industry due to radiation issues and excessive energy consumption for the former, and programming costs for the latter.
The use of CPU for inference, however, has been also undergoing an important evolution taking advantage of modern Floating processing unit (FPU) connected to the CPU. CPUs are widely used in Space due to large Space heritage and also ease of programming. Several AI inferences engines are available for CPU+FPU setups. The results of extensive research in building a new AI inference show that both reduces power consumption and also increases data throughput.[…]
Read more: www.klepsydra.com