Embedded processor technology for vision-based space programs

Designing electronic systems for space applications is a huge challenge for designers. But this article shows that even modern standard processors can be deployed in this kind of application and signifi cantly simplify the task.


This article is contributed by Unibap and AMD                Download PDF version of this article


The selection process for technology intended for space programs is constrained and dominated by the requirement to operate in such harsh environments. The demands associated with acceleration, shock and vibration, the ability to withstand large variations in air pressure and heat, and a tolerance to radiation often point towards solutions developed specifically for such extremes. But it isn’t always necessary; a recent example has put a standard processor from the AMD Embedded G-Series family into orbit, empowering a vision-based system performing analytical tasks using deep learning technology. With such impressive credentials, the same technology is clearly applicable to any Earth-bound application.

The continuous improvement of vision systems has led to outward looking programs like the Space Situational Awareness (SSA) Program, and satellite-based earthward-looking missions such as Earth Observation; both rely heavily on real-time vision data from satellites to monitor the space around us and our own fragile atmosphere. The SSA has the unique challenge of identifying hazardous objects that could threaten equipment and infrastructure both in orbit and on the ground. This includes monitoring the Sun and solar winds and their effects on the magnetosphere, ionosphere and thermosphere of the Earth, which can affect space-borne and ground-based infrastructure, endangering human life or health. It is also responsible for observing near-earth objects such as asteroids and comets, as well as active and inactive satellites that could potentially impact the Earth.

The challenge here is the huge amount of bandwidth needed to transfer data in the highest resolution, from satellites in orbit to the radar observation stations on the ground that are used to analyze the data. To illustrate this, current high-end vision systems used in orbiting satellites feature CCD and CMOS sensors producing colour images comprising 25 mega-pixels at video rates; an uncompressed image represents 75 Mbyte of data (30 fps is 18 Gbps). With up to 30 images taken every second, it would require a bandwidth of 18 Gbit/s to transmit. But the bandwidth of a link between a nanosatellite and an observation station on the ground is currently around 50 Mbps; a huge shortfall. Additionally, in the case of applications in deeper space, latency becomes a major problem if they need to be controlled from earth. The solution here is to use autonomous intelligence, allowing satellites and vehicles to self-navigate and perform in-site cloud computing with data mining, extraction and indexing. Engineers are now developing technologies that can pre-process and analyze the massive amounts of raw data gathered at source alongside the vision sensors. This reduces the transmissions down to only the most relevant data, instead of huge streams of raw image data.

This creates a need for smart cameras and sensors generally able to support parallel processing of data on a massive scale, coupled with the execution of deep learning algorithms. Massively parallel processing is needed to accelerate the processing of data from any kind of sensor, from high resolution CMOS sensors with 25 mega-pixels of data, to radar data streams. With conventional CPUs, high performance can be maintained when complex instructions are limited to operating on a single piece of data at a time. But image processing requires parallel processing, a single instruction operating on multiple data at the same time. Multicore processor architectures, such as General Purpose Graphical Processing Units, are used to accelerate processing throughput while at the same time lowering overall system power. Massively parallel processing is also an enabling technology for deep learning algorithms in machine intelligence.

Figure 1. Unibap uses the AMD G-Series as the intelligent computing core for its space applications. The CPU provides high computing power, high reliability and extremely high radiation resistance.

 

Deep learning is required for high levels of abstractions, which allow decisions to be made more naturally than a simple ‘If, Then, Else’ format. Deep learning enables a computer to better identify objects based on experience; drawing from hundreds or thousands of correct examples. Using deep learning, a machine can better differentiate between images of objects and the objects themselves. For example, using deep learning on a Mars mission, the equipment was able to understand that a rock with all the elements of a face could not, in fact be a face. This human-like intelligence makes machines better able to make decisions, at least with respect to specific and well defined tasks.

To address these demands, Unibap has developed a platform which complies with the highest NASA Technology Readiness Level, TRL-9. Employing machine learning algorithms for processing, indexing and storing data, it is built on the Linux Lightweight Ubuntu 16.04 LTS operating system, which has been optimized for applications such as vision processing, robot control, point cloud handling, deep neural networks and scientific operations. It supports high-level interpreted languages including Octave and Python 3, design and simulation frameworks such as MATLAB and Simulink, and relational databases including MySQL and SQLite. A fault-tolerant system with ECC memory error correction offers 6TByte of local storage over native SATA V3 ports and can be expanded with RAID controllers on PCIe with RAID 1/5/10, and 100 GFlops of heterogeneous computing performance. The platform comprises a multicore CPU and GPU with advanced FPGA technology, making it ideal for running deep learning algorithms. The platform has already been deployed in a space information processing solution. The software provided for the platform is based on the Unibap Deep Delphi software stack, a cross-platform solution able to support x86, ARM Cortex-M3 and FPGA state-machines.

Satellites with this kind of capability can enable many different mission scenarios, for instance accurate situational awareness for rapid distribution of information to war fighters; not fast enough to provide real-time data to fighter planes, but able to deliver accurate information about the bombardment of buildings, or strategic information about the movement of ballistic missiles with a resolution of seconds. This makes it instrumental in a combat situation, allowing operatives to follow the movement of resources in real-time. The same technology is used in bio-informatics, in-situ bio-analytics and bio-photonic processing. It is also being applied in autonomous vehicle operations on Mars, as well as interplanetary exploration. On Earth, there are a growing number of application areas for such technology, ranging from autonomous vehicles to remote video surveillance and even human-assist applications.

Figure 2. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

As the central processing core, Unibap selected technology from AMD, with good reason. First and foremost, it offers a combination of CPU and GPU processing which has already made it a preferred choice for many vision-based applications. AMD is also leading the field of heterogeneous system architectures that can maximize the function of each system block in order to offer more performance at lower power. These attributes are the perfect foundation for vision-based space programs, where the available power is limited. Unibap started evaluating the AMD Embedded G-Series processors for space-based customer programs and discovered that the AMD technology excelled in another significant area, resistance to radiation. This is becoming an important attribute not only for space programs but for any Earth-based application that must preserve the highest level of data integrity. This includes any application where human life could be at risk due to a Single Event Upset (SEU), caused by radiation originating in space, leading to lost data. Guaranteed data integrity is one of the most important preconditions for meeting the highest reliability and safety standards. Every single calculation and autonomous decision depends on reliable data, so crucial is it that data stored in RAM is protected against corruption to prevent corruption of the instructions carried out by the CPU/GPU. However, SEUs can still lead to errors. They are caused by background neutron radiation, which is always present and occurs when high energy particles from the Sun and deep space hit the upper atmosphere of the Earth, generating a flood of secondary isotropic neutrons with enough energy to reach ground and sea level.

Figure 3. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

The Single Event probability at sea level is between 10-8 and 10-2 upsets per device-hour for commonly used electronics. This means that within every 100 hours, one Single Event could potentially lead to data corruption, jeopardizing functionality. It is here that AMD G-Series SoC(s) excel, by providing the highest level of radiation resistance (and therefore safety). Tests performed by NASA Goddard Space Flight Center have shown that the AMD G-Series SoC(s) can tolerate a total ionizing radiation dose of 17 Mrad(Si). This surpasses, by far, the current maximum permissible values; 400 rad in a week is lethal to humans. In standard space programs, components are usually required to withstand 300 krad. Even a space mission to Jupiter would only require a resistance of 1 Mrad. In addition, AMD supports advanced error correction memory (ECC RAM), a feature which is used to detect and correct errors caused by Single Events. Although a Jupiter mission would require the software code to be small enough to run from the internal L2 cache, as there are no known DDR memories that can withstand the same massive radiation.


Related


Logic Innovations for space and power savings

In  this webinar the latest Nexperia logic developments and package trends will be presented. The next generation of Smart Grid and intelligent facilities requires advanced power management so...

Slimming program for medical operating devices

Operating devices in the medical sector are not only subject to strict controls and requirements. Nowadays design demands are becoming more and more important for developers of medical HMI devices. De...

Establishing a root of trust to secure the IoT

Security is not something that any developer can ignore. It is no longer safe, for the OEM or their customers, to assume that their product or service is immune to cyber attacks. The sheer size of the...

Securing the smart and connected home

With the Internet of Things and Smart Home technologies, more and more devices are becoming connected and therefore can potentially become entry points for attackers to break into the system to steal,...

Acoustic MEMS - letting systems listen to the world

Ambient intelligence is fast becoming a mainstream technology. Many homes now have some form of smart speakers that understand spoken commands. Car dashboards and navigation systems use voice control ...

Mass Connectivity in the 5G Era

5G will achieve faster transmission rates, more powerful data exchange networks, and more seamless real-time communication, which will enable tremendous growth for advanced and innovative connectivity...

 


Arduino CEO Fabio Violante on their migration upwards in engineering

In this video Arduino CEO Fabio Violante talks about their ambitious migration upwards in engineering solutions and products with Alix Paultre in Tegernsee, Germany. Arduino, long known for their deve...


Silicon Lab CEO Tyson Tuttle talks about their wireless IoT portfolio

In this video Silicon Lab's Tyson Tuttle talks to Alix Paultre about their new wireless IoT portfolio. Wireless Xpress provides a configuration-based development environment, with certified Blueto...


Keysight's Joachim Peerlings talks about the new UXR series Oscilloscope

In this video Keysight's Joachim Peerlings talks about the new UXR series Oscilloscope with Alix Paultre at their launch event in Munich. The Infiniium UXR-Series of oscilloscopes has models rangi...


BrainChip explains their new Neuromorphic System-on-Chip

In this video, Bob Beachler of BrainChip talks to Alix Paultre about their latest single-chip neural network technology.  Spiking neural networks (SNNs) are inherently lower power than traditiona...


Vincotech – EMPOWERING YOUR IDEAS

In this video the Vincotech team walks us through the most important topics displayed on their booth at PCIM Europe 2018. It also explains why Vincotech is First in SiC Modules. Being very flexible in...


Microchip talks about their latest secure microcontroller

Microchip's new SAM L10 and SAM L11 families of 32-bit microcontrollers (MCUs) address the growing need for security in Internet of Things (IoT) endpoints by protecting against the increasing the ...


E-Mail Newsletters

nlsc240

Our 3 E-Mail Newsletters: EETimes/EDN Europe, Embedded News and Power Electronics News inform about the latest news in technology and products, as well as technical know-how like white papers, webinars, articles, etc.


B & S / ECE Magazine

- latest issue is online now -

November 2018

Content Highlights

Cover Story

Internet-connected displays make the industrial IoT more visible

Download now