Embedded processor technology for vision-based space programs

Designing electronic systems for space applications is a huge challenge for designers. But this article shows that even modern standard processors can be deployed in this kind of application and signifi cantly simplify the task.


This article is contributed by Unibap and AMD                Download PDF version of this article


The selection process for technology intended for space programs is constrained and dominated by the requirement to operate in such harsh environments. The demands associated with acceleration, shock and vibration, the ability to withstand large variations in air pressure and heat, and a tolerance to radiation often point towards solutions developed specifically for such extremes. But it isn’t always necessary; a recent example has put a standard processor from the AMD Embedded G-Series family into orbit, empowering a vision-based system performing analytical tasks using deep learning technology. With such impressive credentials, the same technology is clearly applicable to any Earth-bound application.

The continuous improvement of vision systems has led to outward looking programs like the Space Situational Awareness (SSA) Program, and satellite-based earthward-looking missions such as Earth Observation; both rely heavily on real-time vision data from satellites to monitor the space around us and our own fragile atmosphere. The SSA has the unique challenge of identifying hazardous objects that could threaten equipment and infrastructure both in orbit and on the ground. This includes monitoring the Sun and solar winds and their effects on the magnetosphere, ionosphere and thermosphere of the Earth, which can affect space-borne and ground-based infrastructure, endangering human life or health. It is also responsible for observing near-earth objects such as asteroids and comets, as well as active and inactive satellites that could potentially impact the Earth.

The challenge here is the huge amount of bandwidth needed to transfer data in the highest resolution, from satellites in orbit to the radar observation stations on the ground that are used to analyze the data. To illustrate this, current high-end vision systems used in orbiting satellites feature CCD and CMOS sensors producing colour images comprising 25 mega-pixels at video rates; an uncompressed image represents 75 Mbyte of data (30 fps is 18 Gbps). With up to 30 images taken every second, it would require a bandwidth of 18 Gbit/s to transmit. But the bandwidth of a link between a nanosatellite and an observation station on the ground is currently around 50 Mbps; a huge shortfall. Additionally, in the case of applications in deeper space, latency becomes a major problem if they need to be controlled from earth. The solution here is to use autonomous intelligence, allowing satellites and vehicles to self-navigate and perform in-site cloud computing with data mining, extraction and indexing. Engineers are now developing technologies that can pre-process and analyze the massive amounts of raw data gathered at source alongside the vision sensors. This reduces the transmissions down to only the most relevant data, instead of huge streams of raw image data.

This creates a need for smart cameras and sensors generally able to support parallel processing of data on a massive scale, coupled with the execution of deep learning algorithms. Massively parallel processing is needed to accelerate the processing of data from any kind of sensor, from high resolution CMOS sensors with 25 mega-pixels of data, to radar data streams. With conventional CPUs, high performance can be maintained when complex instructions are limited to operating on a single piece of data at a time. But image processing requires parallel processing, a single instruction operating on multiple data at the same time. Multicore processor architectures, such as General Purpose Graphical Processing Units, are used to accelerate processing throughput while at the same time lowering overall system power. Massively parallel processing is also an enabling technology for deep learning algorithms in machine intelligence.

Figure 1. Unibap uses the AMD G-Series as the intelligent computing core for its space applications. The CPU provides high computing power, high reliability and extremely high radiation resistance.

 

Deep learning is required for high levels of abstractions, which allow decisions to be made more naturally than a simple ‘If, Then, Else’ format. Deep learning enables a computer to better identify objects based on experience; drawing from hundreds or thousands of correct examples. Using deep learning, a machine can better differentiate between images of objects and the objects themselves. For example, using deep learning on a Mars mission, the equipment was able to understand that a rock with all the elements of a face could not, in fact be a face. This human-like intelligence makes machines better able to make decisions, at least with respect to specific and well defined tasks.

To address these demands, Unibap has developed a platform which complies with the highest NASA Technology Readiness Level, TRL-9. Employing machine learning algorithms for processing, indexing and storing data, it is built on the Linux Lightweight Ubuntu 16.04 LTS operating system, which has been optimized for applications such as vision processing, robot control, point cloud handling, deep neural networks and scientific operations. It supports high-level interpreted languages including Octave and Python 3, design and simulation frameworks such as MATLAB and Simulink, and relational databases including MySQL and SQLite. A fault-tolerant system with ECC memory error correction offers 6TByte of local storage over native SATA V3 ports and can be expanded with RAID controllers on PCIe with RAID 1/5/10, and 100 GFlops of heterogeneous computing performance. The platform comprises a multicore CPU and GPU with advanced FPGA technology, making it ideal for running deep learning algorithms. The platform has already been deployed in a space information processing solution. The software provided for the platform is based on the Unibap Deep Delphi software stack, a cross-platform solution able to support x86, ARM Cortex-M3 and FPGA state-machines.

Satellites with this kind of capability can enable many different mission scenarios, for instance accurate situational awareness for rapid distribution of information to war fighters; not fast enough to provide real-time data to fighter planes, but able to deliver accurate information about the bombardment of buildings, or strategic information about the movement of ballistic missiles with a resolution of seconds. This makes it instrumental in a combat situation, allowing operatives to follow the movement of resources in real-time. The same technology is used in bio-informatics, in-situ bio-analytics and bio-photonic processing. It is also being applied in autonomous vehicle operations on Mars, as well as interplanetary exploration. On Earth, there are a growing number of application areas for such technology, ranging from autonomous vehicles to remote video surveillance and even human-assist applications.

Figure 2. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

As the central processing core, Unibap selected technology from AMD, with good reason. First and foremost, it offers a combination of CPU and GPU processing which has already made it a preferred choice for many vision-based applications. AMD is also leading the field of heterogeneous system architectures that can maximize the function of each system block in order to offer more performance at lower power. These attributes are the perfect foundation for vision-based space programs, where the available power is limited. Unibap started evaluating the AMD Embedded G-Series processors for space-based customer programs and discovered that the AMD technology excelled in another significant area, resistance to radiation. This is becoming an important attribute not only for space programs but for any Earth-based application that must preserve the highest level of data integrity. This includes any application where human life could be at risk due to a Single Event Upset (SEU), caused by radiation originating in space, leading to lost data. Guaranteed data integrity is one of the most important preconditions for meeting the highest reliability and safety standards. Every single calculation and autonomous decision depends on reliable data, so crucial is it that data stored in RAM is protected against corruption to prevent corruption of the instructions carried out by the CPU/GPU. However, SEUs can still lead to errors. They are caused by background neutron radiation, which is always present and occurs when high energy particles from the Sun and deep space hit the upper atmosphere of the Earth, generating a flood of secondary isotropic neutrons with enough energy to reach ground and sea level.

Figure 3. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

The Single Event probability at sea level is between 10-8 and 10-2 upsets per device-hour for commonly used electronics. This means that within every 100 hours, one Single Event could potentially lead to data corruption, jeopardizing functionality. It is here that AMD G-Series SoC(s) excel, by providing the highest level of radiation resistance (and therefore safety). Tests performed by NASA Goddard Space Flight Center have shown that the AMD G-Series SoC(s) can tolerate a total ionizing radiation dose of 17 Mrad(Si). This surpasses, by far, the current maximum permissible values; 400 rad in a week is lethal to humans. In standard space programs, components are usually required to withstand 300 krad. Even a space mission to Jupiter would only require a resistance of 1 Mrad. In addition, AMD supports advanced error correction memory (ECC RAM), a feature which is used to detect and correct errors caused by Single Events. Although a Jupiter mission would require the software code to be small enough to run from the internal L2 cache, as there are no known DDR memories that can withstand the same massive radiation.


Related


PCAP Touch displays - what does the future hold?

In this article the author compares three different touch technologies and examines their suitability for industrial applications. Figure 1. Example of a touch panel with Force Touch in a medic...

 


Dialog Semi walks through their latest IC solutions for battery chargers

In this video an engineer from Dialog Semiconductor walks us through their latest ICs for battery chargers at APEC 2018. Dialog's Qualcomm Quick Charge adapter solutions offer high efficiency to e...


Steve Allen of pSemi explains their latest LED driver solution

Steve Allen of pSemi explains their latest LED boost product based on Arctic Sand's two-stage architecture. Their PE23300 has a charge-pump, switched-capacitor architecture that offloads most of t...


Teledyne describes their latest 12-bit Wavepro HD oscilloscope

In this video Teledyne LeCroy describes their latest Wavepro HD oscilloscope to Alix Paultre of Power Electronics News at the company's launch event. The WavePro HD high-definition oscilloscope de...


Dialog Semi walks through their latest IC solutions for battery chargers

In this video an engineer from Dialog Semiconductor walks us through their latest ICs for battery chargers at APEC 2018. Dialog's Qualcomm Quick Charge adapter solutions offer high efficiency to e...


ROHM explains their latest wireless battery charger solution kit

In this video an engineer from ROHM goes over their latest wireless power development kit, co-developed with Würth for embedded development. The kit provides a complete wireless power transfer sy...


Tektronix describes their latest mixed-signal oscilloscope

In this video Tektronix explains the features in their latest 5 Series MSO Mixed Signal Oscilloscope. Features include an innovative pinch-swipe-zoom touchscreen user interface, a large high-definitio...


AVX shows a supercapacitor demonstrator at APEC

In this video Eric from AVX explains their supercapacitor demonstrator box at APEC 2018 in San Antonio, Texas. The box shows how a 5V 2.5-farad supercapacitor can quickly charge up using harvested ene...


OnSemi explains their latest passive smart wireless sensor for industrial applications

In this video On Semiconductor explains their latest wireless sensor for hazardous environments at APEC in San Antonio, Texas. Intended for applications like high-voltage power cabinets and other plac...


TI demonstrates an improved gaming power system at Embedded World

In this video Texas Instruments' explains Significant reduction in ripple, which results in improved reliability and increased design margins, among other advantages. Another benefit that improve...


Infineon explains their latest motor drive technology at APEC 2018

In this video Infineon demonstrates new gate drivers using their LS-SOI technology at APEC 2018. In the demo Victorus, an Infineon application engineer, shows in real time how much better thermal the ...


STMicro goes over their latest wireless-enabled microcontroller for the IoT

In this video STMicroelectronics goes over their latest wireless-enabled STM32WB microcontroller for the IoT and intelligent devices in several live connectivity demonstrations at Embedded World 2018....


Infineon explains their latest wireless charging solution at Embedded World

In this video Infineon goes over their latest wireless charging solutions at the Embedded World show in Nuremberg, Germany. The spokesperson explains the difference between their Qi-compatible solutio...


Grammatech talks about the importance of software in engineering

In this video Mark Hermeling of Grammatech talks to Alix Paultre after the Embedded World show in Nuremberg about the importance of software verification for security and safety in electronic design. ...


Lattice Semi walks through their booth demos at Embedded World

In this video Lattice Semiconductor walks us through their booth demonstrations at Embedded World 2018. The live demonstrations include an operating IoT remote vehicle, a low-power network used for vi...


Maxim describes their latest security solution at Embedded World 2018

In this video Scott from Maxim Integrated describes their latest security solution at Embedded World 2018. In the live demo he shows the DS28E38 DeepCover Secure ECDSA Authenticator, an ECDSA public k...


Garz & Fricke at Embedded World 2018 - Embedded HMIs and SBCs “Made in Germany”

You are looking for a HMI-system or single components as touches, displays and ARM-based SBCs? Welcome at Garz & Fricke – the Embedded HMI Company! Our offering ranges from typical single co...


ECRIN Systems myOPALE: Remote Embedded Modular Computers

myOPALE™ offers disruptive technology to multiply capabilities of your next Embedded Computers in a smaller foot print thanks to PCI Express® over Cable interconnect, standard 5.25’&rs...


TechNexion rolls out embedded systems, modules, Android Things kits at Embedded World 2018

In this video John Weber of TechNexion talks to Alix Paultre about how the company helps its customers getting products to market faster. By choosing to work with TechNexion, developers can take advan...


Mike Barr talks cybersecurity

In this video Mike Barr, CEO of the Barr Group, talks to Alix Paultre about cybersecurity at the Embedded World conference in Nuremberg, Germany. Too many designers, even in critical spaces like milit...