Embedded processor technology for vision-based space programs

Designing electronic systems for space applications is a huge challenge for designers. But this article shows that even modern standard processors can be deployed in this kind of application and signifi cantly simplify the task.


This article is contributed by Unibap and AMD                Download PDF version of this article


The selection process for technology intended for space programs is constrained and dominated by the requirement to operate in such harsh environments. The demands associated with acceleration, shock and vibration, the ability to withstand large variations in air pressure and heat, and a tolerance to radiation often point towards solutions developed specifically for such extremes. But it isn’t always necessary; a recent example has put a standard processor from the AMD Embedded G-Series family into orbit, empowering a vision-based system performing analytical tasks using deep learning technology. With such impressive credentials, the same technology is clearly applicable to any Earth-bound application.

The continuous improvement of vision systems has led to outward looking programs like the Space Situational Awareness (SSA) Program, and satellite-based earthward-looking missions such as Earth Observation; both rely heavily on real-time vision data from satellites to monitor the space around us and our own fragile atmosphere. The SSA has the unique challenge of identifying hazardous objects that could threaten equipment and infrastructure both in orbit and on the ground. This includes monitoring the Sun and solar winds and their effects on the magnetosphere, ionosphere and thermosphere of the Earth, which can affect space-borne and ground-based infrastructure, endangering human life or health. It is also responsible for observing near-earth objects such as asteroids and comets, as well as active and inactive satellites that could potentially impact the Earth.

The challenge here is the huge amount of bandwidth needed to transfer data in the highest resolution, from satellites in orbit to the radar observation stations on the ground that are used to analyze the data. To illustrate this, current high-end vision systems used in orbiting satellites feature CCD and CMOS sensors producing colour images comprising 25 mega-pixels at video rates; an uncompressed image represents 75 Mbyte of data (30 fps is 18 Gbps). With up to 30 images taken every second, it would require a bandwidth of 18 Gbit/s to transmit. But the bandwidth of a link between a nanosatellite and an observation station on the ground is currently around 50 Mbps; a huge shortfall. Additionally, in the case of applications in deeper space, latency becomes a major problem if they need to be controlled from earth. The solution here is to use autonomous intelligence, allowing satellites and vehicles to self-navigate and perform in-site cloud computing with data mining, extraction and indexing. Engineers are now developing technologies that can pre-process and analyze the massive amounts of raw data gathered at source alongside the vision sensors. This reduces the transmissions down to only the most relevant data, instead of huge streams of raw image data.

This creates a need for smart cameras and sensors generally able to support parallel processing of data on a massive scale, coupled with the execution of deep learning algorithms. Massively parallel processing is needed to accelerate the processing of data from any kind of sensor, from high resolution CMOS sensors with 25 mega-pixels of data, to radar data streams. With conventional CPUs, high performance can be maintained when complex instructions are limited to operating on a single piece of data at a time. But image processing requires parallel processing, a single instruction operating on multiple data at the same time. Multicore processor architectures, such as General Purpose Graphical Processing Units, are used to accelerate processing throughput while at the same time lowering overall system power. Massively parallel processing is also an enabling technology for deep learning algorithms in machine intelligence.

Figure 1. Unibap uses the AMD G-Series as the intelligent computing core for its space applications. The CPU provides high computing power, high reliability and extremely high radiation resistance.

 

Deep learning is required for high levels of abstractions, which allow decisions to be made more naturally than a simple ‘If, Then, Else’ format. Deep learning enables a computer to better identify objects based on experience; drawing from hundreds or thousands of correct examples. Using deep learning, a machine can better differentiate between images of objects and the objects themselves. For example, using deep learning on a Mars mission, the equipment was able to understand that a rock with all the elements of a face could not, in fact be a face. This human-like intelligence makes machines better able to make decisions, at least with respect to specific and well defined tasks.

To address these demands, Unibap has developed a platform which complies with the highest NASA Technology Readiness Level, TRL-9. Employing machine learning algorithms for processing, indexing and storing data, it is built on the Linux Lightweight Ubuntu 16.04 LTS operating system, which has been optimized for applications such as vision processing, robot control, point cloud handling, deep neural networks and scientific operations. It supports high-level interpreted languages including Octave and Python 3, design and simulation frameworks such as MATLAB and Simulink, and relational databases including MySQL and SQLite. A fault-tolerant system with ECC memory error correction offers 6TByte of local storage over native SATA V3 ports and can be expanded with RAID controllers on PCIe with RAID 1/5/10, and 100 GFlops of heterogeneous computing performance. The platform comprises a multicore CPU and GPU with advanced FPGA technology, making it ideal for running deep learning algorithms. The platform has already been deployed in a space information processing solution. The software provided for the platform is based on the Unibap Deep Delphi software stack, a cross-platform solution able to support x86, ARM Cortex-M3 and FPGA state-machines.

Satellites with this kind of capability can enable many different mission scenarios, for instance accurate situational awareness for rapid distribution of information to war fighters; not fast enough to provide real-time data to fighter planes, but able to deliver accurate information about the bombardment of buildings, or strategic information about the movement of ballistic missiles with a resolution of seconds. This makes it instrumental in a combat situation, allowing operatives to follow the movement of resources in real-time. The same technology is used in bio-informatics, in-situ bio-analytics and bio-photonic processing. It is also being applied in autonomous vehicle operations on Mars, as well as interplanetary exploration. On Earth, there are a growing number of application areas for such technology, ranging from autonomous vehicles to remote video surveillance and even human-assist applications.

Figure 2. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

As the central processing core, Unibap selected technology from AMD, with good reason. First and foremost, it offers a combination of CPU and GPU processing which has already made it a preferred choice for many vision-based applications. AMD is also leading the field of heterogeneous system architectures that can maximize the function of each system block in order to offer more performance at lower power. These attributes are the perfect foundation for vision-based space programs, where the available power is limited. Unibap started evaluating the AMD Embedded G-Series processors for space-based customer programs and discovered that the AMD technology excelled in another significant area, resistance to radiation. This is becoming an important attribute not only for space programs but for any Earth-based application that must preserve the highest level of data integrity. This includes any application where human life could be at risk due to a Single Event Upset (SEU), caused by radiation originating in space, leading to lost data. Guaranteed data integrity is one of the most important preconditions for meeting the highest reliability and safety standards. Every single calculation and autonomous decision depends on reliable data, so crucial is it that data stored in RAM is protected against corruption to prevent corruption of the instructions carried out by the CPU/GPU. However, SEUs can still lead to errors. They are caused by background neutron radiation, which is always present and occurs when high energy particles from the Sun and deep space hit the upper atmosphere of the Earth, generating a flood of secondary isotropic neutrons with enough energy to reach ground and sea level.

Figure 3. Susceptibility of common electronics to the background neutron radiation cross-section Single Event Ratio (Upset/device*hour). In order to compare different technologies, the SER values have been normalized to a size of 1 GByte for each relevant technology.

 

The Single Event probability at sea level is between 10-8 and 10-2 upsets per device-hour for commonly used electronics. This means that within every 100 hours, one Single Event could potentially lead to data corruption, jeopardizing functionality. It is here that AMD G-Series SoC(s) excel, by providing the highest level of radiation resistance (and therefore safety). Tests performed by NASA Goddard Space Flight Center have shown that the AMD G-Series SoC(s) can tolerate a total ionizing radiation dose of 17 Mrad(Si). This surpasses, by far, the current maximum permissible values; 400 rad in a week is lethal to humans. In standard space programs, components are usually required to withstand 300 krad. Even a space mission to Jupiter would only require a resistance of 1 Mrad. In addition, AMD supports advanced error correction memory (ECC RAM), a feature which is used to detect and correct errors caused by Single Events. Although a Jupiter mission would require the software code to be small enough to run from the internal L2 cache, as there are no known DDR memories that can withstand the same massive radiation.


Related


In Search of the Best Op Amp for Remote Devices

Portable and remote devices are integral to medical, home, and business systems that manage the collection of analog data. The trend today is to create smaller, more energy efficient devices to shrink...

 


Bs&T at PCIM2018

powerlosstester presenting BsT-pulse 3 phase version and BsT-SQ for powerloss measurement of inductive components new findings of tester, the highest Bs ferrite material D9B for SiC application GaN fe...


Würth and AnDAPT describe their new programmable power solution

In this video an engineer from AnDAPT describes their new programmable power solution and their partnership with Würth at the APEC exhibition  in San Antonio, Texas. Drawing from Würth&...


MAGMENT: Magnetizable concretes, sole enablers for dynamic inductive wireless charging.

MAGMENT is a patented material technology, engineered from cement and magnetic particles from recycled electronic waste. We are the inventors and sole company worldwide to offer both the concrete mate...


A look at Analog Devices' wireless power demonstration at APEC 2018

In this video Steve from Analog Devices walks us through a wireless power transmission demonstration at APEC 2018 in San Antonio, Texas. The LTC4120 is a constant-current/constant-voltage wireless rec...


Analog Devices talks about their Power over Ethernet solutions at APEC

In this video Analog Devices talks about their Power over Ethernet solutions at APEC 2018 in San Antonio, Texas. Their LTC4291 provides four PSE Ports with two power channels per port, and is fully co...


Silicon Labs demonstrates their latest PoE solutions at APEC 2018

In this video John Wilson of Silicon Labs demonstrates their latest Power over Ethernet solutions at APEC 2018 in San Antonio, Texas. The live demonstration shows how a remote device can effectively p...


Vitrek explains their advanced testing solutions at APEC 2018

In this video Vitrek explains their advanced testing solutions at APEC 2018 in San Antonio, Texas. The devices displayed includes their 4700 high-voltage meter, which can measure up to 10kV and can pe...


Dirk Giesen describes the Parasoft tool suite for Embedded Software Development

Are you responsible for embedded software development in your organization? Your goal should be to create safe, secure, and reliable software. To make sure your device will work properly, deploy Paras...


Ross Sabolik of Silicon Labs talks about advanced Power over Ethernet

In this video Ross Sabolik of Silicon Labs talks about smart  Power over Ethernet systems with Alix Paultre at their APEC exhibit in San ANtonio, Texas. As PoE migrates to higher power levels and...


Dialog Semi walks through their latest IC solutions for battery chargers

In this video an engineer from Dialog Semiconductor walks us through their latest ICs for battery chargers at APEC 2018. Dialog's Qualcomm Quick Charge adapter solutions offer high efficiency to e...


Steve Allen of pSemi explains their latest LED driver solution

Steve Allen of pSemi explains their latest LED boost product based on Arctic Sand's two-stage architecture. Their PE23300 has a charge-pump, switched-capacitor architecture that offloads most of t...


Teledyne describes their latest 12-bit Wavepro HD oscilloscope

In this video Teledyne LeCroy describes their latest Wavepro HD oscilloscope to Alix Paultre of Power Electronics News at the company's launch event. The WavePro HD high-definition oscilloscope de...