Platforms Streamline the ADAS Development Cycle

Along with an industry-wide commitment to increasing the efficiency of automobiles, the promise of autonomous transport is closer to being realised. Systems that provide greater control over every element of an automobile not only offer better mileage, they also deliver greater safety. Indeed, the dream of a self-driving car is no longer science fiction. Trends such as advanced driver assistance systems (ADAS) have seen the adoption of adaptive cruise control, parking assistance, blindspot detection, and lane departure warnings.

The integration of such systems is now commonplace in luxury and high-end models, but their deployment has been less widespread in mainstream vehicles, predominantly due to cost. It’s more usual to see one or two systems taken as options in mid-range purchases, where systems such as reversing sensors or cameras add enough value to warrant any additional cost. Ideally, manufacturers would like to offer all systems at a price that is attractive to all buyers, something that really relies on adoption in higher volumes.

Part of the problem is the relatively bespoke nature of each system. ADAS is still an emerging technology. Therefore, each system requires a significant amount of engineering effort, even within the same model family. To reduce the R&D burden and therefore reduce ADAS costs, automakers are researching the viability of ADAS platforms that can be more easily configured and modified for different levels of functionality while meeting lower pricepoints.

A single platform that can offer extended and configurable functionality has been almost as elusive as the self-driving car. But just as the advent of ADAS is making autonomous vehicles a reality, the development of all-programmable system-on-chip (SoC) solutions has introduced a new era in system design.

For many years, FPGAs have enabled OEMs to create bespoke hardware devices at a fraction of the cost of full ASIC design. The extremely low non-recurring engineering (NRE) costs associated with FPGA development are one of its greatest advantages. In applications that exhibit very high volumes with no variation in specification, an ASIC can still represent the most compelling business and engineering option.

However, that kind of application is increasingly rare. Increasingly often, some level of configurability is required. In such cases, it is more common to choose a discrete solution based on general-purpose components such as DSPs and microcontrollers. While this option offers the most flexibility, it comes at the expense of an increased bill of materials (BoM), greater board space, and higher system power, all of which conspire to raise the overall system cost.

Ideally, OEMs looking to develop a solution that offers enough performance and flexibility for a range of configurations would target a single platform that offers the performance of hardware-based functions, the flexibility of a software-based processor, and the low NRE costs of an FPGA. The Zynq All Programmable SoC from Xilinx offers this combination of features. Featuring dual ARM Cortex-A9 processors and tightly coupled programmable logic in a fully integrated device tested to AEC-Q1000 and beyond, it is suitable for developing ADAS solutions.

The A9 cores can be used together or independently to run anything from bare-metal software up to real-time operating systems, allowing simple control algorithms to run alongside complex video analytics software. For maximum performance, functions can be accelerated in dedicated hardware blocks configured in the programmable logic fabric and tightly coupled to the processors.

With a virtually unlimited level of parallel processing on offer, DSP blocks can be deployed to tackle complex algorithms much more efficiently than in software alone. The level of integration allows engineers to partition any design with the right balance between hardware acceleration and software execution.

With legislation already announced in the United States that will make rear-view cameras mandatory on all new models, multiple camera-based vision systems will feature significantly in ADAS as a market segment for automakers.

Meeting the processing demands of vision systems is challenging. Complex algorithms are used to identify and detect objects that are moving relatively to the point of view, often at high speeds. The required processing power is difficult to estimate, as the algorithms used to provide better detection are constantly evolving. 

source: http://electronicdesign.com/embedded/platforms-streamline-adas-development-cycle

Comments are closed.