March 27, 2014

Today's LiDARs and GPUs Enable Ultra-accurate GPS-free Navigation with Affordable SLAM Louay Eldada

CEO, Co-Founder Quanergy Systems, Inc. © 2014 Quanergy Systems, Inc.

New Paradigm in Automotive Safety & Navigation • Recent technology advances resulted in low cost, high performance: – LiDARs – GPUs – IMUs

• These new components enable affordable ADAS and autonomous driving systems • These advanced systems, installed on a network of vehicles, enable GPS-free navigation through SLAM LiDAR: Light Detection And Ranging IMU: Inertial Measurement Unit ADAS: Advanced Driver Assistance Systems SLAM: Simultaneous Localization and Mapping © 2014 Quanergy Systems, Inc.

2

Time of Flight LiDAR Transmitter Obstacle

Receiver

• LiDAR stands for Light Detection And Ranging • Measures Time of Flight (TOF), the round-trip travel time for a laser pulse reflected off obstacles • Depth perception  3D mapping • High-speed detection & processing  real-time detection • Long range, high accuracy, works day & night, rain or shine • Low-cost LiDARs benefit Mapping vehicles and ADAS today, semiautonomous vehicles in the near future, and fully autonomous vehicles in the future © 2014 Quanergy Systems, Inc.

3

NHTSA Level SAE Level

Levels of Automation for On-Road Vehicles SAE Name

SAE Narrative Definition

Execution of Backup System Monitoring Steering and Performance Capability of Driving Acceleration/ of Dynamic (Driving Environment Deceleration Driving Task Modes)

Human driver monitors the driving environment 0 0

1 1

2 2

The full-time performance by the human driver of Human driver Human driver Human driver Non-Automated all aspects of the dynamic driving task, even when enhanced by warning or intervention systems The driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration Human driver Assisted using information about the driving environment and Human driver Human driver and system with the expectation that the human driver perform all remaining aspects of the dynamic driving task The driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/ Partial deceleration using information about the driving System Human driver Human driver Automation environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task

n/a

Some Driving modes

Some Driving modes

Automated driving system (“system”) monitors the driving environment 3 3

Conditional Automation

4

High Automation

5

Full Automation

4

The driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene The driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene The full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver

Source: Summary of SAE International’s Draft Levels of Automation for On-Road Vehicles (July 2013) NHTSA - National Highway Traffic Safety Administration © 2014 Quanergy Systems, Inc. SAE - Society of Automotive Engineers

System

System

Human driver

Some Driving modes

System

System

System

Some Driving modes

System

System

System

All driving modes

4

Why LiDAR • LiDAR is the most reliable sensor for object detection • LiDAR is more reliable than any existing sensing solution, including: – – – – –

Radar (all types) Video (all types, including IR) Video + Radar Video + Ultrasonic Sensors Stereoscopic Cameras

LiDAR is the only acceptable technology for object detection in vehicles – technologies that detect 99 out of 100 objects would be acceptable if one accepts the consequences of failing to detect 1 out of 100 pedestrians or bikers Brad Templeton, Robocars (& Google Car Consultant) © 2014 Quanergy Systems, Inc.

5

The Need for Low Cost LiDAR

Safety – Monitoring, Security – Surveillance

Automotive – Full Awareness

Surveying

LiDAR 3D sensors enable safety and efficiency in numerous applications unserved on mass scale due to cost, size, reliability

Military – Driverless Vehicles

Maritime – Autonomous Vessels

Industrial – Factory/Warehouse Automation

Simultaneous Localization & Mapping

Automotive – Driver Assistance © 2014 Quanergy Systems, Inc.

Automotive – Autonomous Driving

6

TEGRA K1 Based Automotive System • Real-time object detection, tracking, identification and classification is important in ADAS and critical in Autonomous Vehicles • Objects include vehicles, pedestrians, animals, road features (e.g., debris, speed bumps, potholes), infrastructure (e.g., bridges, curbs, street signs), etc. • We are doing CUDA implementation of this capability on the Tegra K1 platform

© 2014 Quanergy Systems, Inc.

7

Jetson TK1 • NVIDIA Jetson TK1 (Tegra K1 dev kit) was provided to Quanergy by Carlos Garcia-Sierra (thank you!) • TK1 is computer-on-a-board with K1 processor, 2GB RAM, 16GB eMMC storage, and: – – – – – – – – – –

Half mini-PCIE slot Gigabit Ethernet SD card connector HDMI port Mic and Line jacks 1 micro USB 2.0 port 1 USB 3.0 port SATA data port 4MByte boot flash Expansion port for UART, GPIO, DP/LVDS signals

• Jetson TK1 board has NVIDIA’s ARM Cortex-A15 4-Plus-1 quad-core processor with Kepler graphics and 192 CUDA cores • TK1 supports the CUDA 6.0 developer tool suite; it provides tools for app developers to take advantage of the Tegra K1 features, and for hardware makers to test the platform for chip selection © 2014 Quanergy Systems, Inc.

8

The Mark VIII: 8-Beam LiDAR Designs focus simultaneously on cost, performance, size, and reliability Gen 1: mechanical LiDAR (Mark VIII) Gen 2: solid state LiDAR

19 Patents Pending

• Lasers that see farther are spaced closer to maintain resolution at distance • Some beams are pointed skyward to prevent blind spots in the event of pitch or roll © 2014 Quanergy Systems, Inc.

9

Mark VIII Specifications Parameter

Specification

Laser Class Laser Wavelength Measurement Technique Measurement Range Range Accuracy Angular Resolution Spatial Resolution Sensors Field of View (FOV) Frame Rate (Update Frequency) Operating Temperature Storage Temperature Power Operating Voltage Weight Dimensions Shock Vibration Environmental Protection Functional Safety Stress Resistance Laser Safety Laser Product Safety Military System Safety Data Output Connection UDP** Packets

Class 1 (eye safe) 905 nm Time of Flight (TOF) 300 m at 80% reflectivity, 100 m at 10% reflectivity 1.5 cm 0.1° 17.5 cm at 100 m 8 laser/detector pairs, 3-axis accelerometer Horizontal: 360°, Vertical: 20° (+3°/-17°) 10-30 Hz -40°C to +85°C (-40°F to +185°F) -40°C to +105°C (-40°F to +220°F) 20 W 9-32 VDC 1 kg 3.5” diameter x 3” height 500 m/sec2 amplitude, 11 msec duration 5 Hz to 2000 Hz, 3 Grms IP69K – NEMA* rating for dust & water ingress protection ISO 26262 – Road Vehicle Functional Safety AEC-Q100 – Critical Stress Test for Automotive ICs ANSI Z136.1 – American National Standard for Safe Use of Lasers IEC 60825 – European Standard for Safety of Laser Products MIL-STD-882E – System Safety (for Military Applications only) 1 Gbps Ethernet Angle, Distance, Intensity, IMU, Status, Return Classification (solid vs. aerosol) © 2014 Quanergy Systems, Inc.

*NEMA: National Electrical Manufacturers Association **User Datagram Protocol

10

LiDAR Range

vs. Target Reflectivity & Incidence Angle

 For vehicles (0.8 reflectivity [standard in range reporting]), range is >400m  For low reflectivity obstacles (0.1 reflectivity), range is 250m © 2014 Quanergy Systems, Inc.

11

The Mark VIII 8-beam LiDAR

Compact size, easy to integrate discreetly in various vehicle types

© 2014 Quanergy Systems, Inc.

12

Mark VIII LiDAR Mounting in Van

Small form factor allows easy fit in various locations on a vehicle

© 2014 Quanergy Systems, Inc.

13

Mark VIII LiDAR

Mounting in Sports Car

Small form factor allows easy fit in sports cars without impacting aesthetics or aerodynamics

© 2014 Quanergy Systems, Inc.

14

ADAS Ecosystem Cloud Based Data Processing

Off-line Data Fusion Cloud Computing

Online Map Data Cloud Streaming

Telematics Cellular Connectivity

Wi-Fi Connectivity

Map Data Generation Safety Monitoring On-Board Data Parsing Sensor Layer

© 2014 Quanergy Systems, Inc.

15

Point Cloud Fusion in the Cloud LiDAR point cloud data is pre-processed in the system, then post-processed and fused in the cloud, creating a continuously updated 3D map for SLAM

Vehicle Side ● Hardware: ‒ NVIDIA Tegra K1 integrated CUDA GPU and CPU coprocessor ● Software: ‒ Onboard software performing real time sensor data preprocessing, vehicle localization, sensor fusion (with GPS and IMU data), scene segmentation, object identification, tracking and classification, scenario analysis and preemptive safety modeling ‒ Written in C++ and employing extensions/customizations of the PCL (Point Cloud Library)

Server Side ● Hardware: ‒ Scalable cloud storage and processing using a web service such as AWS (Amazon Web Services) and leveraging free data transfers between AWS modules ‒ Expandable storage cluster using service such as Amazon S3 for saving raw and processed data and serving data to both end users and internal processing routines ‒ On-demand HPC (High Performance Compute) cluster such as Amazon EC2 using NVIDIA CUDA GPUs for highly parallelized raw point cloud fusion, multi dataset co-registration, filtering, scene segmentation & classification, and dataset down-sampling ● Software: ‒ Scalable Hadoop instance on cluster such as EC2 ‒ Customized C++ fusion algorithms employing HPC API wrappers, OpenCL, and PCL ‒ External-facing API for use of final 3D map © 2014 Quanergy Systems, Inc.

16

Quanergy ADAS Components • Sensor modules – – – – – – –

8-beam Quanergy LiDAR (typically 1 or 2 units) GNSS (Global Navigation Satellite System) receiver IMU (Inertial Measurement Unit), includes accelerometer & gyroscope Wheel encoder (optional) Video camera (optional) Ultrasonic sensors (optional) RADAR (optional for long-haul vehicles)

• Processing and communication modules – – – – – –

TEGRA K1 processor Ethernet controller Cell modem Wi-Fi controller Data storage HMI (Human Machine Interface) e.g., onboard display, audio, buzzer

• Balance of System (BOS) – – – –

Power supply Enclosure Cabling Mounting hardware © 2014 Quanergy Systems, Inc.

17

ADAS System Schematic Wired Energy Flow

Wired Data Flow

Wireless Energy Flow

Quanergy Module/Component

Wireless Data Flow

Third Party Module/Component

NVIDIA Component

Quanergy LiDAR Sensor Optics

Optics

Laser Driver

Complementary Sensors: • GPS/GNSS • IMU • Cell/Wi-Fi • Wheel encoder* • Radar* • Ultrasonic* • Video camera*

BLDC Control Energy

PMU

HV Converter

12-36V input

PMU

<20 Watts

Data Fusion Module: Object Detection, Tracking, Identification, Classification

CAN Bus Interface

*Optional

Vehicle Layer

Ethernet UDP Packets

FPGA

FPGA

Encoder

Data Fusion & Analysis Layer

Serial Data

Active Safety Control • Pre-crash braking • Adaptive cruise control • Lane maintenance • Traffic jam/intersection assist • Autonomous driving

Passive Warning Codes & Active Vehicle Control Commands Vehicle Information, OBD-II: velocity, accelerator and brake pedal positions, etc.

CAN Data OBD-II

LiDAR Sensor Layer

Serial Data

TOF Receiver

Vehicle CAN Bus

System Monitoring NVIDIA TEGRA K1 Processor

LiDAR Data

100W Power Supply Gigabit Ethernet Switch

LiDAR 1 LiDAR 2 LiDAR 3 LiDAR 4 …

IMU: Inertial Measurement Unit CAN: Controller Area Network OBD: On-Board Diagnostic

Passive Warning Systems • Visual Cues – Head-up Display (HUD) • Tactile Cues – vibrating seat / steering wheel • Auditory Cues – beeps, vocal commands

© 2014 Quanergy Systems, Inc.

Vehicle Power Supply

18

Quanergy LiDAR Use in Truck ADAS ● 1 or 2 LiDAR sensors secured to vehicle (side or roof mount) ● LiDAR sends pulsed laser light in 360° swath ● Data is relayed to a central processor for data fusion, object recognition, scenario analysis and decision making

● Passive ADAS: data is sent to HMI in cabin (audio, visual and/or tactile feedback) ● Active ADAS: system interfaces to ECU and takes action when driver does not react in time ● Autonomous Vehicle (AV): AI takes over, with driver or driverless © 2014 Quanergy Systems, Inc.

19

Conventional ADAS Sensors: Long Range Radar

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Forward looking long range radar: Most ADAS solutions require a long range, forward looking radar for high speed collision warnings

© 2014 Quanergy Systems, Inc.

20

Conventional ADAS Sensors: Ultrasonic Sensors

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Ultrasonic Sensors – Impractical and poor performance: - Impractical and expensive to mount so many sensors and to equip every trailer - Sensors provide low quality data; a binary observation depending on presence of object, and short range © 2014 Quanergy Systems, Inc.

21

Conventional ADAS Sensors: Radar Sensors

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Radar Sensors – prohibitively expensive: - Due to narrow field of view, 6-8 radar sensors are necessary to cover the area surrounding the vehicle - Sensors provide medium quality data: capable of tracking stationary and moving objects; incapable of classifying objects, which can lead to false positive warnings and diminished driver trust © 2014 Quanergy Systems, Inc.

22

Typical Commercial ADAS

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Volvo System Built by Delphi

© 2014 Quanergy Systems, Inc.

23

Quanergy ADAS

Quanergy System: Configuration 1 – Side Mount

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Actual LiDAR range 10x longer (400 m)

© 2014 Quanergy Systems, Inc.

24

Quanergy ADAS

Quanergy System: Configuration 2 – Roof Mount

Video camera Ultrasonic Sensor 50° RADAR Sensor 12° RADAR Sensor LiDAR Sensor Blind Spot

Actual LiDAR range 10x longer (400 m)

© 2014 Quanergy Systems, Inc.

25

Quanergy ADAS Safety Features • Front Collision Warning – up to highway speeds, detection of moving & stationary objects

• Blind Spot Detection – comprehensive front and side coverage

• Cross Traffic Monitoring – for intersection crossings, tight turns, etc.

• Object Detection, Tracking, Identification, Classification – false alarms suppressed with high resolution under all conditions

• Lane Assistance – using robust road surface intensity readings, not video cameras

• Scene Capture – accurate capture of accident scenes, etc.

Features offered with 3D high-resolution mapping, 360° view, long range, and object recognition, at lower cost than today’s video & radar systems © 2014 Quanergy Systems, Inc.

26

Autonomous Vehicles US Army: 600,000 non-combat vehicles self-driving by 2015 Tesla: Rolling out semi-autonomous technologies by 2017 Nissan: “Commercial self-driving Nissan cars in 2020” Mercedes: “Commercial fully autonomous Mercedes cars in 2020” Google: Developing commercial self-driving car systems © 2014 Quanergy Systems, Inc.

27

Driverless Cars Are Data Guzzlers Wall Street Journal – March 23, 2014

One 8-beam Quanergy LiDAR sensor: 864,000 3D points/sec, 144 bytes of data per point  1 Gbps Ethernet Each 32-beam LiDAR sensor on Ford: 625,000 3D points/sec, 28 bytes of data per point  100 Mbps Ethernet

Quanergy LiDAR sensor range: 400 m (1333 ft) LiDAR sensor on Ford experimental car: 70 m (200 ft) © 2014 Quanergy Systems, Inc.

28

Benefits of Real Time 3D Map Data Real time 3D maps deliver automatically:

 Quantified traffic observations (speed, density, …)

Google

 Actual local weather and road quality information

 Lane-level traffic flow measurements  cm-accurate location of accidents, obstacles, debris  Lane-level street maps

 cm-accurate position of infrastructure (road signs, …)  Full array of LBS (location based services)  Vehicle parameters (speed, acceleration, braking patterns, gps coordinates, etc.) and accurate factual accident recreation  Rich smart data needed for autonomous vehicles  And the first GPS-independent, cm-accurate localization system on the planet © 2014 Quanergy Systems, Inc.

29

Centimeter-Accurate Localization with LIDAR

CHAPTER 3. EXTENSION TO PROBABILISTIC MAPS

27

1. A centimeter-accurate, probabilistic 3D map is created from LIDAR, GPS, and IMU data using a computationally intensive simultaneous localization and mapping (SLAM) algorithm 2. The map is loaded into a LIDAR equipped system, which e 3.1: An infrared refectivity map of a large urban block showing the average reflectivity of each combines real-time LIDAR dataFigur 15x15cm cell. Due to the incorporation of all laser returns, as well as the multiple beams at different angles, this map is noticeably denser, and has fewer holes and occlusions, than those from the preceding chapter. with a computationally efficient, multiple times, thereby enabling the computation of intensity variances on a per-cell basis. localization algorithm to We first present the details of our mapping algorithm, including a novel unsupervised determine its location within the laser calibration routine, and then explain how the localizer uses incoming data and a probabilistic map to localize the vehicle. Finally, we show results from several experiments that map with high accuracy demonstrate the accuracy and robustness of our approach.

3. The approach is robust even in 3.2 Probabilistic M aps dynamic environments Our ultimate goal in building a map is to obtain a grid-cell representation of the observed environment in which each cell stores both the average infrared reflectivity observed at that location aswell as thevariance of those values. Wegenerate such amap in three steps: first,

© 2014 Quanergy Systems, Inc.

30

In Summary • Today’s mobile 3D mapping LiDARs are low cost and compact while delivering high performance and high reliability • Advanced vehicular safety and navigation systems using LiDARs and GPUs with artificial intelligence deliver the most capable solution for real-time object detection, tracking, identification and classification • Combining state of the art LiDARs and GPUs enables ultraaccurate GPS-free full-availability jam-proof centimeteraccurate navigation based on SLAM

© 2014 Quanergy Systems, Inc.

31

Thank You

Louay Eldada Quanergy Systems, Inc. 265 Sobrante Way, Ste M Sunnyvale, CA 94086, USA +1-512-965-6151 [email protected] www.quanergy.com © 2014 Quanergy Systems, Inc.

32

Today's LiDARs and GPUs Enable Ultra-Accurate GPS-Free ...

Mar 27, 2014 - Brad Templeton, Robocars (& Google Car Consultant) ..... Features offered with 3D high-resolution mapping, 360° view, long range, and object ...

3MB Sizes 9 Downloads 171 Views

Recommend Documents

long-pulse lidars
If Af (t) is a slow function compared with f(t) ... site case of fast variation of Af (t) in comparison with f(t), ...... at a comparatively high initial SNR, a broadband noise,.

MANTZARIDIS MOUNT ATHOS AND TODAYS SOCIETY.pdf ...
Page 1 of 8. Page 1 of 8. Page 2 of 8. Page 2 of 8. Page 3 of 8. Page 3 of 8. Page 4 of 8. Page 4 of 8. MANTZARIDIS MOUNT ATHOS AND TODAYS SOCIETY.

long-pulse lidars
Fourier-deconvolution technique as well as on the solution of the first kind of Volterra integral ...... following analytical form of the recorded long-pulse lidar.

Neural GPUs Learn Algorithms
Mar 15, 2016 - Published as a conference paper at ICLR 2016. NEURAL ... One way to resolve this problem is by using an attention mechanism .... for x =9=8+1 and y =5=4+1 written in binary with least-significant bit left. Input .... We call this.

Mobile GPUs - CIS 565
Apr 11, 2012 - 23%. NVIDIA. 3%. Qualcomm. 31%. Samsung. 14%. TI. 17%. Others. 12% ... version. – Higher than nearly all desktop and laptop displays.

Enable Offline Editing.pdf
Page 1 of 1. Enable Offline Editing.pdf. Enable Offline Editing.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying Enable Offline Editing.pdf. Page 1 of ...

Enable Offline Editing.pdf
Page 1 of 1. How to enable Offline Editing in Google Drive. Once logged in to Google Drive,. select “more” from the menu. on the left hand side. Select “Oine” 1 2.

Enable Indian Language -
If this does not solve your problem, please visit the wikipedia article which describes Multilingual support for various OS. Page 2 of 2. Enable Indian Language.

Web architecture and components which enable internet and web ...
Page 2 of 4. Components: Web servers: This is server that the web hosting services would use to host your website, giving you a set. space on a server they own that holds your website so people can view it. This allows the. use of web applications, a

Web architecture and components which enable internet and web ...
Retrying... Web architecture and components which enable internet and web functionality.pdf. Web architecture and components which enable internet and web ...

Exploiting Prediction to Enable Secure and Reliable ...
Keywords—Wireless body area networks; routing; prediction; reliability; security ... coverage area cannot be used in WBANs because RF energy .... network. The logic of greedy forwarding is to move a packet to a node closer to n0 than the node curre

Parallel Programming CPUs & GPUs
1837-71: Charles Babbage analytical engine. • 1954: IBM 704 “first real MIMD”. • 1958: parallelism in numerical calculations. • 1962: four-processor, 16 memory modules. • 1964: SIMD. • 1969: eight processors in parallel. • 1970s: more

GPUfs: Integrating a File System with GPUs
to be largely oblivious to where data is located—whether on disk, in main memory, in a .... comprising a kernel into a single global hardware queue. The hard-.

heterogeneous catalysis for todays challenges ...
heterogeneous catalysis for todays challenges synthesis characterization and applications rsc green chemistry contains important information and a detailed ...

GPUfs: Integrating a File System with GPUs
parallel data analysis applications, prioritized image matching and string search, highlight the ... A design and implementation of a generic software-only buffer.

A System to enable Relational Persistence and Semantic Web style ...
A System to enable Relational Persistence and. Semantic Web style access simultaneously for Objects. Jing Mei, Guotong Xie, Shengping Liu, Lei Zhang, Yue ...

Accelerating SSL with GPUs
eavesdropping, and enables authentication of end hosts. Nowadays,. SSL plays an essential role in online-banking, e-commerce, and other Internet services to ...

Efficient Computation of Sum-products on GPUs ...
Bayesian networks on an NVIDIA GeForce 8800GTX GPU over an optimized CPU version on a single core of a 2.4 GHz Intel Core. 2 processor with 4 MB L2 cache. For sufficiently large inputs the speedups reach 2700. The significant contributor to the speed

Robotic, Self-Sustaining Architecture to Utilize Resources and Enable ...
X.O Autonomous robotics pervasive throughout solar system enabling human presence electronics and robotics. Shipyards to support main belt. Monkey-like ...

YouTube and AdSense enable Mexico's Craftingeek to inspire
All other company and product names may be trademarks of the respective ... one a week and built a social media following by commenting on the videos and.

Exploiting Prediction to Enable Secure and ... - Semantic Scholar
the routing protocol used), PSR selects the one with the highest predicted link quality ... Keywords—Wireless body area networks; routing; prediction; reliability ... regarded as notorious Denial-of-Service (DoS) attacks, and other robust and ...

To enable the students to understand the principles, practices and ...
Capacity management – Capacity control. 10 - 11. Capacity planning – Capacity Requirement Planning (CRP) – Inputs to. CRP – CRP output – Benefits and ...

Robotic, Self-Sustaining Architecture to Utilize Resources and Enable ...
1 . PFWDUCT. OUT\. ENiRGV. S.VS'RM li!1l iiii 31 n~•3 f•9C ... plus import electronics boxes. Diversify ... Fleet of 6 spacecraft (20 MT plus 12 MT payload, each.