VEHICLE AUTOMATION REPORT Tempe, AZ HWY18MH010 (16 pages) NATIONAL TRANSPORTATION SAFETY BOARD OFFICE OF HIGHWAY SAFETY WASHINGTON, D.C. VEHICLE AUTOMATION REPORT A. CRASH INFORMATION Location: Northbound Mill Avenue, approximately 400 feet south of the intersection with Curry Road, in Tempe, Maricopa County, Arizona. Vehicle: Operator: 2017 Volvo XC90 with Uber ATG developmental automated driving system. Uber ATG. Pedestrian: 49-year-old female, walking a bicycle Date: Time: March 18, 2018 9:58 p.m. Mountain Standard Time (MST) NTSB #: HWY18MH010 B. VEHICLE AUTOMATION GROUP Ensar Becic, Investigator, Group Chairman NTSB Office of Highway Safety 490 L’Enfant Plaza East, S.W., Washington, DC 20594 Noah Zych Chief of Staff Uber Advanced Technologies Group Jan Ivarsson Director, Senior Technical Advisor Volvo Car Group C. CRASH SUMMARY For a summary of the crash, refer to the Crash Summary Report in the docket for this investigation. Tempe, AZ – Vehicle Automation Report Page 2 of 16 D. DETAILS OF THE REPORT The Vehicle Automation report focuses on the autonomous operation of the 2017 Volvo XC90 equipped with Uber Technologies, Inc. Advanced Technologies Group (ATG) developmental automated driving system (ADS). Specifically, this report discusses the testing of ATG ADS and describes the system’s functionality, limitations, restrictions, and postcrash changes made by the ATG. The vehicle was factory-equipped by Volvo with several advanced driver assistance systems (ADAS), including forward collision warning (FCW) system and automatic emergency braking (AEB) system. However, Volvo collision avoidance ADAS were not active at the time of the crash; the interaction of the Volvo ADAS and ATG ADS is further explored in section 1.9. 1. Developmental Automated Driving System 1.1 General Information Uber ATG developmental ADS installed on the crash-involved vehicle was designed to operate in a fully autonomous mode only on pre-mapped designated routes. Although the system was designed to be fully automated along a specific route, a human operator located inside the vehicle was tasked with overseeing the operation of the system and monitoring the environment. Unless stated otherwise, the ADS discussed in this report refers only to Krypton platform, software version 2018.071.3p1 that was installed on the crash-involved vehicle. 1 After the crash, ATG made numerous changes which are primarily discussed in section 1.11. 1.2 Structural Components 1.2.1 General Information ATG developmental ADS consists of multiple systems capable of monitoring and analyzing the surrounding environment, each consisting of hardware components and accompanying software analysis and data recording elements. As shown in figures 1 and 2, these structural components include (1) lidar system, (2) radar system, (3) camera system, (4) various telemetry, positioning, monitoring and telecommunication systems. Additionally, as part of the development of the ADS, ATG equipped the vehicle with (1) one inward-facing camera to monitor the vehicle operator (see Human Performance and Video Summary factual reports), and (2) a human-machine interface (HMI)—a tablet—that affords interaction between the vehicle operator and the ADS (see section 1.7 for additional details). 1 The rest of the ATG fleet of 2017 Volvo XC90 vehicles at the time of the crash, also operated on the same system. Tempe, AZ – Vehicle Automation Report Page 3 of 16 Figure 1. Image of 2017 Volvo XC90 showing the location of sensor components supporting the ATG developmental ADS. (Source: Uber ATG) Figure 2. Diagram showing the function and general description of various sensors supporting the ATG developmental ADS. (Source: Uber ATG) Tempe, AZ – Vehicle Automation Report Page 4 of 16 1.2.2 Lidar System This is a sensor system that uses laser light to detect and measure distance to objects by directing light and receiving it back upon its reflection from an object. Time of flight between the pulsing of the laser light and the reception of its return upon reflection from an object is used to compute distance. Data from the lidar system is used for classification of the detected objects, which the ADS uses to estimate the object’s probable path and intention (the classification is described in section 1.5.1). The lidar system consists of a single lidar, mounted on the forward half of the roof of the SUV (see figures 1 and 2). The lidar is manufactured by Velodyne, has a range of over 100 meters and can detect objects in a 360-degree area. The initial processing of lidar data is conducted by Velodyne’s processing unit. The ADS then uses that data to build a representation of the surrounding environment which is continually updated as new objects are detected. In addition to detection and classification of objects and obstacles, the lidar system is also used to make the initial map of a designated route and for the verification of the vehicle’s position along that route (this process is described in section 1.4). 1.2.3. Radar System Radars use super high frequency radio waves to detect and measure distance to objects. Time of flight between the broadcast of the waves and the reception of their return upon reflection from an object is used to compute distance. Data from the radar system is also used for classification of the detected objects. The system consists of 8 radars with dual ranging capabilities— alternating between narrow, long range scanning and wider, medium range scanning—positioned around the vehicle to provide a 360-degree view of the surrounding environment. There were (a) two radars on the front end of the vehicle for forward scanning, (b) two radars on each side of the vehicle for lateral scanning, and (c) two radars on the rear end of the vehicle for rearward scanning. The long-range scan has an observational range of up to 180 meters with a 20-degree field of view. The medium-range scan has an observational range of up to 65 meters with a 90-degree field of view. The radar processing units conduct the initial processing of the data, which the ADS then uses to build and continually update the representation of the surrounding environment. 1.2.4. Camera System The camera system consists of 10 cameras positioned around the vehicle to provide a 360degree view of the surrounding environment. The system includes (a) two cameras with with a narrow field of view for long range forward stereo imaging, (b) one single lens camera with wide field of view for medium range forward imaging, (c) two single lens cameras with wide field of view for medium range imaging of lateral areas, (d) two single lens cameras with wide field of view for imaging of the area rear of the vehicle, and (e) four surround view cameras positioned for close range imaging. The range at which an object can be detected is dependent on its size and environmental visibility. The visual processing of the optical data from all cameras is performed by the ADS. The primary purpose of the forward camera is to provide data for the detection of vehicles and pedestrians, and for reading of traffic lights. Additionally, the camera system supports near-range Tempe, AZ – Vehicle Automation Report Page 5 of 16 sensing of people and objects within 5 meters of the vehicle during lane changes, parking, and when collecting passengers. 2 Data from the camera system is also used for classification of the detected objects. In addition to monitoring and real-time analysis of the perceived objects, the camera system also records the visual environment. Segments of the recorded videos are regularly reviewed by the ATG as part of the process of ADS development. 1.2.5. Additional Systems The vehicle was equipped with a global positioning system (GPS) which is used for determining the vehicle’s position upon engagement of the ADS. The GPS is not used for the verification of the vehicle’s position along the pre-mapped routes. Additionally, the vehicle was equipped with a long-term evolution (LTE) antenna which the ADS-equipped vehicle used for securing mobile data traffic and authentication of cloud communication. Furthermore, the vehicle was equipped with 12 ultrasonic sensors located around the vehicle with a range of 5 meters which are primarily used to detect other vehicles during lane changes, and for detecting pedestrians, curbs and other obstacles when parking and collecting passengers. 3 Also, the vehicle is equipped with an Inertial Measurement Unit, an electronic device that measures vehicle’s acceleration and angular rate. This device contains accelerometers and gyroscopes and is used to refine the position of the vehicle along the route. 1.3 Sensor Maintenance and Calibration The sensors are calibrated on a fixed interval—about every six months—or after a sensor is replaced. On the crash-involved vehicle, the last that the sensors were calibrated before the crash was on March 13, 2018. 1.4 Route Mapping, Path Guidance and Verification The ADS is designed to be operated autonomously only on designated routes for which ATG has developed high-definition maps. These designated routes can be considered the ADS geographical operational design domain (ODD)—the roadways on which an automated system is designed to operate. 4 A route is mapped while manually driving the ADS-equipped vehicle along the route and recording all aspects of the environment with lidar, camera and others sensor systems. These systems create a high-definition map which includes roadway markings, curbs, traffic signals, signage, roadway grade and curvature, and pertinent non-traffic static objects in the environment, such as buildings. Furthermore, the relevant sensor systems also measure velocity 2 The near range cameras were not in use at the time of the crash. 3 The ultrasonic sensors were not in use at the time of the crash. 4 The ODD as a concept was introduced in the first Federal Automated Vehicles Policy published by Department of Transportation (DOT) in September 2016; this version was accessed in April, 2019. Based on the first automated vehicles policy, a defined ODD should include roadways, geographic area and environmental conditions and speed range under which the automated vehicle system is designed to operate. Tempe, AZ – Vehicle Automation Report Page 6 of 16 and yaw rate which are then fused with the measurements pertaining to the environmental features, to determine the vehicle’s exact location. The ADS can be initiated only when the vehicle is located on a designated pre-mapped route. This is a system-based restriction, as it precludes an operator from engaging the ADS outside the pre-mapped route. However, an operator is responsible for adhering to other ODD conditions, such as operation during inclement weather (see section 1.6.1 for operational restrictions). As the vehicle travels along the route, the various sensory systems continually scan the environment and monitor vehicle dynamics, which are then analyzed to verify the vehicle’s position. The environmental features and roadway characteristics detected by these systems are matched to the features and characteristics along the mapped route at those specific locations. This process of continuous and redundant verification of vehicle position is designed to eliminate the possibility of the vehicle venturing outside the designated path, and also allows the system to accommodate slight deviations in the environment and adjust the ADS motion plan. 1.5 Operational Restrictions and ADS Disengagement 1.5.1. ADS Engagement and Operational Restrictions ADS engagement is a two-step process which can be completed only when ATG test vehicle is on a designated route. An operator engages ADS by (1) pulling up a red knob on the center console to the right of the shift lever, and (2) pushing a silver button behind the red knob (see figure 4 in section 1.8 for illustration). The operation of the ADS testing in Tempe area was restricted to certain environmental, geographic and vehicle operational conditions, including (1) maximum vehicle speed of 45 mph, (2) urban and rural roads, but excluding highways 5, (3) all lighting conditions, including daytime, nighttime and twilight, (4) most weather conditions, except for heavy rain and snow, (5) most roadway conditions such as dry and wet, except those involving snow accumulation. 1.5.2. ADS Disengagement and Operator Takeover There are two main mechanisms by which the ADS would disengage: (1) executed by a vehicle operator, (2) initiated by the system itself. An operator can immediately disengage the ADS by taking control of the vehicle—through braking, steering or accelerating, or by pushing down a disengagement knob—ATG test vehicles are equipped with a red knob, located on the center console to the right of the shift lever (see section 1.8 for more details). ADS disengagement initiated by the system consists of an auditory alert to the operator and a simultaneous disengagement—return of the control to the operator. Depending on the circumstances, the ADS disengagement initiated by the system can be sudden or anticipated. Sudden disengagement would be due to an operational error to any of the sensory systems, or system faults, such as problems with data recording. An anticipated disengagement, 5 ATG had conducted testing on highways as well. This highway testing was conducted with Volvo XC90, but the software was configured for dynamics of a truck. Tempe, AZ – Vehicle Automation Report Page 7 of 16 such as when the vehicle is exiting a pre-mapped area (geographical ODD) 6 would be preceded by an early—before the disengagement process begins—alert to the operator to take control of the vehicle. 1.6 Object Detection and Hazard Avoidance When ADS is activated, it performs all driving tasks, including changing lanes, overtaking slow moving or stopped vehicles, making turns, or stopping at traffic lights and stop signs. 1.6.1. Object Detection and Classification, and Path Prediction As the ADS navigates and controls the vehicle along a designated route, the system continually monitors the environment for any objects, whether moving or stationary, on or outside a roadway. The detected objects are incorporated into the virtual environment, and the system dynamically updates the vehicle’s motion plan to avoid potential conflicts. Object detection is conducted primarily by the lidar, radar and camera systems, each of which has different specialized functions. When an object is detected, it is tracked, its heading and velocity calculated, and classified by the perception system. Detected objects can be classified as vehicles, pedestrians and bicyclists; a detected object may also be classified as “other”, indicating an unknown object. The ADS uses a fusion of the three sensor systems to classify a detected object; the perception system uses a prioritization schema that promotes certain tracking methods over others, and is also dependent on the recency of the observation. Once the perception process classifies a detected object, the system predicts its goals; an object detected in a travel lane and classified as a vehicle would generally be assigned a goal of traveling in the direction of traffic within that lane. However, the system also takes into account the previously detected locations of that object, its tracking history. The system then generates multiple possible trajectories—path predictions—based on the goal of the detected object and its tracking history. The path predictions are continually updated to incorporate the latest detected location. However, if the perception system changes the classification of a detected object, the tracking history of that object is no longer considered when generating new trajectories. For such newly reclassified object, the predicted path is dependent on its classification, the object’s goal; for example, a detected object in a travel lane that is newly classified as a bicycle is assigned a goal of moving in the direction of traffic within that lane. However, certain object classifications— other—are not assigned goals. For such objects, their currently detected location is viewed as a static location; unless that location is directly on the path of the automated vehicle, that object is not considered as a possible obstacle. Additionally, pedestrians outside a vicinity of a crosswalk are also not assigned an explicit goal. However, they may be predicted a trajectory based on the observed velocities, when continually detected as a pedestrian. 6 The ADS had a system-based restriction that precluded autonomous operation outside the designated pre-mapped area. If a final destination was outside the pre-mapped area, the system would initiate the anticipated disengagement protocol prior to exiting the pre-mapped area. Tempe, AZ – Vehicle Automation Report Page 8 of 16 If a predicted path of a detected object intersects with that of the automated vehicle, the system modifies its motion plan or initiates hazard avoidance (described in the next section). Since the crash, ATG has made changes to the way the system fuses sensor information and predicts possible paths to take into account the tracking history (see section 1.11). 1.6.2. Hazard Avoidance and Emergency Braking As the ADS detects and tracks objects, it modulates vehicle dynamics—steering, throttle— to maintain smooth movement, devoid of abrupt changes in motion. In certain situations, such as sudden hard braking of a lead vehicle or an initially obscured pedestrian darting in front of the ATG test vehicle, gradual changes in vehicle trajectory may not be sufficient to avoid a collision. ATG ADS, as a developmental system is designed with limited automated capabilities in emergency situations— defined as those requiring braking greater than 7 m/s2 (0.71 g) or rate of deceleration (jerk) greater than +/- 5 m/s3 to prevent a collision. The primary countermeasure in such situations is the vehicle operator who is expected to intervene and take control of the vehicle if the circumstances are truly collision-imminent, rather than due to system error/ misjudgment. When the system detects an emergency situation, it initiates action suppression. This is a one-second period during which the ADS suppresses planned braking while the (1) system verifies the nature of the detected hazard and calculates an alternative path, or (2) vehicle operator takes control of the vehicle. ATG stated that it implemented action suppression process due to the concerns of the developmental ADS identifying false alarms—detection of a hazardous situation when none exists—causing the vehicle to engage in unnecessary extreme maneuvers. If a vehicle operator does not take control of the vehicle in an emergency, and the situation remains hazardous after the 1-second period of action suppression, the automated response is dependent on whether the collision can be avoided with the maximum braking of 7 m/s2 and jerk/ deceleration rate of +/- 5 m/s3. The automated response options in such situations include: • if the collision can be avoided with the maximum allowed braking and jerk, the system executes its plan and engages braking up to the maximum limit, • if the collision cannot be avoided with the application of the maximum allowed braking, the system is designed to provide an auditory warning to the vehicle operator while simultaneously initiating gradual vehicle slowdown. In such circumstance, ADS would not apply the maximum braking to only mitigate the collision. Since the crash, ATG has made changes to the way that the system responds in emergency situations, including the activation of automatic braking for crash mitigation (see section 1.11). 1.7 Data Recording Systems ATG provided NTSB investigators with a data summary of relevant events leading up to the crash, including the time when the ADS detected the pedestrian, how the system had classified and assigned predicted paths to the pedestrian, as well as various vehicle dynamics. The ADS did not report any sensor or system failures during the crash trip. Tempe, AZ – Vehicle Automation Report Page 9 of 16 Additionally, at the request of NTSB investigators, ATG provided a playback of sensor and vehicle dynamics information showing the events leading up to the crash. The investigators examined the output from the sensor systems to create a timeline of the events (see Table 1). Data pertaining to the operator’s interaction with the HMI is presented in section 1.8. Table 1. Selected parameters recorded by the ADS. Time (s) Classification and Path Predictiona relative to impact Speed (mph) - 9.9 35 Vehicle begins to accelerate from 35 mph due to an increased speed limit - 5.8 44 Vehicle reaches the speed of 44 mph - 5.6 44 Classification: Vehicle - by radar Path prediction: None; not on the path of the SUV Radar makes the first detection of the pedestrian and estimates its speed. 45 Classification: Other - by lidar Path prediction: Static; not on the path of the SUV Lidar detects an unknown object; this is the first detection of that object by lidar, the tracking history is unavailable, and its velocity cannot be determined. ADS predicts the object’s path as static. 45 Classification: Vehicle - by lidar Path prediction: Static; not on the path of the SUV Lidar classifies a detected object as a vehicle; this is a changed classification of an object and without a tracking history. ADS predicts the object’s path as static. Classification: Vehicle - by lidar Path prediction: The left through lane (adjacent to the SUV); not on the path of the SUV Lidar retains the classification “vehicle”, and based on the tracking history and the assigned goal, ADS predicts the object’s path as traveling in the left through lane. 45 Classification: alternated several times between vehicle and other - by lidar Path prediction: alternated between static and left lane; neither were considered on the path of the SUV The object’s classification alternates several times between vehicle and an unknown. At each change, the object’s tracking history is unavailable, and ADS predicts the object’s path as static. When the detected object’s classification remained the same, ADS predicts the path as traveling in the left through lane. 45 Classification: Bicycle - by lidar Path prediction: Static; not on the path of the SUV Lidar classifies a detected object as a bicycle; this is a changed classification of the object, and without a tracking history. ADS predicts the bicycle’s path as static. 45 Classification: Bicycle - by lidar Path prediction: The left through lane (adjacent to the SUV); not on the path of the SUV Lidar retains the classification “bicycle” and based on the tracking history and the assigned goal, ADS predicts the bicycle’s path as traveling in the left through lane. - 5.2 - 4.2 -3.9 ⇔ - 3.8 - 2.7 - 2.6 - 2.5 Tempe, AZ – Vehicle Automation Report Page 10 of 16 Other Events / Detailsb Time (s) relative to impact - 1.5 - 1.2 Speed Classification and Path (mph) Predictiona 44c 43 - 0.2 40 - 0.02 39 Other Events / Detailsb Classification: Unknown - by lidar Path prediction: Static; partially on the path of the SUV - Lidar detects an unknown object; since this is a changed classification, and an unknown object, it lacks tracking history and is not assigned a goal. ADS predicts the object’s path as static. - Although the detected object is partially in the SUV’s lane of travel, the ADS generates a motion plan around the object (maneuver to the right of the object); this motion plan remains valid—avoiding the object—for the next two data points. Classification: Bicycle - by lidar Path prediction: The travel lane of the SUV; fully on the path of the SUV - Lidar detects a bicycle; although this is a changed classification and without a tracking history, it was assigned a goal. ADS predicts the bicycle to be on the path of the SUV. - The ADS motion plan—generated 300 msec earlier—for steering around the bicycle was no longer possible; as such, this situation becomes hazardous. - Action suppression begins. Classification: Bicycle - by lidar Path prediction: The travel lane of the SUV; fully on the path of the SUV - Action suppression ends 1 second after it begins. - The situation remains hazardous; as such, ADS initiates a plan for vehicle slowdown. - An auditory alert was presented to indicate that the controlled slowdown was initiating.d Vehicle operator takes control of the steering wheel, disengaging the ADS. Impact 0.7 37 Vehicle operator brakes a Only changes in object classification and path prediction are reported in the table.The last reported values persist until a new one is reported. b The process of predicting a path of a detected object is complex and relies on the examination of numerous factors, beyond the details described in this column.c The vehicle started decelerating due to the approaching intersection, where the pre-planned route includes a right turn at Curry Road. The deceleration plan was generated 3.6 seconds before impact. d While the system generated a plan for the vehicle slowdown, due to a slight communication delay, the data is unclear on whether the implementation of the slowdown plan started before the operator took control prior to impact. At the time when the ADS detected the pedestrian for the first time, 5.6 seconds before impact, she was positioned approximately in the middle of the two left turn lanes (see figure 3). Although the ADS sensed the pedestrian nearly 6 seconds before the impact, the system never Tempe, AZ – Vehicle Automation Report Page 11 of 16 classified her as a pedestrian—or predicted correctly her goal as a jaywalking pedestrian or a cyclist—because she was crossing the N. Mill Avenue at a location without a crosswalk; the system design did not include a consideration for jaywalking pedestrians. Instead, the system had initially classified her as an other object which are not assigned goals. As the ADS changed the classification of the pedestrian several times—alternating between vehicle, bicycle, and an other— the system was unable to correctly predict the path of the detected object. Only when the ADS determined that the object’s currently detected location was on the path of the ATG vehicle—1.2 seconds before impact—the system recognized an emergency situation, an imminent collision. At that time, because preventing the collision would have required extreme braking or steering actions—beyond the design specifications—the ADS initiated suppression of its motion plan. One second later, the vehicle was still on the collision path with the pedestrian, and preventing the collision still required an extreme avoidance maneuverer; per design, the system did not engage emergency brakes, but rather provided an auditory alert to the vehicle operator as it initiated a plan for the vehicle slowdown. Figure 3. Aerial view of crash location showing the path of the pedestrian as she crossed N. Mill Avenue, and the movement and speed of the vehicle at three points before impact. The pedestrian’s path depicts the position at time of the initial detection (5.6 seconds) and the corresponding times with the vehicle’s position. Tempe, AZ – Vehicle Automation Report Page 12 of 16 1.8 Human Machine Interface (HMI) As part of the interaction between the vehicle operator and the ADS, ATG mounted a tablet onto the center console, covering Volvo’s infotainment screen. While the vehicle was in motion, the tablet would depict the navigation screen showing the movement of the vehicle, and its interactive functionality was limited. In motion, operator could tag (1) an outside event of interest, such as when encountering a stopped school bus, (2) an unusual action of the ADS, such as the system incorrectly reacting to a situation that should be within its capabilities, or (3) an issue with equipment inside the vehicle. Each of the three allowed tagging actions required one or two taps to complete. Data from the tablet showed that 19 minutes and 25 seconds before the crash, the system alerted the operator that ADS was disengaging, which prompted the operator to, three seconds later, take manual control of the vehicle. 7 At that time, the operator tagged the disengagement event on the tablet. Ten seconds later, 19 minutes and 12 seconds before the crash, the operator reengaged the ADS. For the remainder of the crash trip, the operator did not interact with the tablet, and the HMI did not present any information that required the operator’s input. Figure 4. Image of the interior of an exemplar vehicle showing locations of cell phone slot (yellow region in center console), ADS engagement/ disengagement button knob (red), ADS engagement button (blue) and HMI (with inset illustrating image). 1.9 Interaction Between Volvo ADAS and ATG ADS When the SUV was operated in a manual mode—controlled by a vehicle operator—all the Volvo ADAS components were active and operated as designed. When the SUV was operated in the autonomous mode—controlled by the ADS—all Volvo ADAS components were automatically 7 ATG provided the HMI data in both the quantitative and video format. Tempe, AZ – Vehicle Automation Report Page 13 of 16 disengaged. Volvo’s passive safety technologies, such as seat belt pretensioners and airbag deployment systems remained active. Volvo provided ATG with an access to its ADAS to allow seamless automatic deactivation when engaging the ADS. According to ATG and Volvo, simultaneous operation of Volvo ADAS and ATG ADS was viewed as incompatible because (1) of high likelihood of misinterpretation of signals between Volvo and ATG radars due to the use of same frequencies; (2) the vehicle’s brake module had not been designed to assign priority if it were to receive braking commands from both the Volvo AEB and ATG ADS. 1.10 Crash History of the ATG Fleet of ADS-Equipped Vehicles ATG shared records of fleet crash history with NTSB investigators. The records showed that between September 2016 and March 2018 (excluding the current crash), there were 37 crashes and incidents involving ATG test vehicles which at the time operated in autonomous mode. Most of these crashes involved another vehicle striking the ATG test vehicle—33 such incidents; 25 of them were rear-end crashes and in 8 crashes ATG test vehicle was side swiped by another vehicle. In only two incidents, the ATG test vehicles were the striking vehicles. In one incident, the ATG vehicle struck a bent bicycle lane bollard that partially occupied the ATG test vehicle’s lane of travel. In another incident, the vehicle operator took control of the vehicle to avoid a rapidly approaching oncoming vehicle that entered the ATG vehicle’s lane of travel; the vehicle operator steered away and struck a parked car. In the remaining two incidents, an ATG vehicle was damaged by a passing pedestrian while the vehicle was stopped. 1.11 Postcrash Changes On March 26—eight days after the crash—Arizona Governor suspended ATG’s privileges to test ADS-equipped vehicles in the autonomous mode in Arizona. Immediately after the crash, on March 19, 2018, ATG stopped testing of ADS-equipped vehicles on public roads in all operational centers, as a precautionary measure while the company started the process of evaluating its testing procedures and overall operational and organizational structure. On December 20, 2018, ATG resumed ADS testing on public roads. The testing was limited to a 1mile loop in Pittsburgh, PA in the vicinity of Uber ATG headquarters. The speed limit on the roads of this loop is 25 mph, the maximum speed at which ATG ADS-equipped vehicles are currently— as of August 31, 2019—being tested. As a part of the process of examining the company’s safety culture and identifying potential safety deficiencies, ATG conducted an internal assessment and a voluntary external review. Both reviews made numerous recommendations in several areas, including those affecting technical performance, operational safety that covered implementation of safety procedures and oversight of vehicle operators, and organizational structure. Furthermore, during investigative meetings with ATG representatives, NTSB investigators communicated several safety-relevant issue areas that were uncovered during the course of the investigation. Finally, as part of a continuous ADS development ATG had planned changes to various technical and operational areas even prior to this crash. Tempe, AZ – Vehicle Automation Report Page 14 of 16 As a result of the reviews, the investigative findings, and the already planned changes, at the time when the ATG resumed testing on public roads in December 2018, the ATG implemented extensive system changes, including the following postcrash changes in the area of technical performance: Volvo ADAS. Several Volvo ADAS remain active during the operation of the ADS; specifically, the FCW and the AEB with pedestrian-detection capabilities are engaged during both manual driving and testing with the UBER ATG autonomous mode. ATG changed the frequency at which ATG-installed radars supporting the ADS operate; at the new frequency, these radars do not interfere with the functioning of Volvo ADAS. ATG also worked with Volvo to assign prioritization for the ADS and Volvo AEB in situations when both systems issue command for braking. The decision of which system is assigned priority is dependent on the specific circumstance at that time. Handling of Emergency Situations. ATG changed the way the ADS manages emergency situations (as described in section 1.6.2) by no longer implementing action suppression. The updated system does not suppress system response after detection of an emergency situation, even when the resolution of such situation—prevention of the crash—exceeds the design specifications. In such situations, the system allows braking even when such action would not prevent a crash; emergency braking is engaged to mitigate the crash. ATG increased the jerk (the rate of deceleration) limit to 20 m/s3. When inquired by NTSB investigators, ATG stated that under the current testing speed of up to 25 mph, there have been no unintended consequences—increased number of false alarms 8– as a result of removing the action suppression process coupled with other technical and operational changes. Path Prediction. ATG changed the way the ADS generates possible trajectories—predicts the path—of detected objects (as described in section 1.6.1). Previous locations of a tracked object are incorporated into decision process when generating possible trajectories, even when object’s classification changes. Trajectories are generated based on both (1) the classification of the object –possible goals associated with such object, and (2) the all previous locations. Impact of the Changes. NTSB investigators inquired with ATG regarding the impact that the postcrash changes would have had on the circumstances of the crash. ATG conducted simulation of the sensor data from the crash with a September 2018 version of the ADS software that include the aforementioned technical changes, as well as other changes. ATG reported that the new software would have been able to detect and correctly classify the pedestrian at a distance of approximately 88 meters—4.5 seconds—before the original time of impact. ATG stated that based on the new software, the system would have correctly classified the pedestrian as such, predict a path of crossing the street on a collision path with the SUV, and that 8 ATG reported that there was no increase in hard braking as a result of a falsely detected emergency situation/ imminent collision. Tempe, AZ – Vehicle Automation Report Page 15 of 16 the system would have initiated controlled braking more than 4 seconds before the original time of impact. The changes that ATG implemented in the areas of operational safety and organizational structure are discussed in Operations Factual report. (Ensar Becic, Ph.D.) (Vehicle Automation Investigator) Tempe, AZ – Vehicle Automation Report Page 16 of 16