JJ Burns wrote an article recently published in the American Association for Justice’s Truck Litigation Group’s semi-annual newsletter entitled Automated Commercial Vehicles: Developments in 2018.
Here is a copy of the article:
The use of some automated vehicles, including autonomous CMVs, will almost certainly be commonplace in the future, however, there remain significant technological and regulatory obstacles to a transportation landscape populated with driverless vehicles. Weather conditions, unpredictable pedestrians, sensory integration, environmental coordination, a variety of federal regulations, and public acceptance constitute significant hurdles to a world where a non-negligible percentage of vehicles lack a human operator.
For purposes of clarity, it is important to establish precisely what is meant by “automated vehicles.” The Society of Automotive Engineers has provided a taxonomy and definitions for terms related to driving automation systems that has been adopted by the DOT as well as most of the private stakeholders in the autonomous driving field. SAE’s taxonomy includes a spectrum on which there are a different automation levels: “0” is no automation, or full-time performance by the human driver of all aspects of the “dynamic driving task” or “DDT,” even when enhanced by warning or intervention systems; “1” is Driver Assistance, where the driving mode-specific execution by a driver assistance system of either steering or acceleration / deceleration uses information about the driving environment but expects that a human driver performs all remaining aspects of the DDT; “2” is Partial Automation – the driving mode-specific execution by one or more driver assistance systems of both steering or acceleration / deceleration uses information about the driving environment but expects that a human driver performs all remaining aspects of the DDT; “3” is Conditional Automation, which consists of a driving mode-specific performance by an automated driving system of all aspects of the DDT with the expectation that a human driver will respond appropriately to a request to intervene; “4” is High Automation – the driving-mode specific performance by an automated driving system of all aspects of the DDT, even if a human driver does not respond appropriately to a request to intervene; and “5” is Full Automation, which consists of the full-time performance of an automated driving system of all aspects of the DDT under all roadway and environmental conditions that can be managed by a human driver. Obviously, from both a technological and legal perspective, there are fewer obstacles to ordinary, non-trial utilization of automated driving systems on the “lower” end of the automation spectrum; in fact, vehicles using some of these systems are in use today.
The technological obstacles to “high” or “full” automation of tractor-trailers and similar vehicles are many. First, commercial vehicle dynamics are far more complex than the dynamics of passenger vehicles; this point is based upon a number of technical issues that are beyond the scope of this brief update, but functionally, this just means that automation for big trucks is always going to lag behind automation for smaller vehicles.
Second, weather conditions like rain, snow and fog pose a currently insurmountable obstacle to the development of fully-automated heavy trucks: “As things stand today, the driverless care of the future can’t handle more than a dusting of snow.” According to a study by the World Economic Forum and the Consulting Group NuTonomy, rain, fog and snow not only potentially alter a vehicle’s traction, but also change how a vehicle’s cameras and sensors perceive the roadway surface. WaveSense, an automation group in Boston, has built a radar system to scan what is below the roadway that is supposed to be able to provide reliable roadway / location data despite adverse weather conditions. Nevertheless, it remains the case that autonomous vehicles rely on a patchwork of sensors: GPS, traditional cameras, radar and lidar technology. Unfortunately, most camera sensors are useless in fog and heavy snow, and lidar sensors are thrown off by raindrops and snowflakes. The “weather problem” is why nearly all of the more advanced self-driving vehicles are restricted to sunny, dry cities. Cross-country treks through different areas of the country by driverless trucks are currently impossible.
Third, automated vehicles fail to respond adequately to the unpredictability of humans. “Whether self-driving cars can correctly identify and avoid pedestrians crossing streets has become a burning issue since March after an Uber self-driving car killed a woman in Arizona who was walking a bicycle across the street at night outside a designated crosswalk.” According to a preliminary report from federal regulators, the Uber vehicle’s sensors detected the woman, but its decision-making software discounted the sensory data as a false positive. One presently-available technological solution to the problem would entail changing the parameters of what data is capable of being deemed a false positive; however, the natural result of such a change (assuming the use of the same level of tech) would be that automated vehicles would stop and possibly remain stopped in the roadway for all kinds of objects and under numerous circumstances that are inappropriate and even unsafe. Some proponents of the currently available technology assert that humans simply need to act more predictably (i.e., not jaywalk), but the reality is that there will always be unexpected and relatively unpredictable conduct on or near the roadway by pedestrians, bicyclists, motorcyclists, and other motorists. Public acceptance of any driverless technologies will be lacking so long as automated vehicles cannot anticipate and respond to variables that ordinary human operators have to deal with all the time.
Fourth, many proponents of automated vehicles contend that the entire driving landscape has to shift and adapt along with new technologies. They maintain that intersections, exits, roadway surfaces, signage and lighting all has to change and be updated in order to work cohesively with autonomous vehicles. Those kinds of changes take significant time and coordination between and among various local, state and federal entities.
And fifth, there is a moral, or philosophical aspect to the programming (or the algorithms that provide the decision-making factors necessary for automation) that is currently being developed for autonomous vehicles. For example, in the face of a hazard / potential collision, is the automated vehicle programmed to maximize the safety of those contained within that particular vehicle, or is the automated vehicle programmed to maximize the safety of all motorists / passengers? It makes a big difference. Obviously, the legal questions implicated by the issues are just as vexing: What kind of duties do the programmers have? What constitutes reasonable care?
On the regulatory side, the main issue that advocates for commercial driverless technology face is that many of the Federal Motor Carrier Safety Regulations assume the existence of a human operator, as well as various instruments that are necessary for human operation of a tractor-trailer. Just last month, the Department of Transportation released Automated Vehicles 3.0: Preparing for the Future of Transportation. The update plan focuses on safety matters relevant to autonomous vehicles, as well as “building bridges between industry and government to successfully incorporate automated vehicle technologies into the U.S. transportation system.” As part of the DOT’s overall plan, the National Highway Traffic Safety Administration is putting out an Advanced Notice of Proposed Rulemaking relating to a pilot program for safe testing of automated driving systems. Additionally, notice of a proposed rulemaking by the FMCSA will soon take place regarding what FMCSRs apply to automated commercial vehicles, and which regulations might need to be changed. Most notably, Automated Vehicles 3.0 clearly states that motor carriers may participate in automated vehicle testing: for those automated systems that do not fully comply with the current FMCSRs, exemption requests have been invited by the administration.
Larry Minor, the FMCSA’s associate director for policy and program development, has stated that the FMCSA’s advance notice of proposed rulemaking is intended to identify regulatory gaps, including the areas of inspection, repair and maintenance of automated vehicles. Relatedly, the DOT indicated in Automated Vehicles 3.0 that it will broaden the definitions of “driver” and “operator” in such a way that the regulatory terminology no longer assumes or implies that a human (as opposed to an automated driving system) is operating a commercial motor vehicle.
Despite the aforementioned obstacles (and others), there are a variety of companies and stakeholders leading the pack in the area of vehicle automation. Google’s Waymo has promised to launch a self-driving taxi service in the next couple months. General Motors has pledged a rival service, using a car without wheels or pedals, sometime in 2019. It is unclear if either fleet will be capable of operating outside of designated areas or without a safety driver who can take over in an emergency situation. At the IAA Commercial Vehicles Show in Hanover, Germany, Volvo trucks exhibited an autonomous truck concept that entirely removes the cab from the tractor. The “Vera” would only be deployed in specific, limited applications that would complement today’s vehicles. Specifically, Volvo’s vehicle was designed for operations with dedicated routes over relatively short distances, low travel speeds and high volumes of goods; “maximum speed would depend on the application, but about 25 mph would be typical.” Volvo indicated that it foresees a need for some degree of remote control operation in the even of something unexpected, such as a fallen tree blocking the road. Ike, a newcomer to the field, intends to develop technology strictly for highway driving. Recognizing the limitations of current and near-future autonomous technology, the co-founders of Ike are working to “develop a self-driving truck for highway hauling, in which conditions are more predictable.” Numerous other companies have adopted the same highway strategy and are presently focused on “platooning” technology, where a number of automated vehicles “track” and “follow” a lead vehicle that is—according to most plans—operated by a human driver.
Other autonomous vehicle initiatives have recently lost steam. Elon Musk has shelved plans for an autonomous Tesla to drive across the U.S., and “Uber has axed a self-driving truck program to focus on autonomous cars.” Daimler Trucks, despite earlier optimism, now says that the earliest commercial driverless trucks are at least five years down the road.
Fully automated commercial vehicles are not going to be commonplace for a significant period of time. One of the co-founders of Ike, Alden Woodrow, explained that “[i]t would be very foolish of us to try and build a truck that could do all of the things a truck driver could. That is potentially impossible. It would certainly take a very long time.” Implicit in his statement, however, is the assertion that semi-automated commercial vehicles are on the horizon, and lawyers involved in truck crash litigation are going to be impacted by those associated technologies.
For example, in cases involving automation, will nearly every case against a motor carrier also be a potential products case? How will changes to the regulations (made for the benefit of automated vehicle technologies and systems) affect relevant standards? How will public attitudes about automated vehicles affect litigation decisions and strategy? Will a “safety driver” or “safety operator” (the human in the automated truck) have to be qualified in the same way as a current CDL driver? Will regulations adequately define the situations in which safety operators are required to take over some or all operation of the vehicle? Are the standards applicable to automated vehicle programming (which are intentional, and designed in a non-emergency setting) and/or programmers comparable to the standards applied to human operators in emergency situations? Are the limitations of automated vehicles in, for example, adverse weather conditions comparable to the limitations of human drivers operating in adverse weather conditions? Will the increased of automated vehicles (and the general knowledge of that increased use) have an effect on standards related to the standard of care for pedestrians, cyclists and motorists? For example, will the knowledge of the use of new vehicles with different limitations make jaywalking a more serious offense?
Some of these questions are being addressed by engineers, software developers, trucking companies, safety advocacy groups, and government agencies, but some are not, and only time will tell how they are answered. In the meantime, everyone should download a copy of Automated Vehicles 3.0 (available at www.transportation.gov/av) and pay close attention to proposed changes to vehicle equipment and driver regulations.
 Automated Vehicles 3.0: Preparing for the Future of Transportation, US Department of Transportation (October 2018).
 Kyle Stock, “Self-Driving Cars Still Trying to Navigate Bad Weather,” Transport Topics (September 17, 2018).
 Jeremy Kahn, “To Get Ready for Robot Driving, Some Want to Reprogram Pedestrians,” Transport Topics (August 16, 2018).
 See Automated Vehicles 3.0: Preparing for the Future of Transportation, US Department of Transportation (October 2018).
 Brian Straight, “FMCSA plans proposed rulemaking to address regulations slowing development of autonomous technologies,” FreightWaves (October 4, 2018).
 Seth Clevenger, “Volvo Exhibits Autonomous Vera Concept,” Transport Topics (September 20, 2018).
 Joe Antoshak, “New Autonomous Trucking Company Ike Aims to Bring ‘Some Patience’ to Field, Transport Topics (October 23, 2018).