DETROIT • As auto accidents go, it was not much. Twelve minutes before noon on a cool June day, a Chevrolet Bolt was hit at the back as it crawled from a stop light in downtown San Francisco.
What made this crash noteworthy was the Bolt's driver - a computer.
In California, where companies such as Cruise Automation and Waymo are ramping up testing of self-driving cars, human drivers keep running into them in low-speed collisions.
The run-ins highlight an emerging culture clash between humans, who often treat traffic laws as guidelines, and autonomous cars that refuse to roll through a stop sign or exceed the speed limit.
"They don't drive like people. They drive like robots," said Mr Mike Ramsey, an analyst at Gartner who specialises in advanced automotive technologies. "They're odd and that's why they get hit."
Companies are now testing autonomous vehicles and watching how they interact with their human-driven counterparts as the makers prepare for a future in which their vehicles will be sharing the road.
What they have found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious.
They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law - unlike humans.
Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, said Mr Karl Iagnemma, chief executive of self-driving software developer NuTonomy.
"If the cars drive in a way that's distinct from the way that every other motorist on the road is driving, there will be, in the worst case, accidents and - in the best case - frustration," he added.
"What that's going to lead to is a lower likelihood that the public is going to accept the technology."
Sensors embedded in autonomous cars allow them to "see" the world with far more precision than humans. But the cars struggle to translate visual cues on the road into predictions about what might happen next, Mr Iagnemma said.
They also struggle to handle new scenarios that they have not encountered before.
California is the only American state that specifically requires reports when an autonomous vehicle is involved in an accident.
The records show vehicles in autonomous mode have been struck at the back 13 times since the beginning of last year, out of 31 collisions involving self-driving cars in total, according to the California Department Of Motor Vehicles.
The collisions almost always occur at intersections rather than in free-flowing traffic.
A Cruise autonomous vehicle was hit at the back last month, for example, while braking to avoid a vehicle drifting into its lane from the right as traffic advanced from a green light.
Waymo's Firefly autonomous vehicle prototypes were hit at the rear twice at the same intersection in Mountain View, California, in separate instances less than a month apart last year. In both cases, the cars were preparing to make a right-hand turn before they stopped to yield to oncoming traffic and got hit from behind.
Another time, a vehicle was rear-ended by a cyclist after it braked to avoid another car. And a truck racing to pass a slow-moving self-driving vehicle before a stop sign clipped it as it scooted back to the right.
The state's crash reports do not assign blame and provide only terse summaries of the incidents, but a few themes are common. They are almost always low-speed collisions with no injuries.
Elsewhere, Sergeant Alan Pfohl, a spokesman for the Phoenix Police Department, said the only crash he was aware of was one last March, in which an Uber Technologies' self-driving Volvo sport utility vehicle was toppled after being hit by another vehicle that failed to yield.
"Technology can always fail, but so can humans," he said.