Moral Psychologist Azim Shariff believes that self-driving cars will inherently be programmed to shift risk towards some people and away from others.

“Consider an autonomous car that is deciding where to position itself in a lane – closer to a truck to its right, or a bicycle lane on its left,” said Shariff, an associate professor at the University of British Columbia in Vancouver. “If cars were always programmed to be slightly closer to the bicycle lane, they may slightly reduce the likelihood of hitting other cars, while slightly increasing the likelihood of hitting cyclists.”

Shariff is the co-author of a new report, The Moral Machine Experiment, on the ethical judgments yet to be made by autonomous car manufacturers. The Moral Machine, published last week in Nature, features results from an online game played by nearly 40 million volunteers in which they had to choose whether a car should hit a pregnant woman or swerve into a wall and kill the four passengers inside. Other decisions that had to be made were whether to kill an athlete or an overweight person, a child or a senior, and many other ethical dilemmas.

The results varied across regions, based on the culture’s preference for saving women over men or protecting the old over the young. Furthermore, some regions even went as far and deprioritizing the lives of “jaywalkers” in favor of those that followed pedestrian crossing laws.

The issue posed by The Moral Machine is similar to that of the Trolley Experiment, a philosophical concept in which a person must choose between allowing a runaway trolley to kill the five people in its current path, or to switch paths and kill another person who would be otherwise unharmed.

Although The Moral Machine did not feature any cyclists, Shariff said that self-driving cars would have to interact with them once they are on the road. “Over millions or billions [passing maneuvers with cars having to decide whether to drive close to another motor vehicle or to a cyclist] either more cyclists will die, or more passengers will die.”

This comes after numerous instances of issues with autonomous car manufacturers and cyclists, such as a Google Car stopping at an intersection behind a cyclist who was also stopped, a Nissan car narrowly passing by a cyclist on the road, not giving the ample space mandated by law, and even a Mercedes driverless car technician allegedly claiming that the technology would always favor the lives of those in the car and not those outside.

“One of the biggest problems is people with bicycles,” said Renault Chief Executive Carlos Ghosn. “The car is confused by [cyclists] because from time-to-time they behave like pedestrians and from time-to-time they behave like cars.”

If self-driving cars kill cyclists, some may argue that they are not any more dangerous than human-controlled cars. But, the net number of cyclists killed would reduce, argues Shariff. “Autonomous cars likely won’t be let on the roads in large numbers until they are significantly safer than current drivers. So, while the ratios of cyclist deaths compared to pedestrian or passenger deaths might not improve (and could potentially get worse), the total numbers of cyclist deaths will hopefully fall.”