Example-Airbags, Gender Bias in Algorithms

“Algorithms are engineered by people, at least at some level, and therefore they may include certain biases held by the people who created it. Everyone is biased about something. For example, airbags were designed on assumptions about the male body, making them dangerous for women. Because the designers were men. The same sort of bias that went into designing airbags can be included when designing algorithms.”

The use of dummies for testing is a necessary crash test before a car goes out to the field. Past data shows that men are the majority of people injured in car accidents, but women are 71% more likely to be slightly injured than men, 47% more likely to be seriously injured, and 21% more likely to die. Taking into account the differences in height and weight between men and women, as well as the use of seat belts, the intensity of car accidents, etc., there is only one reason for the difference in injury and death rates between men and women –

the car design is based on the average adult male body.

According to the regulations (C-NCAP, New Car Assessment Program) , when a car is subjected to a frontal crash test, a male dummy should be placed in the front driver and passenger positions to measure the front Injury of platoon personnel. Place a female dummy on the far left seat of the second row of seats, and a child safety seat and child dummy on the far right seat.

In the default test scenario, the woman is not the driver and can only act as a passenger in the rear.

Leave a Reply

Your email address will not be published. Required fields are marked *