Self Driving Cars : Is It Responsible For The Car 's Accidents?

848 Words Feb 15th, 2016 4 Pages
of Self-Driving Cars”, sheds more light on the situation, “if we were driving [a] car in manual mode, whichever way we reacted would be understood as just that, a reaction not a deliberate decision. It would be an instinctual panicked move with no forethought or malleolus, but if a programmer were to instruct the car to make the same move given condition it may sense in the future, Well that looks more like premeditative homicide” (“The ethical dilemma of self-driving cars” – Patrick Lin). Again, the idea of “who should program cars?” and “who is legally responsible for the car’s accidents?” arise. All the scenarios presented might be unlikely to occur; however, “they illuminate hidden or latent problems in normal cases” (“The Robot Car of Tomorrow May Just Be Programmed to Hit You, 3). The hidden problem being self-driving cars do not possess innate ethics, therefore they must be programmed, a task which might be impossible due to the obscure line between right and wrong. Ultimately, allowing these cars on the road would negatively impact our society causing liability issues for manufacturers and owners and privacy issues for the owners of these cars. Before even asking who should program these cars, we should consider if it is even possible to program morals. These philosophical questions are moral, and morality has no black or white, no right or wrong. “The thought-experiment [trolley problem] is a moral dilemma, because there’s no clearly right way to go” (“Here’s a…

Related Documents