Ethics 1

Example 1: Use of Facial Recognition Technology in Law Enforcement

Police departments around the world have begun using facial recognition technology created with AI to fight crime in their jurisdictions. But, this new technology has come across as unethical because it showed racial bias and posed other problems in the communities where this technology has been used. While the accuracy of this technology was tested to have been over 90%, the outcomes were not universal for every group of people. In fact, facial recognition was recorded to have misidentified black women specifically by proportions of up to 34% more in some companies’ versions of facial recognition technology (Harvard). This fault greatly shows how using facial recognition by the police can be seen as unethical, especially when Black Americans in general are already more likely to be arrested and incarcerated for crimes than White Americans (Harvard). As for who is at fault for the technology being seen as unethical, our group thinks that the problem lies with both the software engineer and the functions being performed by the software. We think that the software engineer is at fault because they could have done a better job training the algorithm on a more diverse dataset, which could lead to a better percentage of correctly identifying all groups of people. We also think that the function this technology performs is part of why it’s considered unethical, since police tend to only set up facial recognition technology in areas of high crime, which sometimes the population groups that the technology has a higher percentage of misidentifying reside in. It’s unclear to us whether the use of facial recognition technology fits into the problematic aspect of engineer vs product behavior.

Sources:

Example 2: “Dieselgate” – Volkswagen Emissions Scandal

“Dieselgate,” also known as the Volkswagen emissions scandal, was a corporate scandal involving “defeat device” software. In 2015, the EPA charged Volkswagen with developing and installing this software onto millions of their diesel vehicles worldwide. The purpose of the software was to detect when a vehicle was going through regulatory emissions testing (by monitoring speed, position, etc. for “laboratory conditions”), and when it detected a test was occurring, it would alter the vehicle’s emissions (nitrogen oxide pollutants) in order to pass U.S. emissions tests and seem “cleaner” than it really was. In reality, not only were the vehicles not as clean as Volkswagen claimed, but the software was helping hide the fact that they were emitting 40 times the amount of nitrogen oxide pollutants than what is allowed in the U.S. (Hotten). Naturally, such a scandal is not just problematic because Volkswagen simply cheated some regulations, but these cheats – based on the software and built by a team of engineers – were truly harmful, thus making the software itself unethical as it intentionally hid these harmful impacts. Therefore, the Volkswagen emissions scandal is a unique case in unethical software development where the problem is both in the behavior of the software engineers and the functions performed by the defeat device software itself. As for the functions of the software, the increased emission of pollutants that the software helped hide was so high that researchers estimated the additional pollutants would lead to 59 premature deaths and numerous health issues, from respiratory to cardiac, as well as negative environmental impacts in regards to ozone exposure (Vaughan). The functions of the software were clearly unethical, as it was designed to intentionally hide emissions that had dangerous impacts on health and the environment. However, the problem lies with those who designed the software, too. While the scandal was perpetuated and pushed by various Volkswagen executives, it began with the software engineers, who engaged in unethical behavior from the start. According to James Robert Liang, a software engineer for Volkswagen at the time, his team of engineers was tasked with developing a new diesel engine; however, the team realized they could not develop an engine capable of meeting U.S. emissions standards, so they instead developed the defeat device software to simply get around these standards and hide the increased emissions during testing. This software was then hidden from the public, and both executives and engineers, knowing of the software’s existence, lied to regulatory agencies like the EPA and continued promoting the vehicles with the software as “environmentally-friendly” (U.S. Department of Justice). Thus, the scandal involved unethical software created and pushed by a team of software engineers who set out to engage in unethical behavior to make their jobs and goals easier.

Sources:

Example 3: Therac-25

The Therac-25 was a fully computer-controlled radiation therapy machine that used radiation to treat tumors of patients. Created by Atomic Energy of Canada Limited (AECL), these machines gave massive overdoses of radiation to at least six patients between the years of 1985 and 1987, causing severe injuries and deaths to three patients (Baase). Many things went wrong: poor safety design, lack of testing, and concurrent programming errors in the software (Leveson). Therac-25 was meant to be an upgrade to previous machines called Therac-6 and Therac-20; as such, it used similar software designs. However, the engineers of the software were so confident in their work that they opted to remove previously used hardware safety components that would prevent radiation from being given to a patient in unsafe conditions independent of the software (Baase). Another result of overconfidence was the lack of adequate testing of the model. After receiving word of the incidents, investigators found that AECL failed to produce extensive documentation regarding specifications of software as well as any sort of testing plan while the program for the Therac-25 was being developed (Baase). This case study highlights the serious risks of cutting corners, being careless, and avoiding responsibility when it comes to being a software engineer, especially when dealing with such a potentially dangerous machine like the Therac-25. The software designers clearly were not taking its issues seriously, and as a result, some people lost their lives, while others suffered very serious, lifelong injuries. Thus, the case of the Therac-25 serves as an important reminder to all of us as software engineers to always write and design with great care. We should thoroughly test all components of the system, and work on resolving issues as soon as they are reported, especially when designing software that can have a strong impact on the lives of others every day.

Sources: