A1. Ethics

Cases:

  1. Elizabeth Holmes and Theranos
    • Description: Elizabeth Holmes founded Theranos, a company that had technology that would only require a small amount of blood and be able to detect medical conditions such as cancer or high cholesterol. Holmes sold investors and the public this idea of a machine that could detect these conditions without showing how the technology worked or revealing any information about these machines. The company raised millions of dollars in funding with the promise of being able to reduce the costs for people to get medical tests done. In fact, this technology did not even exist.
    • Unethical: Patients who submitted their blood tests were getting results back with diagnostics from a technology that did not work. This was unethical of Holmes, the founder, and anyone else who was involved in lying to investors and patients to submit their blood work for a machine that does not run and produce accurate medical conditions. Additionally, some patients acted on these diagnostics and made medical decisions. Lying to patients about their medical conditions is harmful to these individuals especially if they made decisions based off of them that can affect their health.
  2. Volkswagen Emissions Scandal
    • Description: Volkswagen is one of the largest automobile manufacturers in the world. After being founded in 1937 by the Nazi Party, it continued to become a world innovator in diesel engines and was the number one sales brand in 2016 and 2017. Selling cars worldwide, Volkswagen is subject to all the car industry regulations such as the US Environmental Protection Agency. Also known as the Emissionsgate or Dieselgate scandal, the Volkswagen Group was accused of violating the aforementioned US Environmental Protection Agency’s Clean Air Act in September 2015. As a matter of fact, 11 million Volkswagen and Audi vehicles ran on a software that, when introduced to a testing environment, would switch to eco mode and would pass all the regulating limits. However, after being pulled out of testing, it would switch back to normal mode, producing emissions over 40x the Clean Air Act’s limits.
    • Unethical: This is evidently an unethical use of software that disguised the cars’ true performance. As mentioned in the link, it is highly unlikely that software engineers were not aware of what they were doing, so this is a case where both the product and the engineers’ behavior was unethical. We consider this being unethical on an intellectual level, since the data that was recorded (and most probably transferred/used) was not real, and on a physical level too, as 11 million cars would contribute significantly over time to the greenhouse gas effect, global warming, and air pollution.
  3. TikTok’s Algorithm
    • Description: TikTok has quickly become the most popular social media in the world, and this has happened by design. The app consists of an endless scroll of maximum 3 minute videos. The most popular feature is the “For You” page which are videos completely personalized to an individual’s account. What makes it different from other social media is the personalization is not based off only what you like but physical characteristics as well. The arguably addictive algorithm consists of displacing videos that treat the app as more like entertainment than a social media, there’s a video for any topic you can think of! The engineers designed the app so every video you watch gives the algorithm insight into who who you are and your preferences.
    • Unethical: The extreme personalization of the TikTok algorithm’s creates an echo chamber that only shows what is believed the user wants to see and hides what they don’t this leads to intense confirmation bias and creates an echo chamber. This is unethical because being in an echo chamber minimizes a very important part of socialization of being exposed to various perspectives to have a more well rounded belief system. Studies have shown how easy it is to fall down the alt-right pipeline into bigotry videos. Not only that but it is even more harmful when the echo chamber consists of videos filled with bigotry. Finally, there are several potentially harmful dangers of using phones too much, such as anxiety, depression (by comparing to others), and becoming unproductive. Generating an algorithm of this type clearly has an unethical side, particularly when debating about younger users.