Ethics 1

Zillow iBuying AI

Zillow, the online real estate marketplace, set up a program called Zillow Offers. The algorithm-based Zillow Offers iBuying program would buy, renovate, and then sell houses for a profit. After some time, Zillow discovered that their house-buying algorithm was estimating house prices that were much too high, and buying too many of them at that. The algorithm was artificially driving prices up so high in certain areas that prospective homeowners were not able to purchase the Zillow-flipped homes. The program was seen as a failure of the AI and machine learning algorithm developed by Zillow’s software engineers.

This situation has dubious ethical grounds because it artificially drove up real estate prices, so that less real people were able to buy homes to live in. In general, there are ethical issues at hand when a major corporation is able to have such an enormous effect on a necessary human right and resource: shelter. In this case, software engineers are simply carrying out a task given to them by executives. The product that they made, the iBuying algorithm, is performing an unethical function based on the decisions of Zillow as a company.

TikTok’s Recommendation Algorithm

TikTok is a video hosting service known for their recommendation algorithm, which curates personalized video feeds to consumer interests based on previous activity. Specifically, TikTok is known to pull from the following data points:

  • Followed accounts
  • Posted comments
  • Liked and shared videos
  • Favorited videos
  • Videos reported or marked as not interested
  • Video completion rate
  • Personally created content

Addiction to TikTok is caused by a number factors inherent to the app’s design:

  • Fewer choices to make. Since TikTok constantly provides the next video to watch, user decision-making decreases. This puts the user on “auto-pilot,” enabling them to mindlessly consume their feed for hours.
  • Infinite Scroll: The availability of endless content makes it easy to stay glued to the app.
  • Low Downtime: Video-playing never ends. After finishing a video, the next one is available instantly. Users can also like videos and read/add comments without pausing.

Naturally, higher user activity is more profitable for TikTok, so the addictive nature of the app represents an unethical profit source designed by the developers.

This problem fits better under unethical software function than engineer behavior. The software is designed to be addictive to maximize user retention and the company’s profit. Naturally, a profitable company is better for its engineers, but it would be a stretch to consider the addiction of each individual user directly profitable for the engineers or motivating for unethical engineer behavior.

Self Driving Cars

Self-driving cars are more accessible today than ever. This can be contributed to Tesla, the largest self-driving car manufacturer. The company sells its Full Self-Driving feature for $15,000. However, this brings the question of whether the technology itself is safe and reliable. An AI-driven automobile is more precise and consistent than a human driver making it less likely to start an accident. An algorithm can’t fall asleep, get distracted, or go past the speeding limit only if the algorithm is comprehensively trained. Then who decides how to program a self-driving car to prevent accidents? Every decision an autonomous vehicle makes has to be intentionally programmed and trained into it. Because an algorithm can’t make an instinctive decision like a human can on the road. Even if the algorithm is highly fine-tuned, accidents can never be 100% prevented. In a vehicle crash where, no matter what, there is a high likelihood of someone getting injured who will the algorithm prioritize? Will the program be told to prioritize protecting the driver, its passengers, pedestrians, or the other drivers on the road? Subjective biases are not an ethical way to train autonomous vehicles.