week 6 reply

Question

STUDENT 1 

STUDENT 2 

Tyler

Hello Everyone,

According to Shin, Amazons hiring algorithm is one example of AI Bias. In 2015, Amazon realized that the algorithm used for hiring was found to be biased against women. When scanning the resumes of people for hire, it chose men more than it did women due to the majority of applicants being men. (Shin, 2020) This caused it to be trained to favor men over women. Another example by Shin, is COMPAS, Which is an algorithm that is used for sentencing by predicting the likelihood of a criminal reoffending. According to an analysis, it showed that black defendants posed a higher risk of reoffending than white defendants. (Shin, 2020)

In order to build fairness into the AI systems you will need to have awareness in building. There should also be a governance structure that will detect and mitigate data collection. When it comes to advantages over bias, I feel that race or sex should never be an issue that is over looked with AI. This causes people to not have the proper healthcare or makes them vulnerable to the judicial system. It also doesn't give people fair chances at obtaining a job.  Though with AI having biases, this can force us to have to deal with it and make change. 

References

Shin, T. (2020, June 4). Real-life examples of discriminating artificial intelligence. Medium. Retrieved March 23, 2022, from https://towardsdatascience.com/real-life-examples-of-discriminating-artificial-intelligence-cae395a90070

Details
Purchase An Answer Below

Have a similar question?