Tuesday 4 August 2020

Coded Bias

Systems can only be as good as the data lying underneath them (and even then).

This documentary talks to a few different people about how machine learning/algorithms are being put into place around the place and used to run things. Especially how the people running the algorithms don't really understand them. And that the algorithms are putting into code biases that are already in the data but are socially accepted anyway (especially disfavouring women and people of colour). London already has identification in place, and it's going wrong. China has identification everywhere and people are happy about it (honest, guv). And America is looking into it. Not yet, but let's be honest, it isn't far off.

While we see cases of where the algorithms are deficient, being efficient isn't necessarily good either. Depending on what has been programmed, by those that control the programs, what you get out will depend a lot on what you put in. Not that that's always known.

And even if it is known, and this point isn't addressed although they bring up resume and teacher etc assessment, what gets measured gets done. So if people find out "you need X", then "X" is what they will focus on. We already see this in terms of what is taught for students to pass exams, now extend that to whether or not you get hired, or can keep your job, or get a mortgage, or... That's just as inequal as everything else.

We definitely need to keep an eye on this, but acknowledge something basic: as soon as something happens (as it already has) that the lawmakers can point to to say "we need technology to do this for us", this will be implemented with all the bugs inherent in place.

[END]

No comments: