'I' for information, not intelligent
As more of our lives go online, companies and governments are turning to automated methods to keep things in check—face recognition, profiling, personal data acquisition; all very Orwellian. I don't know what's more worrying, the consequences of it working, or the consequences when it doesn't work as expected or intended.
Big IT
isn't nearly as clever as it thinks it is. Hell's teeth, G∞gle can't even detect your language preference! And then getting through to a real human being, who will take permanent corrective action, is no walk in the park. The chances are high that, even if real human intervention corrects the error, the oh-so-clever algorithm will come along and shit all over it again.
F*c*book
Mike Hall, a photographer based in Winchester, England, knows all about this. F*c*book has blocked his adverts based on overtly sexual
or otherwise objectionable photographs of wildlife, buildings, and landscapes. Some of the reasons for these rejections are really quite hilarious, including:
- a Hong Kong skyline, on the basis that there was
nothing for sale
in the photograph itself; - a set of tramlines, which went against its ticket sales policy;
- and a neon sign of the word
DISCO
, for promoting alcohol.
Mr Hall has appealed every one of the rejections, without either response or being able to contact any of the morons at F*c*book.
G∞gle
Meanwhile, Kevin MacLeod at Incompetech has been having similar problems with automated rejections of G∞gle ads placement on his logarithmic graph paper generator. Not once, but four times. The reasons given being that the content was hurtful
or that there was no content
; make up your mind, G∞gle.
As he updates the page with each rejection, he wryly observes:
At some point, claiming there is no content here is going to provide this page with enough content.
Although annoying for the small businesses involved, the implications for future clever
algorithmic automation are disturbing. Best not to think about it then.