![]() ![]() wow).īottom line: copy a file into the library, give me a dialog box to input the metadata. I also see plenty of things to send an email (that seems that's all Flow does is send out emails. ![]() I've searched and found something from 2017. What I want is a dialog prompt to input the file metadata for that libray. Querying “child lover,” for instance, offers suggestions for “child lover pics,” “child lover guide,” and “child lover chat.” Given Google and Microsoft’s available technology and resources, and combined with their ostensible commitment, it’s hard to believe that these types of errors slipped through the cracks.Periodically, I want to copy files from OneDrive to a SharePoint Online library, or from somewhere other than OneDrive such as Outlook. Moreover, searching for “child genital” or “child lover” on Google or Bing, as well as “child lust” on Google, all lead to disturbing suggestions that relate to child pornography. While you might think it wry that Google and Bing suggest completions for “prostitute,” the fact that Google also offers completions of “child prostitute” for “images” or “movies” is far more alarming. But there are some conspicuous failures as well. Bing blocks “child nipple,” for instance. According to Ian Brown and Christopher Marsden in their book Regulating Code, “Many governments impose some censorship in their jurisdiction according to content that is illegal under national laws.” So it’s not entirely surprising that, in order to head off more direct government intervention, corporations like Google and Microsoft self-regulate by trying to scrub their autocomplete results clean of suggestions that lead to child pornography.Īs shown in the next figure, both algorithms do get much stricter when you add “child” before the search term. By that logic, it would seem that “homosexual” merely leads to “too much” adult content, causing the algorithm to flag and filter it.įor something like child pornography, the legal stakes get much higher. For example, both algorithms think “prostitute” is just dandy, suggesting options for prostitute “phone numbers” or “websites.” They’re not about sexual deprivation: Bing is happy to complete searches for “masturbate” and “hand job.” Conspicuously, Bing does block query suggestions for “homosexual,” raising the question: Is there such a thing as a gay-friendly search engine? In response, a Microsoft spokesperson commented that, “Sometimes seemingly benign queries can lead to adult content,” and consequently are filtered from autosuggest. In the following diagram, you can see words blocked only by Google or Bing, and by both or neither. ( See my blog for the methodological detail.) The results aren’t always pretty.Īrmed with a list of 110 sex-related words, gathered from the linguistic extremes of both academic linguists and that tome of slang the Urban Dictionary, I first sought to understand which words resulted in zero suggestions (which likely means the word is blocked). What exactly are the boundaries and editorial criteria of that censorship, and how do they differ among search engines? More importantly, what kinds of mistakes do these algorithms make in applying their editorial criteria? To answer these questions, I automatically gathered autosuggest results from hundreds of queries related to sex and violence in an effort to find those that are surprising or deviant. ![]()
0 Comments
Leave a Reply. |