Even if that is not intended." As Wired notes here.
These tend to be extreme, and outrageous and to push viewers toward extreme-outrageous positions. Or consider the political impact of YouTube: "Its goal is to get users to keep watching 'related' videos. "It sounds very much worth checking for unintended bias," Weinberg says. I asked about NeoGov, the digital hiring system that Pennsylvania and 25 other states are supposed to be using to replace the old system of state job exams. (For more on how bad data science makes bad public policy, Weinberg recommends Cathy O'Neil's book "Weapons of Math Destruction.") Weinberg says public financing programs using similar algorithms can similarly entrench neighborhoods' economic character. Weinberg admires how nonprofit ProPublica used freedom-of-information (aka right-to-know) law requests to show how New York State judges "bought an algorithm to assist in sentencing suggestions" that had the effect of concentrating perceived bias, rather than making sentencing dispassionate. "When you hear about it, you should be saying to yourself, 'Maybe I can get this uncovered!' " But it's a tool someone put together using limited information, probably in secret, with unintended but significant consequences, putting some people at a disadvantage, and biasing results. Or the agency will justify a costly new personal-data processing application, saying, " 'This is Artificial Intelligence'." As if that alone is a reason to use the new tool.
#Tower of trample online pro#
So news organizations and pro bono lawyers are suing to get the algorithms," Weinberg told me. And every time there are unintended consequences."Īren't government applications of personal data public? No: "Because they get the algorithm from a company, it's 'proprietary.' But there is an argument they should be made available to the public, under transparency laws. "The algorithms are often not complicated. We'll make a better one for you," Weinberg says. Here's how it works: IT contractors/consultants come in to government agencies "and say, 'Your old system is not good. So it's extra worrisome how even government agencies are putting personal data to work, no matter the accuracy and privacy concerns. Hulu shows have been unencrypted." Your preferences are easier to track, list and sell. A lot of it turns out to have been unencrypted. So Verizon, Comcast and other "ISPs can now collect and sell your data, on where you visit. Government isn't ready to shield us, Weinberg says: Congress allowed the old prohibition against your internet service provider selling your browser history to expire last year. See Kashmir Hill's work for Gizmodo-her takedown of the "Internet of Things," showing how companies collect and interpret your data from devices and services you pay for, in "The House that Spied on Me" and how Google uses its power to quash ideas it doesn't like. He tells how some of his heroes-investigative reporters, working for old and new-style news organizations-have found "significant bias in the algorithms." It's worse than "garbage in, garbage out," Weinberg says. Your content goes viral, and you don't make any money."īut isn't complaining that the internet is a device to enrich tech moguls at public expense like objecting to progress? Aren't the internet algorithms that tell us what YouTube videos to watch, which old classmates to friend on Facebook, and what celebrities wear cooler clothes on Instragram, based on math, demographics and logic? And they keep increasing the price it's prohibitively expensive.
Facebook has taken their content and, with it, their audience, forcing them to pay to place their own new videos. Check out what's happened to, he said, which "just announced layoffs for half the staff.