We should opt into data tracking, not out of it, says DuckDuckGo CEO Gabe Weinberg (Vox)
On the latest Recode Decode with Kara Swisher, Weinberg explains why it’s time for Congress to step in and make "do not track" the norm.

This is a long, thorough, and very in-depth interview with Gabe Weinberg, covering several inter-linked topics. First is privacy, which is DuckDuckGo’s raison d’être. Near the end of this topic, there’s some talk about why some people don’t care about the privacy impact of the data collection underpinnings of the mainstream web.

One of the things a lot of people do bring up with me still, though, is, “Well, I don’t really care. I don’t have much to hide. It doesn’t matter.” I get that all the time. Like, who cares if they know if I went to Best Buy and bought a, whatever I bought. Talk to why that might be not the best way to think about it.

There’s two answers to that. One is philosophical, in that privacy is a fundamental human right, and so you don’t need to care or hide anything to exercise your rights. You wouldn’t say that for speech. Just because you have nothing to say doesn’t mean you should never have free speech. That’s kind of on the philosophical side.

On the harm side, there are some that people don’t realize. A lot of people really don’t like the creepy ads following them around. Some people seem to be fine with that. At a deeper level, there’s this thing called the filter bubble, which is that recommendation algorithms, and in particular, search results, are tailored to you, and that means that you’re not seeing what everyone else is seeing, and that actually distorts the democracy. That’s a real harm to individual people and society.

I don’t think I’ve seen the “I’ve got nothing to hide” vs right to privacy argument reframed against a “nothing to say” vs right to free speech argument. I’ve not though about it enough yet to get a feel for if it holds up under scrutiny, but at first blush it seems good.

The next topic covered after privacy is the “filter bubble” and how the idea of it has gone mainstream in the last few years:

I’ll give you an example. We’ve been talking about the filter bubble for years. In 2012, we ran a study on Google that we think influenced the 2012 election, that’s how long ago it was, but nobody … we had to speak for 10 minutes to explain what the filter bubble was back then. But after 2016, in the last two years, now we can talk about the filter bubble, just name it and people know what it is, generally. How many people know what the filter bubble is, I’m just curious?

Explain the filter bubble.

Well, it’s the idea — first of all, that percentage is very high, so I like that — but it’s the idea that for search in particular, as an example, when you search, you expect to get the results right? If you searched for gun control or abortion, you expect, we search at the same time right here, you would expect to get the same thing. But that’s actually not what we found when we did a study on Google.

Yes, there could be different search results.

Yeah, and people don’t realize that. So in addition, we found that it varies a lot by location, and so if you take that to the extreme, let’s say that voting districts are getting different results for candidates or issues, it can skew the polarization of that district very easily over time. Because people who are undecided are actually searching for these topics, and people generally click on the first link, and if you’re controlling that first link in that district, that’s what people are going to learn about.

I haven’t had time to read the entire transcript yet (it’s pretty long), but I’m going to try to digest it over a couple of sessions.

Forget privacy: you're terrible at targeting anyway (apenwarr)
The state of personalized recommendations is surprisingly terrible. At this point, the top recommendation is always a clickbait rage-creating article about movie stars or whatever Trump did or didn't do in the last 6 hours. Or if not an article, then a video or documentary. That's not what I want to read or to watch, but I sometimes get sucked in anyway, and then it's recommendation apocalypse time, because the algorithm now thinks I like reading about Trump, and now everything is Trump. Never give positive feedback to an AI.

Every service I use which has moved to “personalised recommendations/discovery” using some ML algorithm has gotten worse by doing so.

Every. Single. One.