27 pointsby sampo2 days ago6 comments
  • Havoc2 days ago
    >“We know that China, Iran, Russia, Turkey, and North Korea are using bot networks to amplify narratives all over the world,”

    Add India and Israel to the list. Saw a pretty sharp uptick in supportive content for each immediately after violence broke out.

    There is also a staggering amount about US military though there it's a bit harder to tell whether it's organic. Content about say F22 could very well be...it's pretty cool and presumably gets views.

  • dredmorbius2 days ago
    One of the underappreciated aspects of information, publishing, and communications systems is that they can not only be used against you by their owners and operators, but by other opportunists.

    Some years back a friend coined what I call "Woozle's Epistemic Paradox". Paraphrasing slightly:

    "As epistemic systems become used more widely or by influential groups, there is substantial power to be had by influencing the discussions that take place."

    The original formulation is more verbose:

    Our present epistemic systems are undergoing kind of the same shock that the online community underwent when transitioning from BBSs and Usenet to the commercial web to social media.

    We were used to a very high content-to-BS ratio because it took a certain amount of intelligence and intense domain-interest for people to be there in the first place -- and we've now transitioned to a situation where many people are there more or less accidentally and (the worst part), because of a high percentage of the population being present, there is now substantial power to be had by influencing the discussions that take place.

    Science is much the same. For a long time, it was this small thing operating off to the side; only elites could afford to indulge in it, and their discoveries affected very few -- so the truth value could remain high because there was relatively little to be gained by distortion. People's lives were largely governed by things that had been around long enough that the culture had evolved to deal with them more or less reasonably, so they didn't need advice from domain experts to provide accurate information -- and where expertise was needed, it flowed from parent to child and from master to apprentice as part of a cultural process that everyone understood.

    <https://web.archive.org/web/20210904005401/https://old.reddi...>

    (Originally posted to Google+ ~2017.)

    Which means that online forum moderation has to be performed with this in mind. Spammers and trolls are the easy stuff to root out (and they're hard enough). It's the spinners, manipulators, and propagandists who are truly insidious.

    As a related question, one might want to ask what the merits of the "marketplace of ideas" and ultimate value and goals of free speech itself are.

    (Both have been recent topics of David Runciman's truly excellent Past Present Future podcast. The first in a series on bad ideas, the second on revolutionary ones. They're more closely wedded than Runciman realises (and he does at least recognise this), and are likewise related to free market advocacy itself.)

    The History of Bad Ideas: The Marketplace of Ideas: <https://www.ppfideas.com/episodes/the-history-of-bad-ideas:-...>

    The History of Revolutionary Ideas: Free Speech: <https://www.ppfideas.com/episodes/the-history-of-revolutiona...>

    (Both links are to audio, no transcript.)

    For an excellent take on the Marketplace of Ideas trope and its relationship to John Stuart Mill, I strongly recommend Jill Gordon's "John Stuart Mill and the 'Marketplace of Ideas'" (1997) <https://doi.org/10.5840/soctheorpract199723210> Social Theory and Practice, Volume 23, Issue 2, Summer 1997, Pages 235-249.

  • silexia2 days ago
    Reddit is truly terrible with this now. Unusable.
    • junky2282 days ago
      I largely stopped using reddit a few years ago. I notice youtube as well is horrible. And then even more crazy is that YT will remove legitimate comments and censor legitimate videos, but leaves up obvious scam comments and obviously inappropriate bot comments and inappropriate videos.
  • techpineapple2 days ago
    It’s interesting to me that individuals would rarely identify as susceptible to misinformation, and yet en masse it seems pretty clear that millions of us are vulnerable.
    • bigbadfelinea day ago
      Although this problem's been around since forever (M Twain: “It's easier to fool people than to convince them that they have been fooled.”), there's a pretty good chance that it can be solved if a determined community approaches it seriously enough. That of course leads to the need to resolve problems like "determination" and "community" but here things get a bit circular in addition to the obvious resource constraints.
  • grinlif12315 hours ago
    I feel we've known about this for years, or have I missed something in that article? What concerns me more is how LLM/AI has obviously enabled this on an unimaginable scale, when it was already pretty bad. These days, when I go on reddit, I often notice things that make me think someone's got an ulterior motive behind the posts or comments that involves convincing me of something. I can recall noticing it a number of times in regards to overwhelmingly pro-Israel comments, where after another hospital got blown there would suddenly be a hundred posts of things like one was a Palestinian man mistreating a dog I think, and they all had 1000s of comments saying that justified the genocide. Just to be clear, not trying to make this politcal, just what I noticed. It felt similar after Trump and Zelensky met in that 'why aren't you in a suit' meeting, and suddenly I saw a bunch of subreddits with videos of Ukrainian alt-right marches - and again, comments that resoundingly agreed that Ukraine was not the perfect place we'd apparently been told. My point though, is it feels more and more like everything on the internet has an agenda behind it, and for me, it's just making the internet feel less like a social networking community between the people of the world and just propaganda slop. It also feels harder and harder to find the truth about things, or a balanced view. Everything is all just rage-bait, like the whole world is shouting at each other. What worries me about it is how much more divided we're getting in the real world, because you go out thinking these threads reflect the world and now everyones enraged with each other. Anyway I've woffled on and on but still can't really articulate what I mean, so I'll give up.
  • user47657782 days ago
    [flagged]