3 pointsby burnerToBetOut25 days ago2 comments
  • hiAndrewQuinn24 days ago
    This is actually one area of research for me as a recently-minted DevSecOps engineer. Most cybersecurity attacks are relatively unsophisticated, and succeed by being scaled up so much they succeed on a few soft targets anyway - but a Ralph Wiggum loop on even a scaled up local edge model could make those kinds of techniques much, much more terrifyingly sophisticated for actors with that kind of hardware at their disposal. Abliterated models are of special interest here [1] because they make it even cheaper to do it at scale.

    It almost feels like we are seeing the digital analogue of the evolution of the flighted stinging insect, i.e. mosquitoes, bees, etc. They don't have to be very smart individually to absolutely decimate the population of megafauna. A tiny bit of economic intelligence goes a really long way here.

    [1]: https://huggingface.co/blog/mlabonne/abliteration

  • mikewarot25 days ago
    Social engineering using AI generated content is likely already happening.

    AI just makes using existing script kiddie stuff easier.

    • burnerToBetOut24 days ago
      It's really funny that you said, "script kiddie". Before I edited my original draft of this post, I referred to them too :)

      > "…Social engineering…"

      I know Mitnick is synonymous with social engineering. This article in another post is right on the money with the kind of cyber attacks I had in mind, however: https://news.ycombinator.com/item?id=46605553