96 pointsby barodeur3 months ago10 comments
  • crowfunder3 months ago
    The year is 2030. REST API is dead. Invoking requests results in your web browser inbuilt LLM guessing what is supposed to be returned from the server. When opening github.com you occasionally see cheese, at times animal shelter hotline.
    • cactusplant73743 months ago
      How about submitting a prompt for a REST request and then the LLM decides what information you are allowed to access and what actions you are allowed to take? And it works like a batch job of sorts. One request but it can enter many flows.
    • arthurcolle3 months ago
      Nightmare fuel
  • antonvs3 months ago
    Relevant, from 20 years ago, "Enhancing Server Availability and Security Through Failure-Oblivious Computing", Rinard et al.: https://people.csail.mit.edu/rinard/techreport/MIT-CSAIL-TR-...

    From the abstract:

    > Failure-oblivious computing "enables servers to execute through memory errors without memory corruption. Our safe compiler for C inserts checks that dynamically detect invalid memory accesses. Instead of terminating or throwing an exception, the generated code simply discards invalid writes and manufactures values to return for invalid reads, enabling the server to continue its normal execution path.

    From the conclusion:

    > Our results show that failure-oblivious computation enhances availability, resilience, and security by continuing to execute through memory errors while ensuring that such errors do not corrupt the address space or data structures of the computation. In many cases failure-oblivious computing can automatically convert unanticipated and dangerous inputs or data into anticipated error cases that the program is designed to handle correctly. The result is that the program survives the unanticipated situation, returns back into its normal operating envelope, and continues to satisfy the needs of its users.

  • mowkdizz3 months ago
    I wrote a similar program using Ruby metaprogrammming, but instead if a function is called that doesn't exist (say in tests) it has the LLM fix it dynamically
    • ipnon3 months ago
      Don't leave us hanging!
      • mowkdizz3 months ago
        Haha I will dig it up sometime, but it was a little prototype!
  • kayodelycaon3 months ago
    Beautiful. Unfortunately, I don’t think I can afford the Black Friday price unless I find a volunteer.
  • actsasbuffoon3 months ago
    This is a delightfully horrible idea. Well played.
  • 3 months ago
    undefined
  • knowitnone33 months ago
    Or have LLM catch these errors during development
  • empiko3 months ago
    I wonder how many people will use it unironically despute the writing. It's probably going to be a nonzero number, right?
    • 3abiton3 months ago
      Wait till it gets gobbled by the next gen training data, and embedded in the weights of upcoming LLMs. Paired a clueless vibe coder, with no token limits.
  • ramanvarma3 months ago
    only if your production environment is a raspberry pi under your bed haha
  • cwmoore3 months ago
    “Depending on what the AI thinks is funny today.”

    lol