32 pointsby steveklabnik4 hours ago3 comments
  • rurban15 minutes ago
    Of course not. Users love the chatbot. It's fast and easier to use than manually searching for answers or sticking together reports and graphs.

    There is no latency, because the inference is done locally. On a server at the customer with a big GPU

  • kami23an hour ago
    Love this, this is what I have been envisioning as a LLM first OS! Feels like truly organic computing. Maybe Minority Report figured it out way back then.

    The idea of having the elements anticipated and lowering the cognitive load of searching a giant drop down list scratches a good place in my brain. Instantly recognize it as such a better experience than what we have on the web.

    I think something like this is the long term future for personal computing, maybe I'm way off, but this the type of computing I want to be doing, highly customized to my exact flow, highly malleable to improvement and feedback.

  • dhruv30062 hours ago
    This is something I agree with.Will be interesting to see if more and more people take this philosophy up.