26 pointsby ncvgl3 months ago8 comments
  • benterix3 months ago
    > Would love feedback from the HN community. What other features would make this more useful?

    A web app

    • scottydelta2 months ago
      Check out open web UI, it’s self hostable web app that can connect to different providers and models.
    • natoucs2 months ago
      I would if I found a way to keep access to the frontends of each LLM provider while being the web
  • sidcool3 months ago
    Make a web all pls. I'm not going to install a native app from unknown source.
    • mmh00002 months ago
      Especially where it is just an electron app.

      I don’t want to run your webpage in a web browser I have no control over.

      My normal browser has been tediously customized and tailored for my usability.

    • natoucs2 months ago
      That makes sense. But you can't access the native frontend if it is in a webapp
      • scottydelta2 months ago
        Why do you need native ChatGPT Frontend specifically?

        There are apps that provide similar Frontend and use api keys from ChatGPT and Gemini and others to provide all models under one web interface.

        • natoucs2 months ago
          Few reasons: keep access to the frontend features of each providers, have access to my chats I have in the individual frontend apps, to not have to trust a 3rd party provider, to not have to update the app each time a new model comes out
  • scottydelta2 months ago
    Open Web UI already provides this as a self hosted web solution.

    One good feature I like is ability to generate multiple responses from different models and merge it using one default model.

    • natoucs2 months ago
      Very nice! Do you still get access to the frontend of the original LLM providers and do you have to insert API keys ?
      • scottydelta2 months ago
        You get access to similar UI like ChatGPT and you connect the models you want to use by providing API key.

        Once configured you can choose between models of all providers you have connected in dropdown in chat.

  • BeetleB3 months ago
    Just FYI, Open WebUI has this feature built-in.
  • elsa262 months ago
    Make it a web app - there would definitely be less friction to try it out.
  • unstatusthequo3 months ago
    Now you just need to add a judge node that compares the responses, fact checks them, and outputs the best response of the three. Although this makes another issue of which model is that judge.
    • theoldgreybeard3 months ago
      Pewdiepie did something like this where all the AIs looked at each others answers and voted on the highest quality answer.

      Democracy!

      It worked pretty well until he updated them to know that poorly performing agents would get deleted and replaced. Then they started conspiring against him.

      (19:43 for relevant part) https://youtu.be/qw4fDU18RcU

    • irilesscent3 months ago
      Make a jury or blind models making a case for the best response and choose a random model to be the judge.
    • mschulkind3 months ago
      Just give me a day to vibe code an interface to side by side judge judging models...
    • adamisom2 months ago
      easy, let them all judge then? you guessed it...
  • hotgeart3 months ago
    Does it need an API Key or it's like an 'iframe' of the web version?
    • natoucs3 months ago
      iframe - you got it - it embeds the web apps
      • nextaccountic3 months ago
        A better feature would be to select one pf the responses as the best one, and use it as the context for all LLMs, as if they were sent by each ome

        But this would require API access instead of embedding web apps

        • natoucs2 months ago
          good idea! And yes that's the issue
  • browningstreet2 months ago
    Ninja Chat offers this.
    • natoucs2 months ago
      It doesn't offer to keep access to the native frontends