60 pointsby tcsenpai12 hours ago5 comments
  • asdev2 hours ago
    I built a chrome version of this for summarizing HN comments: https://github.com/built-by-as/FastDigest
  • RicoElectrico8 hours ago
    I've found that for the most part the articles that I want summarized are those which only fit the largest context models such as Claude. Because otherwise I can skim-read the article possibly in reader mode for legibility.

    Is llama 2 a good fit considering its small context window?

    • tcsenpai7 hours ago
      Personally I use llama3.1:8b or mistral-nemo:latest which have a decent contex window (even if it is less than the commercial ones usually). I am working on a token calculator / division of the content method too but is very early
      • garyfirestorman hour ago
        why not llama3.2:3B? it has fairly large context window too
  • 7 hours ago
    undefined
  • chx5 hours ago
    Help me understand why people are using these.

    I presume you want information of some value to you otherwise you wouldn't bother reading an article. Then you feed it to a probabilistic algorithm and so you can not have any idea what the output has to do with the input. Like https://i.imgur.com/n6hFwVv.png you can somewhat decipher what this slop wants to be but what if the summary leaves out or invents or inverts some crucial piece of info?

    • andrewmcwatters5 hours ago
      People write too much. Get to the point.
      • ranger_danger4 hours ago
        I think you just insulted every journalist on Earth.
      • throwup2385 hours ago
        Even if I want to read the entirety of a piece of long form writing I'll often summarize it (with Kagi key points mode) so that I know what the overall points are and can follow the writing better. Too much long form writing is written like some mystery thriller where the writer has to unpack an entire storyline before they'll state their main thesis, so it helps my reading comprehension to know what the point is going in. The personal interest stories that precede the main content always land better that way.
      • chx5 hours ago
        any point? regardless of what's written? does that work for you?
        • 87m78m78man hour ago
          Why don't you try using these tools yourself so you have an understanding of them? People like to get shit summarized, its really not as deep as you are trying to make it out to be.
        • garyfirestorm2 hours ago
          sometimes you don't have time to read an entirety of a large article. You want a quick summary, some people are poor at summarizing things in their head as they go and can get lost in dense text. Extensions like these really help me with headers, structure that I want to follow, quick overview and gives me an idea if I want to deep dive further.
          • drdaeman31 minutes ago
            Sometimes it's not even an article, but a video. And sometimes all you care is just a single tiny fact from that video.

            Although I don't think this particular summarizer works for videos. And I don't think Ollama API supports audio ingestion for transcription. There are some summarizers that work with YouTube specifically (using automatic subtitles).

  • donclark8 hours ago
    If we can get this as the default for all the newly posted HN articles please and thank you?
    • totallymike2 hours ago
      I sincerely hope this never happens