28 pointsby nLight6 hours ago5 comments
  • herovaa few seconds ago
    Using it from first release, all the best for you and thanks
  • nLight6 hours ago
    Hi HN! My name is Dima, and I'm the founder of Summit.

    I built this because I kept running into a hard limit with existing meeting tools: I couldn't use them for NDA-covered calls or internal discussions, since audio and transcripts had to be uploaded to third-party servers. On top of that, juggling multiple call apps made built-in summarization hard to use even when it was technically compliant.

    That's why Summit takes a different approach: everything runs locally on macOS - recording, transcription, speaker identification, and summarization. Nothing leaves the machine, and there's no account or cloud backend.

    The tradeoff is that it's more resource-intensive than cloud tools, and accuracy depends on the hardware you're running on. I spent a lot of time optimizing the local tool chain (e.g. smaller on-device models like Qwen) to make this practical on Apple Silicon. I tested it on a standard corporate MacBook Air with 16 GB RAM, which works well; more memory lets you run larger models, but 16 GB is enough.

    I believe in local-first AI and would love feedback from people here who've thought about it:

      – Is fully on-device processing something you'd personally value?
      – Are there privacy or compliance use cases I'm missing?
      – What would you want to inspect or control in a tool like this?
    
    Happy to answer any technical questions.
  • buschgrau29 minutes ago
    Dima, when can we expect versions for other operating systems, if there will be any at all?
    • nLight22 minutes ago
      For now, I'm focused on Apple's ecosystem. IOs app will leverage an encrypted iCloud database to sync from mac to iPhone. But the approach is transferrable, though it would be a breakout new cross-platform codebase for Win and Linux.
  • aaghaan hour ago
    Source code?
    • nLightan hour ago
      The app is not open source
  • mr_mig6 hours ago
    Woah, this is good!