13 pointsby kanjun4 hours ago2 comments
  • Isolated_Routes3 hours ago
    I love the idea of this. Twitter used to be the go to place for real time community knowledge, but the algorithm has started pushing content that I don't want. I would love to be able to tailor it more to my needs. How are you addressing the on-device option? I'd definitely be most interested in using this in a way that doesn't send information to external servers. Thank you!
    • millanjp3 hours ago
      On the browser extension side, we're forking WebLLM, adding support for more modern multimodal models, and doing some optimization so that an M4 chip can keep up with scrolling. You can actually use it in bouncer today by going into settings and turning on the experimental local models.

      On the mobile side, we're working to get 4B models running in the Apple Neural Engine. Main bottleneck for Mobile is actually battery life. Neither are quite optimized enough to formally brag about, but we're almost there!

      • Isolated_Routesan hour ago
        I wish you luck! This is a clever and creative approach. I feel like we are inching towards on device solutions and I love seeing people work the problem like this.
  • millanjp4 hours ago
    [dead]