1 pointby zakmcintyre4 hours ago1 comment
  • zakmcintyre4 hours ago
    Hey HN – I'm Zak, a solo iOS developer. I built ThoughtTrack because I kept losing ideas in the gap between having them and writing them down.

    The core problem: your brain drops thoughts within ~30 seconds. Notes apps don't solve this because they require you to organize as you capture. ThoughtTrack separates capture from organization entirely.

    How it works:

    - Tap one button to voice-capture a thought. It transcribes on-device using Apple Speech.

    -The "Nebula" view shows all your thoughts as a drifting particle system — no folders, no hierarchy. Connected ideas glow brighter and drift closer.

    - A composite scoring algorithm finds "Thought Bridges" between entries using NLEmbedding (semantic similarity, 0.55 weight), NLTagger for named entity overlap (0.25), and tag Jaccard similarity (0.20). Threshold of 0.40 to form a connection.

    - A context-aware resurfacing engine uses activity state, focus mode, time, location, and audio routing to decide when to bring a thought back — not just which one. It adapts its weights based on whether you engage with or dismiss notifications.

    Stack: Pure SwiftUI + SwiftData + Observation framework. All NLP runs on-device via Apple's NaturalLanguage framework — nothing leaves the phone. Also has a watchOS companion for wrist capture and a widget for the daily "Spark."

    The architecture challenge I'm most proud of solving: decomposing a monolithic AppState into separate ThoughtRepository, NavigationCoordinator, and NotificationCoordinator while keeping SwiftData's CloudKit sync working with conflict resolution.

    Free 7-day trial, then 10 thoughts/month on the free tier. Would love feedback on the resurfacing algorithm — I'm still tuning the context weights and curious if anyone has worked on similar "right moment" notification systems.