Hi HN,I spent years at Toyota Group trying to digitize the "intuition" of master craftsmen (Shokunin).We often hit a wall where a master could detect a defect but couldn't explain why.In one case (detailed in the post), a master inspected parts by hitting them with a hammer. Frequency analysis showed no difference between "Good" and "Bad" parts. The engineers thought he was guessing.But by applying a framework to extract his "Thinking Process" ($T$) rather than just his "Action" ($A$), we found out he wasn't listening to the pitch—he was listening to the reverberation time (decay rate).Once we changed the feature extraction to focus on the "tail" of the sound, the AI matched his accuracy.I believe we are too focused on training AI on "Action Data" (what they did) and missing the "Thinking Data" (what risk they simulated).I'd love to hear if anyone else has successfully digitized a "gut feeling" in a physical domain?