What paper? This is slop.
No, BitNet not requiring multiplication will not put a foundational model in your pocket. It would be nice for power if tinary models had scaled, but since it requires roughly 3x the parameters of a similarly capable model, the memory bandwidth does not scale down nearly as well.
The real trick is that a classic LLM is not useful in the scenarios the author proposes. The hypothetical livestock vet is far better served by her books and a phone call to a university ag extension to confer with colleagues than an LLM disconnected from the internet that will hallucinate nonsense.