I wouldn’t use the word deterministic here. I would use the word symbolic. Determinism, meaning that you always get the same output on the same input, isn’t what you want here. For instance, you can use an LLM without temperature, etc. and its output will be deterministic. More over, if you had a symbolic, non-deterministic algorithm you would probably also be happy to use that.
I was trying to show that determinism is not the crux by pointing out that there are ways to get a deterministic output from an LLM. And that thought experiment shows that determinism isn’t what’s essential.
And I will disagree about merely narrowing the outputs. If I download a local model and set the temperature to zero and give it the same prompt twice, I will get the same output. Not one of several outputs in a narrow set. LLMs are functions.
Maybe here I should emphasize on the fact that it's external to any LLM? I don't know.