Why does this require an LLM? Insurance companies either outline this effectively on their site as they want you to do preventative appointments or you can just call them to get the resources.
They were pretty good, I also used ChatGPT health that OpenAI released few days back which is said to be more grounded. So I wouldn't be too worried about hallucinations.