I may very well be wrong and have no job waiting for me in a couple years, but I feel like the goal of university should be to train the brain and become accustomed to software. The world of software is too large to be able to successfully teach the entirety of it in a couple years, so the next best thing is to prepare the students so they optimize any future learning.
For AI usage in class, I would do the same as in my university. The projects you can do as you like, but the exams are on paper and without AI. So if you choose to use AI for your projects, get ready for the exam because you may struggle there.
A subject that I feel is practically useless is for example Theory of Computation, but it has been one of my favorite subjects because it has forced me to think in some ways that I didn't before, and I have learnt a lot from it.
- Modern, simple, established tech stack. Python or Golang are good choices. Don't be teaching Java or C#, like some colleges probably still do.
- Add a little C programming to teach low-level fundamentals, enabling students to continue down that path if they desire.
- Developing from a Linux workstation. Deploying their code to Linux servers and the basics of managing them. (SBCs are great servers for learning)
- Cybersecurity fundamentals. How to avoid supply chain attacks.
- Exams: Students can work and use AI assistance. Afterwards their work is reviewed, and you question them why they made certain decisions. Those answers should make up 2/3 of the final grade.