I know the job market can suck for graduates right now, but I do believe studying CS can still lead to decent paying careers. There's always going to be demand for people who understand code, who can break down complex problems and bring a problem solving mindset. LLMs don't solve everything.
The drop in CS students ironically may create a vacuum that allows us employed engineers to demand even higher compensation.
Even back when I was in college (graduated 2017), I noticed there was this clear bifurcation among the students. Alot of the students at that time did it because you could score a great job after college but the smaller cohort were the students that just loved the game. And even back then we had loads of students wash out or graduate then take other jobs after college from the former group.
It's no different today except that the group that did it for money are washing out before they even get to college because they fear that AI will take their jobs, meanwhile the latter group is still here and were able to do more and more with AI.
It's a truly wild time to be alive in this industry. Half of us are seeing the doom and gloom of AI and the other half are seeing the "next age" happen right before our eyes.
And I'll be honest I kinda feel sad for the folks that take the negative view of AI right now. Cause I'm having more fun than I've ever had before in this industry.
You covered people washing out of CS degrees and people getting degrees and then not, ultimately, doing something in the CS field.
But what you see in our field that you don't see as often, elsewhere (or -- at all, depending on regulations) is people who ... (1) washed out of the degree because it was competing with their lucrative career as a software developer (or -- more rarely, successful entrepreneur), (2) got a degree in textual biblical studies and had a long career in software development[0] or, you know, other unrelated degree, (3) none of the above, even sometimes incomplete High School education (also[0]).
I've been hiring developers for almost 30 years, now, at a variety of employers -- one global multi-national telecom, one "we make a lot of the products other companies pass off as their own work" IoT/small shop, and a couple of video conference/remote-enabling service shops. There are far more degrees out there, today, than there were 30 years ago. My experience, however, is that the necessity of a degree at the companies I've been employed at has gone down. I suspect that's because I worked for "the giant multinational", first, and all of the rest have been startup or smaller/younger shops (typically 5-10 devs, but no more than ~20 at peek). The giant multi-national, though, during my 17 years, changed (early on) to "or equivalent experience" while rarely hiring someone without a degree for most IT positions to routinely interviewing and hiring people without regard for their degree (and focusing on "code you've written" over "whiteboard exercises", too) while still generally favoring candidates with them. At the best shop I've worked for, it was an even mix of "none", "some", "unrelated", "+bootcamp", "CS degrees" and filled with extremely competent, well-paid, developers.
It's a whole lot harder to get the experience required to have "equivalent experience" without university/internships/the like, but getting the degree without any relevant work experience along the way isn't a good way to go, either.
Around the late 90s (until the bust) and then again a few years later, everyone was pushing kids into CS degrees and the most "interesting" aspect to many of those kids was the starting/long-term earnings against the cost of the 4-year degree. And while, personally, I think "anyone can do it", not "everyone will find it enjoyable to do" like I do.
I'm starting to believe that last part is far more rare than I think it is with my 18-year-old son mostly disliking his introductory computer programming class in High School[1]. I don't push "what I do" on them, just like my Dad didn't, but I expose them to it whenever I can (like my Dad -- kind of -- didn't). And I'll never forget when their Mom looked over at my screen and said "So ... is that what you do all day?", and I beamed "Yes" because it really is the most interesting thing in the world to me, and she said "Wow ... I think I'd kill myself."
[0] Ok, so that's a specific example of someone I know.
[1] Ultimately coming around at the end when his assignment was "make something you want to make."
Yes, for countries like India.
With AI, outsourcing becomes much more effective.
Thing is AI is taking outsourced jobs in india at much faster rate than elsewhere.
The latest layoff coming from Oracle mostly laid off workers in india.
I was talking to a guy who wanted uptime monitoring. So, he told the executive who called the uptimerobot but then other guy rolled out his uptime robot using AI in 60 minutes and deployed along with centralized logging and it costs the company only $5 VPS.
And honestly it works just as good, I've seen companies are refusing to pay for external tools and building leaner version using AI.
You can build a SaaS faster now but the need for SaaS is on decline.
I've moved to deploying on bare metal from OVH and Hertzner, why? Because devops is completely reduced to few minutes worth of work using agents.
Of course, this is not a production-grade deployment. To get there, I'd need to build images on pipelines, scan them, test them, publish artefacts, write up the IaC to manage the cloud resources, add monitoring around the solution, ...
Deploying a simple piece of software on a custom server was never difficult or slow to do.
If that script works for your use case then great, but I don't see how LLMs were a game changer here.
For 5 minutes. The need for cheap SaaS that one person can build and has no uptime requirements or security requirements or legal requirements or ongoing maintenance requirements is indeed declining.
Also, why do you hate the poor so much? Do people not deserve well paying jobs because the vagina they came out of wasn’t located on the correct patch of dirt?
The fact that this changed in the last 2 years right when AI became feasible is just a coincidence. It’s actually finally outsourcing destroying the grad job market nearly 3 decades later.
Fast forward a decade or two, and it is like you said, people who don't have a strong interest in computers starting taking CS as a major as a path to jobs and income.
Now, as a manager of engineering teams, I'm constantly surprised by Software Engineers that don't even own their own computers and/or have very little knowledge about how they work.
You can substitute every single possible profession for programmers above and still have a trust. Even things like groceries (everyone needs to eat) still has ups and downs (some recessions people go to restaurants less and so are good; others they switch to low margin staples and profits go down).
Also during the Dot Com era. Pretty much every cycle lead to more people getting into the field.
I'm not sure what the future will bring but stay humble and hungry.
These two roles are at odds with each other.
(Not a criticism! I don't personally feel informed enough to have an opinion on this subject.)
It's probably a good thing that the hype starts to die and we're seeing a market correction, hopefully back to a saner structure.
What’s happening now reminds me a lot of that.
The managers and everyone are so excited by the fact the person did it with AI but I just get really confused because it seems like they just made some worse that has less value because it cannot actually correctly simulate the thing we want to test. Maybe i am being petty and salty but I think the that this is time wasted by any measure. And net-negative value but the team wants to emphasize we are using AI. There have been some productive uses but the productivity trap-doors are about the same as with normal development just people seem more willing to take the trap door ideas now.
So let’s just wait a bit before we say it hit a wall.
I think we're in for an era where many folks will be filtered out and those who know and understand code, will be in high-demand.
(One candidate example for this is the discussion I've seen in the last few days about not trying to negate something, to say "Don't do X", but instead stay positive because eventually the negation gets lost in the context window and you're better off just not putting the idea in the LLM's mind at all, where "Don't do X" comes to be seen as an LLM antipattern.)
One of the consequences of none of us having used AI for long enough is that we don't know how to onboard developers in an age of AI. This will be, by necessity, transient. Eventually we're going to max out what a person can do and we'll need more people. The supply of existing engineers will be limited. We will be forced to discover how to onboard new engineers.
But at the moment we've got our hands full, and we don't know how to do it.
The irony is, the best time to join a field is often exactly when the enrollment dips and the worst can be precisely when it is the most popular. Start a programming college program today and the odds that in 4 years we'll have onboarding figured out and have developed some sort of need for fresh developers is pretty decent.
But I don't know what to do about the fact that the standard CS curriculum was already of debatable relevance to me in the late 90s and I don't know of what relevance it will be in four years except to guess that it very likely to be even less. I do know that we are again affected by the fact nobody has been doing this for 20 years, like I mentioned above. There is no body of "wisdom" for an AI-powered world to draw on to construct a new curriculum. Universities would be inclined to do the obvious thing and try to chase our current practices with AI but those aren't going to be stable enough to build a curriculum on any time soon, and a real fundamentals-based curriculum may involve less AI than people may think.
I know one advantage I have over my younger peers at this point is just a knowledge of what terms to say to the AI to get it to do what I want, words like "event sourced" or "message bus" or "stored procedures", where simply knowing that the concept exists is the bottleneck. I could see a programming curriculum based on touring through a whole whackload of concepts with their pros and cons, or at least, where that is a much larger portion of it.
Ask me in 5 years though and I'd almost certainly suggest a completely different curriculum than I would now, though.
Technically speaking, they are leftists who publicly oppose AI. They created the new Chief of AI Officer who has no support at all from the univeristy, had to go to politicians for support.
Whereas the college straight up opposes AI.
But what value is any of their degrees anymore? Suspicious at best.
The "leftist" administration created a position while at the same time speaking out against AI? Doesn't seem realistic.
> Whereas the college straight up opposes AI.
Opposes AI in what way? No courses on it? Does not allow students to utilize it? I have a hard time believing they do not offer a single course on any AI subject. Many colleges are offering it as a post-grad option, at least in Canada.
> But what value is any of their degrees anymore? Suspicious at best.
In general? I don't understand what you are getting at here.
It is an interesting situation to be sure. Not my place to say what they say or do. I'm just commenting about the situation.
>Opposes AI in what way? No courses on it? Does not allow students to utilize it? I have a hard time believing they do not offer a single course on any AI subject.
Publicly opposed to the use of AI.
If a student is found using it, they are expelled. There's 1 public case where a student used Em-dashes and got accused of cheating. GPTzero says human written, but they got expelled. Probably just 1 extreme story, but i dont know.
As for courses. If you look up the 'courses' they do have a machine learning course where it has a single class. "Introduction to Python"
That's it. And that's a course, not even a degree or certification.
They have a 1 year 'data analytics' masters but it explicitly says it's not machine learning or AI. Here I am thinking university is rarely 2 years, always 3+. Yet this is just 1 year? Odd.
>Many colleges are offering it as a post-grad option, at least in Canada.
In fact, this was the more interesting case. I go to one of the networking events put on by the college, as a professional and not student. Nothing about AI technically, but the subject comes up often and almost always in negative light.
I looked into like being a teacher of AI at the college, but they have nothing at all.
>In general? I don't understand what you are getting at here.
Sorry. My bad. I have autism and dont communicate well.
Not your place to say what they say or do but you're doing exactly that? You're bringing politics into the comment and then throwing your hands up and saying "not my place to speak on this" when you are questioned about it.
> That's it. And that's a course, not even a degree or certification.
I see many universities and colleges in Ontario (where I am), for example, that offer post-grad programs on AI. I can't find a single school that has outright banned it. Do you have a source for these claims you are making? I might be uninformed if it is an American or European school but post-secondary institutions in this country seem to be generally supportive.
Teaching AI is a rather large field are you talking about LLMs/transformers? Are you talking about working with LLMs, which is something that seems to change every 6 months?
They do! It's a 1 year course about "Introduction to Python" and that's it.Wild.
>Teaching AI is a rather large field are you talking about LLMs/transformers? Are you talking about working with LLMs, which is something that seems to change every 6 months?
That's a valid point. But that means education is more important as opposed to dont teach it at all.