There is a tension here that is being understated which is that people of every major now take the intro-level CS class because programming is integral to everything. Teaching algorithm design in that class is not particularly useful to a biologist who just wants to be able to cobble something together to analyze some data (usually in Python). As a result, the non-computer-scientists and non-software-engineers at the school would rather have a curriculum that is more "practical" and directly applies to research (or later class projects) the students might be doing.
Some time not far in the future, we are going to accept that this is not the same thing at all as computer science and give the computer scientists a curriculum that is a lot heavier on math and theory while adding elective or core courses for non-major students on programming. That will end the compromised state that intro CS courses currently sit in. Right now, there isn't enough teaching talent to run those non-core courses (because that talent is earning the big bucks doing something else).
Why doesn’t the CS department have a different intro class for CS majors and non-CS majors.
That’s a common practice in the mathematics, physics, etc. departments.
(To have harder intro classes for the students majoring in your department vs those who don’t.)
I think these days it’s a bit different in that many schools allow high schoolers to apply directly to the CS department? But then it would be a matter of scale I guess, they usually have clinical professors run these classes because they demand full time attention.
I don't think it's completely wrong. Failing a student after he or she sunk a ton of money into it is really not ideal. IMO it seems really easy to pass prospective students the first few chapters of your CS textbook/lectures then test them as a prerequisite for enrollment, before they enrol. This would be close to free, filter out a lot of would be dropouts and just save money and effort all around.
1. Back when I was in CS, the department was much smaller than it was today. This was after the 80s programmer crash, and so they just didn't have enough professors or resources to teach as many students as were interested in being CS majors.
2. You are able to teach to a higher standard if you've filtered the students that enter your department. You can have one or two people fail OS rather than half the class. At some point, it is reasonable to see if the students are committed, and they get to prove themselves a couple of years after high schools, which worked better for people like me who weren't very accomplished until they get to college.
I sort of like the change to redeem yourself during the first two years of your university if you didn't have the ideal secondary school experience to get into a hot department of a hot university. CC can do that as well, I guess, but it is a much harder hill to climb.
At my university, after the usual calculus/diff eq/lin alg sequence everyone in STEM takes, we had “intro to advanced mathematics” that was proof based, taught in the spring, and a pre-req for everything higher level (abstract algebra, real analysis, etc). Most math programs have a similar “first proofs” class as their “weed out” class.
But intro STEM math is used as a weed out for other majors. You aren't going to get far in CS if you aren't able to ace your basic calculus classes.
Even when I was an undergrad (1995-1999) and a grad student (2008-2011) everyone knew that CS was a research science [that happened to turn out people who lucratively employable skills], and not a software engineering program, which existed separately at both schools.
CS 101, 102, etc. were the “non-major” intro CS classes and exactly what you describe here. They didn’t count towards a CS major at all. Neither would non-calculus-based physics classes.
Note that, at least at UNC, in every other department I know of, 400-series classes were generally graduate level. But UNC CS operated a bit differently.
I loved the challenge and the knowledge I gained, but it’s worth noting that CS at UNC was much, much harder than nearly any other major there (Chemistry may be the one exception).
The things that people are using Python for are significantly easier with Python. There are libraries designed to make these tasks super simple, and with the right IDE you wouldn't ever have to touch the shell.
Someone like you should think of it as a jupyter notebook that doesn't need a web browser.
I'm saying that for the average scientist using these tools, a shell script is not an easier or more useful tool than Python plus libraries. Can you honestly say that shell scripts are more intuitive and easy to understand than Jupyter notebooks? Are they going to be more easy to transfer to other stakeholders?
This is one of those solutions that seems better in theory, but likely isn't in practice.
There was no pretense that it was even remotely appropriate as a CS class.
'Algorithmic thinking' more or less doesn't exist in a lot of the classical disciplines with any degree of rigor, so a CS class is the only place a biologist is going to be exposed.
Specifically for utility, there is a very tight mapping between 'how do I search these strings' and 'how do I [most efficiently] design a set of experiments [searching for a a result]'. Both in the literal algorithms as well as the logical framework necessary to extend past trivial examples.
"Stats for business"
"PE for physicists"
Why isn't "programming for sciences" not a thing?
Dating myself, my college had Pascal for CS majors, and FORTRAN for physicists. The FORTRAN class was math heavy, and we had a higher level course in process control as well, plus we all took the math class in numerical analysis.
When my daughter was in college, it was Java and Python. She took Java. I had suggested to her that she could easily keep up with the CS students, which she did.
I very much favor better training in programming for scientists. Taught by a scientist. The "intro to Matlab" that many students take is really too lightweight. Also, as for languages becoming obsolete, the disciplines that I learned in the 1980s are still of value today even if the languages have changed.
You meant juggling and fencing?
A more pragmatic answer is who’s going to teach it? Someone with weak skills in the discipline? Or someone with weak skills in computing? (This is a glib answer)
Finding people at the true intersection is surprisingly hard, and those people tend to be busy.
In grad school I took and advanced statistics course in the psych department. The concepts were new to me but not to the folks in the discipline. However the math (matrix multiplication) and basic coding were easy for me and very hard for them.
There was only one person in the department who could teach it, other faculty did that type of work entirely as “clients” which just ran tools and code from others.
It's not unreasonable for the biology department to come up with the common programming use-cases for their students, then have the CS department build a course around those skills.
My minor was Bioinformatics. The biology department taught foundational course like biology and genetics and the CS department taught the courses on processing data.
IME its the same...I've also found that approach works less well at generating good learning. It (seemingly inherently) results in decontextualized knowledge and does not get students over the application gap, which is the exact problem that such things are trying to solve.
I don’t know why you’d try to throw CS majors and Biology majors looking to cobble together scripts into the same class.
And then for everyone else the Math department of the school of Letters and Science had their PIC (program in computing I think?) series which had two or three courses in C++ and another course in Lisp. As a physics major I took the latter.
I graduated from an Engineering focused college in the late 1990's, and many departments offered exactly these types of courses in addition to the standard intro courses. They were typically named something [domain] [course #] for non-majors.
Then there are the cs programs of the ivy leagues, not as strong, but usually you have a rich parent or uncle who has already speed-tracked you into a hedge fund, so lets put those aside.
I didnt go to a top-4 cs program and the reality is -- there is no longer a real job market for any cs grad outside the top-4. If it were not for ZIRP it could be there never was! There is definitely not a job market for the sheer masses graduating with cs degrees, and it will take a decade to absorb the fresh graduates.
The curriculum does not matter here, so I think all this discussion is beside the point. No curriculum stasis or change will magically lead to jobs for fresh graduates.
I say this from three perspectives
1. Reality - just ask people if they found a job (ignore nepo-hires, also ignore startup founders with nepo-vc investments)
2. What politicians say. Both Dems and Reps have tacitly (or loudly) noted that local graduates do not cut it. It used to be subtle (https://www.fisherphillips.com/en/news-insights/biden-admini...), but it isnt any longer https://www.bbc.com/news/articles/clyv7gxp02yo
3. How people act. Foreign workers want the jobs more and are willing to do anything and learn anything to get it. My office is 95% non-us workers. They work hard.
Firstly, there are not "plenty of jobs for CS grads" -- if there were, you would have supply (us cs grads) matching up with demand (us tech jobs). Yet you see tons of unemployed CS grads. FAANG mostly has hiring freezes, and you have to be part of the inside club to be let in. Many cs grads go into bs jobs way under their potential (random government agencies, contractors.)
Walk into any US tech area and you wont see any Northeastern CS grads, or many non-Top-4 grads. You will either see Top-4 CS grads in leadership positions or at their own startups -- or you will see foreign grads. In some offices, 90% or more of the workers are foreign grads, not US school system grads.
Tons of job postings are fake. There was an entire HN post on this last week, companies posting fake jobs. Sometimes it is just to show a best effort before an insider is hired.
I think LLM copilots are a factor, but they are just a convenient distraction from the real problem -- a supply-demand-price mismatch. ZIRP conveniently hid the problem, but now it is out and visible.
Why are those jobs deemed bs?
> Why are those jobs deemed bs?
Because many of us went into the field bright-eyed, thinking about working on cool products, designing the latest algorithms, or being part of world-renowned teams. NOT to run the nightly batch job for the Massachusetts Depart of Family Services wage garnishment system. NOT to fix bug tickets at the local tax office's COTS implementation, while all the actual software is built elsewhere. NOT to work as a contractor without benefits.
Just look on Linked In, filter by Northeastern CS, see how many people have green semi-circles. Look at where people are landing jobs. Filter out nepo-hires, which are easily visible (e.g., jumping into leadership role straight out of school)
But I started at a lame crappy job 10 years ago after college. It’s not a big deal, it gets your foot in the door and then in 2-3 years you can move up to a better company. I did this a couple times and after 10 years i’m now working at a faang adjacent company (household name but not part of faang).
That’s what normal life looks like for a typical CS grad. The fact you think getting a job at faang is expected or normal right out of the gate shows you’re in a bubble. It happens, and i’ve worked with colleagues who did it, but it’s not the norm.
and nobody is changing the world with their code, unless you’re linus torvalds or similar. best to squash that naivety earlier than later lol. It’s a job, and a passion for some (like myself). that’s it. It’s also healthy to find additional hobbies outside of the computer - something i learned the hard way.
Guess the smarties should have measured the patterns physical social reality rather than get sucked in by propaganda.
What physics demands society align with the spoken philosophy of a people? It’s kind of on the people to demand politics align with the philosophy. But go on, people; scream at your screen like gramps did at Dan Rather.
Distribution of education does not guarantee distribution of intelligence.
CS major numbers == supply
I'm speaking about jobs == demand
I had been to a Scheme conference in Washington, adjunct to the Clojure Conference one year, and it was attended by many undergraduates from Northeastern, (and the authors of those books that I got a photo with.)
I have to feel sympathy for those undergraduates I spoke to. They gave a strong feeling, even then (8 years ago), that it was time for the University to move on in language choice.
I had a similar experience in the late 90s when the world was picking up Java, and our University insisted on teaching in Eiffel.
Basically Java and OO and Design Patterns are taught upfront because it turned out this was a huge stumbling block for a lot of people. Bigger than for instance C and pointers. Like it just doesn't click for a lot of people and they end up struggling a lot the rest of the major
So it's not that these are the most crucial concepts, but you want people to "fail fast" and have a sense if they'll succeed in the major within the first year
Later
I edited "a lot of" to "some of"; I was coming on too strong.
Furthermore part of PLT is learning the past and what worked and what didn't.
I watched one or two lectures and it made no sense to me, so I gave up. I had no idea WTF "objects are like nouns and methods are like verbs" was trying to teach me. I just wanted to make my computer do things.
Around 14-15 I started playing around with my TI-84 calculator writing simple programs. The TI-84 used a form of BASIC where I could write a program that took INPUT and plugged it into an equation and print it to OUTPUT, and it felt so much more approachable than the neo-neo-platonist OO lectures I'd watched. From there I gradually started writing more complicated code until I eventually started to get why programmers would define functions to stop repeating themselves, or why they might implement custom types.
> So it's not that these are the most crucial concepts, but you want people to "fail fast" and have a sense if they'll succeed in the major within the first year
I'd instead posit that so many people "fail fast" with OO because they go into their class being interested in programming, but have no idea wtf is even going on with programming and drop it, because they're forced to learn all this inane trivia [0] and write all this boilerplate (define a class, methods, observability, return by value vs ref) they don't understand before they can even run a program. They think maybe they're stupid or not a good fit for programming and drop it.
IMO a better teaching language would be one that lets you opt-in to OO and functional features but also lets you write really simple programs like "take a number, multiply it, and print it". I think that's why Python is so popular these days. It helps that the lack of semicolons/curlies, optional typing, and modifiers [1] removes so many distractions and gotchas that stymie absolute novices.
I also think most CS educators do a very poor job explaining CS concepts to beginners without realizing it. "Methods are like verbs" is absolute nonsense without a moderate to large amount of computer science knowledge to contextualize it. Some of my teachers were actually pretty good, and I also don't remember much acknowledgement that programming didn't have to be this way but that the language/tool was designed that way because that abstraction comes in handy. That'd probably help a lot in retaining students who successfully suffered through their first semester of CS 101 in Java but hated it so much they decided to swear off programming.
[0] Always start your program with "public static void main(string[] args)" ! Don't worry, that'll make sense in a year, or in six years when one day on the toilet at your software engineering job you realize that it really was a static function returning void that took string[] args
[1] Static and Foo& are justifiable, although static should arguably be implicit for a beginner language. Forcing students to learn about final, const, val/var, public/private, etc. early on is just stupid. I never understood why these were actually useful, or had a good reason to use them, until I'd already graduated.
But somehow, in the curriculum, we expect complete noobs to just get this abstract, non-relatable concepts without any contexts.
I'm sure if we look back further, C++ displaced some other language (pascal?) for exactly the same reason. And likely the same for the language the proceed C++. I'm just not old enough to have personal experience here.
I think a lot of 2025 developers would be alarmed to think that a project had started from an object-oriented design perspective.
Then you have data (growing above average), scripting and partially frontend that are done differently, but they are still a minority of the job market.
(I'm not a Lisp or functional partisan; like I said downthread, it's Go and Rust for me these days mostly).
Keep my original point in mind here, which is simply that OOP principles are not a fundament of software development in the same way algorithms, data structures, memory models, and concurrency are. We're discussing curricula that have students learning class-based object-oriented programming as expressed in Java as a requirement, and basic systems programming as an elective. That's backwards.
I don't care if you still use OOP. I'm not advocating for its removal from production codebases. There are plenty of things that earn their keep in modern product stacks that aren't fundamentals of computer science!
And when you look at what the actual classes teach: it's usually patterns-y Java. GoF patterns are effectively obsolete. You might still need to grok them, but that's trade knowledge, not fundamental computer science. You don't come out of a CS degree knowing Hibernate or SAP, either, but people figure it out on the job.
OO is important and if I’m three years of computer science there’s no time to teach it then you have to ask what you’d heck they are doing.
OO is everywhere and it’s an important software concept and is the right solution for certain classes of problem.
Inheritance, polymorphism and encapsulation inform every major professional programming language and framework (except c). Even go which in many ways is active response to oo uses most of those concepts extensively. One major challenge of people programming in go is how to adapt the familiar patterns to it.
Now I’d probably not teach the whole GoF as an exercise in cataloging patterns but teaching a few of the most common while showing the concept of patterns (probably the least well understood concept in development) seems sensible.
Rust and Go have concepts that approximate class-based object-oriented programming (and: if you look at NEU's OO syllabus, that's what they're doing), but only barely. You could teach someone how to use an interface without teaching "polymorphism" as a concept. Encapsulation as a language feature is an example of an OO-ism that I think is done and dusted. We hid implementation details in C, too! The notion of an abstract data type is fine to hold on to. The idea of "friend classes" and "protected" fields and other ACLs for variables, though, I don't think we're going to see come back.
And perhaps we’ve moved past inheritance hierarchies (though multi/parent inheritance has been interestingly used recently) but even then, knowing about it and its negatives seems as fundamental as understanding goto.
Are they on the programming side of the computer science curriculum rather than computational math? Yes! But much of the normal curriculum is. Similar to how I’d expect the relational calculus to be taught as well as how sql relates to it.
I think you’re pretty far out on a limb here and I’d be very skeptical of a curriculum that didn’t have a segment on oo early on.
Whatever you want to say about the OO inspiration of how Rust and Go are structured, they are both clear and deliberate rejections of the OO orthodoxy from that period. They both deliberately don't have classes. They both deliberately eschew inheritance. They both deliberately have less-granular, less-configurable "encapsulation" rules.
And: "encapsulation" is a term we only have because of class-based object-oriented programming. As a professional C programmer in the 1990s, I'd have gotten dinged on a message board by a C++ person if I claimed that I was "encapsulating" when I hid the structs for my timer library or my trie library behind a void pointer. We had abstract data types long before mainstream class-based OO, and then class-based OO claimed credit for the idea.
I also think, as you've alluded to here, mainstream OO has a lot of bad ideas that get people in trouble. The obvious one is animal->mammal->cat->tabby, but there's also the debacles that happen when people try to unify storage with the same inheritance ideas used by compute.
More than anything else though, I think these are the kinds of details you can just learn on the job. What's fundamental to programming? Ironically, it's a lot more of the stuff you get in a Scheme-based class --- and I'm saying this as someone who does not like Lisp. I think Python is a step in the right direction here.
The 2 differences in go are composition being the mechanism for inheritance and structural typing in an otherwise strongly typed language. Both of those were available in other languages that strongly identify as oo.
My first cs class was taught in scheme. I think that’s the _right_ way to begin to learn. But very quickly after that oo and imperative ought to be introduced.
I understand that not every developer is a big fan of OO but that doesn't mean we can ignore it.
OO is out of fashion just like blockchains and NFTs are out of fashion, and the same way AI will fall out of fashion in the future. The huge hype around it will die and what's left covers the few useful scenarios.
OO isn't debunked and its not out of fashion.
All that's happened is that it's no longer gospel that "the only way to program is OO".
I write code that has all sorts of styles and approaches that fit the task at hand and sometimes the right tool for the job is OO.
I get the sense you're saying that OO has been proven to be hokum and no one should learn it or do it anymore and all that remains OO is the smoking ruin of 20 years of Java. That's not correct at all.
My undergraduate degree is from RPI, I have worked with many NU grads, they are often very good, but there have been many eye opening moments for me with them in terms of how different the material they learned was and what was left for graduate school when it comes other the core mathematical fundamentals of computer science. To be fair I've run into engineers from other schools that leave all of this to graduate school too. My first internship I shared a cubicle with a graduate student at Boston University. She was taking a graduate course on algorithm proofs and the course used the same book that we had used in the major weed out class that we had in the spring of Freshman year.
"Program Design" has changed almost as often as popular programming languages during my career. Almost none of those core mathematical fundamentals have changed at all.
Algorithm proofs around complexity, efficiency, etc..
At least an introduction to the design of languages, parsers, and grammars
Algorithms and concepts in the design of different database designs (not how to use specific databases)
Fundamentals of operating systems and systems programming
Some crossover with computer systems engineering courses. You must know at a basic level how logic gates are implemented, how an ALU is built, and how these blocks are built up to construct a CPU
These are the actual fundamentals of CS and they change at glacial pace compared to languages and design patterns.
One area where NU does fall down IMO is that they offer "combined majors" with CS and other subjects that eschew some of these courses to make room for the other half of the major, which will be an unrelated subject. This offering is a mistake because you do miss out on some key concepts. If you're working with a pure CS major, however, they were required to learn all these concepts.
https://felleisen.org/matthias/Thoughts/Developing_Developer...
I always advocated a C based language, like C or C++. That's where my program (not NEU) began and I hated it but am grateful. We eventually moved onto Java. Later courses through my 3rd year allowed me briefly work with functional programming. We never touched Python or web frameworks until our Junior and Senior year projects, and even then it was generally voluntary and depended on the project we had proposed for our databases or algorithms classes.
What I do think CS programs should be evolving for are LLMs. Python + ChatGPT are powerful without the user knowing too much of the logic off-hand. That's a problem for new CS students who need to learn the fundamentals of logic, reasoning and programming. I don't know what languages work "less-better" with modern LLMs, all I know is that ChatGPT and Claude work exceptionally well with Python.
I suppose, as long as we keep paper exams, all hope is not lost. Maybe just a little, in my opinion.
Computer science is to programming in the same way that astronomy is to navigation.
How can a student "understand" LLMs without the background knowledge of computer science?
Programming courses can teach how to use LLMs, in the same way it can teach how to use Python, Java, language-de-jour.
Engineering continues to demand that students learn principles grounded in theory (calculus and statistics and discrete math), rigorous analysis (pattern recognition and learning), and system compositionality (design using reliable components as building blocks). CS curricula largely jettisoned this approach after higher-level languages like FORTRAN and C caught on, and has retained only vestiges of old-school mathematical rigor (basic inductive proofs and algorithmic analysis (albeit dumbed down to O(n) only). In the past 20 years, CS has even given up teaching software engineering (compositionality and reliablility) as a requisite skill area. That speaks volumes about the difference that already exists between CS and engineering.
With the enormous growth in college-preparation for software careers in the past decade, it's little wonder that most students prefer a less rigorous, less formal curriculum, and that colleges will choose to meet demand where it lives. Thereafter, if employers want to hires grads with math or engineering skills, they will turn back to engineering as they did long ago. I expect programs in EE and Computer Engineering will adjust their curricula (or add minors) to fill the intellectual void that 'CS For Poets' will leave.
Without knowing the new curriculum, I guarantee you that one of the biggest daily complaints about the old curriculum was the requirements that they write unit tests ("check-expects"). Do you think that's going away? Hahahaha. The laugh will be on students who think they're getting something different because the label on the can changed.
I know a lot of students hated it—frankly those were mostly the students that it seemed were only doing computer science programs because they’d heard they could make a lot of money in the field. The “real nerds” all seemed to love it, and now nearly 15 years later those are the engineers in my network who have built the most impressive systems and products.
I guess I’ll have to update my default instructions for recruiters from “automatically interview anyone with a degree from Northeastern” to add “if they graduated before 2025”
That's a good thing. I don't know whether your assertion about the breakdown between "real nerds" and the other camp is accurate or not, but I think this point stands on its own regardless--learning is hard. It's uncomfortable. It's unpleasant. If it isn't, you're not being pushed hard enough. So what's the point of asking students how they feel about it? Why make strategic decisions based on those data?
I'm genuinely curious, not trolling or anything. It seems completely baffling to me that educators behave this way, and I'd really love to understand why.
Schools already handle many cross-competing concerns across stakeholders (PTA, Taxpayers, Town Government, State Government), so I suspect they would want to reduce their enforcement & oversight load. They'll choose a teaching style that makes everyone happy or at least complacent, even if they know "fun is not learning".
If you can get college students idle brains curiously contemplating the how and why of the subject, that's when the tuition is really worth it.
Another way to say it is "rewarding".
Fundies 1 and 2 were great, but I have always felt that the amount of delayed gratification in NU’s CS program was much too long and incongruent with the university’s focus on experiential learning. I wanted to get my hands dirty and build something and the whole curriculum felt too academic.
Northeastern CS is world class at compilers and programming language research, so I always understood that the undergrad program would tend to be academic as a trickle down effect.
It’s a spectrum with tradeoffs, so I think balance is key. But happy to see the pendulum swing a bit and think it will be good for new grads, especially as more coding work becomes automated.
There are two separate points. The first one, almost a bit hidden, seems to be a "keep the student numbers up" change: allow placing out with AP CS credit and so skipping the intro courses; making the curriculum easier; reducing the number of students who withdraw from some of the modules; removing the teamwork (code swap) exercise; rolling back on fundamentals and design principles. This is just the word of one TA, but it's big if true.
The second one is about the language change. Look - if you're as great a TA as you say, you can teach design in any language, it just works a bit differently. You can teach good design in python, though it's a lot harder than just teaching python. You can teach design in Java, at some point you'll realise that half the design patterns book is still relevant (the other half has been eaten by streams and lambdas). You can teach design in golang, the people who wrote it at google did think of this. Using a Lisp brackets-style language doesn't give you magical powers, and I doubt the full details of macros and whatever the racket equivalent of call/cc is are that accessible in a beginner course anyway. Even SICP has a JS edition these days.
Personally I think "OOP light" is where it's at: interfaces instead of subclassing, methods that can be called on objects, encapsulation and modularity, and a package-private option so you can unit test stuff but indicate to users that this is not part of the API. Immutable objects and collections are good for many things and need to be taught, but sometimes there's good reasons just to make something with internal state and a fixed set of methods to modify it, so you can reason inductively about its invariants. You can get this out of many languages and teach it properly if you really know what you're doing. And you need actual programming projects, not just exams or "write a 20 line program" midterms. You don't need racket for this.
Let me instead suggest that one question we should ask is: What kind of students will be successful in the new system? What kinds of students will not?
One interesting thing aspect of the former curriculum is that (IMO) it provides an entry for students who are not entering with a lot of pre-existing knowledge. One of the criticisms of "old school" computer science teaching is that it privileges students who already have exposure to the material... The former curriculum is not the only way to level the playing field, but it certainly does provide a more level playing field for students who might not even be sure about the major.
I will stop with the suggestion that (IMO) pre-existing experience is definitely not the best indicator of future developer quality, so I value a curriculum that does not select for this.
(Caveats: never saw the new curriculum, it could be just fine, there are lots of other ways to accomplish this goal, but still... I am concerned.)
The student populations at MIT and NEU, particularly in CS, are fundamentally different. The majority of undergraduates at CS MIT participate in academic research while the vast majority of CS undergraduates at NEU do not (do not let NEU's exceptionally high computer science research output [1] confuse you - the undergraduate and graduate schools are very separate). MIT educates significantly less students than NEU. MIT's algorithms class (6.046) is significantly more rigorous than NEU's equivalent (CS3000) - just compare the publicly available curriculum and problem sets [2,3]. In general, MIT's CS curriculum caters towards the third of the student body that go on to do PhDs, while NEU's CS curriculum caters towards the vast majority of students that beeline towards industry [4,5]. The institutional goals and educational values between MIT and NEU could not be more different. I know all of this to be true because I've spent a significant amount of time at both institutions.
I don't know if NEU will butcher its CS curriculum. I hope not. I guess we'll just have to see.
P.S. it's worth checking out Pyret [4], essentially a functional teaching programming language. The language is mostly written by NEU staff, so I wager NEU's future CS curriculum plans to phase out Racket in favor of Pyret.
[1] https://csrankings.org/#/fromyear/2014/toyear/2024/index?all... [2] https://courses.csail.mit.edu/6.046/ [3] https://tlarock.github.io/teaching/cs3000/syllabus.html [4] https://facts.mit.edu/alumni/ [5] https://www.northeastern.edu/experiential-learning/co-op/ [6] https://pyret.org
Before the current curriculum was implemented, a good portion of students were having difficulty finding jobs and co-ops. After the switch, employers were very eager to hire these students.
The change is not a mere "switch to Python" (which itself is not nothing). It is the replacement of a good curriculum with something mediocre.
Also, I don't know how MIT is doing after the switch.
I've worked at a coding bootcamp in 2017. And I have to say, I was a bit jealous. Where I was learning through methods at university that were a bit quaint, my students learned JS with the latest frameworks. And sure, you can argue whether the length of 3 months is enough, but they were surely getting a better education. What they learned in 3 months, took me at least 6 months to a year in terms of how useful it is.
The thing is, especially as a beginner, learning any programming language will give you similar difficulties (glossing over some nuances). So why not just learn a practical one?
I think once a student has had a practical programming course under their belt, only after that should more esoteric languages come to showcase certain concepts. I believe that they'd be more motivated to learn them as they're more into the groove of programming.
And this is coming from someone that has programmed 2 years in Pharo.
I lol:ed.
> The “code swap” at the end of the semester, where students are required to build upon other students’ code, is one of the assignments students struggle most with
Wow, sadism may be common in academia, but that is just on a whole different level! A few hours of waterboarding would be nicer.
There is exactly zero evidence indicating that Racket would be a better introductory language than Python, so why not go with what is popular? CS students are de facto expected to already know basic programming. So with Python you can jump straight to algorithms, you don't have to waste time with a foreign syntax and an esoteric interpreter.
While noble in intent, one suspects Kaplan and Kölling may be on a quixotic quest in a money wins world, outgunned by the demands, resources, and influence of tech giants like Amazon — the top employer of Northeastern MSCS program grads — who pushed back against NSF advice to deemphasize Java in high school CS and dropped $15 million to have tech-backed nonprofit Code.org develop and push a new Java-based, powered-by-AWS CS curriculum into high schools with the support of a consortium of politicians, educators, and tech companies. Echoing Northeastern, an Amazon press release argued the new Java-based curriculum "best prepares students for the next step in their education and careers."
Links at: Should First-Year Programming Students Be Taught With Python and Java? https://developers.slashdot.org/story/25/01/05/1853210/shoul...
My main complaints border around: there's no art in anything anymore in that unless you're doing the lowest common denominator you are doing it wrong, companies show more and more success in influencing others (namely universities) to be their training centers for them, and that universities should be about learning and exploration and not subservience to culture.
As the old Dijkstra quote correctly states, physical computing devices have as much to do with CS as telescopes have with astronomy. They're eating utensils, not the main course. Computer science (the name in English is misleading and horrible) is not about physical computing devices. Devices are merely an instrument. Knowing how to use this instrument well is very useful for people in the field, but it is incidental to the field itself.
Even the distinction between "high-level" and "low-level" languages is meaningless from a CS perspective, as it essentially presumes a target language on a target architecture (with the target "low-level" language) that is treated as normative and "real", but again, that is an implementation concern wrt the instrument. From the CS perspective, compilation is just translation from one language to another.
So I dare say that a CS curriculum that starts with digital logic is not really a CS curriculum, but an electronics engineering curriculum. This is the lesson here.
If a student knows how to use a computer (a prerequisite to starting software project) they will demonstrably have more opportunities for learning than one who doesn't. And it's not even close.
"He who seeks for methods without having a definite problem in mind seeks in the most part in vain."
A computer science class teaching how transistors work is more like a creative writing class teaching how pencils are made, and focusing on the quirks a particular language is like the creative writing class teaching a specific style of cursive.
There are degrees which blend the EE and CS worlds, commonly referred to as computer engineering. That would be more along the lines of mixing the hardware/boolean logic idea with then software development. But that's definitely a different perspective compared to what computer science historically meant.
That's how we did it in Slovenia (Uni of Ljubljana). You start with discrete math and bool algebra, and you learn how transistors work in physics, then you learn about digital circuits and write some assembly, after that it kinda feels like two parallel branches. One set of professors teaches you the math of it all, the other set teaches you how all that math works in practice.
It's a really good curriculum.
Yes yes you also have programming classes in parallel with the digital logic and the math and the assembly. Gotta make sure folks can get those all important part-time jobs and feel like they're learning something useful. It's common, or at least was back then, for students to have jobs in industry while they are studying CS. I honestly think having a job and getting to apply what you're learning makes it stick more ... even if it leads to abysmal completion rates (average 7.5 years to graduate when I was there).
Later on, I joined the curriculum committee and argued for it to be taught in Python. The faculty weren't convinced at the time but I see now that eventually they moved the intro course to Python and kept the later courses in C++, which seems like a wise decision. Python is far more approachable and will serve science students well if they ever need to use Pandas etc. (The second-semester C++ course has you implement the STL container types, which is an extremely valuable exercise.)
I also like the fact that it is functional, invariant by default, and they are working on type checking.
Seeing this, I can't wait to hear about the students' experiences with the new curriculum.
The ideas behind a database is more important than the the concrete database used. If the ideas are taught, then the students can adapt when the fashion changes.
So instead of teaching something that you can use on your first job, they teach principles you can use your entire career. And times does change.
Remember the good old languages Fortran, Algol 60 etc. That's where the jobs were...
CS (traditionally) is more about algorithms, limits, growth. There's a bunch of complexity in even just understanding and calculating what a computer can do.
The Software Engineering degree makes so much sense if you're studying about how libraries can be tied together, upgraded, unit/integration testing, container and source code management, etc.
They are truly different disciplines at this point.
when i hear computer science i hear the study of computers that in my head means software, hardware, and theory it's like having doctors that study humans theoretically and others that do surgery
Software Engineering is the study of how to efficiently build software, typically on a medium to large team.
Dijkstra line sums it up well "Computer science is no more about computers than astronomy is about telescopes."
Software Engineers : Doctors :: Computer Scientists : Biologists/Biomedical Scientists
is perhaps reasonable.
And not because they were outdated and useless when the course started, though some did have issues. I think the least outdated "on the job applicable" material is Java fundamentals, and that's because you can still write Java 6 basic stuff in current JDK - a lot of the interesting and powerful stuff we learnt is no longer available in JDK though.
OTOH, fundamental principles - algorithms, including set logic in RDBMSes, low-level programming (which to annoyance of some was done in SPARC assembly or random assembly designed just for given assignment), robotics using programming stack and parts that were never seen outside university, various in-depth studies on different theoretical or scientific areas - all of that is material I use to this day in many different jobs when it allows me to understand and reason in ways I couldn't before I joined university - and I already "knew" how to program then.
BTW, within the 5 years I spent at university, "javascript on the server" turned from niche use cases within some stacks, usually running Mozilla Rhino, into complete new ecosystem that was surviving its first big fork and was becoming used in big projects, Android went from one phone "who knows" to pretty mature platform holding half the world with 64bit CPUs, tablets and first wearables, similar evolution for iOS, in fact arguably over my entire university time we had explosion of "you can now make mobile apps easier" to "mobile app developer is lucrative" all the way to "you're probably not going to make much money as solo developer anymore".
That's the point. CS curricula are supposed to teach you deep skills and principles, not how to fiddle around with git.
This feels like a step backwards.
I think the more likely cause is pressure from their co-op program's partner employers to make the intro curriculum more 'practical' (i.e. outsourcing training from the employer to school). It's beneficial in the short-term but IMO a loss in the long-term, Northeastern is a university not a bootcamp.
It’s entirely reasonable to ask why students valuable learning budget is being wasted on Racket.
If you go to university you have a reasonable expectation to learn relevant things.
But that complaint can be made about any language! "This dynamically typed language won't allow students to understand type safety." "This high-level language won't allow students to learn pointers and systems programming." Etc.
I believe that an intro course should get students coding since the first major hurdle is learning how to construct any kind of program at all. The switch to a more "employable" language isn't going to make education worse.
>Racket was chosen because it has “teaching languages” that can gradually introduce features as students are taught the relevant design principles.
So no, that complaint can't be made about any language.
> I believe that an intro course should get students coding since the first major hurdle is learning how to construct any kind of program at all. The switch to a more "employable" language isn't going to make education worse.
None of this is the issue at hand. The switch to python is because industry uses it. The article correctly makes the point that racket was intentionally designed to get students coding as easily and quickly as possible. It has multiple steps of teaching languages for exactly that purpose, introducing concepts in ways that let students grapple with them one at a time in an interactive environment.
Meanwhile in python complex topics like duck typing, object oriented methods, exceptions, the distinction between iterables and lists, how to use a command line/terminal or how to configure an IDE, and so on must be covered before people can start writing code for the exercises. Racket is streamlined for beginners.
No, they dont have to be at all. You might as well suggest you need to learn the JVM before writing a line of Java.
You thought that supporting multiple "programming paradigms" is a nice thing, but it's the opposite for teaching beginning student. Experienced programmers want expressivity/customization/choices to do whatever they want. That's not what newbies need when they get stuck on an assignment.
Also, see SolarNet's comment. https://news.ycombinator.com/item?id=42677918
At least "Racket" is aptly named.