114 pointsby jdgr6 hours ago10 comments
  • ClawsOnPaws4 hours ago
    I'm in a similar position to the OP, unemployed for about 10 months, with tons and tons of applications sent both remote and local, and yeah not sure where this is gonna go or what I'm supposed to do. Also disabled, my eyes don't work so that automatically removes many, many non-software jobs I'd otherwise do from the equation.

    Don't even really have anything else to say other than that, but maybe commenting it somewhere helps someone else realize they're not alone. I don't know how that helps you or me, but that's what I got. Maybe there's still something for us somewhere, but it is very difficult to stay motivated, and I don't have an answer.

    • MakeAJiraTicketan hour ago
      I'm not in your situation, but I've hit the bottom of the despair and found the inner "fuck it we ball" within me. I don't know what's an option for you, but I'm learning bartending, stocking shelves, and having irresponsible sex with the young women I work with in retail.

      I enjoy software development and hopefully one day I will return to it, but I am but one tiny kernel of corn in such a mighty ocean of shit so I might as well right the waves instead of fighting them. Maybe your calling is scamming Indians or scamming Americans or scamming Indian scammers. You aren't alone but the attitude you have will never stop mattering. See if you want to go back to school, start a tutoring program for kids. Motivation is for morons, do something.

  • AdieuToLogic39 minutes ago
    From the well-written article:

      I have spent months adjusting my resume, applying for all 
      jobs where my skill set may be of use, building 
      proof-of-concepts using Claude, and doing cold outreach to 
      anyone who may be interested in my potential products or my 
      services. The well has gone dry. 
    
    A major quandary companies are finding themselves in is "resume fraud", which can be defined here as being inundated with applicants only to find 99%+ have used GenAI to produce a bogus work history tuned to satisfy the job posting. To the point where many companies simply give up trying to identify "real" applicants via online submissions.

    It is analogous to email spam in the 90's, before anti-spam technology was mature.

  • AussieWog93an hour ago
    Can't offer you any work unfortunately, but have an updoot. Hope this gets to the top and helps you provide for your son.
  • arkt86 hours ago
    More than ever is time to be stoic. Have things but live as having nothing. But as obvious as the author says it was predictable too.

    By now... I see in my country high prices for laptops with only 4Gb of Ram and Celerons.

    It could do wonderful things if in 2000s people didn't buy the argument that hardware is so cheap so lets write unefficient code. Same hardware that could play an Youtube video in 2000s today cannot even open the website. Electron send hugs...

    Now people are mad about AI until when? Oceans be drought like in Oblivion movie?

    And professionals? The generation of specialists will pass... and people will blindly depend on Ai soon if the course of things doesn't stop or at least be corrected.

    I think the author could have brighter days in future (and still thing in present in some hidden niches) as knowledge will always precious.

    The main lesson I have is buy less TI and every buzz promises and find the place where knoledge and craft walk side by side.

    • 2 hours ago
      undefined
  • donatj5 hours ago
    I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies as more of a liability than a virtue.

    More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing. They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.

    We've been moving to React, replacing an internal framework that's worked wonders for us we've been using for over a decade. The biggest part of the move is "hiring".

    My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

    Everything is giant overbuilt and terrible because most people never bothered to learn even a single level up from where they do most of their work. The people that do become unhirable. Everything takes hundreds or thousands more cycles and electricity it should because people can't be bothered to understand what they're doing.

    • rdevilla5 hours ago
      > I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.

      This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.

      I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.

      What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.

      > My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

      The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.

      There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.

      • xpct2 hours ago
        I can confidently say that I know little to no people truly interested in understanding technology, except for strangers online.
      • slopinthebaga minute ago
        people will accuse you of "gatekeeping" because you shouldn't need to have any knowledge or skill to do stuff. those things are unimportant, even bad, because anything requiring those is inherently exclusionary. lmao.
      • globalnode3 hours ago
        sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).
    • jongjong5 hours ago
      This has been obvious to me since I graduated with a BIT majoring in 'Software design.' I literally went to university with software design and software architecture being my core interests.

      When I graduated, I was shocked to learn that no company cared about any of the architectural concepts that I had learned. UML class diagrams, sequence diagrams, ER diagrams, etc... had been on the way out. At one point, as internet companies where scaling up, there was a brief resurgence of interest in sequence diagrams... Especially as a communication method when explaining complex bugs or complex message-passing scenarios. But it didn't really last. Nowadays most software is riddled with race conditions and deep exploitable architectural flaws. Cryptocurrencies have been victims to many such attacks. Billions of dollars have been lost to race conditions... And that's just the ones which were discovered. They are notoriously difficult to find post-implementation.

      The programming primitives that we're using today aren't optimized to avoid race conditions or even try to encourage good concurrency patterns; quite the opposite; they encourage convenient but disorganized parallelization and they're optimized to put the focus on type safety which is a far less concerning issue. A lot of people who were rightly alarmed by gaps in schema validation (which is critical at API boundaries) became overly obsessed with type safety (which is a broader concern). I have built some async primitives for Node.js, nobody cared! NOBODY! Other developers have had the same experience with most other languages. I think only a few niche languages like Elixir actually treated it as important. But nobody even acknowledged that the problem could be remedied in existing languages. It's so bad that it seems as though some people wanted it to be that way.

      The term 'concurrency safety' doesn't even exist! Some have a vague idea about thread-safety OK, that's very specific to one particular concurrency primitive... but what about the concurrency of asynchronous logic (much more common nowadays)? I have felt thoroughly suppressed in that regard in my career.

      The only voice on the subject of architecture that got through to the 'mainstream' was Martin Fowler (one of the inventors of Agile software development). After that, there was Dan Abramov of Redux fame. Some notable opinionated architecture books were published but none really identified the underlying essential philosophy to good architecture.

      The best, most succinct quote I ever read on the subject was from Alan Kay (inventor of OOP) who said "I'm sorry that I long ago coined the term 'objects' for this topic because it gets many people to focus on the lesser idea. The big idea is messaging."

      I like that quote for many reasons; firstly because it shows wisdom, secondly, it tells you what the issue is, very simply and, thirdly, it hints at the importance of 'focus' in this discipline where we are saturated with thousands of complex overlapping and partially conflicting ideas.

      I think the FP trend was somewhat of a red herring. Same with Type Safety. Yes, they were useful to some extent, there are some really good ideas in there, but people got so caught up in them that the most fundamental area of improvement was ignored entirely. To me, the core value proposition of FP can be reduced down to "pass by value is safer than pass by reference." Consider that in the context of Alan Kay's "The big idea is messaging." - Is an object reference a message? NO! A live instance is not a message! Precisely! His point supports pass-by-value, furthermore, it encourages succinct/minimal parameters.

      Good architecture is rooted in 2 core concepts. 1. Loose coupling. 2. High cohesion and you achieve those by separating logic + structure from messaging. The biggest mistake people make it passing around structure and logic as parameters to other logic. You should avoid moving around logic and structure at runtime; only messages should move between objects; the simpler the messages, the better. And note that 'avoid' doesn't mean never but it means you have to be extremely careful when you do violate this principle and there should be a really good commercial reason to do so. I.e. You should exhaust other reasonable approaches first.

      • burakemir2 hours ago
        Yeah, passing by value or "Value semantics" can prevent many programming errors. Passing references to immutable data can serve a similar purpose. In low-level languages where memory layout and calling convention map to target hardware, there are differences in performance to consider.

        Pass by value would indeed make a big difference to how programs are structured and make it easier to reason about programs.

        I just want to point out that "concurrency safety" is very much a word, although "thread safety" is more common. These are broadly part of memory safety, which is a topic mainly due to security concerns but also academic study.

        The two perspectives are not perfectly congruent. Non-concurrency-safe languages like go can also be considered broadly memory safe. The pragmatic rationale is that data races in GCed languages are much less exploitable. From a academic, principle based view this is unsatisfying and unconvincing as one would prefer safety to be matter of semantics. See also https://www.ralfj.de/blog/2025/07/24/memory-safety.html

        Rust uses "fearless concurrency" as a slogan. Rust offers more options than passing by value (Copy) while still guaranteeing safety through static type checking.

        There is also research for GCed languages to establish non-interference eg Scala capture checking.

        Concurrency is recognized as difficult (at least by people who are knowledgable) and programs language design usually involves pragmatic choices if you need concurrency. If the language does not provide the primitives or spec that enables safety, then you are left with patterns and architecture.

        The science is still evolving, it is certainly not the case that nobody cares. Rather, progress is slow and moving ideas from research industry is even slower. How much value we ascribe to correctness, safety and performance in industry depends very much on the context.

      • kelsier_hathsin3 hours ago
        > only messages should move between objects

        Can you provide an example for this?

        • aryehof34 minutes ago
          The Alan Kay viewpoint (he is NOT the inventor of OOP [1]) is considered the least helpful viewpoint on OO design. The “magical” and unhelpful “its all about messages” perspective, that helps you not at all unless one is talking about the internal implementation of a platform like Smalltalk. Consider the views of the real inventors - Nygaard and Dahl.

          [1] I don't think I invented "Object-oriented" but more or less "noticed" what was really powerful about just making everything from complete computers communicating with non-command messages. This was all chronicled in the HOPL II chapter I wrote "The Early History of Smalltalk". — Alan Kay

        • burakemir3 hours ago
          Say you have a Car, Engine and Dashboard object.

          Let's not have dashboard access the temperature by doing `GetSurroundingCar().engine.temperature`

          If the dashboard needs to get the temperature from a sensor in the engine, it should be able to "talk" to the sensor, without going through car object.

          In ideal OOP, a "method call o.m(...)" is considered a message m being sent to o.

          In practice, field access, value and "data objects" etc are useful. OOP purism isn't necessarily helping if taken to the extreme.

          The pure OOP idea emphasizes that the structure of a program (how things are composed) should be based on interactions between "units of behavior".

        • jongjong3 hours ago
          1. Avoid passing live instances (by reference) to other instances as much as possible. Because you don't want many instance references to be scattered too widely throughout your codebase. This can cause 'spooky action at a distance' where the instance state is being modified by interactions occurring in one part of the code and it unexpectedly breaks a different module which also has a reference to that same instance in a different part of the codebase. The more broadly scattered the reference is throughout the codebase, the harder it is to figure out which part of the code is responsible for the unexpected state change. These bugs are often very difficult to track down because stack traces tend to be misleading because they don't point you to the event which led to the unexpected state change which later caused the bug.

          2. Avoid overly complex function parameters and return values. Stick to passing simple primitives; strings, numbers, flat objects with as few fields as necessary (by value, if possible). Otherwise, it increases the coupling of your module with dependent logic and is often a sign of low-cohesion. The relationship between cohesion and coupling tends to be inversely proportional. If you spend a lot of time thinking about cohesion of your modules (I.e. give each module a distinct, well-defined, non-overlapping purpose), the loosely-coupled function interfaces will tend to come to you naturally.

          The metaphor I sometimes use to explain this is:

          If you want to catch a taxi to go from point A to point B, do you bring a steering wheel and a jerry-can of petrol with you to give to the taxi driver? No, you just give them a message; information about the pick up location and destination. This is an easy to understand example. The original scenario involves improper overlapping responsibilities between you and the taxi service which add friction. Usually it's not so simple, the problem is not so familiar, and you really need to think it through.

          We understand intuitively why it's a bad idea in this case because we understand very well the goal of the customer, the power dynamics (convenience of the customer has priority over that of the taxi driver), time constraints (customer may be in a hurry), the compatibility constraints (steering wheel and fuel will not suit all cars). When we don't understand a problem so well, an optimal solution can be difficult to come up with and we usually miss the optimal solution by a long shot.

      • globalnode3 hours ago
        nice post, lately ive been dealing with concurrency, between threads and processes. trying to keep it cross platform as well, its a lot to learn. if you have large buffers and want to keep some semblance of performance, its VERY interesting understanding all the transfer mechanisms and cache levels involved. i feel these are the sorts of things my education skipped, it was all very focused on the static structure of objects not the dynamics of data transfer.
    • skydhash5 hours ago
      > More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing.

      That's one thing I never care to do unless I'm the one making the technical decisions. What I do is to build the thing, but with defensive programming in place. I take care of making that my code is good, then harden any interface so that I can demonstrate that I'm not the cause for new bugs. People will be careless, so make sure that you have blast doors between your work and theirs.

      And I do take time to learn about the abstractions of the new shiny tools, even when it's overengineered. Going blind and making mistakes is not my cup of tea.

  • soopypoos32 minutes ago

      I spoke a million words
      They didn't mean that much to me
      They rang around my head
      Like empty tuneless harmonies
      Love's great abstraction mine
  • oxag3n5 hours ago
    "Any problem in computer science can be solved with another layer of indirection, except of course for the problem of too many layers of indirection." Bjarne Stroustrup

    That's why you see hundred level call stacks, polymorphism with a single implementation and still errors are hidden or root causes hidden behind "exception caught".

  • hamasho5 hours ago

      “Duplication is far cheaper than wrong abstraction."
  • shadowgovt6 hours ago
    Oof. There are two pieces to this story. One is great and one his heartbreaking.

    The fact that modern tech has disintermediated people with problems to solve from the need for a "priest class" to commune with the machine to solve the problem is a great thing. It's the goal. The more we do it the better we are making the world for humans.

    ... the fact that people need to work to eat or provide anything above a subsistence quality of life is not only tragic, it's increasingly abhorrent in a world where automation and simplification via machines has freed up this much raw resource and free time.

    If we're pitting LLMs against people's ability to provide for their families, we have lost the thread on why we're doing any of this.

    • renticulous38 minutes ago
      > this much raw resource and free time.

      Those resources are being redirected to create entertainment areas for the rich like golf courses, 7 star luxury hotels and villas. This is the modern predicament.

    • arkt86 hours ago
      Not he automation, but the way... we gone farther since agricultural and energy domestication... but the profit as main director is less than suboptimal, it is tragical. Having known about many accidents in complex systems is a madness to see things at this point in the most complex of systems that is society.
      • hgyyy5 hours ago
        Profit is what drives the survival of the firm to be fair

        However there are tasteful ways of doing it. And google and meta in particular certainly are not.

  • SadErn41 minutes ago
    [dead]