7 pointsby astlouis445 hours ago2 comments
  • fwlr2 hours ago
    From the post:

    “Think about what this means … the original SimCity ran on a Commodore 64. An empty Chrome tab takes more memory than that entire machine had. We’re not constrained by hardware anymore. We’re not even constrained by understanding what the code does … codebases will 10-100x in size because AI … endless bugs … the question is whether you’re building with it or explaining why you’re not.”

    Looking through the eyes of an AI champion, I see a world where the first execution of any given idea, the first product to hit the market for any given need, is guaranteed to be AI-generated - with the “10-100x size” codebase, the corresponding (and often superlinear) decrease in performance, and the attendant “endless bugs”.

  • chrisjj5 hours ago
    I've read the article and found nothing to substantiate "without reading the code".

    But then, I suspect the article is AI slop. Take this:

    > Christopher Ehrlich just did something that would have taken a team of engineers months. He pointed OpenAI’s 5.3-codex at the entire SimCity (1989) C codebase and let it run.

    No, that wouldn't have taken engineers months.

    > Four days later: the game works in the browser.

    So someome used a (very slow) program to translate a program.

    > No code reading.

    What?? Osmosis, then?

    • skysanctuary2 hours ago
      It's a bad title.

      I think he meant Christopher didn't read any of the original code himself. The AI certainly ingested it.

      Though, there is this part:

      "Ehrlich wrote a bridge that could call the original C code, then ran property-based tests asserting his TypeScript port performed identically."

      So, he must have had some kind of awareness of how the code worked.

    • astlouis444 hours ago
      The article was written by the CEO of Ycombinator, funnily enough.