Hopefully they get Mojo to a good place for more general ML, but at the moment it still feels quite limited - they've actually deprecated some of the nice builtins they had for Tensors etc... For now I'll stick with JAX and check in periodically, fingers crossed.
All the flaws I can think of in Kotlin are due to the Java compatibility. They could've made it work here by being more explicit but the way it currently works seems doomed.
Unless it's open sourced, it's a moot point, as most Python devs wont come anyway.
And it wasn't "equivalent python", whatever that means, they did loop unrolling and SIMD and stuff. That can't be done in pure python at all, so there literally is no equivalent python.
Chris Lattner talked more about the relationship between MLIR and Mojo than Python and Mojo.
That and the not completely open source development model is what has always felt very vaporwary to me.
Python interop > Mojo natively interoperates with Python so you can eliminate performance bottlenecks in existing code without rewriting everything. You can start with one function, and scale up as needed to move performance-critical code into Mojo. Your Mojo code imports naturally into Python and packages together for distribution. Likewise, you can import libraries from the Python ecosystem into your Mojo code.
That was the original claim, but it was quietly removed from the website. (Did they fall for the common “Python is a simple language” misconception?).
Now they promise I can “write like Python”, but don’t even support fundamentals like classes (which are part of stage 3 of the roadmap, but they’re still working on stage 1).
Maybe Mojo will achieve all its goals, but so far has been over-promising and under-delivering - it’s starting to remind me of the V language.
For me this was a big disappointment, and I wonder how much this has backfired across developers.
Might not have the niceties purists like, but perhaps that's exactly it's a great language for that.
It's like executable pseudocode, and unlike other languages, all the ceremony is optional.
People flocked to it way before it became a "must" for ML and CS thanks to that ecosystem becoming dominant.
Event though it's not portable, it will likely have far greater usage than Mojo just by being heavely promoted by Nvidia, integrated in dev tools and working alongside existing CUDA code.
Tile IR was more likely a response to the threat of Triton rather than Mojo, at least from the pov of how easy is to write a decently performing LLM kernel.
Not to mention efforts like GraalPy and PyPy.
And all these efforts work today in Windows, which is quite relevant in companies where that is the assigned device to most employees, even if the servers run Linux distros.
I keep wondering if this isn't going to be another Swift for Tensorflow kind of outcome.
You always need to touch the hardware/platform APIs at some level, because even if the same code executes the same, the observed performance, or in the case of GPUs the numeric accuracy has visible side effects.
Can anyone of the AI enthusiasts here explain, why, or, what is meant by
> As a compiled, statically-typed language, it's also ideal for agentic programming.
> why, or, what is meant by More errors caught at compile time means an agent can quickly check their work statically without unit and other tests.
So, agents tend to do better the more feedback they can get. Type checking is pretty good for catching a bunch of dumb mistakes automatically.
The point is more hints for the agent is more better most of the time.
Regarding compilation and static typing, it's extremely helpful to be able to detect issues at compile time when doing agentic programming. That way, you don't run into as many problems at runtime, which of course the agent has more difficulty addressing. Unit tests can help bridge the gap somewhat but not entirely.
What's not stated on their website is that Mojo is likely a bad choice for agentic programming simply because there isn't much Mojo training data yet.
But yea, to write mojo 1.0 code even after getting errors might take a new training round, so next or even next-next models.
Python cuTile JIT compiler allows writing CUDA kernels in straight Python.
AMD and Intel are following up with similar approaches.
If Mojo will still arrive on time to gain wider adoption remains to be seen.
From my experience, AI revolves a lot around building up function pipelines, computing their derivatives, and passing tons of data through them; which composability and higher order functions from functional programming make it a breeze to describe.
I also feel that other fields than AI are moving towards building up large functional pipelines to produce outputs, which would make mojo suitable for those fields as well. I’m building in the space of CAD for example and I’d love to use a “functional mojo” language.
One would want to see either a strong community build up around it, or really hard evidence for a long-term commitment to the language from Modular. And the latter will take a long time to be assured of I think.
Also, editing tools need to catch up before very wide adoption of a language with a lot of new syntax.
I bet that’s true for a great many people. There are too many wonderful FOSS languages to bother with one you can’t fix or adapt or share.
- The MLIR approach, which was also designed by Chris Lattner while at Google, has proven quite valuable to create Python JIT DSL
- The Python ecosystem now being taken seriously by the main GPU vendors, thanks to MLIR, as all their proprietary compilers are based out of LLVM
- Others remember Swift for Tensorflow
I think that nowadays with vibe/agentic coding, high performance Python-like languages become ever more important. Directly using AI agents to code, say, C++, is painful as the verbose nature of the language often causes the context window to explode.
If Mojo succeeds, it could be the one language spanning across those levels, while simplifying heterogeneous hardware programming.
https://docs.modular.com/mojo/faq/#will-mojo-be-open-sourced
Now I will probably rewrite the model in rust if I want to do anything with it (mostly for the web assembly target as I want this thing to run in browsers) but I will for sure be using Julia for further experimentation. Lovely language.
But then I read this:
> AI native
> Mojo is built from the ground up to deliver the best performance on the diverse hardware that powers modern AI systems. As a compiled, statically-typed language, it's also ideal for agentic programming.
Well, no thank you. I know the irony here but I want nothing to do with a language made for robots.
Go on, give it a shot. It stops being intimidating soon! And remember that the uv we all love was heavily influenced by Cargo.
I remember Rust very fondly in fact. And I had the same experience as you, learning Rust made me a better Javascript programmer. Lets see if a little neural network can be as fun.
That's a very big claim.