Sponsoring them was a no brainer for me :)
The difference (https://servo.org/sponsorship/#donation-fees) between the donation going via GitHub/Microsoft and Open Collective (independent) is so small (96% VS ~91%) that I'd rather not centralize funding FOSS with someone who has kind of a shitty track record with it, like Microsoft.
Made more sense in the beginning of GitHub Sponsors when Microsoft was matching donations 2x, or whatever it was. But now? I don't feel like it makes much sense anymore.
Open Collective is a fully public organization, who lives and breathes FOSS.
Servo would only see 85.6% of my 5 USD/mo donation as I'm from Canada. If I used PayPal, that number would go down to 81.2%.
I do agree that I'd prefer Open Collective, fees being equal/comparable.
Every now and then you run into these small-ish expert consultancies that actually are the force behind a lot of open source.
They seem awesome.
The technical talent at Igalia runs deep.
They are self directed contractors.
They are also responsible for huge portions of chromium and fundimental/base opensource libraries.
If you can think of an open source library, it's highly likely Igalia had funded some development or bug fixes.
> Servo is a huge project. To keep it alive and making progress, we need continuous funding on a bigger scale than crowdfunding can generally accomplish. If you’re interested in contributing to the project or sponsoring the development of specific functionality, please contact us at join@servo.org or igalia.com/contact.
> Let’s hope we can walk this path together and keep working on Servo for many years ahead.
That said, in the longer term the solution to the WWW"'s Too Big To Fork problem surely has to involve getting much more of the "specification" expressed precisely in declarative specification languages, so as to greatly reduce the handwork involved in generating a half-decent implementation.
Once a big donation is given, you get to wonder what sort of influence that person (willingly or not) has had on the project. A much better model is a large amount of small donations, the incentive becomes to serve the maximum amount of these people.
Servo has no "customers" as such. It has potential future project users and there may be a support/development economic return for those users to fund further work.
It's very similar to Rust the language. Rust itself is not a startup or a company product.
The economics are completely different.
Should probably have a "(2024)" appended to the title.
I hope Servo will eventually replace Chromium in QtWebEngine and other similar cases.
It might be possible for Servo to go down the same route as Blitz and have pluggable rendering backends. If so then the wgpu-based renderering library we are using (Vello [0] - which is an exciting project in it's own right) could be an option. Servo is actively looking at potentially using this library to implement Canvas2D.
At the same time these projects are soooo promissing (to me -- it may be purely subjective).
By these projects I mean:
Servo and Verso
Redox OS
System76's COSMIC Desktop's EPOCH
RipGrep
Deno
Zig
tree-sitter
And lots of web dev libs and frameworks: Actix, Leptos, Dioxus...
Currently a web dev stack can run on Redox OS and use significantly less resources than Alpine! (and this stack has not even had the years of tuning Alpine had)
Rewriting things in Rust is a reasonable thing to do. I think the hate is for people who criticize existing software for being written in C on the grounds that hypothetically someone could rewrite them in Rust.
"I rewrote SQLite in Rust" would be praiseworthy (assuming it's true). "Why don't you rewrite SQLite in Rust?" is trolling.
Also HN: "Mozilla should spend more than a decade and tens of millions of dollars on a brand new browser engine that has no hope of replacing Gecko before it reaches 100% compatibility with a spec thousands (tens of thousands?) of lines long, not to mention the kind of "quirks" you see with websites in the wild, while they already lag behind Google with the browser engine they already have."
People like cool R&D projects, and that's understandable - I like Servo too. But the fact that it was really cool doesn't compensate for the fact that it was not going to be production-ready any time soon and in that light it's understandable why it was cancelled. While some parts of Servo ended up being so successful that they were merged into Firefox, a lot of what remained only in Servo (and not in Firefox) was nowhere close.
The layout component was by far the least mature of any part of Servo at the time (unlike Stylo and WebRender, I mean) and in fact it was going through the early stages of a brand-new rewrite of that component at the time the project was cancelled, partly because the experimental architecture ended up not being very suitable.
When Servo was still managed by Mozilla, they were able to merge some components incrementally into the Firefox. Most famously, Stylo and WebRender were first developed in Servo. They could have kept Servo for experimentation and merge parts incrementally.
It may also have enabled better embedding supporting which is a weak point of Firefox compared to Chrome; which is a long-term solution to remain relevant.
The CSS engine and rendering engine are a lot easier to swap out than the remaining parts.
Again, I get why people like Servo, but "in 10 years, maybe we'll be able to take on Electron" isn't that great of a value proposition for a huge R&D project by a company already struggling to remain relevant with their core projects.
Perhaps not, but "in 10 years, we'll have a browser that's significantly faster and safer than the competition" is how you plan to still be relevant 10 years from now.
Their primary leverage is unique features and functional adblockers, neither of which is impacted by the layout engine.
And again, you're taking away resources from something that is already behind right now. The canonical example of massive long-term rewrites being a bad idea for the business is literally the precursor to Firefox. Gecko can be refactored in-place, including into Rust if they decided to do so.
Yes, unique features like being written in a memory safe language and depending on memory safe implementations of image and video decode libraries are exactly what I care about in an all-knowing sandbox which touches network services and runs untrusted code on my computer.
> And again, you're taking away resources from something that is already behind right now.
Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
What does that have to do with Servo? Firefox has already been doing those things and continues to do them [0], they don't need to do them in Servo first.
We are specifically talking about the utility of rewriting a layout engine from scratch, rather than putting more resources into evolving Gecko - including rewriting small parts of Gecko in Rust incrementally.
>Disagree. You're talking about every Mozilla project that's not Servo. Firefox/Servo development is Mozilla's core competency. One which they've abandoned.
They obviously haven't abandoned it. It's not like they cancelled Gecko development too and are rebasing on top of Blink. Again, this is all just a philosophical debate over whether rewrites or refactors are more effective when it comes to the most core component of the browser.
[0] https://github.com/mozilla/standards-positions/pull/1064
Do you see those red and orange and green pie slices? 40% of the code. There, be memory errors. Approximately 70% of all errors in that code will be memory safety related and exploitable.
Fixing it looks like developing Servo.
Don't want to take my word for it? How about the US Department of Defense: https://media.defense.gov/2022/Nov/10/2003112742/-1/-1/0/CSI...
I'm not responding further until you actually read and understand what I'm saying instead of flailing at a strawman.
A rewrite is the only way to convert the codebase to Rust or any other memory safe language. Whether that happens in parallel, piecemeal, or both at the same time is down to how well you use a version control system and structure your code. As has already been shown by sharing Servo code with Firefox.
A full rewrite is particularly useful with Rust, as the language wants you to structure your code differently than most C/C++ is structured. Doesn't make sense not to have one going if that's the plan. If you're going to rewrite the whole thing anyway, might as well do it in an idiomatic way.
But just to offer another point, I also still run into memory leaks and other performance issues in long-lived Firefox processes which, based on my experience with Rust, would be unlikely to be a problem in a functional Servo. It'd be nice to have a browser I don't have to occasionally kill and restart just to keep Youtube working.
> based on my experience with Rust
This suggests that you haven't encountered references cycles at your level of experience:
https://doc.rust-lang.org/book/ch15-06-reference-cycles.html...
Their wording was "would be unlikely", rather than "don't happen". The affine(ish) type system, along with lifetimes, makes it so most objects have a single owner, and that owner always knows it is safe to deallocate.
> for some reason seems to define memory leaks as safe behavior
The reason is that Rust aims to prevent undefined behavior, and that is the only thing it defines as unsafe behavior.
Memory leaks cannot cause a program to, for example, start serving credit cards or personal information to an attacker. Their behavior is well defined (maybe over-complicated thanks to Linux's overcommit, but still well defined).
Rust does not protect against DoS attacks in any way. In fact it seems to enjoy DoSing itself quite a lot given how many things panic in the standard library.
"Rust’s memory safety guarantees make it difficult, but not impossible, to accidentally create memory that is never cleaned up (known as a memory leak)."
My sentiment exactly. Rust makes it difficult to accidentally create memory leaks. If you try hard to do it, it's definitely possible. But it's tremendously more difficult to accomplish than in C/C++ where it accidentally happens all the time.
This makes it so that things that appear dangerous in C++ are safe in Rust, so for example instead of defensively allocating std::string to store strings (because who knows what might happen with the original string), Rust can just keep using the equivalent of std::string_view until it becomes obvious that it's no longer possible.
Things You Should Never Do, Part I [0]
[0]: https://www.joelonsoftware.com/2000/04/06/things-you-should-...
Certainly not on performance. On safety you have a chance because bugs happen.
They should just keep launching bookmarking and vpn services that might make money RIGHT NOW.
I'm trying Firefox on Android at the moment and it's noticeably less snappy than Chrome. I wonder if Servo would have changed that.
To be clear, I am not trying to claim you are wrong! It is the common wisdom that chrome is faster on android. But I swear, scrolling and page loading just feels faster on firefox. My only guess is the adblocker, but I think firefox is faster than brave, so who knows.
I wonder if other have a similar experience and can shed some light on the situation?
Firefox is probably faster for ad-heavy sites, but it definitely isn't for sites without obtrusive ads.
But on the occasional times I launch it (chrome), it just feels like it bogus down more often doing basic things. Someone once suggested that, paradoxically, it is slower because I don't use it very often. Something to do with ART and how the AOT compilation work on android
Mozilla decided that replacing Gecko as-is was not reasonably likely to actually happen, and that further efforts towards Servo would be better made by continuing to evolve Gecko.
Building these kinds of apps was commonplace in the 90s/early 2000s: photo editing apps, word processors, IDEs, 3D modeling software etc. Maybe RDBMS count as well.
In practice Rust is mostly used by web people to gain clout - rewriting microservices, which are usually <10k, but very rarely above 100k LOC, and were originally written in a very slow language, such as Python or Ruby.
Had these projects started out in an uncool, but performant language, like Java, there'd have been very little reasonable justification for these Rust rewrites.
On the performance side, raw processing power is not that important as you are waiting for io all the time: the main difference is not having GC and GC spikes (if you are working at significant scale) and lower memory usage all around.
[1]: https://www.joelonsoftware.com/2000/04/06/things-you-should-...