Apart from this little doubt, I love it:
The APIs and std lib are so extensive that we don't need huge amounts of third party packages. Typescript support is great. Deno actually can check Typescript (not just run it by stripping it as Node or Bun). Compiling (bundling) is another great feature.
- Automatic connection pooling
- 400MB/s+ of throughput
- 20 000+ keys and larger values (10-50kb)
- 1000+ concurrent reads/writes
- 200-250MB of RAM usage max
Without breaking a sweat, the limitation was my keydb test server.
(Issues = mts vs ts, configuration problems, lots of conflicting documentation. It doesn't help that I'm mostly DevOps and don't spend every day on writing Lambdas for AWS etc)
Edited to add: The installation page[1] talks about asdf[2] but I find mise-en-place[3] to be faster than asdf.
1: https://docs.deno.com/runtime/getting_started/installation/
Mise/asdf are pretty great. A .tool-versions file in your homedir will set up default versions, and you can override them with .tool-versions in your repos. And because 2 different tools can read those versioning files, you can include them in the repo and pin versions.
I think I wouldn't use TypeScript without Deno now :)
It's probably more about people complaining they can't run Next or some other framework on Deno which directly impacts their business (Deno Deploy).
Anyways, that backwards compat is huge
Does "deno install" behave like pnpm with the way it handles node_modules efficiently?
Also, is there any summary of the Deno 2 presentation? 1 hour is a bit too much to take in. It's almost like a movie.
The site presents Deno 2 as if it has finally beat Bun in terms of performance, is that the case? Either case, I downloaded Deno 2 from asdf and I'll give it a try, looks exciting!
Bun's HTTP server performs 51% faster before parallelism. Their benchmark is incorrect. They posted a correction, and their correction is also incorrect. Benchmarking correctly is hard, and we put a lot of effort into making sure our benchmarks are easily reproducible.
Bun v1.1.30: 283,386 requests per second (51% faster)
Deno v2.0.0: 187,359 requests per second
Deno v1.45.5: 185,522 requests per second
The following code:
let i = 0;
export default {
async fetch(request: Request): Promise<Response> {
return new Response(`Hello, world! ${i++}`);
},
};
Run with: oha -c 10 -n 10000000 --disable-compression http://localhost:{port}
This was tested on a Linux x64 machine running Debian 11 with a 32 core Intel i9-13900 CPU and 64GB of RAM. In this benchmark, the HTTP server runs on a single thread, so the CPU core count is not as relevant but still worth mentioning.If we increase the number of concurrent connections from 10 to 1,000 - Bun's HTTP server performs 267% faster.
Bun v1.1.30: 209,133 requests per second (267% faster)
Deno v2.0.0: 78,289 requests per second
Deno v1.45.5: 76,628 requests per second
Run with:
oha -c 1000 -n 10000000 --disable-compression http://localhost:{port}
Note that we add the --disable-compression for Deno's benefit here (as it does nothing for Bun right now)Have not yet spent enough time investigating their other benchmarks.
It's more like 20 mins.
There's a general intro of like 10 mins on Deno and then like 30 mins of a livestream with the Deno team.
I found the FAQ in the announcement well made too, with some good and questions and answers.
I'm already having a great time with Deno v1 on my side project - thank you!