cURL is not alone.
“I hear similar witness reports from fellow maintainers in many other Open Source projects,” Stenberg writes on LinkedIn.
Several of those colleagues back him up in the discussion thread — among them the maintainers of glibc, Vim, and Node.js.
“I'd say it is primarily because the tooling has improved. HackerOne did basically nothing new that could explain this (plus, this is mirrored in countless other projects, many of them not on hackerone). This is a notable change in the incoming reports,” Stenberg writes.
HackerOne is the platform cURL uses to receive bug reports.
There is an unexpected downside to being flooded with good bug reports, though — there are simply too many to handle in time.
The challenge used to be filtering out noise. Now it is keeping pace with reports that actually matter. That is how Steve M. Hernandez, a code security specialist, puts it, in the same thread on LinkedIn.
“High quality reports at higher frequency still require the triage capacity and decision consistency to keep up. The bar is moving from filtering noise to keeping pace with real signal.”
There is also something very unsettling about how easy finding vulnerabilities has apparently become. The exact same flaw can be reported several days running. Willy Tarreau, who maintains the load balancing project HAProxy, has seen it coming.
“We're all progressively killing embargoes as well, they're pointless for vulnerabilities found by widely available tools, it's just trying to hide something that can be published again the next day,” he writes.
But as Daniel says somewhere: ”The AI tools are better at finding problems than they are at fixing them or writing code...”
There is also a consensus is humans need to be involved to evaluate reports and code – to filter out AI slop. There are also discussions on the more philosphical level like ”Sure, this is a vulnerability, but it would more properly be on the user to guard against it.”