Having said all this: nobody should be using crypto/fips140 unless they know specifically why they're doing that. Even in its 140-3 incarnation, FIPS 140 is mostly a genuflection to FedGov idiosyncrasies.
https://go.dev/doc/security/fips140
Yup.
What should folks use then?
FIPS 140-3 mode is enabled by building with GOFIPS140=v1.0.0 (or similar, see https://go.dev/doc/security/fips140), but it shares 99% of the code with non-FIPS mode.
Still, your message is right, just GOFIPS140=off (the default!), not GOFIPS140=v1.0.0.
Speaking as a sysadmin for a local government roped into FIPS requirements by way of FBI CJIS compliance I can safely say your assumption of competence is incorrect.
My personal experience disagrees.
What does "Soup-to-nuts" require in this context? Should I expect that Go has for some reason re-implemented the x86-64 machine code to do ChaCha20 or do you still consider it a "Soup-to-nuts" implementation if they reuse the same machine code for this that you'd find in everybody else's implementation so long as they wrote their own wrapper code ?
Unlike say compression there's no real room in these core features to innovate. You can't pick different "nothing up my sleeve" numbers, you can't can't offer more parametrisation, anything like that would be an incompatibility - it's intentionally a standardized component.
Besides integrating properly with Go applications, this lets us optimize for readability and correctness, with IMHO excellent empirical results.
https://words.filippo.io/a-literate-go-implementation-of-pol...
Nah, I'm pretty sure there's exactly as much (assembly) as I'd think because I spent some time prodding it before writing the question in the first place.
You likely understand what it does much better than me, but I'm under no illusion that there's a lot of this code, nor indeed that it's problematic to use this code.
However I'm not at all convinced that porting "hand written assembler to higher level generators" is a good strategy for this specific code. It serves to launder the creativity, so that maybe some day one lawyer can convince another lawyer that A and B are distinct works despite every actual process along the convoluted path from A to B being mechanical... but it doesn't seem like it gets us closer to having a sustainable alternative.
And that's what I actually am interested in here, I think we want a Special Purpose language focused on emitting guaranteed constant time machine code for an algorithm. So that actually gets to the same Readability and Correctness goals you state, but rather than hoping to get away with writing Go (or any GP language) we accept that it's never going to make sense for a General Purpose language to prioritise this. IMO We're more or less guaranteeing one day we accidentally ship an Oracle.
Porting the assembly to higher-level generators has nothing to do with lawyers (??), the goals are stated in https://go.dev/wiki/AssemblyPolicy.
The idea that one day we'll write All Of The Cryptography Code Once And For All In The Perfect Language and reuse that across languages comes up pretty regularly, and has never panned out.
Still, now we're here:
It's foolish to wait for a "perfect language" but it's not foolish to look for a way to improve the status quo, and I don't buy that just because it so happens it "never panned out" previously that won't ever change.
Which actually gets me back to that DIV instruction. What ensured Go couldn't have this problem? Just luck? The IRA's warning to Thatcher seems very applicable.
We avoided the DIV by deliberately not using a modulus operation and doing Barrett reduction instead.
This moves the needle for Go but you still need to cut a FIPS version of your software since this crypto is still statically linked. I like this option quite a bit if the Government customers get on board with it.
There are some Go forks maintained by Microsoft and RedHat I believe that do dynamic linking for crypto which requires CGO.
To clarify, you could previously link to an external SSL library (typically something like BoringSSL) and people did do that. However, it makes cross compilation a pain compared to native Go.
The money could probably be more wisely spent if not following FIPS but without FIPS the average company wouldn't direct that money towards security at all.
But yeah, complying with FIPS doesn't necessarily mean you are secure, and it is definitely possible to be secure without being FIPS compliant.
It is of no security value.
And the average company needs to be dragged kicking and screaming to care about security at all.
That has nothing to do with FIPS 140.
FIPS 140 is just requirements for "cryptographic modules".
It specifies which algorithms are allowed and requires that you use modules that have been "validated" by going through an expensive and slow auditing process.
While I don't think it is completely useless to have those requirements, it has some problems, such as:
- it takes a very long time for anything to get validated. For example, Ubuntu 22.04 only recently got its crypto packages validated after being "in process" for years.
- bug fixes have to go through the validation process too, so if a vulnerability is found, you can be left vulnerable for a while
- For many languages and runtimes, using FIPS certified modules is a royal pain. For example, for several versions of node, there was no good way to run it in a FIPS compatible way, because it required a newer version of openssl than the latest FIPS certified version. AWS lambdas, even in GovCloud don't include FIPS certified crypto, so you have to bundle it in your package and make sure to use your local library instead of the system library, which can be quite difficult depending on the language. Prior to this change in go, using FIPS in go required using cgo to link to a FIPS certified c library, and make sure you either did some fancy link magic to get any libraries you used to also use that, or don't use any libraries that use the standard crypto library.
- It doesn't include many algorithms that are widely used and generally considered secure including Ed25519, chacha20-poly1305, argon (along scrypt, bcrypt, etc.), etc. This can cause problems with compatibility with other systems.
The consequences of encrypting wrongly quite possibly are worse than if you never encrypted at all.
- "Don't use unproven cryptography" is a reasonable policy.
- Policymaking can be subverted by bad actors.
FIPS validation address the compliance problem of needing validation. Beyond that, the benefits are ambiguous at best.
The most useful thing about FIPS 140 is that it’s a great way of quickly identifying clueless security people.
The way you get quality products is by requiring and validating quality, not by picking a lowest common denominator and then pinning it in place for a decade with paperwork and calcified standards bodies.
This also now makes Go a very convenient language to write US Gov software in.
If you have never heard of FIPS before ignore this entirely and continue to live in happiness.
This is something I wondered. Will we now see even more adoption of Go? Will Go be a more attractive option compared to language X/Yv
The system libraries approach used by Microsoft Go is cgo based IIUC, and I think derived from Go+BoringCrypto. I understand they are working on migrating their bindings to fit better downstream of the new native mode.
It's based on OpenSSL
It depends, but if you are targeting Security Level 1 (which is what most folks think about when they think about FIPS 140) you generally don't need your entire application to be validated, only the cryptographic module.
So (again, depending on your requirements and on the Operating Environment you deploy to and on what algorithms you use and how) setting GOFIPS140 might actually be all you need to do.
LOL. At least they don't bury the lede.