(I considered JAX, but the code in question was not amenable to a compute graph. Another option was to thread by fork, and use IPC.)
I liked the language itself more than expected. You have something like "generics" with tensors. Suppose you pass a parameter, N, and you also would like to pass a tensor, and you would like to specify the tensor's shape (N, N). You can do this; the parameter type constraints can reference other parameters.
Tensors and various operations are first-class types, so the compiler can optimise operations easily for the system you're building on. In my case, I got 80% improvement from ifx over gfortran.
Invocation from Python was basically the same as a C library. Both Python and Fortran have facilities for C interop, and Numpy can be asked to lay out tensors in a Fortran compatible way.
Part of what eased the port was that Numpy seems to be a kind of "Fortran wrapper". The ergonomics on tensor addressing, slicing and views is identical.
You can do that, and it might be cleaner and less lines of code that way.
But you don't necessarily need to pass the array dimensions as a parameter, as you can call `size` or `shape` to query it inside your function.
program main
implicit none
real :: a(2, 2) = reshape([1., 2., 3., 4.], [2, 2])
call print_array(a)
contains
subroutine print_array(a)
real, intent(in) :: a(:, :)
integer :: n, m, i, j
n = size(a, 1) ; m = size(a, 2)
write(*, '("array dimensions:", 2i3)') [n, m]
do i = 1, n
do j = 1, m
write(*, '(f6.1, 1x)', advance='no') a(i, j)
end do
print *
end do
end subroutine
end programAt the time everyone seems to default to using C instead. But Fortran is so much easier! It even has slicing notations for arrays and the code looked so much like Numpy as you say.
It also helps that Fortran compatibility is a must for pretty much anything that expects to use BLAS.
I wouldn't call multi-dimensional arrays tensors though. That's a bit of a bastardization of the term that seemed to be introduced by ML guys.
It wasn't until I started using Fortran that I realized how similar it is to BASIC which must have been a poor-man's Fortran.
If you didn't have vectors, Maxwell's equations would spill all over the place. Tensors on the other hand are used in places like continuum mechanics and general relativity where something more than vectors are called for but you're living in the same space(/time) with the same symmetries.
It is also a bit funny that the author complains about older Fortran programs requiring SCREAMING_CASE, when if anything this is an improvement over previous and current practices. Too many Fortran codes have overly terse variable names that often were just single characters or impenetrable abbreviations for obscure terms. I have had to create cheat sheets for each program to figure out what each variable was.
Sun Microsystems had a great quote about this back in the day [1]:
> Consistently separating words by spaces became a general custom about the tenth century A.D., and lasted until about 1957, when FORTRAN 77 abandoned the practice.
[1] https://docs.oracle.com/cd/E19957-01/802-2998/802-2998.pdf
Style wise, many prefer to reserve functions for things that resemble mathematical functions (i.e. only intent(in) and pure). In some sense a little bit similar to how people tend to use lambdas in python.
Waitaminit, is that why we have "sub" in Visual Basic ?
The NUMERICAL RECIPES in Fortran 90 book[0]
The NUMERICAL RECIPES website[1]
0 - https://www.cambridge.org/us/universitypress/subjects/mathem...* Nothing wrong with that as a reason, of course
The trickiest part of really learning Fortran today is that it is hard to define what the language is, apart from the practical definition imposed by what its seven or so surviving compilers accept and how they interpret it. There are near-universally portable features that are not part of the ISO standard; there are standard features that are not at all portable, or not available at all anywhere. So what one should know as “Fortran” is its reasonably portable intersection of features across multiple compilers, and there isn’t a good practical book that will teach you that.
0 - https://docs.open-mpi.org/en/main/developers/bindings.html#f...
Fortran in Modern Scientific Computing: An Unexpected Comeback - https://medium.com/@stack1/fortran-in-modern-scientific-comp...
5 Reasons Why Fortran is Still Used - https://www.matecdev.com/posts/why-fortran-still-used.html
Is Fortran better than Python for teaching the basics of numerical linear algebra? - https://loiseaujc.github.io/posts/blog-title/fortran_vs_pyth...
I take back everything i said about FORTRAN - https://x.com/ThePrimeagen/status/1745542049284423973
Modern Fortran online tutorial - https://wvuhpc.github.io/Modern-Fortran/
I recently learned Fortran IV to build a backpropagated neural network for the IBM 1130 (1965) and was amazed to see it compile with no warning on both the punched card compiler from IBM and on a modern Fortran compiler (gfortran).
Some Fortran II conventions, like the arithmetic IF, are now deprecated, but --std=legacy is all it takes to make it work.
FLOW-MATIC's claim to fame was beating Fortran at releasing a working implementation (and having syntax that looked like English, but that's not something to be proud of). Plankalkül, however, has not yet been implemented so if we're only counting releases of working software, it isn't a contender.
Pretty much sums up this one. Can't say that I agree if/select/stop are "modern" features.
> Next time, we’ll talk more about...
Alas, there was no next time.
Edit: and just three paragraphs in, the author admits they didn't even bother using the oldest version of FORTRAN itself.
So, "oldest" means "rather early".