1200 years for superhuman programmers but two for human-level coders seems strange. Why is arbitrarily a human ballpark such a big benchmark? Or as they put it: the workload that takes x time needs to increase by doubling for 1200 years each seven months. Strange leap, as that is a factor of 1.410E42, while there are e.g. only 4.310E17 seconds for the time since the big bang.