My father points out that the monopoly deal with the US govt turned Bell Labs the profit center. AT&T charged its customers a fixed percentage of its costs … from 3 years prior. The only way to make more profit was to make today’s costs cheaper than those of 3 years ago. This lead to: innovate, innovate, innovate.
Many companies admire research. For AT&T at that time, it was their lifeblood.
No, this still allows the standard method of increasing profits under a cost-plus structure: all it takes is increasing your costs. The more you spend, the better off you are. Doesn't matter what you get for your money.
With a three-year lag, you need to be able to pay your operating costs out of savings over a three-year window, but that's it. It's purely a cash-flow restriction.
By contrast, lowering your costs will make you a one-time cash bonus this year, at the cost of making you permanently poorer. This is not a sensible trade to make.
By comparison, a construction company cannot stop buying materials in order to continue generating revenue, and they cannot reduce their workforce significantly and continue taking on the same types of jobs.
In the instance of Bell Labs, those researchers were the 'unnecessary to business' route through which to increase revenue (by both increasing costs and also building a technology-based moat) and since those who control a company typically only profit by 'skimming the cream' increasing revenue without increasing profit seems like an excellent way to increase the value of the company to those who are interested in doing so.
Richard Hamming worked on the Manhattan Project, and John Tukey developed fast Fourier transforms (FFT) in service of the US nuclear program. John Von Neumann shows up as a hidden presence, working with Tukey when Tukey coined the term "bit", and informing Claude Shannon that the mathematical quantity he was working on was a special case of the concept of entropy.
There are periods of history like this where major breakthroughs (quantum mechanics and also what we now call computer science) enable new technologies (like physically realizable Turing machines) and those technologies are suddenly in high demand (because of WWII and then the threat nuclear war). Suddenly there are lots of new ideas, insane amounts of money, and the survival of humanity requires getting things done.
The other interesting thing to me is that Bell had experience with communication at scale due to their telephone networks. And a lot of early computer hardware and software concepts echo designs from the telegraphy days. Think of things like UART and the progression from telegraphy to teleprinters to keyboards. Concepts like Clos Networks from the 1930s are still used in modern data centers today. So having all these large communication problems at scale fed into what we now think of as computer science but really predated computers.
So you may not get the same mileage asking the right questions as Bell Labs did. The environment they were working in was pretty remarkable. We may be getting there eventually with AI, and perhaps climate change will be a pressing enough problem the way nuclear war was.
Yes. The heyday of big corporate research labs was roughly from WWII until the mid-1980s. There was a big open space for pushing things that seemed possible all the way through to commercial products. From synthetic rubber to transistors, the basics had been done but there was plenty of development ahead. There were lots of things which were possible but hadn't been done yet. Sometimes it was too early, and only a non cost effective demo resulted. Sometimes there was a big payoff. Overall, corporate R&D had a positive payoff in that era.
I was lucky enough to visit or deal with many of the big corporate labs of that era. They were quite varied in outlook. The Bell Labs people were telco guys; their approach to networking fit the Bell System centralized model. The PARC people got personal computing and local area networking working, but somehow were obsessed with discrete-event simulation as an application. They didn't foresee how it would all be used. The SRI International people were too into abstraction, and kind of hand-wavey. I once tried using string substitution to remove "virtual" and "abstract" from one of their documents, and it got better. The IBM guys had the coolest location, a glass box on a hill overlooking a park. IBM happened to exit the disk drive business the day I visited. That place had invented the disk drive. Their day was over. The Ford Scientific Research Lab people were theoreticians who were also car guys. The back side of the building had a long, long row of garage spaces with tools and lifts, and everybody had a car to which they were doing something. Their in-car fiber-optic network was forty years too early.
Common factors?
- Steady funding. Not many ups and downs.
- A profitable parent company with a stable business model.
- Good support. Unlike academia, where grad students tried to do machining, badly, the corporate labs had competent people to build stuff.
- Not too rigorous cost accounting. I was once told by a manager of a major lab that "your accounting system is too good", because it kept people on defined tasks.
- A focus on good demos. Production was the responsibility of others. R&D did not ship products.
https://www.goodreads.com/work/editions/937703-crc-handbook-...
which I have found quite useful for its overview of trigonometry and so forth when machining.
Goodheart’s law states: “ When a measure becomes a target, it ceases to be a good measure”
If your accounting is too fine grained, there becomes an inescapable draw to point at a metric and declare that a goal.
Once that happens, engineering tasks can be come extremely well defined to the point of acting like tunnel vision.
Instead of finding improvements or innovations generally, the focus is entirely on moving a specific metric.
You see this issue in highly metric driven orgs as well.
One thing people don't realize is that we had anti-submarine aircraft that would fly themselves on autopilot using sea-scan radar to attack submarines, radar-guided anti-aircraft batteries that would automatically shoot down aircraft (the only thing the humans were doing is loading the shells), proximity-detecting shells that would explode when they got near a target, and many other amazingly high-tech weapons during WWII. Most of them came out of the Rad Lab and a few other places.
Solid-state electronics using semiconductors and masers were both explored at the Rad Lab but it was obvious that they would not be useful for creating weapons so they were set aside, but they were also taken back to institutions like bell for development post-war.
My grandfather designed and was in charge of production of the Chain Home transmitters at Metrovick [2], he worked on centimetric systems after this as there is a family story of him sleeping with one of the prototype Cavity Magnetrons under his pillow to keep it safe from air raids.
He also built the transmitter for the first AWACS, a Wellington bomber, and helped fit it into the aircraft, but this is presented as just a TRE project.
[1] https://en.wikipedia.org/wiki/Telecommunications_Research_Es... [2] https://en.wikipedia.org/wiki/Metropolitan-Vickers
The anti-submarine radar was a good example: the British developed the sea-scan radar, but testing showed it was very difficult for a bomber crew to use it to actually get the plane on-target so the Rad Lab developed the radar-driven autopilot and integrated it with the B-24s.
I think the H2S radar was developed entirely by the British with limited US/Rad Lab involvement though.
Amazing story about your grandfather! Actually the cavity magnetron played a key role in starting the Rad Lab. The US was still neutral at this point and Churchill decided to "gift" one of the prototypes as well as some scientists to explain how it works to the US in hopes that the US would create radar sets. On the way over the scientists decided to give the crew of the convoy ship a physics lecture, but they couldn't talk about what they worked on so they decided to talk about something that could not possibly be relevant to the war: nuclear fission.
To circle back to the subject of the thread, after WWII my grandfather was in charge of the research lab at Metrovick, his final job before retirement was to close it down.
Precisely. Such as:
> the threat nuclear war
One would not happen without the other. Invention is neutral. Humanity puts it to good or bad uses.
> and the survival of humanity requires getting things done.
Well, that's what they tell themselves, looking back on their works, at least.
The part of the article that mentioned Bell Labs’ philosophy for choosing research problems reminds me of Richard Hamming’s famous “You and Your Research” talk, where Hamming talked about the importance of working on important problems, and how he asked his colleagues about the importance of their problems.
Come to think of it, part of the reason why Xerox PARC and Bell Labs are so renowned is because of the importance of the problems they worked on. Transitors, Unix, Smalltalk, the Xerox Alto, GUIs, word processing, Ethernet, the list goes on.
You mean the Mergenthaler Linotron 202? That cost nearly $60,000 but they considered it the lowest price high-resolution phototypesetter available in 1978?
Just the small work they did on supporting mice so early on (third system to integrate a mouse?) was pretty massive for changing the way we interact and deal with computers.
I know the mouse concept is in “the mother of all demos” but actually first principle integration of the mouse into the system probably belongs to PARC.
But GUIs, OOP programming, etc, Xerox had no business model around that at all and it was just sitting around, until someone "stole the loot".
The brains are the most precious resource, not the money, and they should not be bothered with trivialities.
It takes a whole lot of implementation and organisation to make any innovation tangible, so in some ways a bigger overall society slows things down as much as you gain from the total number of researchers.
Competition from foreign societies is also a part of this.
Nice example of a simple policy that changes the culture in exactly the way you want it to. The university researcher types want to say "ugh, stop interrupting me," but when that's not an option everybody keeps each other on track.
Seems like a general recipe useful in startups, science and life.
afterthought :
We are not doing this now, either in science research or in startups :
The current VC model of funding startups is too few large bets, and too many hoops for small startups to jump thru to get to VC money.
Wouldn't a more efficient betting stategy be to place a small bet on a promising team/idea/market .. then followup with a larger bet on proof of progress ?
I think we are missing out on funding a long-tail of small startups using Reinforcement Learning techniques to solve real problems in engineering / logistics etc.
The impedance mismatch means we dont get the useful new tech built, we dont develop talent and VCs take on more risk than needed and miss out on high growth startups.
This might be structural in the money supply .. we might need smaller boutique VCs to fit that gap between large capital and small startup .. perhaps more post-exit founders should start small angel firms, and mine the RL niche.
standardized questionnaire / pitch format
standardized terms
trust ratings on both sides, like airbnb reviews
common tag set for searching / matching
In an auction style market to mix and match small investors and small startups.A similar style market for cofounders / early hires might work.
YC cofounder matching has a great pool of talent, but my guess is tag search would be a better way to find good matches.
In a war of desperation , those playing it safe and conservative are instantly out and all the experts get their day in the sun. All the safe choices become coffins for the ideas of the past.
Contrast this with the NIH, where the science also has a goal - improving human health - but the system to be improved was not engineered. Curing a disease, which has a natural origin, is quite different from improving communications channel capacity.
I suspect that managing engineering research is much more amenable to process analysis than research on biological systems.
And you work at a phone company, and phone companies have interesting problems. I'm sure at some point everyone there felt somewhat obligated to do something to contribute. And there's plenty of interesting things to do at the phone company.
Then the MBAs come in, try to reduce the "lose some" with tightly managed processes and you slowly start losing the wins.
A similar thing happened at my company. During summer they would let people go at noon on Fridays resulting in a 36 hour week. Productivity stayed the same, people were happy. New VP comes in "won't we be more productive if we worked Friday afternoon?", back to 40 hours. Productivity still the same, people less happy, more people leaving.
Oh but how true this still is today!
Bell Labs and Xerox PARC are examples of this.
So while there were not significant pressures to produce products there was a culture of working on relevant problems or in areas connected to relevant problems. ... rather than going full open-loop gazing deep into the category theory of their navels -- which appears to be an occasional failure mode of academia.
In working with academics in niche areas there often is a thirst for applications-- like you've found some interesting idea but where to go next? What constraints need to be solved for to make this interesting property into something useful. Applications would be a good guide if anyone would provide some, but the connections often don't exist. Part of the magic sauce of Bell labs must have been this enormous science and engineering driven industrial corporation that could just flood real problems at people who needed them.
They were also allowed to explore areas where they reasonably believed value might exist without being forced to ship quarterly or being forced to pursue and sell their pursuit of what would be sexy to investors
Rather than Problems in search of a solution, possibly related to this technology.
Sadly our academic institutions and how they are funded by the govt is also hyper focused with grants and goals required to "prevent waste" and require specific results - serving as blinders to wifer ranging exploitation
Modern companies are under much tighter control.
Government funding tends to have a problem with bureaucratic overreach, the need to persuade committees mostly composed of old ossified has-beens, and the need to produce papers at any cost to "prove" you are not slacking off.
France redistributes over 50 per cent of domestic GDP. It has some scientific successes (as measured, say, by Nobel Prizes), but not dramatically more than other comparably developed countries.
Sure, LLMs are impressive, but they don’t yet feel as foundational as Unix, transistors, GUIs, editors, or word processors. But give it a few decades, and who knows? In hindsight, we might look at OpenAI, Anthropic, or DeepMind with the same awe we reserve for Bell Labs and PARC today.
So for example, rather than succintly stating: "The quick brown fox jumped over the lazy dog", this author adopts the style more akin to: "The quick brown fox, which grew up Oakville near the rushing river, jumped, or, leaped mightily, over the lazy dog rather than simply walking around the dog like most would do normally".
Similar to Blaise Pascal's sentiment: 'I have made this letter longer than usual, only because I have not had the time to make it shorter.'
Indeed, Thomas Jefferson writes about this skill: “The most valuable of all talents is that of never using two words when one will do.”
As a counter, readers of this board love some Tolkien, yet that writing goes on and on and on and on with so much extra text
This describes the majority of all literature, works of the written word, and perhaps human communication in general. For Tolkien, it could be argued that the whole purpose of his books is to entertain with the medium of language, so all that "extra text" is not extra at all, since they're all supposed to be there to be appreciated and enjoyed. If you took away the "extra", there would be no literature there, just a plot summary.
It wasn't just extra prose. I found it genuinely difficult to read.
Maybe Bell Labs was lucky to be the right place and right time for a lot of briliant people.
High corporate profit taxes lead to lots of innovation because what else are you gonna do with all that money? Give it to the govt? No no, better pay a bunch of super smart people to just do stuff, maybe they’ll find something.
Everything else is downstream of that
https://slate.com/business/2012/07/xerox-parc-and-bell-labs-...
With high corporate tax rates companies are incentivized to invest in tax deductible expenses like: research, employee training and retention benefits, building factories, etc.
Higher taxes are good for the country. Lower taxes are only good for wealthy individuals, and only in the short term.
That sort of topic requires an active argument of the form "this is why they did that". IE, identifying how Bell's owners and operators were profiting from the work. It probably wasn't by running skunkworks and research labs.
And, carthago delenda est, the big change in the US has been the transition to a service economy and financialisation. Since research isn't likely to help that sort of economy but is in an industrial economy this change seems like an obvious candidate for why it got rolled back. Doing research while trying to be a cutting edge producer makes sense.
Yet Bell Labs was jointly owned by AT&T, which provided telephone service, and Western Electric, which made the equipment used by AT&T to provide telephone service.
Maybe your claim is the opposite of the evidence.
I also think dividends should not be taxed by corporations, and only taxed as ordinary income by stock holders.
I’ve been told I’m a crazy socialist.
How else are they supposed to raise the stock price without actually improving the business in any way?
(I know it's more than just a "mindset" and is driven by real, rational financial incentives, but nonetheless it's a cultural phenomenon.)
Even if you set up Bell2 Labs, the people you're hiring are still the standard 21st century STEM workers/academics.
Perhaps Bell2 Labs would do better if set up in Japan?
Also, companies would rather spend money on acquisitions than retention, and that choice keeps reinforcing the cycle.
They also hired literally thousands upon thousands of researchers.
Consider that fusion power is an inevitability, but you don’t know who’s name will be associated with it, nor the correct stock to invest in prior to their success.
Hindsight is always 20/20, remember?