I know Southwest Bell bought a number of them and stuffed them in a closet north of downtown Dallas. During the install I remember having to explain what Ethernet was to their techs. They were EXCELLENT at phone standards, but had decided the data world was threatening and were determined to never learn anything about it.
I know that between around '93 and '97 if you dialed AOL from D/FW there was a good chance your call would be terminated somewhere within a mile or two of your house and the bits flowing between your Compaq Presario and AOL would be sent digitally from the local CO to AOL's data center in Sterling, VA.
This line of business was (of course) destroyed by consumer DSL and cable modems, but for about 5 years it was fairly popular with the phone companies. ISDN at the time was a bit pricey for most households and a modem is a one-off purchase. Most people I knew using things like AOL or CompuServe were using a hand-me-down 36k modem on a crappy 33MHz 486sx running DOS / Win3.1 / Win95 and were fairly cost-sensitive.
We’ve had techs come to our home in Canada in the 1990s, and I remember being fascinated with their mystical toolbox phone that seemed to uncover hidden phone line functionality. Almost like the whip in Indiana Jones.
https://en.wikipedia.org/wiki/Toll-free_telephone_numbers_in....
Why use real physical modems when you already have subscribers signals converted to convenient digital form in DS1 bundle? Wouldnt it make more sense to put a box with one fat DSP doing 24 modems all in bulk inside a box with DS1 and Ethernet sockets at the ISP location instead?
These were great boxes, and the only way you could get 56K was to call into an ISP with one of these or similar on their end -- the trickery that allowed 56K relied on one end being fully digital.
I was working for an ISP around that time and we had a bunch of Portmaster 2s connected via RS-232 cables to piles of modems, some rackmount some just stacks and stacks of US Robotics Sportsters. Sometimes modems would get wedged and we'd have to reboot them or "busy out" the line that they were on. Harder for the modems that were an hour away.
When the transition happened we were able to get rid of all those wires and just plug in one small phone cable for the T1, another for Ethernet, and terminate 23 lines. The Portmaster would treat all the modems as a pool and route calls to whichever was available, and once a call was done would run some testing on the modem before putting it back into the pool. It was like a space age rocket ship! At one point I was driving around with $50K worth of Portmasters in the trunk of my car, hoping I didn't get rear-ended. They were not at all cheap, but they were worth it.
Yeah, friend. I’m very familiar, and it was amazing tech. It sure kept the data center cooling system busy, though.
I was installing Windows 2000 on a PC in the lab (manual disk swapping required) when, hidden behind another rack, several shelves full of physical modems all started calling the box at once, speakers on.
If you don't live in that jurisdiction, it's the same but your ISP and your phone company are the same company.
Also you don't dial your ISP with a number.
I also remember back in the day that my 56k modem would often only connect at like 48k or so, especially when it was raining. I guess living far out from the city made the connection more noisy?
https://www.microsoft.com/en-us/microsoft-365/business-insig...
They really do sound alot better. It always reminds me of the first time I ever made a FaceTime call, in 2010, and the high quality audio was just as interesting as the video.
If course it has to be pre-planned, someone needs to have the hardware with them. So sometimes there's a spontaneous connection over normal mobile phone. That's something that everyone has with them at all times.
I was talking to a sales rep, at the time I worked for USWest or QWest, whatever they were called at that time, which may have helped, and the sales rep told me "we are being told to actively discourage people from buying ISDN".
I get the impression that the phone company hated consumer modem use of any kind, because it tied up CO equipment 24x7, and they liked the returns on investment they got with people paying $25/mo for resources that were used an hour or less a day, sometimes with extra revenue from long distance calling. And ISDN was just another representation of that.
humans can’t hear above 20khz. adult humans can’t hear above 16khz or so, we lose the top end before age 20. this means that the standard 48khz sampling rate covers the entire human hearing range and then some (0-24khz). any sampling rate over 48khz for sound intended for human hearing is a total waste.
Also, you might possibly be sensitive to resampling artifacts if your output device runs at 44.1kHz and your file is 48kHz or vice versa.
Audio testing is hard, and testing on yourself is tricky... But if you have a sample that you're convinced sounds better at high rates than lower rates, I would urge you to put it through a tool to resample it down to lower rates and see if/when you can tell the difference. If the rate isn't an even multiple, it's worth using a tool that can dither; dithered resampling artifacts are less abrasive than undithered... I had some voice recordings to play over the phone, and everything needed to be 8kHz u-law; the 48kHz original recordings sounded better than 44.1kHz original recordings because one is even multiple and the other isn't, but either way, the waveforms looked worse than it sounded.
This seems to be mixing up two things; proper interpolation and dithering.
If you have limited bit depth (in practice, 16 bits or worse), you should pretty much always dither, ideally also noise shape. This is independent of the interpolation you're using; having a rational relationship between the original and downsampled signal makes some of the implementation a bit easier, but even for something like 48000 -> 24000, you'll end up with effectively a float signal that you need to convert to your chosen bit depth somehow, and that should be done better than just truncating/rounding.
And even for interpolating between two prime rates, or even variable-rate interpolation, you can and should get great interpolation (typically by picking out polyphase filtering coefficients from a windowed sinc of some sort).
"Headroom"
And the idea that humans can't hear over 20khz is like "humans taste 'sweet' on the tip of the tongue, and 'bitter' on the sides"
As we get older the hairs in out ears break or whatever and our perception decreases, but I could hear the fly backs in my old monitors, I used to be able to see the flicker in 3khz pwm LEDs, and my induction hob drives my kids crazy but it's merely midly annoying to me.
Get a real soundcard and some young people and play square(pwm) and sine tones starting at 16khz and find out where they can't hear it anymore. I find studio monitors with tweeters that are not paper are the best.
The extra headroom can indeed be useful for some kinds of processing, but you can safely discard it for actual listening.
Are they the exact same volume? We perceive things slightly louder as higher quality.
Is it a double blind test, ie an ABX test?
Are the bit depths the same? Many 96khz sampled files use 24 bits per sample, whereas 48khz usually uses 16 bits per sample.
but you do need phile-enough gears(minus the gilded pebbles hot glued onto circuit breakers)
The big issue with analogue landline phone calls is the audio bandwidth is so limited. It's not the full frequency spectrum, most of it it cut off.
EDIT: I do agree that lossless (or at least high bitrate modern lossy, like 256k Opus which is basically transparent) should be available in many more situations though.
https://en.wikipedia.org/wiki/Comparison_of_audio_coding_for...
https://en.wikipedia.org/wiki/Opus_(audio_format)#Quality_co...
24-bit samples is ridiculous overkill. That's a huge dynamic range that's completely unnecessary.
At 192KHz you'd be able to capture 96KHz signals, far, FAR outside the range of human hearing. Human hearing peaks at 22KHz so you only need a sample rate of 44KHz to capture the total range of human hearing.
For human voice you don't really need better than 16-bit samples at 12KHz or so. That's for great quality voice.
The only reason audio mastering is done at huge sample sizes and sampling frequencies is to prevent aliasing during mixing and to preserve higher frequency harmonics. There's absolutely no need for such rates delivering to human beings.
Also higher fidelity audio sampling is available for phone calls. The issue is more political than technical. Cellular carriers don't like to negotiate higher quality calls between one another so inter-carrier calls tend to fall back on the lowest common denominator AMR-NB codec. Intra-carrier calls don't even reliably pick AMR-WB let alone EVS available with VoLTE.
We could do this since local loops to most folks were about $150-200/mo, and we already had a channelized DS3 terminated at our rack at a local datacenter for our phone banks. If you bought your own DS1 retail you'd be paying upwards of $1k/mo back then to a provider.
It was by far the best "stickiness for dollar" investment into employee benefits I've ever found back then or since.
And I accessed the heck out of that connection (until the ISP went bust, wonder why?), and was very much a Q3A LPB during that time.
Fast-forward to 2025, and I now have dual 1Gbps symmetric fiber connections (AT&T, GFiber) into my home from opposite sides of the house. (It's totally gratuitous and I'll probably cancel GFiber in a few months, but I wanted to have it wired up so I could more quickly start service in the future.)
https://en.wikipedia.org/wiki/Ricochet_(Internet_service)
Wireless 56k baud. So you could take your luggable laptop circa 1994 with you and dial in to work... given you lived in SF.
One of my internships in college was at Sun Microsystems in the org that provided this connectivity to employees. My job was to automate pushing updates to connection software and modem firmware down to clients, but I ended up doing a lot of technical support as well.
The other (often overlooked) benefit that ISDN provided was 24/7 connectivity in an age of dialup.
Oh and you could spoof your outgoing phone number for caller ID ]:D
If they had gotten out of their own way when the internet came around they could have charged a small monthly fee to upgrade to a "digital phone line". Lots of people would have switched.
My dad ran a BBS from like 1992 to 1995, which started falling out of favor, especially as the users were getting more busy signals because the modem phone line was tied up with the internet connection.
It was truly astonishing to be up there, checking email.
A few, what seems very short, years later .. and now it is just normal.
While today you'd probably expect that the IP network could connect to every phone branch office where the local loops were connected to the phone network, that wasn't necessarily the case - Internet data would usually have to travel some of its way over the phone network backbone, with the problematic digital encodings. V.90 and related standards allowed the phone network to accept the digital data directly from ISPs and send it in digital form to the branch office, without attempting a digital to analog to digital conversion to inject it as digital voice into the phone network. That's why the upload speed couldn't be improved via this method - it would still need to undergo analog-digital conversion to travel across the phone network to the ISP where it could enter the IP network. (V.92, a later standard, improved upload speed to 48 Kb/s via fancier signal processing trickery that could survive the digital voice conversion.)
The phone companies had enormous sway over the development of these longer haul protocols. The debate around packet sizing almost always favoured smaller cells (especially ATM), which was more ideal for voice - with the added overhead for more standard IP packets. They were also often very connection-oriented, with all the extra equipment overhead required.
But if it was a direct copper pair end-to-end then attenuation and electrical other characteristics would have made it hard to achieve the higher speeds, this is the Shannon limits they mention.
Ok. Searched around. Here is an article that states old copper could have carried 1 gigabit.
https://www.newscientist.com/article/2317040-ordinary-copper...
Keep in mind that at the time, LAN speeds over controlled twisted copper pairs over short distances (100m) were 100mbps - 1gbps.
If you've ever seen the physical condition of the telephone company's outside subscriber wiring (what they call "outside plant") -- and particularly the intermediate splices between central office and subscriber -- you would quickly disabuse yourself of the notion that you could transmit anything close to 1gbps over a twisted pair.
https://www.revk.uk/2017/12/its-official-adsl-works-over-wet...
A quick read of the linked article seems to indicate that it's BS, as it doesn't account for the real topology of the local loop. In particular, in older neighborhoods you had a bundle of pairs going down the street, and a new connection was made by patching in to a free pair, creating a "T" shaped circuit. When a house was disconnected, part of this "stub" might have been left attached; over time a single pair might accumulate multiple disconnected stubs. The capacity of that copper circuit is far lower than a straight run.
In addition in many cases corrosion and water cause noise, further reducing bandwidth - I can remember having noise so bad on rainy days that I had to call and get them to fix it. (I assume they patched us onto a free pair and abandoned the noisy one)
Of course none of this is related to the end-to-end bandwidth of the old telephone system. Starting in the 50s a longer-distance phone call would get a single-side-band channel on a microwave link, with about 3KHz allocated. Later on calls got sampled at 8KHz with 8-bit mu-law (logarithmic) encoding, or A-law in Europe, and transmitted digitally.
Over what distance?
Make that distance short enough, as has happened with FTTN, or FTTC deployments in a whole heap of places, you're basically building a network that's, and I'll keep this very brief, subpar.
Since you mentioned a UK context there, Openreach rolled out an upgrade that kept the last mile of copper but now just about a decade later they're rolling out Full fibre. Whatever argument copper had, it went out the window near enough a decade ago.
In other words, instead of losing 99.7% of the signal over that distance, it'll only lose 99% of the signal. Sure, it helps, but consider me underwhelmed.
The very first part of a dialup modem sound? Where it's playing a tone that reverses phase at regular intervals? That tone is actually designed to disable all the repeaters and echo cancelers that are in your switched circuit.
Also two parallel phone lines are prone to capacitive coupling. I had a case so bad once that one office could pick up the phone and nearly perfectly couple onto their neighbors line and hear all their conversations. It was a 50/50 which port on the PBX recognized the tones and started the call when either of them picked up to dial out.
A reasonable decision at the time.
Australia tried this, it's physically impossible.
The fact that all of this worked continues to amaze me, but then, so do mobile phones. I understand at a high level how CDMA works, but it’s just so insane…
I remember us getting our first modeum, it was 800 baud! Then we moved to 2400, 14.4, 33.6 and eventually all the way to 56k.
Like some other commentors I also fondly remember ISDN. Overall I found it to be finicky. Sometimes one channel would just drop, even if a phone call wasn't coming in. And, in order to use a traditional analog phone with your ISDN line, you needed a special powered "TA" adapter or the phone wouldn't ring when a call came in.
DS0 is not encoding. It's (pseudo) framing.
> phone calls became digital with
The G.711 encoding in either aLaw or muLaw format.
Was crazy to think about trying to get your page to load in less than 64k a few years back.
Around 2000, I saw a crew pulling new phone lines through the neighborhood because everybody was getting a second line and they were running out, but even after switching to the new copper, we were still stuck at 26400. 20+ years later, it looks like 25/5 ADSL is now available at that address, so the new copper wasn't a complete waste.
As there was no legal compulsion to get them to act Bellsouth wouldn't do anything to help slow connect speeds for internet dialup. The trick was to lie and say you were having problems sending a fax, then they were required to act. They wouldn't even worry about testing first, as it was quicker to just re-engineer the line to the best practices of the day.
And then I had the satisfaction of explaining that Ma Bell cheaper out and used pair gains to build out their block cheaply. Sorry, call them to complain, or move.
What I loved the most about ISDN was the quick call setup. Took like 1 second max and you had a 64kbit channel. 56k modems went through a dialling phase, a connection phase, endless handshaking...
It was cool having a fast downstream but the slow upstream over finicky dial-up was a pain in the ass. If the dial-up dropped the in progress downloads all died because no ACKs could be sent. Gaming was no better than plain dial-up since your upstream had the same shitty latency as plain dial-up.