“Conventional linear TV services alone (albeit ultra-high-definition) may not be sufficient to sustain the terrestrial broadcasting business which requires a large amount of highly coveted spectrum resources. Intelligent media delivery and flexible service models that maximize the network Return on Investment (ROI) is of paramount importance to the broadcasting industry in the new era.”
That's a lot of fancy words to say ‘we're doing this because it makes us more money’ lol
“Recent studies have shown that interactivity between media customers and service providers and between users themselves will be one of the most important features in the next-generation media service. In this document, this unique opportunity is addressed by defining a Dedicated Return Channel (DRC) system for the next-generation broadcasting system.”
Almost everything I've seen (besides BPS, and maybe HDR if you're one of the few who has a really good home theater setup) is a benefit for broadcasters and advertisers, and a bit worse for consumers (especially requiring new hardware/decoders... and sometimes persistent Internet connections!).
There is 1 operating system for ATSC3 DRM: Android.
There are several SoCs that can be used for "Level 1 Widevine".
When a SoC is compromised and the key is leaked from the TEE, all models of that device with the key are now untrusted for Level 1.
I think people should just be aware of the state of play.
Concerning because they could have a situation like with some 4K blu ray discs, your hardware becomes obsolete because DRM requires that cat and mouse game...
It would be a different story if the DRM were available ubiquitously, e.g. in the way that arguably Widevine is for online streaming (but certainly not broadcast TV). Are rightholders that afraid of unauthorized out-of-market rebroadcasts that they'd rather obliterate their reachable market with stunts like that?
This should be a big clue that the spying is the point, and all the DRMs of the world are justification for spying instead of the other way around. Total Information Awareness is the path to completing the Great Work.
Has this actually happened? Especially for "appliances" like set-top boxes or blu-ray players, as opposed to something like a tablet which are presumably easier to hack.
L1 devices remotely downgrade to become L3 devices. This has different effects depending on content provided from "totally unavailable" to "lower resolution".
It's both DRM and planned-obsolescence in one.
The qualifier: https://developer.android.com/privacy-and-security/security-...
The list: https://android.googleapis.com/attestation/status
The dev docs: https://github.com/doridori/Android-Security-Reference/blob/...
I'm not sure if it's confirmed, but it's believed Level 1 video output contains a watermarking scheme that ties the key to the media, so if it's leaked they can disable the key that leaked the content.
You can search around and find tons of angry consumers shouting into the void about widevine errors on older consumer devices.
Why would anyone use atsc 3? It's not free over the air, you can't spoof it?
TV networks in the US are a living proof that, with enough marketing spend and a pinch of confusion in the offer structure, you can sell people on anything. Half the time you can just offer sportsball access and people will switch.
It's how previous versions of broadcaster overreach happened, and why Smart TVs succeed despite their shortcomings.
That’s what happened to LG [0]. They dropped ATSC 3.0 tuners. I’m sure this cost them precisely 0 sales as the industry incompetence destroys the broadcast industry.
[0]: https://www.theverge.com/2023/9/30/23897460/lg-drops-atsc-3-...
That kind of pumping makes me sad and sick.
The cratering market values say otherwise. Few people under 40 even care about “TV”, and live sports distribution contracts (and the associated gambling) are the only thing holding it up.
We've finally come full circle: Linear broadcast TV -> TiVo (finally, no more missed episodes!) -> VOD, i.e. Netflix, Max, Hulu etc. (why linear broadcast everything to everyone if we can just OTT stream everything individually?) -> FAST, i.e. Pluto TV etc.
The crux of the matter being that even if OTA channels didn’t track people’s location, it wouldn’t matter since OTA itself going the way of the dodo.
ATSC first generation will probably outlive this DRM-driven abomination.
For a few weeks it was glorious. I had no problem picking up CBS (it was broadcast from the same antenna as ATSC 1.0 - so it was the modulation that was helping out). And then, after a little over a month. Whack! No more CBS. They turned on DRM. They are still the only network that in my area with DRM. Ughh.
Under the previous administration I filled a few issues about this from a public safety perspective - I live in with the FCC an area with unreliable power. During severe weather, we often lose Internet and power (which knocks out cable TV too). Requiring working internet to watch TV to monitor the progress of a tornado in your area seems stupid and dangerous. Unfortunately, nothing happened then regarding the issue and given the way that Brendan Carr is taking the FCC, I don’t think there will be any progress on this.
They’re all a huge improvement over older cards like the pcHDTV or the really old Hauppauge WinTV PVR 250 that I fought with for Freevo and MythTV so long ago. The switch to Channels was a huge quality of life improvement too.
Just listen to radio?
In the event of a truly severe weather event (the ones where they hit the alert to make every cell phone go off), the visuals provided by television are hugely helpful.
You say that as if they're using lots of words to obfuscate that fact, but the quote you pasted has them saying entirely directly "maximize the network Return on Investment", which is just normal business terminology (and only one word more than your "it makes us more money"!)
Obviously this has no impact on whether that's a good or bad thing, I'm just pointing out that they weren't using a lot of words to hide that fact.
The fact that it seemed like its intention was to obfuscate is just a sign that you're not familiar with basic business terminology, nothing more.
(And there's no shame in that, nobody knows the common phrases in every area of society, if you've never taken any business classes nor been involved in running a business there's no reason you should know it - but that doesn't mean people who use those words are trying to hide what they mean.)
Edit: and unsurprisingly it's therefore also frequently seen on HN:
* https://www.youtube.com/watch?v=cw3W7MoafR4
* https://www.youtube.com/@AntennaMan/videos
ATSC 3.0 allows for DRM/encryption as the parent comment mentions.
Wow, that's one of the best uses of corporate-speak euphemism I've seen. Everybody who reads it knows what it really means, but if you just don't say it, it's fine. Recent studies indeed!
I mean, let's keep in mind, even ATSC 1.0 had really awful reception issues; compared to analog NTSC where there was enough redundancy that you could just tune into a garbage station from way too far away and see something. Now imagine trying to make that already unreliable channel bidirectional. I just really hope all the return channel stuff is optional, because it sure as hell isn't going to work without way more stations broadcasting on more channels, and OOPS you've reinvented LTE.
In the end the company that the governement selected to start the rollout of DTT went bust and I don't think the system was used anywhere else. The developer of the technology abandoned it in 2006 as other connection methods (broadband/mobile data) were preferred.
In ATSC there is two types of forward error correction on the digital bitstream. The problem it faces is it sits in the same channel allocations as NTSC while having to deliver significantly more information than NTSC. That and the actual digital modulation used is not as ideal for receivers to capture.
In ATSC the tradeoff between compression and error correction is such that a noisy channel is far more likely to cut out or otherwise be unusable than it would have been in NTSC.
“The fundamental record that captures consumption information is called a Consumption Data Unit (CDU). For a streaming A/V channel, each CDU identifies a reporting interval during which a service is accessed. Such a CDU includes the service identifier, the time the service access started and the time the service access ended. If any Applications are active during the report interval, it also records when the Applications are active (whether on a primary device or a “second screen”, companion device), including the Application Identifier, the time the Application started being active, and the time it stopped being active.”
“For services, events logged into a CDU shall correspond to all usage intervals of no less than 10 seconds and may correspond to shorter usage intervals. For Application activity, events logged into a CDU shall correspond to all usage intervals of no less than 5 seconds and may correspond to shorter usage intervals. The precision and accuracy of start times and end times in the CDUs should be within 1 second.”
The payload schema is a 4651-byte JSON structure, so I would imagine that a response-payload fitting this schema would be same-order-of-magnitude size. With 10-second granularity that works out to roughly a half-kilobyte-per-second data rate, and according to the DRC spec the maximum payload size of one DRC message is 2048 bytes.
It will also report when you play something back from a DVR:
“Component.SourceDeliveryPath – Delivery path used for or the source of the content component indicated by the parent Component element.
SourceDeliveryPath.type –
0 – Broadcast delivery (content component is delivered by broadcast)
1 – Broadband delivery (content component is delivered directly by broadband by broadcaster)
2 – Time-shift-buffer source (content source is local time shift buffer)
3 – Hard-drive source (content source is local hard drive)
4 – Delivery via direct connection (HDMI)
5 – Alternate IP delivery (content component is delivered via intermediary)”
Ok, this is entirely paranoid conspiracy theory rumblings at this point, but why does this sound like a way to catch unlicensed rebroadcasters?
Search for "miot" or "mmtc"
OP is falling about the spec incorporating usage monitoring.
5G was designed with a very public and explicit goal for IoT of allowing many more devices to connect than 4G could, and more conveniently. Nothing unexpected or user-harming, and nothing new as 4G was already used for IoT.
The one thing that was really new was support for super high density environments with mmWave. That would have been ideal for stadiums etc. With regular tech the networks get overwhelmed. mmWave offers more cells in a tiny area. But here in Europe that's been given up for good. Phones don't even come with mmWave antennas anymore.
One of the best adverts for a long time - if you knew about the technical advantages of 5G, it had a real double meaning.
If you knew 90s British dance music, it had a further humourous double meaning promoting the telco (an oblique reference to The Shamen's Ebenezer Goode)
That didn’t even cross my mind and Rich West (Mr. C) is a friend of mine and I know the song well. I, of course, get the reference now it has been pointed out, but you’re right that it’s pretty damn oblique!
https://www.fastcompany.com/90314058/5g-means-youll-have-to-...
5G does not always result in more towers, it just supports closer towers for better performance.
"Anyone with access" to Google or Apple data also knows exactly where you are. "Anyone with access" to <social media app of choice> knows exactly where you are. "Anyone with access" to police databases know exactly where you drove. "Anyone with access" to VISA/Mastercard data knows exactly where you've shopped. "Anyone with access" to Find my data will know exactly where your phones, headphones, and other trackable items are at any point in time.
This isn't a complaint about 5G, but about cellular anonymity which has never existed.
We should make things so useless and annoying for them, as they did for us.
Then again, thanks to modern modulation techniques, digital steering etc., battery-powered smartphones can talk to satellites 36 thousand kilometers away these days as well, so maybe this is just a thing now? The spec also does mention receive repeaters for complicated non-line-of-sight propagation scenarios.
[1] https://www.atsc.org/wp-content/uploads/2024/04/A323-2024-04...
So you got that right.
Right now it's in the experimental stage, with only 6 towers total deployed (only 5 were operational during NAB, and only one in Nevada... so timing, not navigation yet).
The ultimate plan—which is probably dependent on how well ATSC 3.0 rolls out (which has plenty of hurdles[1])—is to encourage broadcasters to add on the necessary timing equipment to their transmitter sites, to build a mesh network for timing.
That would allow the system to be 100% independent of GPS (time transfer could be done via dark fiber and/or ground-satellite-ground directly to some 'master' sites).
The advantages for BPS are coverage (somewhat) inside buildings, the ability to have line of sight nearly everywhere in populated areas, and resilience to jamming you can't get with GPS (a 100 kW transmitter signal 10 miles away is a lot harder to defeat than a weak GPS signal hundreds of miles away in the sky).
The demo on the show floor was also using eLoran to distribute time from a site in Nevada to the transmitter facility on Black Mountain outside Vegas, showing a way to be fully GPS-independent (though the current eLoran timing was sourced from GPS).
[1] ATSC 3.0, as it is being rolled out in the US, doesn't even add on 4K (just 1080p HDR), and tacks on 'features' like 'show replay' (where you tap a button and an app can stream a show you're watching on OTA TV through the Internet... amazing! /s), DRM (at stations' discretion, ugh), and 'personalized ad injection' (no doubt requiring you to connect your TV to the Internet so advertisers can get your precise location too...). Because ATSC 3.0 requires new hardware, consumers have to be motivated to buy new TVs or converter boxes—I don't see anything that motivates me to do so. I feel like it may be a lot like the (forever ongoing) HD Radio rollout.
I was hoping to get better fidelity between the roughly 2x bitrate per channel, and the video codec update. And probably overly optimistically was hoping the 1080p feed source was progressive so there wouldn't be a deinterlacing step.
Otoh, local broadcasters use an audio codec I can't easily use, integration with mythtv is poor, and there's no sign anything is going to get better soon.
Maybe if I had a tv with an atsc 3 tuner, live tv would be an option, but I'm not buying a tv for that.
ATSC 1.0 took a while before gathering momentum, so maybe that's going to be the same here, and in another few years, it might make sense to consider a transition. OTOH, maybe the writing is on the wall and OTA broadcasting will die on this hill. I was an OTA enthusiast, but between ATSC 3 being terrible, and the reallocation of spectrum that means cellular base stations sometimes overwhelm my pre-amp, it's not much fun anymore. (I have a filter post-pre-amp but it'd be better if I got on the roof to put it pre-pre-amp, but roofs are scary) Maybe I'm just getting curmudgeonly though.
Even if someone mandate it as requirement for TV sold next year all the tech inside are at least 10 years old ( HEVC ? ) . Not to mention the roll out time. Do Americans only watch Cables and Netflix? And not Free to Air TV? Which is what I belief what most of the worst still do to a larger extend other than Internet streaming.
They might as well look into the standards before putting a mandate into it.
To the north, competition from a huge installed base of last-gen technology, which is mostly good enough.
To the south, streaming services, youtube and cable. These let people watch whenever they want (nobody has VCRs any more) and they've offered 4k for over a decade.
To the east, the industry's dumb decision to build the 'next gen' technology atop a patent minefield, and load it with DRM. So if you manufacture this tech, you can face huge surprise bills because in implementing the spec you've unknowingly infringed on some nonsense patent.
And to the west, the commercial reality that showing someone an advert in 4K isn't any more profitable than showing the advert in 1080p. If you're a broadcast TV station when you up your quality everything gets more expensive but you don't make any extra money. So why bother?
In a functioning, competitive market, the answer to this is "Customers choose a competing broadcast TV station with higher quality." Unfortunately what we have is far from that.
https://www.nielsen.com/insights/2024/beyond-big-data-the-au...
They're not that popular but there are similar bundles available over the Internet, YouTube TV is one.
All the national broadcast networks have Internet options, Hulu has shows from multiple networks, Paramount+ has CBS. Both also have shows made for cable & satellite channels and Internet-only programming.
There's a lot more than Netflix too, Amazon Prime, AppleTV+, Disney+, Max, and many others.
Live sports kept people signed up to cable & satellite for a long time, I think now there are Internet options (and probably exclusives, I don't watch sports).
This is from 2022: https://www.nielsen.com/insights/2022/broadband-only-tv-home...
At the time of the switchover in the early 2000s I lived about 40 miles from a major metropolitan area, Minneapolis, which is pretty close in US terms. We spent hundreds of dollars on different antennas (indoor and outdoor) and signal boosters and what not and it was simply never reliability.
In 2008 I moved to my current location, three miles outside of downtown Minneapolis. Again I tried a number of antennas and still found operation to be anything but reliable. I gave up and began just watching Netflix.
The people who live close enough to the broadcasts to pick it up have easy access to cable TV. The people who live in the countryside who used to depend on it can't pick it up. There's just no place for the TV system we were given.
That is the first time I've heard that. Everything I've heard has been positive - people amazed that others aren't doing it. Are there any numbers on user satisfaction?
I used it myself once or twice and it worked simply with antennas that were relatively cheap (<$50 iirc). Maybe there was a problem in Minneapolis?
> The people who live close enough to the broadcasts to pick it up have easy access to cable TV.
Cable is expensive for many people and broadcast is free, of course. (Also, Broadcast is more private, for now.)
I don’t know about Minneapolis is particular, but 40 miles is far enough that it gets tricky, and in my Canadian city, I’m close to the towers as the crow flies, but in the RF shadow of a huge hill - so I’d actually get better reception if I was further away from the broadcast towers.
Fun: if you’ve got Apple Maps (I’m sure Android has this as well), ask for walking directions from Minneapolis to something 35-40 miles away. I chose “Elko New Market” - 36 miles from downtown. Click on the walking details and you can see the elevation change. You’re going from around 800 feet above sea level to 1100 feet above sea level, a difference of ~300 feet. But the total change over the course of the walk is nearly 4000 feet!
There's been a consistent call by many people that there needs to be a diversity of options for navigation and timing:
* https://rntfnd.org/2025/02/04/pnt-gps-critical-issue-for-new...
China has GNSS (BeiDou, plus plans for LEO), plus terrestrial navigation (eLoran), plus a fibre-based network for accurate timing:
* https://rntfnd.org/2024/10/03/china-completes-national-elora...
* https://rntfnd.org/2024/03/01/patton-read-their-book-chinas-...
* https://rntfnd.org/2024/11/29/china-announces-plan-to-furthe...
Russia has a Loran-equivalent:
Edit: Broadcast Positioning System for anyone that didn't figure it out.
In a true "end of history" moment, the US and other NATO members discontinued both of their ground-based systems (which are inherently harder to jam due to their much higher transmission power, since transmitters are not power limited) – Omega in the late 1990s and Loran-C in the early 2010s – in favor of GPS, while Russia kept their equivalent functional, and China completed an eLoran network last year.
Add to that the FAA's reduction of their ground-based VOR/DME station network that lets planes navigate when GPS is unavailable...
GPS jamming, and much more concerningly spoofing, will probably quickly come within reach of non-nation-states and smaller groups of all kinds, and ultimately individual actors, and that can't possibly end well for civil aviation if robust countermeasures don't become available very soon.
Aircraft and military positioning concepts are evolving towards more map and dead reckoning, lessening the benefit of GPS jamming.
Current systems drift by about 0.5 miles per hour. And that's normal commercial grade systems, I'm sure the military has an option for better systems if they need them.
A missile would simply have to follow the jammer’s signal.
Jammers often move. Your missle often cannot manovure well enough to hit. jammers often turn off - if you missle is detected they turn the jammer off and move it. They are often running more than one jammer so getting one to turn off isn't going to matter.
It may already be so:
GPS is such a critical infrastructure component to modern society- knowing that a redundancy system like this is in the works is great.
It's a travesty that this was ever approved.
If it’s not intrinsic to FM, why not use existing cellular towers to do this? They’re everywhere, and phones already receive broadcast messages (like Amber Alerts) even without a SIM (I think) — so it feels like this could be done without needing new radios.
What makes this more accurate than cell tower triangulation today? Is the limitation in timing sync across towers, or something else in how cell networks are structured?
And for indoor use — how does this handle multipath? Reflections from walls or even atmospheric bounce seem like they’d throw off timing accuracy, similar to what messes with GPS in dense areas.
They use precise timing to coordinate timed broadcast slots between base stations with overlapping coverage.
Due to how OFDM works, I suppose the idea here is to intentionally send a heterogeneous signal on a few non-overlapping subcarriers (for single-frequency networks) or on different transponders at different locations (since single frequency networks aren't as common in the US due to how broadcasting evolved there, although ATSC 3.0 apparently also allows single frequency networks).
for people who don't want to watch videos
I'm sure the average reader who deals with broadcast signal electronics knows what's going on here but I just walked away from it confused. It looks like terrestrial broadcasters sending out time codes for triangulation?
It sounds like a hail mary. How many people use the fm radio transceivers in their smartphone?
Maybe it's a total re-imagining of what to do with high power terrestrial band broadcasting?
Every time I show someone I first have to introduce the concept of it existing. And these are technical people.
https://www.sparkfun.com/sparkfun-gps-rtk2-board-zed-f9p-qwi...
The datasheet: https://cdn.sparkfun.com/assets/f/8/d/6/d/ZED-F9P-02B_DataSh...
Typical high end microwave measurement system cost as much as a Ferrari car.
Good cable and connectors can set you back by several thousand dollars.
It's a very good business space prime for disruption (hint SDR - or software-defined radio).
Fun facts, the grand daddy of Silicon Valley start-up is HP (then Agilent, and now Keysight) selling function signal generator.
Another domain where that is true involves logic analyzers. A few years ago, on a bit of a lark, I bought a (used) fairly high-end Keysight logic analyzer. The kind of thing that cost like $20,000 or more when it was brand new. But I got a sweet deal on it, so I bought it. Only... it came with no test leads. And then I started shopping for the leads.
Yikes.
I forget the exact numbers now, but as best as I can recall, the leads came in 64pin sets, where the device supported up to 4 test lead sets, for 256 total channels. And just one of the 64pin test lead sets cost something like $1500. So a full set would cost another $6000 on top of the device itself. I think that was about what I paid for the analyzer itself in the first place!
Now I don't regret buying it and in truth I never needed to use 256 channels anyway, so I only bought 1 of the test lead sets so far. But yeah... test leads / cables /etc. for high bandwidth / low latency / high frequency applications get pretty damn expensive.
But concepts are translatable to other technologies, for example mobile phone network signals (even without decrypting it) which in most populated areas can have hundreds frequencies by itself.
there are literally literally thousands of radio signals around us which can be used for various unintended / non-cooperative purposes. also not only ground based signals, satellites are transmitting all kinds of signals towards earth, some for communication, some for remote sensing / earth observation.
Or not only is it possible to use non-cooperative signals for timing, but also for passive radar. For example DVB-T - you receive bounces/echoes of signal from airplanes, drones and measure its characteristics.
NATO public document - UAV Detection and Localization Using Passive DVB-T Radar MFN and SFN - https://www.sto.nato.int/publications/STO%20Meeting%20Procee...
Good community is around GNURadio, they have all kinds of enthusiast and profesional usecases, explorations, videos, ...
Or just simple 30$ RTL-SDR + laptop, you can sit next to road and listen for tire pressure monitoring sensors data, they contain unique ids, so you can know when postman enters your street...
it is possible to listen to small part of spectrum thru receivers which enthusiasts connected to internet, without buying anything for example -http://kiwisdr.com/public/
but "high performance signals" are not in frequency range of those radios. but it is possible to hear ham radio, aviation, military, maritime, not only voice but weather fax, other digital signals, all sorts of timing signals...
• ATSC 3.0's physical layer can already transmit GPS time in a way that receivers could get it back out. What BPS brings to the table is a requirement and specification for accurately and consistently filling in the physical layer preamble fields containing the time data, along with a new physical layer pipe (think "low-level data stream") that contains additional information about the transmitter and, optionally, its neighboring transmitters.
• BPS is capable of producing time fixes when the receiver only has a lock on one source. This isn't surprising at all — GPS receivers can do the same thing. But either type of receiver with only one source would see a clock offset proportional to the path delay, which it wouldn't be able to compute and back out without knowing its position.
• BPS is only designed for 2-D position fixes. While that's a reasonable design decision (the vertical position error would be massive), it also makes BPS less useful for the NAB's "indoor positioning for first responders" use case, especially in areas with multi-story buildings.
• The need to receive and process/decode multiple, most likely non-adjacent 6 MHz channels for positioning increases receiver complexity and cost.
• The NAB claims that 1 kilometer of separation between two BPS transmitters is "sufficient for useful position determination." I don't buy it, especially in the face of poor transmitter geometry.
• They note that 16 TV stations in the New York City area broadcast from One World Trade Center, so for the purposes of BPS, they're effectively one station. This kind of transmitter colocation is incredibly common, both in urban areas (ten TV stations broadcast from Sutro Tower in San Francisco) and in more rural areas (six TV stations in the Roanoke-Lynchburg DMA broadcast from towers within ~1 mile of each other on the ridgeline of Poor Mountain). Even if every ATSC TV station became an ATSC 3.0 w/ BPS transmitter, bad transmitter geometries would destroy BPS's position accuracy in lots of markets.
• What's the business case for broadcasters? BPS won't be free for broadcasters to implement, and there doesn't seem to be a path to it generating revenue except for a hand-wavy "maybe one day televisions will be able to determine their locations without Internet connections using BPS, and then broadcasters can do location-targeted advertising with those TVs!"
My uncharitable take is that BPS will never be a usable standalone PNT system. A timing system in the "rebroadcasts GPS" sense? Maybe. Standalone positioning? No way. Broadcasters implementing BPS (or ATSC 3.0 at all) without being forced to by the government? I don't see it.
My uneducated guess is government funding, plus becoming part of a new "essential backbone" infrastructure, thus guaranteeing incentives to stay operational for a longer period of time.
Current planning is public availability in 2027-2029.
A good gov presentation with an overview and technical details is here [1].
[1] https://www.gps.gov/governance/advisory/meetings/2022-11/mat...
There are places, especially in the mountains where you don’t get the requisite number of towers, but large portions of the US will, and the required signal to noise ratio is better than to decode regular TV signals, so you have a larger area covered than for TV.
Also, it smelled a bit like wishful thinking to assume the high precision clock would not be driven by GPS on real world deployments. I know some cell towers synchronize via PTP, but a great many others use GPS as their time source.
Holdover can only help so much, if there's a persistent jamming effort, it can wreak havoc on many time-critical systems.
Poor form. Do better.
No idea how it is in the US, but in EU you can't really easily choose, it will by default use GPS+Galileo combined for better accuracy.
I guess you can through developer settings on android though.
Also you should be able to look up your phone specifications to see what GNSS services it's capable of using on GSMArena
Edit:oh I missed that you're on Apple so that app is useless for you. Still might be of some use to others though.