While I agree, I think this solution is some nonsense. I bought a “TV” and paid for all the hardware and software that went into it, but I essentially have to use it as a monitor with my own hardware to escape the enshittification.
I also agree, but I view it more as ‘I bought a TV, and that’s all I want it to be’.
I don’t care about the built in software features foisted on me because I wanted an OLED panel; simply because they are going to be abandoned within 1-2 years, are powered by some anaemic chipset that is already multiple generations behind what is already available in my TV stand; and will likely end up as an attack vector to my network some period down the road.
The article mentions that TV manufacturers make ~$5 a quarter from selling your data. So those ‘features’ aren’t even free, they come at the expense of your personal information, privacy and likely security as a result.
So to quote a famous Dave Chapelle skit: “fuck ‘em, that’s why!”
simply because they are going to be abandoned within 1-2 years, are powered by some anaemic chipset that is already multiple generations behind what is already available in my TV stand; and will likely end up as an attack vector to my network some period down the road.
You do realize all of that would probably cease being a problem if people were able to hack their TVs to install custom OS’s.
all the spyware bullshit would also be gone with a custom OS.
Literally every one of your gripes would be addressed and fixed by being able to hack your TV
Custom OS isn’t going to address the anaemic hardware, nor do I think relying on open-source custom ROMs for a niche item is the best way to ensure any hardware-level vulnerabilities are covered.
If you already have an Internet-connected device hooked up to your TV (eg. PlayStation); there is no need to connect another, especially when it provides an overall worse experience.
Shit, a basic HTPC is infinitely better - using a Linux-based distribution (which will have a lot more support vs. a niche TV ROM), and it’ll be supported well beyond what the hardware could handle.
Custom OS isn’t going to address the anaemic hardware, nor do I think relying on open-source custom ROMs for a niche item is the best way to ensure any hardware-level vulnerabilities are covered.
Not only would it give “anemic” hardware new life, I can point at how its already been done at another in home device. Routers. DDWRT/OpenWRT/Tomato do exactly that for old, otherwise useless routers.
Literally every single argument you make can make against it has been proven wrong, and has in other devices, be addressed with a custom OS/Firmware that is designed for purpose without all the bloat and other BS.
You can adamantly say “Nuh uh!” all you want, but it doesnt change the facts.
You can buy PS5s for every TV in your house if you want to, Not everyone has that money, luxury, or stubborn desire.
Good luck implementing all the display color calibration, pixel refresher, anti-burn in features, etc… on these new TV panels. Personally I’d rather keep my warranty and just use a separate device to run the apps.
That anemic device uses hardware decoding in order to be able to decode the video data fast enough - it is literally unable to handle newer video encodings fast enough because it would have to do software decoding, which is were the anemic part totally kills it.
Routers on the other hand have been entirely done in software for ages (with at most hardware support for the encryption in things like SSL, which hasn’t changed in decades) and don’t have to reliably process 4k of data within 20 ms (for 50Hz) time frames.
Your example is very much an apples and oranges comparison.
Kind of, I haven’t had to buy a new tv to replace my dumb tv from 2014 but my understanding is that these awful smart TVs are at least cheaper because they’re subsidised by all the ads. If that’s the case, at least you didn’t actually fully pay for the hardware and can hopefully afford to put your own on there without being out of pocket by too extreme an amount.
That’s not really true because even the high end top of the line Samsung QD-OLED TVs have ads on the home screen if you connect Internet. If you want the latest display technology, your only options are Smart TV with ads, or spending 10x the price for a commercial display that nobody will actually sell you.
I happen to work for a commercial touch screen and android OEM. In my position, I needed to test a 50” 4K IDS display, and since i work from home it had to be shipped to me and we don’t exactly have a “return” option. It’s now in my bedroom with an Apple TV 4K on it.
You’re right. This is likely the only way to get a “dumb” display and tbh it’s not even the “best” display tech because it’s for commercial use, designed to run for longer hours with higher reliability at the cost of the newer fancier bells and whistles. But i didn’t pay for it and it’d pretty decent. I’m not complaining. And you’re also right that no one will actually sell you one of these. You have to buy it through a distributor at volume. Getting one outside of my weird circumstances as a one-off is basically not possible at all.
The best solution os actually to keep the decoder smarts separate from the actual displaying of image because those two things have different life-cycles and different costs.
A decent TV screen will last you decades and work fine at doing what it does, with the only pressures to upgrade being video connectors - which change maybe once every 2 decades and usually you can use adaptors to give them another 2 decades or so of life - higher resolutions - which make no difference unless you have a very large screen, something which requires a large living room to view at the optimal distance and in which case what really drove you to replace it was not obsolescence - and screen tech advances - which is another of those “every couple of decades it changes but the old ones are generally still fine” kind of thing.
Media Playing, on the other hand, has its life-cycle linked to video encoding and compression which change every 5 years or so and either you have a seriously overpowered generic CPU there (which smart TVs do not) or you have hardware decoding, and in the latter case new video encodings require new hardware with support for them.
So your TV with built-in decoding - i.e. “smart” TV - will need to be replaced more frequently driven by the need to support new digital formats, even though the part that costs the most by far - the screen - is still perfectly good. On the other hand if your media player functionality is separate, all you have to replace with some frequency is the much cheaper media box whilst only replacing the much more expensive screen side once in a blue moon.
Smart TVs are great for manufacturers because they force people to replace the TV much more often hence they sell 2 or 3 times more TVs, but they’re in the mid and long term a really bad option for actual buyers who needlessly spend much more on TVs, not to mention Ecologically with all those perfectly good screens ending up in landfills because the $20 worth of “smarts” tied to a $1000 screen is not capable of handling new video encoding formats.
While I agree, I think this solution is some nonsense. I bought a “TV” and paid for all the hardware and software that went into it, but I essentially have to use it as a monitor with my own hardware to escape the enshittification.
I also agree, but I view it more as ‘I bought a TV, and that’s all I want it to be’.
I don’t care about the built in software features foisted on me because I wanted an OLED panel; simply because they are going to be abandoned within 1-2 years, are powered by some anaemic chipset that is already multiple generations behind what is already available in my TV stand; and will likely end up as an attack vector to my network some period down the road.
The article mentions that TV manufacturers make ~$5 a quarter from selling your data. So those ‘features’ aren’t even free, they come at the expense of your personal information, privacy and likely security as a result.
So to quote a famous Dave Chapelle skit: “fuck ‘em, that’s why!”
You do realize all of that would probably cease being a problem if people were able to hack their TVs to install custom OS’s.
all the spyware bullshit would also be gone with a custom OS.
Literally every one of your gripes would be addressed and fixed by being able to hack your TV
Custom OS isn’t going to address the anaemic hardware, nor do I think relying on open-source custom ROMs for a niche item is the best way to ensure any hardware-level vulnerabilities are covered.
If you already have an Internet-connected device hooked up to your TV (eg. PlayStation); there is no need to connect another, especially when it provides an overall worse experience.
Shit, a basic HTPC is infinitely better - using a Linux-based distribution (which will have a lot more support vs. a niche TV ROM), and it’ll be supported well beyond what the hardware could handle.
Not only would it give “anemic” hardware new life, I can point at how its already been done at another in home device. Routers. DDWRT/OpenWRT/Tomato do exactly that for old, otherwise useless routers.
Literally every single argument you make can make against it has been proven wrong, and has in other devices, be addressed with a custom OS/Firmware that is designed for purpose without all the bloat and other BS.
You can adamantly say “Nuh uh!” all you want, but it doesnt change the facts.
You can buy PS5s for every TV in your house if you want to, Not everyone has that money, luxury, or stubborn desire.
Good luck implementing all the display color calibration, pixel refresher, anti-burn in features, etc… on these new TV panels. Personally I’d rather keep my warranty and just use a separate device to run the apps.
Okay, you buy a new TV every year just to have a warranty.
Most people dont have that luxury.
I’m upgrading from no TV, and I expect it to last me at least 10 years or I’ll be very disappointed.
That anemic device uses hardware decoding in order to be able to decode the video data fast enough - it is literally unable to handle newer video encodings fast enough because it would have to do software decoding, which is were the anemic part totally kills it.
Routers on the other hand have been entirely done in software for ages (with at most hardware support for the encryption in things like SSL, which hasn’t changed in decades) and don’t have to reliably process 4k of data within 20 ms (for 50Hz) time frames.
Your example is very much an apples and oranges comparison.
Kind of, I haven’t had to buy a new tv to replace my dumb tv from 2014 but my understanding is that these awful smart TVs are at least cheaper because they’re subsidised by all the ads. If that’s the case, at least you didn’t actually fully pay for the hardware and can hopefully afford to put your own on there without being out of pocket by too extreme an amount.
That’s not really true because even the high end top of the line Samsung QD-OLED TVs have ads on the home screen if you connect Internet. If you want the latest display technology, your only options are Smart TV with ads, or spending 10x the price for a commercial display that nobody will actually sell you.
They can’t show you ads if you never connect them to the internet
I happen to work for a commercial touch screen and android OEM. In my position, I needed to test a 50” 4K IDS display, and since i work from home it had to be shipped to me and we don’t exactly have a “return” option. It’s now in my bedroom with an Apple TV 4K on it.
You’re right. This is likely the only way to get a “dumb” display and tbh it’s not even the “best” display tech because it’s for commercial use, designed to run for longer hours with higher reliability at the cost of the newer fancier bells and whistles. But i didn’t pay for it and it’d pretty decent. I’m not complaining. And you’re also right that no one will actually sell you one of these. You have to buy it through a distributor at volume. Getting one outside of my weird circumstances as a one-off is basically not possible at all.
The best solution os actually to keep the decoder smarts separate from the actual displaying of image because those two things have different life-cycles and different costs.
A decent TV screen will last you decades and work fine at doing what it does, with the only pressures to upgrade being video connectors - which change maybe once every 2 decades and usually you can use adaptors to give them another 2 decades or so of life - higher resolutions - which make no difference unless you have a very large screen, something which requires a large living room to view at the optimal distance and in which case what really drove you to replace it was not obsolescence - and screen tech advances - which is another of those “every couple of decades it changes but the old ones are generally still fine” kind of thing.
Media Playing, on the other hand, has its life-cycle linked to video encoding and compression which change every 5 years or so and either you have a seriously overpowered generic CPU there (which smart TVs do not) or you have hardware decoding, and in the latter case new video encodings require new hardware with support for them.
So your TV with built-in decoding - i.e. “smart” TV - will need to be replaced more frequently driven by the need to support new digital formats, even though the part that costs the most by far - the screen - is still perfectly good. On the other hand if your media player functionality is separate, all you have to replace with some frequency is the much cheaper media box whilst only replacing the much more expensive screen side once in a blue moon.
Smart TVs are great for manufacturers because they force people to replace the TV much more often hence they sell 2 or 3 times more TVs, but they’re in the mid and long term a really bad option for actual buyers who needlessly spend much more on TVs, not to mention Ecologically with all those perfectly good screens ending up in landfills because the $20 worth of “smarts” tied to a $1000 screen is not capable of handling new video encoding formats.
Which is exactly where you were before smart TVs.