201 points

I’d pay extra for no AI in any of my shit.

permalink
report
reply
111 points

I would already like to buy a 4k TV that isn’t smart and have yet to find it. Please don’t add AI into the mix as well :(

permalink
report
parent
reply
6 points
*

All TVs are dumb TVs if they have no internet access

permalink
report
parent
reply
38 points

Look into commercial displays

permalink
report
parent
reply
45 points

The simple trick to turn a “smart” TV into a regular one is too cut off its internet access.

permalink
report
parent
reply
6 points

We got a Sceptre brand TV from Walmart a few years ago that does the trick. 4k, 50 inch, no smart features.

permalink
report
parent
reply
-4 points

I don’t have a TV, but doesn’t a smart TV require internet access? Why not just… not give it internet access? Or do they come with their own mobile data plans now meaning you can’t even turn off the internet access?

Anti Commercial-AI license

permalink
report
parent
reply
7 points

A lot of TVs are requiring an account login before being able to use it.

permalink
report
parent
reply
15 points

They continually try to get ob the Internet, it’s basically malware at this point. The on board SoC is also usually comically underpowered so the menus stutter.

permalink
report
parent
reply
12 points

I was just thinking the other day how I’d love to “root” my TV like I used to root my phones. Maybe install some free OS instead

permalink
report
parent
reply
5 points

You can if you have a pre-2022 LG TV. It’s more akin to jailbreaking since you can’t install a custom OS, but it does give you more control.

https://rootmy.tv

permalink
report
parent
reply
3 points

I just disconnected my smart TV from the internet. Nice and dumb.

permalink
report
parent
reply
5 points

Still slow UI.
If only signage displays would have the fidelity of a regular OLED consumer without the business-usage tax on top.

permalink
report
parent
reply
2 points
*

Signage TVs are good for this. They’re designed to run 24/7 in store windows displaying advertisements or animated menus, so they’re a bit pricey, and don’t expect any fancy features like HDR, but they’ve got no smarts whatsoever. What they do have is a slot you can shove your own smart gadget into with a connector that breaks oug power, HDMI etc. which someone has made a Raspberry Pi Compute Module carrier board for, so if you’re into, say, Jellyfin, you can make it smart completely under your own control with e.g. libreELEC. Here’s a video from Jeff Geerling going into more detail: https://youtu.be/-epPf7D8oMk

Alternatively, if you want HDR and high refresh rates, you’re okay with a smallish TV, and you’re really willing to splash out, ASUS ROG makes 48" 4K 10-bit gaming monitors for around $1700 US. HDMI is HDMI, you can plug whatever you want into there.

permalink
report
parent
reply
4 points

I’m sure that’s coming up.

permalink
report
parent
reply
3 points

As a yearly fee for DRMd televisions that require Internet access to work at all maybe

permalink
report
parent
reply
1 point

Right now it’s easier to find projectors without it and a smart os. Before long tho it’s gonna be harder to find those without a smart os and AI upscaling

permalink
report
parent
reply
68 points

I would pay for AI-enhanced hardware…but I haven’t yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

permalink
report
reply
-1 points

I use it heavily at work nowadays. It would be nice to run it locally.

permalink
report
parent
reply
14 points

You don’t need AI enhanced hardware for that, just normal ass hardware and you run AI software on it.

permalink
report
parent
reply
2 points

But you can run more complex networks faster. Which is what I want.

permalink
report
parent
reply
4 points

https://github.com/huggingface/candle

You can look into this, however it’s not what this discussion is about

permalink
report
parent
reply
2 points

An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks.

Exactly what we are talking about.

permalink
report
parent
reply
1 point

I’m curious what you use it for at work.

permalink
report
parent
reply
1 point

I’m a programmer so when learning a new framework or library I use it as an interactive docs that allows follow up questions.

I also use it to generate things like regex and SQL queries.

It’s also really good at refactoring code and other repetitive tasks like that

permalink
report
parent
reply
1 point

Not the guy you were asking but it’s great for writing powershell scripts

permalink
report
parent
reply
27 points

In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

permalink
report
parent
reply
12 points

Anything AI actually enhanced would be advertising the enhancement not the AI part.

permalink
report
parent
reply
19 points

It’s like rgb all over again.

At least rgb didn’t make a giant stock market bubble…

permalink
report
parent
reply
9 points

DLSS and XeSS (XMX) are AI and they’re noticably better than non-hardware accelerated alternatives.

permalink
report
parent
reply
3 points
*

Already had that Google thingy for years now. The USB/nvme device for image recognition. Can’t remember what it’s called now. Cost like $30.

Edit: Google coral TPU

permalink
report
parent
reply
9 points

My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.

permalink
report
parent
reply
24 points

The biggest surprise here is that as many as 16% are willing to pay more…

permalink
report
reply
3 points
*

I mean, if framegen and supersampling solutions become so good on those chips that regular versions can’t compare I guess I would get the AI version. I wouldn’t pay extra compared to current pricing though

permalink
report
parent
reply
13 points

Acktually it’s 7% that would pay, with the remainder ‘unsure’

permalink
report
parent
reply
11 points

Show the actual use case in a convincing way and people will line up around the block. Generating some funny pictures or making generic suggestions about your calendar won’t cut it.

permalink
report
reply
3 points

I completely agree. There are some killer AI apps, but why should AI run on my OS? Recall is a complete disaster of a product and I hope it doesn’t see the light of day, but I’ve no doubt that there’s a place for AI on the PC.

Whatever application there is in AI at the OS level, it needs to be a trustless system that the user has complete control of. I’d be all for an Open source AI running at that level, but Microsoft is not going to do that because they want to ensure that they control your OS data.

permalink
report
parent
reply
-3 points

Machine learning in the os is a great value add for medium to large companies as it will allow them to track real productivity of office workers and easily replace them. Say goodbye to middle management.

permalink
report
parent
reply
1 point
*

I think it could definitely automate some roles where you aren’t necessarily thinking and all decisions are made based on information internally available to the PC. For sure these exist but some decisions need human input, I’m not sure how they automate out those roles just because they see stuff happening on the PC every day.

If anything I think this feature is used to spy on users at work and see when keystrokes fall below a certain level each day, but I’m sure that’s already possible for companies to do (but they just don’t).

permalink
report
parent
reply
4 points

Unless you’re doing music or graphics design there’s no usecase. And if you do, you probably have high end GPU anyway

permalink
report
reply
3 points

I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I’d rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.

permalink
report
parent
reply
1 point

There are local models for text gen - not as good as chatGPT but at the same time they’re uncensored - so it may or may not be useful

permalink
report
parent
reply
2 points

Yes, I know - that’s my point. But you need the necessary hardware to run those models in a performative way. Waiting a minute to produce some vaguely relevant gibberish is not going to be of much use. You could also use generative text for other applications, such as video game NPCs, especially all those otherwise useless drones you see in a lot of open world titles could gain a lot of depth.

permalink
report
parent
reply

PC Gaming

!pcgaming@lemmy.ca

Create post

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

Community stats

  • 4.9K

    Monthly active users

  • 1.8K

    Posts

  • 12K

    Comments