Panther Lake and Nova Lake laptops will return to traditional RAM sticks

34 points

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

In other news, Intel is replaced by Nvidia in the Dow Jones, a company that exclusively produces dedicated graphics cards: https://lemmy.world/post/21576540

permalink
report
reply
8 points

Nvidia does more than just GPUs.

Nvidia makes both SoCs like the Tegra series and server CPUs (Grace; ARM based to be used with their ML/AI cards with much higher bandwidths than regular CPUs).

Nvidia also just announced that they are working on a consumer desktop CPU.

permalink
report
parent
reply
5 points

Well I had the same thought a few years ago when APUs started getting better. But then I’m not the CEO of a huge tech company, so nobody lost their job because I was wrong.

permalink
report
parent
reply
3 points

Can you be certain that no company monitors your brain and uses that as their CEO somehow?

permalink
report
parent
reply
0 points

Historical success/performance of DOW is biased by these strategic decisions to replace losers before their bankruptcy but where decline is obvious.

permalink
report
parent
reply
108 points

coming up next: Intel fires 25% of their staff, CEO gets a quarterly bonus in the millions

permalink
report
reply
3 points

Intel already laid off thousands and is still getting CHIPS Act taxpayer money.

permalink
report
parent
reply
120 points

Reverting to RAM sticks is good, but not shutting down GPU line. GPU market needs more competiter, not less.

permalink
report
reply
30 points

Intel can’t afford to keep making GPUs because it doesn’t have the reliable CPU side to soak up the losses. The GPU market has established players and Intel, besides being a big name, didn’t bring much to the table to build a place for itself in the market. Outside of good Linux support (I’ve heard, but not personally used) the Intel GPUs don’t stand out for price or performance.

Intel is struggling with its very existence and doesn’t have the money or time to explore new markets when their primary product is cratering their own revenue. Intel has a very deep problem with how it is run and will most likely be unable to survive as-is for much longer.

permalink
report
parent
reply
1 point

Intel is too big to fail. And the defense sector needs an advanced domestic foundry. Uncle Sam will bail it out with our tax money.

permalink
report
parent
reply
1 point

The United States has a few chip fabs that are capable of making military grade hardware. It’s helpful that the defense industry uses chips which aren’t the most advanced possible - they want the reliability mature tech provides. Micron, Texas Instruments, ON semiconductor - there are a few domestic chip companies with stateside fabs.

Intel is also a valuable collection of patents and a huge number of companies would love to get them. Someone will want to step in before the government takes over.

permalink
report
parent
reply
22 points

It boggles the mind that AMD realized the importance of GPUs 20 years ago when they bought ATI and in all that time Intel still doesn’t have a competitive GPU.

permalink
report
parent
reply
7 points

Intel realized it back then too, but things didn’t pan out the way they wanted.

nVidia and AMD were going to merge while ATi was circling the drain. Then Jensen and Hector Ruiz got into their shitfight about who was going to be CEO of the marged AMD/nVidia (it should have been Jensen, Hector Ruiz is an idiot) which eventually terminated the merger.

AMD, desperately needing a GPU side for their ‘future is fusion’ plans, bought the ailing ATi at a massive premium.

Intel was waiting for ATi to circle the drain a little more before swooping in and buying them cheap, AMD beat them to it.

permalink
report
parent
reply
27 points
*

As a Linux user of an Intel Arc card. I can safely say that the support is outstanding. In terms of price to performance, I think it’s pretty good too. I mainly enjoy having 16GB of VRAM and not spending $450-$500+ to get that amount like Nvidia. I know AMD also has cards around the same price that have that amount of VRAM too though

permalink
report
parent
reply
3 points

That’s interesting, thanks. Can I ask what that vram is getting used for? Gaming, llms, other computing?

permalink
report
parent
reply
3 points

Where can I get a sub 400 AMD card with 26 GB of VRAM?

permalink
report
parent
reply
5 points

Basically there is only money at the top of the gpu range. Everything else is a budget card with razor thin margins.

AI specific chips will take off over time but even then the ship is starting to sail . These are mostly data center projects.

permalink
report
parent
reply
2 points
*

besides being a big name, didn’t bring much.

Absolutely wrong. A lot of old and dated information in your post.

They have something no one else has: manufacturing, and very low price and great performance after recent driver updates. They just lack the driver stability which has been making leaps and bounds.

I do not think anyone else can enter the market, let alone with an edge.

permalink
report
parent
reply
27 points

I have no confidence in Intel’s long-term prospects.

permalink
report
reply
9 points

Intel’s long term prospects rely on China invading Taiwan.

permalink
report
parent
reply
2 points

Intel is a CIA champion. Vector for backdoor spying and kill switches. Why not embed plastic explosives on every motherboard, since US/Trump praised the Israel strategy?

Taiwan declaring independence and offering to host US nuclear missile bases… incoming.

permalink
report
parent
reply
9 points

Intel has been a mistake since 1978. But evil doesn’t generally die.

permalink
report
parent
reply
95 points

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

No wonder Intel is in such rough shape! Gelsinger is an idiot.

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

The board needs to fire his ass ASAP and replace him with someone who has a grip on reality. Or at least someone who has a some imagination of how the future could be.

permalink
report
reply
70 points
*

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

Reminds me of decades ago when intel didn’t bother getting into graphics because they said pretty soon CPUs would be powerful enough for high-performance graphics rendering lmao

The short-sightedness of Intel absolutely staggers me.

permalink
report
parent
reply
48 points
*

CPUs would be powerful enough for high-performance graphics rendering lmao

And then they continued making 4 core desktop CPU’s, even after phones were at deca-core. 🤣🤣🤣

permalink
report
parent
reply
10 points
*

To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it’s either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones

permalink
report
parent
reply
26 points

It’s been the same “vision” since the late 90s - the CPU is the computer and everything else is peripherals.

permalink
report
parent
reply
5 points

Probably because APU’s are getting better and more pc gamers are doing handhelds and apu laptops instead of dedicated desktops. PC gaming has gotten really expensive.

This is a non comparison for at least the next 5 years. A dedicated gpu is still a better choice hands down for gaming. Even going on a lower end build an older gpu will still beat the current best apu by a good amount, but in 10 years time it may not be so necessary to need a gpu over an apu. GPUs are getting too power hungry and expensive. Gamers gonna game, but they won’t all want to spend an ever increasing amount of money to get better graphics, and arc would need at least another 5 years to be competition enough to claim a worthwhile market share from amd or nvidia and that’s wishful thinking. Long time to bleed money on a maybe.

permalink
report
parent
reply
6 points

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

Maybe the idea is to put it on the CPU/NPU instead? Hence them going so hard on AI processors in the CPU, even though basically nothing uses it.

permalink
report
parent
reply
10 points

But if he wants npu then why not buff igpu too? I mean, igpu exclusive on CPU memory is good boost, look up intel i7 8709g they put AMD Radeon vega igpu and exclusive to igpu 4gb of hbm memory, it did wonders, now when AMD is winning in apu sector, they could utilise same ideas they did in the past

permalink
report
parent
reply
7 points

Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn’t sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It’s got a Ryzen 2600 by memory that’s horribly thermally limited and because of that it leaves so much performance on the table)

permalink
report
parent
reply
3 points

Unless he thinks he’s going to serve all that from the die in the next 5 years.

permalink
report
parent
reply
4 points

You think Intel is going to have 500-850mm^2 dies?

That’s what they need to compete in the GPU space.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 5.9K

    Posts

  • 126K

    Comments