I can’t wait for the spectacular implosion

You are viewing a single thread.
View all comments View context
13 points
*

these are compute GPUs that don’t even have graphics ports

permalink
report
parent
reply
17 points

Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.

permalink
report
parent
reply
7 points

I’m waiting on the a100 fire sale next year

permalink
report
parent
reply
6 points

I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.

I mean there are a lot of applications for linear algebra, although I admit I don’t fully know in what way “AI” uses linear algebra and what other uses overlap with it.

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 1.5K

    Monthly active users

  • 417

    Posts

  • 11K

    Comments

Community moderators