Afaik most LLMs run purely on the GPU, dont they?

So if I have an Nvidia Titan X with 12GB of RAM, could I plug this into my laptop and offload the load?

I am using Fedora, so getting the NVIDIA drivers would be… fun and already probably a dealbreaker (wouldnt want to run proprietary drivers on my daily system).

I know that using ExpressPort adapters people where able to use GPUs externally, and this is possible with thunderbolt too, isnt it?

The question is, how well does this work?

Or would using a small SOC to host a webserver for the interface and do all the computing on the GPU make more sense?

I am curious about the difficulties here, ARM SOC and proprietary drivers? Laptop over USB-c (maybe not thunderbolt?) and a GPU just for the AI tasks…

You are viewing a single thread.
View all comments
0 points

Your best bet would probably be to get a used office PC to put the card in. You’ll likely have to replace the power supply and maybe swap the storage but with how much proper external enclosures go for the price might not be too different. Some frameworks don’t support direct GPU loading so make sure that you have more ram than vram.

An arm soc won’t work in most cases due to a lack of bandwidth and software support. The only board I know of that can do it is the rpi5 and that’s still mostly a poc.

In general I wouldn’t recomend a titan x unless you already have one because it’s been deprecated in cuda, so getting modern libraries to work will be a pain.

permalink
report
reply
0 points

In general I wouldn’t recomend a titan x unless you already have one because it’s been deprecated in cuda, so getting modern libraries to work will be a pain.

Omg I spent too much on this… thanks for the heads up, that is a major fuckup

permalink
report
parent
reply

LocalLLaMA

!localllama@sh.itjust.works

Create post

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

Community stats

  • 5

    Monthly active users

  • 74

    Posts

  • 54

    Comments