Thx in advice.

You are viewing a single thread.
View all comments View context
5 points

I actually use an AMD card for running image generation and LLMs on my PC on Linux. It’s actually not hard to set up.

permalink
report
parent
reply
2 points

Details on your setup?

permalink
report
parent
reply
4 points

I’m not the original person you replied to, but I also have a similar setup. I’m using a 6700XT, with both InvokeAI and stable-diffusion-webui-forge setup to run without any issues. While I’m running Arch Linux, I have it setup in Distrobox so its agnostic to the distro I’m running (since I’ve hopped between quite a few distros) - the container is actually an Ubuntu based container.

The only hiccup I ran into is that while ROCm does support this card, you need to set an environmental variable for it to be picked up correctly. At the start of both sd-webui and invokeai’s launch scripts, I just use:

export HSA_OVERRIDE_GFX_VERSION=10.3.0

In order to set that up, and it works perfectly. This is the link to the distrobox container file I use to get that up and running.

permalink
report
parent
reply
2 points

Thx. I’m dabbling rn with a 2015 Intel i5 SFF and a low profile 6400 GPU, but it looks like I’ll be getting back to all my gear soon, and was curious to see what others are having success running with.

I think I’m looking at upgrading to a 7600 or greater GPU in a ryzen 7, but still on the sidelines watching the ryzen 9k rollout.

I still haven’t tried any image generation, have only used llamafile and LM studio, but would like to did a little deeper, while accounting for my dreaded ADHD that makes it miserable to learn new skills…

permalink
report
parent
reply
2 points

I have Fedora installed on my system (don’t know how the situation is on other distros regarding rocm) and my GPU is an RX 6700 XT. For image generation I use stable duffusion webui and for LLMs I use text generation webui. Both installed everything they needed by themselves and work perfectly fine on my AMD GPU. I can also give you more info if there’s anything else you wanna know.

permalink
report
parent
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 5.3K

    Monthly active users

  • 1.8K

    Posts

  • 19K

    Comments