Wake me up when it works offline “The Llama 3.1 models are available for download through Meta’s own website and on Hugging Face. They both require providing contact information and agreeing to a license and an acceptable use policy, which means that Meta can technically legally pull the rug out from under your use of Llama 3.1 or its outputs at any time.”
WAKE UP!
It works offline. When you use with ollama, you don’t have to register or agree to anything.
Once you have downloaded it, it will keep on working, meta can’t shut it down.
It’s available through ollama already. i am running the 8b model on my little server with it’s 3070 as of right now.
It’s really impressive for a 8b model
Yup, 8GB card
Its my old one from the gaming PC after switching to AMD.
It now serves as my little AI hub and whisper server for home assistant
I’m running 3.1 8b as we speak via ollama totally offline and gave info to nobody.