19 points

Wake me up when it works offline “The Llama 3.1 models are available for download through Meta’s own website and on Hugging Face. They both require providing contact information and agreeing to a license and an acceptable use policy, which means that Meta can technically legally pull the rug out from under your use of Llama 3.1 or its outputs at any time.”

permalink
report
reply
1 point

Through meta…

That’s where I stop caring

permalink
report
parent
reply
4 points

I was able to set up small one via open webui.

It did ask to make an account but I didn’t see any pinging home when I did it.

What am I missing here?

permalink
report
parent
reply
33 points
*

WAKE UP!

It works offline. When you use with ollama, you don’t have to register or agree to anything.

Once you have downloaded it, it will keep on working, meta can’t shut it down.

permalink
report
parent
reply
1 point
*

Well, yes and no. See the other comment, 64 GB VRAM at the lowest setting.

permalink
report
parent
reply
9 points

Oh, sure. For the 405B model it’s absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.

I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.

permalink
report
parent
reply
12 points
*

I’m running 3.1 8b as we speak via ollama totally offline and gave info to nobody.

https://ollama.com/library/llama3.1

permalink
report
parent
reply
14 points
*

It’s available through ollama already. i am running the 8b model on my little server with it’s 3070 as of right now.

It’s really impressive for a 8b model

permalink
report
parent
reply
1 point

Intriguing. Is that an 8gb card? Might have to try this after all

permalink
report
parent
reply

Yup, 8GB card

Its my old one from the gaming PC after switching to AMD.

It now serves as my little AI hub and whisper server for home assistant

permalink
report
parent
reply
2 points

Did anyone get 70b to run locally?

If so what, what hardware specs?

permalink
report
reply
5 points

Afaik you need about 40GB of vram for a 70b model.

permalink
report
parent
reply
3 points

Can’t you offload some of it to RAM?

permalink
report
parent
reply
7 points

Same requirements, but much slower.

permalink
report
parent
reply
11 points

Kind of petty from Zuck not to roll it out in Europe due to the digital services act… But also kind of weird since it’s open source? What’s stopping anyone from downloading the model and creating a web ui for Europe users?

permalink
report
reply
16 points

Yo this is big. In both that it is momentous, and holy shit that’s a lot of parameters. How many GB is this model?? I’d be able to run it if I had an few extra $10k bills lying around to buy the required hardware.

permalink
report
reply
22 points

its around 800gb

permalink
report
parent
reply
5 points

God damn.

permalink
report
parent
reply
3 points

That’s some thick model

permalink
report
parent
reply
1 point

Time to buy a thread ripper and 800gb of ram so that I can run this model at 1 token per hour.

permalink
report
parent
reply
1 point

That looks good on paper, but while I find ChatGPT good to create critical thinking, I’ve found Meta’s products (Facebook and Instagram) to be sources of disinformation. That makes me have reservations about Meta’s intentions with LLMs. As the article says, the model comes pre-trained, so it’s most made up of information gathered by Meta.

permalink
report
reply
2 points

Neither Meta nor anyone else is hand-curating their dataset. The fact that Facebook is full of grandparents sharing disinformation doesn’t impact what’s in their model.

But all LLMs are going to have accuracy issues because they’re 1) trained on text written by humans who themselves are inaccurate and 2) designed to choose tokens based on probability rather than any internal logic as to whether an answer is factual.

All LLMs are full of shit. That doesn’t mean they’re not fun or even useful in some applications, but you shouldn’t trust anything they write.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 6K

    Posts

  • 128K

    Comments