You are viewing a single thread.
View all comments
44 points
*

Technically possible with a small enough model to work from. It’s going to be pretty shit, but “working”.

Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work.

Or how many 1.44MB floppies you would need for the actual program and smallest viable model.

permalink
report
reply
12 points

Might be a dvd. 70b ollama llm is like 1.5GB. So you could save many models on one dvd.

permalink
report
parent
reply
5 points

It does have the label DVD-R

permalink
report
parent
reply
8 points

70b model taking 1.5GB? So 0.02 bit per parameter?

Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs

permalink
report
parent
reply
7 points

Less than half of a BDXL though! The dream still breathes

permalink
report
parent
reply
5 points

For some reason, triple layer writable blu-ray exists. 100GB each

https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/

permalink
report
parent
reply
9 points

Ah yes probably the Smaler version, your right. Still, a very good llm better than gpt 3

permalink
report
parent
reply
8 points

It is a DVD, can faintly see DVD+R on the left side

permalink
report
parent
reply
8 points
*

Maybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA

permalink
report
parent
reply
4 points

yes i guess it would be a funny experiment for just a local model

permalink
report
parent
reply
16 points

squints

That says , “PHILLIPS DVD+R”

So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>

permalink
report
parent
reply
13 points

llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.

permalink
report
parent
reply
1 point

Just interested in the topic did you 🔨 offline privately?

permalink
report
parent
reply
7 points

ELIZA was pretty impressive for the 1960s, as a chatbot for psychology.

permalink
report
parent
reply
3 points

pkzip c:\chatgpt*.* a:\chatgpt.zip -&

permalink
report
parent
reply

Memes

!memes@lemmy.ml

Create post

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

Community stats

  • 12K

    Monthly active users

  • 5.7K

    Posts

  • 37K

    Comments