Technically possible with a small enough model to work from. It’s going to be pretty shit, but “working”.
Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work.
Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
squints
That says , “PHILLIPS DVD+R”
So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>
llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.
Might be a dvd. 70b ollama llm is like 1.5GB. So you could save many models on one dvd.
70b model taking 1.5GB? So 0.02 bit per parameter?
Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs
For some reason, triple layer writable blu-ray exists. 100GB each
https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/
Maybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA
Not needed, I’ve got this gem.
Y’all can look, but don’t touch.
Fun fact: you can download llama3, an llm model made by meta (which is surprisingly good for its size), and it’s only 4.7gb. A dvd can store 4.7 gb of data, meaning you could in theory have an llm on a DVD.
Thought that said cbatGPT
Please keep us updated with all future inconsequential misinterpretations.
-Management