Early speculation is that it’s an MoE (mixture of experts) of 8 7b models, so maybe not earth shattering like their last release but highly intriguing, will update with more info as it comes out

0 points

Honestly its such a good idea to share models via p2p it saves so much bandwidth Ofc there should still be a ddl for preservation but still

permalink
report
reply

LocalLLaMA

!localllama@sh.itjust.works

Create post

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

Community stats

  • 5

    Monthly active users

  • 74

    Posts

  • 54

    Comments