You are viewing a single thread.
View all comments 4 points
I think it makes sense. I like ChatGPT and I appreciate having easy access to it. What I really wish is the option to use local models instead. I realize most people don’t have machines that can tokenize quickly enough but for those that do…
5 points
Seconding this. Why not allow people to run llama3 or other open source models?
14 points
1 point