14 points
The LLMs they train on their code will only be accessible internally. They won’t leak their own intellectual property.
4 points
5 points
2 points
2 points
*
If only we had an overarching structure that everyone in society has agreed exists for the purposes of enforcing laws and regulating things. Something that governs people living in a region… Maybe then they could be compelled to show exactly what they’re using, and what those models are being trained with.
Oh well.