hedgehog
The concern is that branches and commits that are not otherwise publicly visible become visible, thanks to the way Github handles forks.
Here’s a link to an earlier discussion on this topic: https://lemmy.ml/post/18368342
Hallucinations are an unavoidable part of LLMs, and are just as present in the human mind. Training data isn’t the issue. The issue is that the design of the systems that leverage LLMs uses them to do more than they should be doing.
I don’t think that anything short of being able to validate an LLM’s output without running it through another LLM will be able to fully prevent hallucinations.
The main disadvantage I can think of would involve a situation where your email (and possibly also other personal data) was exposed without your name attached. It’d be possible for your DLN and/or SSN (or the equivalents for other countries) and email to be exposed without your name being exposed, for example. This wouldn’t have to be a breach - it could be that, for privacy purposes, certain people working with accounts simply don’t get visibility to names.
It’s also feasible that an employee might have access to your full name but only to partially masked email addresses. So if your email is site-firstname-lastname@example.com and they see site-firstname-****@domain.com, they can make an educated guess as to your full email address.
Also, if your email were exposed by itself and someone tried to phish you, it would be more effective if they knew your name.
https://github.com/TriliumNext/Notes is a fork that appears to be actively developed. Found it near the end of the issue linked from the maintenance notice.
ACLU, is this really that high a priority in the list of rights we need to fight for right now?
You say this like the ACLU isn’t doing a ton of other things at the same time. Here are their 2024 plans, for example. See also https://www.aclu.org/news
Besides that, these laws are being passed now, and they’re being passed by people who have no clue what they’re talking about. It wouldn’t make sense for them to wait until the laws are passed to challenge them rather than lobbying to prevent them from being passed in the first place.
wouldn’t these arguments fall apart under the lens of slander?
If you disseminate a deepfake with slanderous intent then your actions are likely already illegal under existing laws, yes, and that’s exactly the point. The ACLU is opposing new laws that are over-broad. There are gaps in the laws, and we should fill those gaps, but not at the expense of infringing upon free speech.