Reddit CEO Steve Huffman is standing by Reddit’s decision to block companies from scraping the site without an AI agreement.

Last week, 404 Media noticed that search engines that weren’t Google were no longer listing recent Reddit posts in results. This was because Reddit updated its Robots Exclusion Protocol (txt file) to block bots from scraping the site. The file reads: “Reddit believes in an open Internet, but not the misuse of public content.” Since the news broke, OpenAI announced SearchGPT, which can show recent Reddit results.

The change came a year after Reddit began its efforts to stop free scraping, which Huffman initially framed as an attempt to stop AI companies from making money off of Reddit content for free. This endeavor also led Reddit to begin charging for API access (the high pricing led to many third-party Reddit apps closing).

In an interview with The Verge today, Huffman stood by the changes that led to Google temporarily being the only search engine able to show recent discussions from Reddit. Reddit and Google signed an AI training deal in February said to be worth $60 million a year. It’s unclear how much Reddit’s OpenAI deal is worth.

Huffman said:

Without these agreements, we don’t have any say or knowledge of how our data is displayed and what it’s used for, which has put us in a position now of blocking folks who haven’t been willing to come to terms with how we’d like our data to be used or not used.

“[It’s been] a real pain in the ass to block these companies,” Huffman told The Verge.

You are viewing a single thread.
View all comments View context
5 points

You are assuming edits overwrite existing content. Instead of overwriting, they could just store the edited post as a new entry in the database with a higher version number. Then, you only show the latest version of each post to the end users while keeping the older versions available die Reddit’s own use.

In fact, it is extremely likely they do this. It is basically a necessity if you want to be able to properly moderate a site like Reddit. Otherwise you could simply post spam or unsavory content, and then overwrite it with something benign an hour or so later, before there were enough reports and a moderator would have gotten a chance to review it.

permalink
report
parent
reply
-3 points

You are assuming edits overwrite existing content

i have seen no evidence to suggest otherwise, but thanks for sharing your theories

In fact, it is extremely likely they do this

based on what evidence? your baseless speculation?

permalink
report
parent
reply
5 points

The fact that they managed to restore overwritten posts after people started to delete their history.

permalink
report
parent
reply
0 points
*

this could also be explained by sketchy scripts failing to completely delete posts/comments, which i even noticed myself when checking that they had done their jobs properly. as i mentioned in another comment, i had to run the shredder scripts several times for complete overwrite/deletion. or it could be database errors failing to register edits/deletions due to extremely heavy loads at the time. it could be a lot of things.

the point is that we don’t have any direct evidence of what it actually was, just a lot of circumstantial evidence and a lot of speculation. nothing definitive.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 5.1K

    Posts

  • 91K

    Comments