You are viewing a single thread.
View all comments View context
-18 points

Technically, it could be coded to recognize the various formats of strings and blur everything indiscriminately.

permalink
report
parent
reply
21 points
*

that would require knowing the formats of strings. And it requires the text to be text.

What if you had a photo of a handwritten piece of sensitive information?

permalink
report
parent
reply
-4 points

I don’t understand your meaning. Screenshots of a photo are still screenshots and manipulating text on a photo is already a thing (you can use phone camera to translate text directly from a fixed surface).

permalink
report
parent
reply
11 points

handwritten. ocr isn’t perfect, especially with handwritten content.

permalink
report
parent
reply
12 points
*

I doubt that OCR (optical character recognition) is done on device so it likely being sent to some server for processing.

As a software engineer, in any of our corporate applications when a user hits delete we toggle an archived flag, but the data is still there. So I wouldn’t trust any application to do what it actually says.

There are so many technical barriers for recall to ever be able to not snipe your private data that I wouldn’t go anywhere near the thing.

Edit: Furthermore, what happens when MS inevitably gets hacked again and someone steals all the data it has and then starts using that to commit fraud.

permalink
report
parent
reply
15 points

As a software engineer, in any of our corporate applications when a user hits delete we toggle an archived flag, but the data is still there.

What many people don’t realize is that this is how some low level data stores work as well. Even regular ol’ file systems do this (basically).

permalink
report
parent
reply
6 points

Blurring isn’t destructive.

permalink
report
parent
reply
5 points

In that case, instead of blurring, let’s have it turn the device into an I.E.D.

permalink
report
parent
reply
33 points
  1. OCR is never perfect.
  2. A partial credit card number or partial SSN wouldn’t match the format, but is still sensitive.
permalink
report
parent
reply
-26 points
  1. Perfection is impossible. Demanding it is silly. Loopholes are unavoidable in everything.
  2. Context can be trained.
permalink
report
parent
reply
14 points

If you agree that it will never be perfect at filtering out sensitive information, why support it?

permalink
report
parent
reply
36 points

Perfection is impossible. Demanding it is silly.

In this case perfection is very easy. It could avoid capturing 100% of credit card info by not taking screenshots of everything.

permalink
report
parent
reply
3 points

no, it cannot. It implies you having samples of every form possible so the llm can interpolate. And even then, something sensitive to me might be harmless to you. The llm cannot know your intent.

permalink
report
parent
reply
21 points
*

Demanding perfection for a system as dangerous as recall is not silly.

It’s like keeping an armed nuclear bomb in the center of a city at all times and being like “hey, it’s ok that it’s activation sequence isn’t perfect, it probably won’t go off”.

The solution to make it perfect is to not install the nuke/recall at all.

permalink
report
parent
reply
8 points

Perfection is impossible. Demanding it is silly.

  1. This isn’t even a matter of perfection, this is Recall barely managing to censor the most blatantly sensitive information (see: the article saying “I also created my own HTML page with a web form that said, explicitly, “enter your credit card number below.” The form had fields for Credit card type, number, CVC and expiration date.”)
  2. Demanding a system protect user data is not silly, it is necessary. And if a given system can’t do that, then it should never be used. Especially considering the fact this is likely going to make its way onto PCs handling extra sensitive data with strict privacy requirements, such as medical data protected by HIPAA.

Context can be trained.

  1. Maybe Microsoft shouldn’t have released a tool until it had that context then?

If a company releases a half-baked tool that doesn’t do what it advertises, easily fails in simple attempts at identifying sensitive data, and is almost impossible to guarantee data security with, then it should never be used or advertised for any context in which any sensitive data could ever be present.

permalink
report
parent
reply

privacy

!privacy@lemmy.ca

Create post

Big tech and governments are monitoring and recording your eating activities. c/Privacy provides tips and tricks to protect your privacy against global surveillance.

Partners:

Community stats

  • 989

    Monthly active users

  • 123

    Posts

  • 218

    Comments