46 points
*

Why do I need to know all of this stuff, why isn’t the web safe by default?

The answer to questions like this is often that there was no need for such safety features when the underlying technology was introduced (more examples here) and adding it later required consensus from many people and organizations who wouldn’t accept something that broke their already-running systems. It’s easy to criticize something when you don’t understand the needs and constraints that led to it.

(The good news is that gradual changes, over the course of years, can further improve things without being too disruptive to survive.)

He’s not wrong in principle, though: Building safe web sites is far more complicated than it should be, and relies far too much on a site to behave in the user’s best interests. Especially when client-side scripts are used.

permalink
report
reply
-1 points

Anything that didn’t need that kind of security from the beginning also wouldn’t break if it’s built.

The stuff that would break are all vulnerable because it doesn’t exist.

permalink
report
parent
reply
-13 points

It’s easy to criticize something when you don’t understand the needs and constraints that led to it.

And that assumption is exactly what led us to the current situation.

It doesn’t matter, why the present is garbage, it’s garbage and we should address that. Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web. And it’s not only the browser, but also the backend stack.

Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.

permalink
report
parent
reply
17 points

Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

No, what I wrote is nothing like that. Please re-read until you understand it better.

permalink
report
parent
reply
-16 points

Of course it is like that. You’re saying that the complaint is wrong because the author doesn’t know the history, and now you accuse me of not understanding you, because I pointed this out.

If you have to accuse everyone of “not understanding”, maybe you’re the one who doesn’t understand.

permalink
report
parent
reply
11 points
*

It doesn’t matter, why the present is garbage, it’s garbage and we should address that. Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

I don’t think your opinion is grounded on reality. The “it is what it is” actually reflects the facts that there is no way to fix the issue in backwards-compatible ways, and it’s unrealistic to believe that vulnerable frameworks/websites/webservices can be updated in a moment’s notice, or even at all. This fact is mentioned in the article. Those which can be updated already moved onto a proper authentication scheme. Those who didn’t have to continue to work after users upgrade their browser.

permalink
report
parent
reply
5 points

A lot of the web used to run on flash. Then apple comes around and says “flash is terrible and insecure”. Within a number of years everything moved away from flash, so it’s definitely possible to force the web in new directions.

permalink
report
parent
reply
4 points

Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web…

Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.

Think about that, and then…what, exactly? As a website author, you don’t control the browser. You don’t control the web standards.

I’m extremely sympathetic to this way of thinking, because I completely agree. The web is crap, and we shouldn’t be complacent about that. But if you are actually in the position of building or maintaining a website (or any other piece of software), then you need to build on what already exists, unless you’re in the exceedingly rare position of being able to near-unilaterally make changes to an existing platform (as Google does with Chrome, or Microsoft and Apple do with their OSes) or to throw out a huge amount of standard infrastructure and start as close to “scratch” as possible (e.g. GNU Hurd, Mill Computing, Oxide, Redox OS, etc; note that several of these are hobby projects not yet ready for “serious” use).

permalink
report
parent
reply
3 points

Okay, and how would you address it? The limitation is easy to criticize when you can think in a vacuum about it. But in the real world, we’d need to find a way to change things that can actually be implemented by everyone.

Which usually means transformative change.

permalink
report
parent
reply
1 point

It doesn’t matter, why the present is garbage, it’s garbage and we should address that.

The problem is fixing it without inadvertently breaking for someone else. Changing the default behavior isn’t easy.

There’s probably some critical systems that relies on old outdated practices because that’s the way it worked when it was written 20 years ago. Why should they go back and fix their code when it has worked perfectly fine for the past two decades?

permalink
report
parent
reply
0 points

If you think anything in software has worked “perfectly fine for the past two decades”, you’re probably not looking closely enough.

I exaggerate, but honestly, not much.

permalink
report
parent
reply
21 points

First and foremost _____ is a giant hack to mitigate legacy mistakes.

Wow, every article on web technology should start this way. And lots of non-web technologies, too.

permalink
report
reply
17 points

Unless I’m missing something, the post is plain wrong in some parts. You can’t POST to a Cross-Site API because the browser will send a CORS preflight first before sending the real request. The only way around that are iirc form submits, for that you need csrf protection.

Also the CORS proxy statement is wrong if I don’t misunderstand their point. They don’t break security because they are obviously not the cookie domain. They’re the proxy domain so the browser will never send cookies to it.

Anyways, don’t trust the post or me. Just read https://owasp.org/ for web security advice.

permalink
report
reply
2 points

As a userscript author, it is some bullshit.

permalink
report
reply
1 point

Thanks, very interesting. I’m a bit confused about what this means:

explicit credentials are unsuitable for server-rendered sites as they aren’t included in top-level navigation

What does “top-level navigation” mean here?

permalink
report
reply
1 point

‘’’ Note: When I say “top-level” I am talking about the URL that you see in the address bar. So if you load fun-games.example in your URL bar and it makes a request to your-bank.example then fun-games.example is the top-level site. ‘’’ Meaning explicit creds won’t be sent. Even if fun-games knows how to send explicit creds, it can’t because fun-games does not have access to creds which stored for your-bank. Say suppose your-bank creds stored in local store. Since current URL is fun-games it can only access local storage of fun-games, not your-bank.

permalink
report
parent
reply

Programming

!programming@programming.dev

Create post

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person’s post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you’re posting long videos try to add in some form of tldr for those who don’t want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



Community stats

  • 3.4K

    Monthly active users

  • 753

    Posts

  • 5.8K

    Comments