You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I limit what a script can do, I can specify a limit in terms of top level domain a script can run:
Let's take two of the worst contenders (owned by the same entity):
Why they are needed? The website owner uses ReCaptcha for the login page. However, they are ever-present in all website pages where they aren't needed because ReCaptcha is only used for logging in.
However, I can't limit so a script is only allowed in a specific page.
My workaround
So I keep turning them on and off to retain barely enough of my navigation privacy. I do that for many websites I wish I could just set once and not having to worry having those scripts running in all pages I travel to.
In my specific situation, I'm trying to get more privacy from google, microsoft and other providers that provide tools to know where a user is navigating to.
Suggestion
Allow more fine control over at what urls scripts like those are allowed to be loaded at.
Potential issues
Nowadays so many websites don't navigate between pages. Even though the user has arrived into a certain URL, the url can change while technically the page has stayed the same and the framework of choice just swapped out and swapped in HTML.
That can lead to inability to add and remove the external scripts due to page change. However, I consider that an acceptable limitation of the tool.
From the user perspective, this can be misunderstood by the user as being a bug or similar. However, if this feature if marked as advanced and with a warning about its limitation, I believe the misinformed limitation is mitigated.
So
I think it's worthwhile to make such feature to provide the finer control over these few last cases that behave closer to trojan horse for data, such as ReCaptcha.
The text was updated successfully, but these errors were encountered:
Problem
When I limit what a script can do, I can specify a limit in terms of top level domain a script can run:

Let's take two of the worst contenders (owned by the same entity):

Why they are needed? The website owner uses ReCaptcha for the login page. However, they are ever-present in all website pages where they aren't needed because ReCaptcha is only used for logging in.
However, I can't limit so a script is only allowed in a specific page.
My workaround
So I keep turning them on and off to retain barely enough of my navigation privacy. I do that for many websites I wish I could just set once and not having to worry having those scripts running in all pages I travel to.
More explanation
This is basically a feature on top of what Contextual Policies allows.
In my specific situation, I'm trying to get more privacy from google, microsoft and other providers that provide tools to know where a user is navigating to.
Suggestion
Allow more fine control over at what urls scripts like those are allowed to be loaded at.
Potential issues
Nowadays so many websites don't navigate between pages. Even though the user has arrived into a certain URL, the url can change while technically the page has stayed the same and the framework of choice just swapped out and swapped in HTML.
That can lead to inability to add and remove the external scripts due to page change. However, I consider that an acceptable limitation of the tool.
From the user perspective, this can be misunderstood by the user as being a bug or similar. However, if this feature if marked as advanced and with a warning about its limitation, I believe the misinformed limitation is mitigated.
So
I think it's worthwhile to make such feature to provide the finer control over these few last cases that behave closer to trojan horse for data, such as ReCaptcha.
The text was updated successfully, but these errors were encountered: