Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

uBlock Origin for Safari has been abandoned for about a year. :\


Safari's (Webkit's) content blockers are even better imo. Declarative rules that get compiled. Can't intercept arbitrary requests with Javascript logic but I don't think that's necessary. Though it makes something like uMatrix harder to implement.


There's still a need for something like the element zapper which can remove obstructing popups that aren't implemented in Javascript.


Though you can implement this with the content blocker API. You add a rule to the rule list and recompile it, though it does come at the expense of UX and flexibility for sure.


Which has been a part of Content Blockers since day 1.


I wish there was something like uMatrix for Safari but using content blockers. Would really scratch an itch for me.


Safari’d content blockers are next to useless.

Safari is really easy to beat by anti-ad-blocking technology, the only reason for why more publishers won’t do it is because they don’t want to piss off people, but as the minority of people using ad-blockers grows, they’ll get over that fear.

Also FYI injecting scripts in webpages is the only way to prevent anti-ad-blocking from working.


Safari content blocking API is abandoned since 2015 when it was introduced. It simply does not allow using advanced blocking techniques which were developed by Chrome/FF blockers during these years.

It might be enough to handle most of the ads at the moment, but it fails miserably when a website uses any adblock circumvention.


How is it abandoned? Because the API doesn't change more often than I change my underwear? It works well, is efficient and ensures user privacy and safety (from the sort of issues being described in the article) and it gained new capabilities (forcing a HTTPS url) after launch.

As for your specific claim, 1Blocker has an entire section of blocking rules to counter anti-adblock.

Sure, if some site want's to be completely hostile to users, it may still show up. The little cross icon on the tab solves that issue pretty quickly.


So you are telling me that not changing the API in years despite numerous requests is a good thing? Well, I disagree.

Regarding https, this change was made in 2015.


There's a wide spectrum between "abandoned" and "totally up to date." Not everything is black and white.


Well it all depends what the requests are, doesn’t it.

I’m sure some have requested the ability to run javascript, that doesn’t mean it’s a good idea to add that to the api.


> Well it all depends what the requests are, doesn’t it.

Oh, there are all sorts of them.

From simple feature requests (like being able to distinguish fetch/xmlhttprequest) and bug reports (like that it considers requests to subdomains third-party) to something more complicated.

The problem is that regardless of what's requested/reported, there is no reaction.

> I’m sure some have requested the ability to run javascript, that doesn’t mean it’s a good idea to add that to the api.

Desktop Safari allows extensions to run javascript, why would anyone want to make it a part of the content blocking API.


>like that it considers requests to subdomains third-party)

As it should. Sometimes subdomains are third party. If I whitelist example.org I do not want any subdomains to be whitelisted without be explicitely whitelisting *.example.org as a wildcard or any specific subdomains.

Proof of concept: I constantly host what I could consider 3rd party resources on subdomains for clients. eg: billing.example.com where billing. isn't owned or operated by example.com and could be running who knows what in terms of Javascript and Ads that I may or may not trust without whitelisting billing.example.com. The most common of these are specific marketing/tracking forms that they wish to run through some third party marketing agency and not our in-house tracking systems.


Neither xhr nor subdomains sound like valid features/bugs. Just because they don’t agree with changes you want doesn’t mean it’s “abandoned”.

As for why someone would request JS - you’re the one who suggested chrome/ff blockers are “more advanced”. And yet here we are on a thread about arbitrary code execution.


I don't think our discussion can come to any compromise. You seem to believe that not doing anything is a good thing, and the API is already ideal. I have a different opinion. Well, it's okay to have different opinions.


> You seem to believe that not doing anything is a good thing, and the API is already ideal

Not really. I've submitted a request for an enhancement myself (ability to auto-redirect to the canonical URL - to avoid the bullshit of AMP pages) but what you've suggested just don't seem like worthwhile or positive changes IMO.

My argument wasn't "zero change is good" my argument is that it doesn't need much change and a stable API is hardly "abandonment".


At the same time, I merely will not use a website that is so hostile nor do I regularly encounter websites like that. uBlock Origin breaks in this way as well, and the only website I can even recall with this behavior was a shady movie-streaming site I was linked to.

What do you mean by abandoned? I use it and it works for me daily.


Adblock circumvention is a technique when a website reinjects and encrypts the ads when it detects that they were blocked. Kinda like what Facebook does.

Most of the changes and improvements to modern ad blockers are to deal with this kind of ads. Not in Safari, though. Webkit devs did nothing to improve or extend the content blocking API. Not to say that it still imposes the ridiculous rules limitation - you can't have more than 50k rules in a list (while Easylist alone has more than 80k).


I'm not sure what misunderstanding you think I have with regard to adblock circumvention.

I can appreciate that uBlock Origin wants to be a universal condom that works on every website for everyone, but I also don't think it's so damning if the content blocker API is sufficient for that. Yet uBlock Origin and its rule lists still fail at this task as well.

A website can do any number of annoying things that uBlock Origin cannot block including things that aren't even related to ads. I simply refuse to use such websites. In fact, I want to know that a website does these things rather than remain in blissful ignorance. I don't want to support such a website even with my viewership.

I've built content blockers. The rule limit is circumvented by simply bundling multiple rule lists.

Also, 80% of Easylist is crap anyways. The problem with Easylist and things like it is that they become append-only lists because nobody wants to go back and verify that rules still apply. Not to mention all the site-specific rules for sites you'll never visit because it was someone's random hobby horse. But you can absolutely encode Easylist across two content blocker extensions bundled in one app. But in doing so, you'll also realize how easy it is to prune so that it fits in one rule list, there are so many rules for garbage sites.


> I can appreciate that uBlock Origin wants to be a universal condom that works on every website for everyone, but I also don't think it's so damning if the content blocker API is sufficient for that. Yet uBlock Origin and its rule lists still fail at this task as well.

I'd say it is sufficient for 90% of the websites at the moment.

My point is that this is not a constant value. Advertisers adapt and evolve, and content blockers need to do the same.

> A website can do any number of annoying things that uBlock Origin cannot block including things that aren't even related to ads. I simply refuse to use such websites. In fact, I want to know that a website does these things rather than remain in blissful ignorance. I don't want to support such a website even with my viewership.

This is indeed the best way to let site owners know that this approach is unacceptable. I wish everyone were doing the same.

> The rule limit is circumvented by simply bundling multiple rule lists.

Sure, but this is a workaround, quite an ugly one in fact.

> Also, 80% of Easylist is crap anyways

It does contain a lot of redundant rules, maybe 30% of it, but not 80% that's for sure. And what if I want to use more filter lists? The only thing I can do is use the multiple lists workaround. It is simply weird that nothing was done about this limitation in more than 3 years.


I think the limitation is just the simplest trade-off.

For example, an unbounded rule list has unbounded serial compilation time.

A 50k rule list recompiles in a couple seconds on my laptop and quite some time on my iPhone, but multiple rule lists can be compiled in parallel.

I suppose your response to this would be that content blocker recompilation could be made parallel over a single list or made to support incremental recompilation.

And you're probably right. But note that even Chrome's proposed content blocker system has the same limitation. Perhaps it's not as straightforward as it seems and it's not just a result of neglect/abandonment.

I mainly responded because the language you used seemed more damning than necessary. But have you tried 1Blocker X[1] which is built on the content blocker API? It's hard to condemn the content blocking system too much when there's an app that utilizes it to such great effect.

You are right about Apple/Webkit moving slowly on community feedback and bug reports though. I suppose I'm used to the glacial pace of non-critical-path browser issues after my Firefox bug went three years without response until it was fixed after Quantum's release. You are probably more critical here than I am. At Amazon, anything sev3 to sev5 went into a heap so large it was a miracle if anyone opened it again.

[1]: https://itunes.apple.com/us/app/1blocker-x-adblock/id1365531...


That’s an easy one to get around. 1Blocker X has 10 sets of rules.


It is open source. You can probably make it work with a single git merge (judging from experience with uMatrix for Edge).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: