Hacker Newsnew | past | comments | ask | show | jobs | submit | arthuredelstein's commentslogin

That is very sad news. I knew Ralph at Brave. He had such a calm, kind presence.


Project Timestamper is an open-source effort to digitially timestamp real human works of art, movies, literature, and science as they exist today, before AI-created content begins to pollute humanity's cultural heritage with false memories.


hakk is a new REPL-based tool for developing Node.js programs. hakk runs your .js source files, and updates your program as it runs whenever you make a change. Unlike the built-in node repl, hakk lets you re-define consts, functions, and class members while your code is running. It works with any code editor.

I would love to get feedback. Do you find this tool useful? What improvements would you like to see?


Is this inspired by the recent post "My Kind of REPL"[0]? :)

[0]: https://news.ycombinator.com/item?id=36600639


I hadn't seen that post, but it's great!

This project was largely inspired by my experience with Clojure, and my wanting to have a similar capability when writing JS code.


Total Cookie Protection partitions all cookie-like data ("state") between the websites you visit, not just cookies. For a technical discussion, see https://developer.mozilla.org/en-US/docs/Web/Privacy/State_P...

You can see which browsers partition state (and which don't) in the State Partitioning section of https://privacytests.org. Firefox passes nearly all of those tests because Total Cookie Protection is enabled by default.


I have a problem with them including the "GPC" flag as a privacy feature. We know directly from "Do Not Track" that it was widely ignored, and frequently actually used to perform tracking because it was meant to be opt in, and so was rare. When it got turned on by default a bunch of companies said "DNT is not a user choice now so doesn't mean anything".

I'm not sure what document.referrer blocking is meant to accomplish - if the intent is for the referrer to pass information to the loaded site then they can just include that in the url. If the intent is "don't let people know where you saw the link", then sadly there are plenty of sites that gate access on the referrer and they get broken. The fact that no one is filtering that kind of indicates the cost/reward balance.

Calling "media queries" a privacy issue is absurd: literally the purpose of these is to allow sites to tailor to the view format. More over you can directly measure these from JS, it's just less efficient and more annoying.

The "known tracker" blocking is more curious as I'm unsure how that's expected to scale, nor what happens if a browser ever misclassifies one (be it a resource or a query parameter). Certainly query strings can be trivially changed to make anything a major browser does just break instantly, and similarly trivially changed to not be statically identifiable.

I also wonder how those are tested because browsers that do automatic "learned" identification of trackers do take time to identify what they consider trackers and start blocking them. e.g. that site says google analytics is not blocked by safari, yet I can look at Safari's tracker stats and see that it has identified and is blocking google analytics.


Hi! Author of PrivacyTests.org here. Thank you very much for the comment. (I only just saw your reply.) PrivacyTests is very much a work in progress, and all feedback is much appreciated.

GPC does differ from Do Not Track in that the former is intended to carry the weight of law. See for example: https://cheq.ai/blog/what-is-a-global-privacy-control/

Regarding document.referrer, you are absolutely right that there is a cost/reward balance and most browsers have chosen to allow cross-site passing of the referrer. However, there are browsers on Android that do block cross-site referrer altogether (see https://privacytests.org/android.html).

"Media queries" refers to the fingerprinting threat where, for example, screen width and height is divulged. You are right that JavaScript can also be easily used to get screen width and height: any fingerprinting resistance feature should protect against screen fingerprinting via both JS and media queries, in my view. Some browsers already do that, as the results show.

Your question about scale is a good one. Some browsers (such as Firefox and Brave) embed fairly large blocklists. You are right that query parameters can be changed, but in practice I haven't seen any cases of that happening (yet).

As far as I am aware, Safari is (by default) blocking cookies/storage from Google Analytics and similar trackers, but not blocking the scripts themselves. You can see that cookie blocking reflected in the "Tracking cookie protection tests".


> Calling "media queries" a privacy issue is absurd

The header at media queries says “Fingerprinting resistance”. Some data point like screenwidth doesn’t immediately disclose you identity. But having a few data points helps with fingerprinting.


on mobile so can't provide source but look up why privacy badger no longer uses a dynamic "learned" list of trackers to block, the way you mention safari works and instead uses a list that everyone else uses. it basically reduces privacy because your list of blocked sites is likely to be unique among other users (along with other data points) and unintuitively makes it easier to track you across the web.


The summary:

>By default, Privacy Badger receives periodic learning updates from Badger Sett, our Badger training project. This “remote learning” automatically discovers trackers present on thousands of the most popular sites on the Web. Privacy Badger no longer learns from your browsing by default, as “local learning” may make you more identifiable to websites. You may want to opt back in to local learning if you regularly browse less popular websites.

-- https://privacybadger.org/#How-does-Privacy-Badger-work

The full writeup:

https://www.eff.org/deeplinks/2020/10/privacy-badger-changin...


Huh an interesting issue to have to deal with. I'm not sure it makes it easier to track you in practice, though I can see in theory it could absolutely work, so if you did have the choice of have a static block list it would be better.

That said, my point stands that the bigger "reputable" trackers (e.g. the ones that exist on basically every website) are all blocked by the learned trackers techniques, and so should be listed as blocked on that site.


I’m impressed because as far as I could tell Safari is equal or better than Firefox on all of these tests.


Let's Encrypt is incredible. Thank you!


Hi -- I'm the maintainer of privacytests.org. To be clear, most of these tests were built before I applied to work for Brave. And I am committed to keeping the tests impartial.

I do hope to include telemetry tests in the future.


Hi -- thanks for the comments!

> it would be very helpful if the table of test results were furnished with a glossary that:

There is some explanation for each test -- to see these explanations, you need to click on category titles, test titles, or test results themselves. But I take your point that these annotations need to be expanded and easier to find.

> I tried to send feedback to this effect but could not as I'm not a Twitter user (it seemingly being the only of sending feedback to PrivacyTests.org).

Actually in the upper-right corner of each page there is a link to an email address for feedback (contact@privacytests.org). And github issues are also welcome. I gather these links could be made more clear. :)

> I was rather surprised to see how poorly Tor featured in the tracking department, it failing every test.

The tracking content section results actually stem from Tor Browser's approach: Tor Browser currently does not block any third-party content from loading in a page. Rather it prevents tracking by various policies, including always using the Tor network, providing full state partitioning, and providing strong fingerprinting resistance. Generally speaking, third-party trackers are prevented from tracking users by these measures. However, I think blocking of trackers could offer defense in depth in Tor Browser, in case any of the other measures fails.


Thank you for your reply on all points.

On the matter of testing browser exploits etc. it seems to me there's no decent comprehensive list of exploitable browser functions that's easily comprehensible to normal users as well as being easily accessible.

I reckon that a link on your homepage to a well organized table etc. that lists browser function names, their description or explanation thereof together with their various exploits (description, modus operandi, notes, links to more info etc.) in an easily readable format would be very worthwhile as would also draw users to your site for that reason alone. When there, they'd also find the browser tests.

(There's any amount of stuff on the web about browsers, exploits etc. but I've not seen one that's comprehensive in that it brings all three aspects into one place.)

About Tor, I would certainly agreed with you. Bootstrapping with fallback security makes sense. If there's any argument then make it optional at the click of a button, etc. In a way, your test results have acted as a review and I reckon that's a good thing.

Frankly, I'm horribly disenchanted with most browsers and we need a site that reviews most of them in an objective and comprehensive way. Perhaps in the future you might consider doing this by reviewing browsers in the light of your tests.


Thank you for the feedback!

Granted, blocklists (lists of tracking domains or URL query parameters) can be circumvented by a determined attacker. Indeed, I agree that blocklists aren't sufficient on their own for a browser to provide solid privacy protection. In my view it's critical, primarily, to have policies that enforce privacy, including such protections as state partitioning and fingerprinting resistance. That's exactly why I included tests for such policies.

However: I do think blocklists provide substantial, though incomplete, privacy protection in practice. And, importantly, blocklists are enforced by a number of popular browsers (Brave, DuckDuckGo, Firefox Private Mode, Firefox Focus) and popular browser extensions and other services (uBlock, ClearURLs, DuckDuckGo Privacy Essentials, Disconnect, etc.). These blocklists seem to work pretty well, at least judging by the ad-free experience they provide. So I felt that to give a more complete picture I should test for blocking.

I tried to avoid cherry picking query parameters or blockers. Here's how I arrived at the current selections for these two sections:

* Tracking query parameter tests: I tried to gather all the query parameters I could find; the list on the page was my full list at the time. (If there are suggestions for more parameters, I will be happy to add them.)

* Tracker content blocking tests: I used the list of the top 20 tracking entities from https://whotracks.me. These are, roughly speaking, 20 of the most widespread third-party tracking domains on the web -- they should be a high priority for any browser respecting privacy, in my opinion. I hope testing for blocking of these 20 serves to gives a sense of each browser's approach to third-party tracking scripts and pixels.


The tests are neat, Thank you.

Can you update the steps to run the tests locally to test against browser with uBlock Origin vs default browser?


Agreed -- in the future I'm hoping to have a page showing results with browsers with various privacy-helpful extensions installed.


Thank you for the feedback -- I agree more context and explanation is needed for each of these tests.

In the Blob case: the test code is storing a unique string in a Blob URL under one website (first party), and then attempting to read back that string under a second, different website. (See "result, different first party".) If the string is accessible under a different first party, then it is possible to use a Blob URL to track a user between two different websites.


Oh ok. That does make sense. Hopefully you read my comment as feedback and not super negative.

Just some verbiage on each test would be wonderful.

You have clearly worked on it. It is a really good resource.


> Oh ok. That does make sense. Hopefully you read my comment as feedback and not super negative.

It was helpful feedback. I value all critiques because they help me make the site better.

> Just some verbiage on each test would be wonderful.

There is some explanation for each test, if you click on the test name. But it's clear I need to expand those explanations and also make them easier to find.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: