I've tested it in both w3m and dillo, should work fine as long as your browser renders noscript tags. It's very much designed from the ground up to handle browsers like that. Just requires you to manually wait a few seconds and then press the link.
One configuration that might break is if you're running something like chrome or firefox, and rigging it to not run JS. But it's really hard to support those types of configurations. If it works in w3m, it's no longer a "site requires JS" issue...
Thanks a lot for considering no-JS browser like Dillo, in the current web hellscape is certainly a difficult task. I checked and it works well in Dillo on my end.
Reflecting on the current market landscape and the unique challenges of my professional journey.
Coming from a non-traditional background, I’ve had to pivot and align with high-performing teams to navigate complex environments. It’s easy to get distracted by the "noise," but I remain laser-focused on strategic growth and ROI.
I’m a lifelong learner with a growth mindset, keeping my eye on the prize while maintaining a competitive edge. I’m fully committed to my organization, and we prioritize high-stakes execution—so let’s keep the professional synergy positive.
In this fast-paced industry, agility is everything. I’m operating in a "do or die" climate where meeting KPIs is the only option. Looking at the current burn rate and market volatility, long-term forecasting is a challenge, but I’m staying resilient.
Good idea for a unit test for this: if you put the lyrics to "Weird Al" Yankovic's Mission Statement in, it should return the exact same text as output.
Big thing that made encryption required is arguably that ISPs started injecting crap into webpages.
Governments can still track you with little issue since SNI is unencrypted. It's also very likely that Cloudflare and the like are sharing what they see as they MITM 80% of your connections.
Google and most search engines optimize for what is most likely to be clicked on. This works poorly and creates a huge popularity bias at scale because it starts feeding on its own tail: What major search engines show you is after all a large contributor to what's most likely to be clicked on.
The reason Marginalia (for some queries) feels like it shows such refreshing results is that it simply does not take popularity into account.
Well to be fair, Marginalia is also developed by 1 guy (me), and Google has like 10K people and infinite compute they can throw at the problem. There has been definite improvements, and will be more improvements still, but Google's still got hands.
Hey Marginalia, cheers. Imo fewer hands can also be an advantage.
There are no PMs breathing down your neck to inject more ads in the search results, you don’t depend on any broken internal bespoke tools that you can’t fix yourself, and you don’t need anybody’s permission to deploy a new ranking strategy if you want to.
Regarding the financials, even though the second nlnet grant runs out in a few weeks, I've got enough of a war chest to work full time probably a good bit into 2029 (modulo additional inflation shocks). The operational bit is self-funding now, and it's relatively low maintenance, so if worse comes to worst I'll have to get a job (if jobs still exist in 2029, otherwise I guess I'll live in the shameful cardboard box of those who were NGMI ;-).
The more I evaluate Claude Code, the more it feels like the world's most inconsistent golfer. It can get within a few paces of the hole in often a single strike, and then it'll spend hours, days, weeks trying to nail the putt.
There's some 80-20:ness to all programming, but with current state of the art coding models, the distribution is the most extreme it's ever been.
reply