Hacker Newsnew | past | comments | ask | show | jobs | submit | evbogue's commentslogin

I disagree that Bluesky is in conflict with The Ecosystem Is Moving. In contrast to most decentralized/distributed protocol projects they've managed to maintain control of almost all of their infrastructure with the exception of the personal data servers (pdses) of which they control 99.01%[1]

Almost all ATProto apps just fetch posts by handle => did:plc => post-type aka "lexicon", so they depend on what Bluesky decides to give them. If someone were to introduce unknowns into the flagship product's "lexicon" they could fix that at the API or Indexing level before shipping this data to the apps that depend on their API.

An actually decentralized network would have to overcome Moxie's criticism of the ecosystem. Can it be done? We'll keep trying.

[1] https://arewedecentralizedyet.online/


We can have both a directory and use content addressable storage and give people the option of using their own keypairs. They are not mutually exclusive. Bluesky chooses to have a central directory and index.

what would you do about the at-uri?

at://hash at://pub i guess I don't know why we need at:// that seems like something we'd need in Beaker Browser.

After reading the docs at https://atproto.com/specs/at-uri-scheme and building applications that integrate with the Bluesky API, it’s clear why this format is useful for interacting with their system. For developers working outside the ATMosphere, however, it may feel less familiar compared to more conventional REST API patterns.


Well, when you figure out why at:// exists and why your hand waving away it's role in your "I guess" makes it impossible... we can chat then

Until then, you are just slopping about like an ai


dude you represent yourself as a tony the AI worm.

I'll do an:// urls if you bring yourself and your code agent to myproto for an afternoon.


Yes. SSB and ANProto do this. We actually can simply link to a hash of a pubkey+signature which opens to a timestamped hashlink to a record. Everything is a hash lookup this way and thus all nodes can store data.

git-ssb was (now is again, really) one of those areas where ssb was vastly superior to atproto since all peers hosted the repos


What strategy will Nostr use to achieve true P2P social?


From what I see, WebRTC is the key to achieve direct P2P connections.

I'm involved at NOSTR project where beyond internet the connections can be made with bluetooth, LoRa, LAN (including Wi-Fi) and radio using walkie-talkies.


This is great to hear. The part about Bluetooth/radio/LoRa sounds vaguely like Reticulum. I’ve always thought that the two projects could find alignment somewhere. Nostr with Reticulum style features, or even just Nostr over Reticulum, would be unstoppable.


Sorry to say, my opinion is that such alignment will NOT be happening on the foreseeable future unless AI can be used to program whatever is missing on reticulum.

Reticulum is poorly implemented on the library side and assumes everything to be encrypted, which is heavy on network links with poor bandwidth. Plus, on radio waves the encryption is not even permitted.

NOSTR can be simplified to be less verbose and function on less bandwidth links like bluetooth, LoRa and be unencrypted (just signed messages) to legally use radio waves. Reticulum has some kind of drama against using bluetooth for transmission of data. Unfortunately that doesn't seem to change in the near times.


Tyr is probably overkill with Deltachat on top of yggdrasil. The network already is encrypted so it's fine to send plaintext emails as long as there's no 3rd party email hubs.


I think DeltaChat in this case is used mainly for spam disincentivization via mandatory encryption.


back in the day a few of us used to run ssb (secure-scuttlebot) over yggdrasil (and cjdns before that) and that system would distribute the private messages to all of the peers within 3 hops. offline peers would just sync up when online and then decrypt the messages sent to them.

ssb's been broken for around five years, but now that it's working again it'd be fun try this experiment again.

2026 could be the year mesh networks finally take off!


Curious why you believe it was broken, and is now fixed. What new development are you referring to? I agree that Patchwork kinda took a dive, and functionality started to bitrot with each new maintainer...but it still replicates feeds.


I couldn't get any of their latest versions working. The ssb-server was still functioning, but had no working client that I could find. https://github.com/evbogue/ssbc is a working fork with a patchbay lite client from circa 2015/16 live at https://ssb.evbogue.com/ (with git-ssb!). I'm also recreating pfrazee's original Phoenix client from scratch.

Let's talk more on a more appropriate channel. Are you on bsky? we're having a small discussion there about "bringing open source projects back from the dead with AI" right now.


as a side note, nano do you still have a working pub with all of the historical data on it?


article is from 2016


How does this handle changes to the website. Is the entire website re-uploaded to a Nostr relay, or is there some re-use of old material that's already been uploaded before?


Nostr is append only, if an event is published to someone else's relay, you can't delete it.


oh yes, i get that about nostr. my question relates to updating the website.

if i have 5 pages, which i publish to nostr using this tool, and then i make a small change to one of the pages, do i then need to create and publish the entire project again?


it relies on the hash of the contents ex. if you uploaded your website for the first time it get fully uploaded, but incase you have changed some contents lets stay 1 page out of 5, it compares the hash of the assets, and just upload the changed assets, same as the retrieval, unchanged cached contents didn't get retrieved from the relays


Why should a web page only have a single person generating and injecting HTML into it?


The analogy doesn't hold markup ;)

Whether I generate a whole page or generate a partial page and then add HTML to it is equivalent from a safety perspective.


A single company. Why would I let another company inject HTML into my page?


There's this newfangled concept called social media where you let other people post content that exists on your web site. You're rarely allowed to post HTML because of the associated issues with sanitizing it. setHTML could help with that.


I just had a flashback to the heyday of MySpace. Now that I think about it though, Neocities has the "social networking" of being able to discover other people's pages and give each other likes and comments.

Hmmm...


Or CMS content, or even anything that comes from the user outside of social media content and could cause a reflected XSS

for example, a search query, or a redirect url, or a million other things


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: