I'm finding the business model aspect of Deno KV absolutely fascinating.
const kv = await Deno.openKv();
That's a Deno core API. It works fine in the open source version of Deno using a local SQLite database file.
But as soon as you deploy your application to their proprietary hosted service, that core API feature gets massively more powerful. It's no longer a SQLite database, it's now a globally distributed key/value store backed by FoundationDB, replicated around the world.
It looks like they've extended that idea further with the latest version - you can now do this:
And your local code is now able to manipulate that remote FoundationDB database as well.
I'm having trouble thinking of a precedent for this - an open source project that has a core API which is effectively a lead generator for their proprietary cloud service.
I'm not entirely sure how I feel about it. I think I like it: open source projects need a business model, and the openKv() method is still a supported, useful part of the open source offering.
Kind of fascinating pattern though.
UPDATE: I just found this page of docs https://github.com/denoland/deno/blob/be1fc754a14683bf640b7b... - which describes the "KV Connect" protocol they are using. It looks like this evens the playing field, in that anyone could implement their own alternative backend to Deno Deploy if they wanted to.
This firmly establishes me on the "I think this is cool" side of the fence.
It has been happening for a while with a bunch of startups like Supabase claiming to be "open source" and marketing themselves as such but making it really hard to self host for a long time.
It wasn't just them either.
I would see with disgust a bunch of startups use "open source" as their marketing tactic, no matter how hard it was to setup or run without their hosted service.
It is also a peverse incentive: the harder the open source system is to run and maintain, the more you will gravitate toward their cloud. Supposedly open source companies raising a ton of money from VC is also strangely contrary to the open source ethos.
I am not going to make any sweeping generalizations across all products. But at least in the case of Deno KV, there doesn't seem to be lock-in. So if you were running something self-hosted for KV persistence, it will continue to work unmodified.
> I would see with disgust a bunch of startups use "open source" as their marketing tactic.
Again, not sure which bunch of startups. But I am not seeing that with this product. Seems more like a survival strategy to add some cashflow behind the developers.
I am curious what you think Open Source should be (or should not be). I think it's fair that running a service in the cloud should cost something. And self-hosting it, I think it's fair that it requires a bit more effort than using the hosted service.
Isn't that the reverse? It's against the sentiment that being a part of system means that you implicitly agree with it, and any complaints you make about it are void.
Seems like you can similarly respond to any complaint with that meme too, which is also invalid. In reality, there is a lot more nuance than can be described by pithiness.
It looks to me like they've documented the KV Connect protocol they invented to support this feature, in enough detail that anyone else could build an alternative backend for it.
This has helped me feel completely OK with how they're handling this. They get an advantage in that they've already built an extremely robust proprietary backend, but I find that acceptable given the documented protocol.
I think this is great, except I feel odd that it's just hanging around on the Deno global instead of being e.g. imported like any other database client.
import KV from "https://deno.land/kv" // for example
If their protocol is indeed open and usable with your own backend, then that library should be able to work for anyone. And if they need some fancy native performance then maybe they could intercept that import when running code on Deno Deploy?
Treating their hosted service as "part of the runtime" which is what the Deno global tends to be for is the only remaining ick factor for me.
Yeah, this is the only thing I don't like. Having it readily available as a module import would not change anything (except one line of code) and it feels more decoupled. But in practice I don't see a huge difference, it's not like it's polluting the global namespace with dozens of functions.
I'm really confused by this statement. What exactly is being degraded in their service? Or in their API? Or in the underlying tech they are compatible with?
All I see here is an open source system that you can manage and deploy yourself, with a 100% compatible API for a cloud service that handles that for you, should you decide to pay money to have that problem solved for you.
Good Open Source is something like PostgreSQL: It's completely free, you can host it yourself, or you can pay someone to host it, there are multiple competing service providers, most of them contribute to the project, and everyone has access to almost all of the source code. Anyone can start offering a compatible service with minimal investment. If you run into a problem, the source code for everything is public, and you can often fix it yourself.
And then there is something like DenoKV: There's an Open Source version that is designed mostly for prototyping or small scale deployments, and a closed source hosted version designed for production. If you want to use it you have to pay one company and there are no competitors. You are locked in. Theoretically, a competitor could create a compatible product, but the required effort is huge, creating a big barrier to entry. And even if competitors do show up, any new features introduced by the proprietary service will take a long time to trickle down to competing services. If you run into a problem, you have to hope the vendor fixes it.
I am definitely ready to call out shams, sellouts, and "enshittification." But I don't see how this one option is any of that.
I'm building a back end with Deno and MariaDB, and pretty psyched about it so far. I haven't built a back end since I did one with PHP 5 in 2011 or so. Thus far I've found it easier to get something working than I did with the tools years ago. So much so that I'd consider throwing these guys a bone by paying for a service.
But... I don't know that a key/value store suffices for what I want to do, which will involve relational queries.
Cloudflare KV is basically the exact same. Even the same name.
Cloudflare’s pricing seems more reasonable, though a 1:1 comparison is impossible. Cloudflare reads and writes in every region for the same price, Deno scales price with number of replica regions. Also CF is billed per read/write operation (and supports listing), this is billed per kB read/written.
That all said, if your values are bigger than ~2kB Cloudflare is almost certainly cheaper. And the list (with metadata) operation is quite powerful.
However, single-region means you don’t have to care about synchronization nearly as much, which can be quite annoying. The end user experience will suffer in areas far from your write zone though.
A better comparison might actually be Cloudflare D1, a hosted SQL with per-kB fees, but that’s still beta.
> I'm having trouble thinking of a precedent for this - an open source project that has a core API which is effectively a lead generator for their proprietary cloud service.
Is this not Vercel’s entire business model? I don’t think they invented it either.
I like this too... it's practical API but there's no way you can actually provide this production API for free. So instead of an open source project pulling back stuff you'd expect to be free (because it has zero marginal cost), they are adding stuff in that makes sense to be paid.
I'm wondering if this might become interesting for a Sandstorm-like use case, where you can write a personal web app, publish it on GitHub, and other people can deploy it easily to their own Deno Deploy account. (Or they can self-host if they prefer.)
By the time you've implemented even a basic key-value store with on-disk storage you've probably written a bunch of code that would be unnecessary if you had used SQLite.
I continue to watch Deno with excitement. I haven't had a good use case to play with it yet (all my free programming time has gone into my side business and I'm not ready to chance it on Deno yet) but I'll keep looking.
I find the way they handle secondary indexes very interesting. I mean under the hood I think DynamoDB does pretty much the same thing (stores the data multiple times) but instead of explicitly writing the data multiple times you define fields on the data that the secondary indexes use so the data is written there at the same time it's written to the primary (I could be a little mistaken, I'm working at a higher abstraction layer so I don't think about that). I can't decide which approach I like more. I will say that I don't think I'd need anything but my own abstraction layer to work with Deno KV vs DynamoDB. That said I still think DynamoDB is way more powerful overall.
Deno gives me hope for web development. The security model, no bundler and general pace of progress is great. They've drastically improved node interop but if they could close the gaps (and sort out extension/less imports) so many more projects could finally jump ship over to Deno.
When you run `kv = await Deno.openKv()` locally it opens a SQLite database. On Deno Deploy it opens a connection to FoundationDB. How does that mechanism work? Is it using the same URL mechanism as the new Deno.openKv(URL) thing?
Yes, sort of - on Deno Deploy the authentication doesn't come from a token in env vars, but from intrinsic security tickets baked into the Deno Deploy system. Also, it's a bit faster on first connect, because compared to KV connect we can skip the metadata exchange[1] because the information it provides is already present in Deno Deploy through other means. Both the backend service and frontend API (JS->KV) is the same though :)
Is "Deno KV" a feature of Deno the runtime or Deno the hosting provider? The docs aren't clear about what it actually is, and that makes me a bit wary when deciding to use it.
Me. I'm excited about what they're doing with their security stuff (I love that it can only access files that were explicitly allowed) and it feels like there's a ton of good ideas in there generally. My notes so far: https://simonwillison.net/tags/deno/
Unfortunately, having a URL is only the beginning. After that it gets complicated, because the "source" you get by following a URL in Deno (or a web browser) has often been put through the blender, repackaged and minified by some CDN. It's hard to read and VS Code's debugger doesn't handle it well. Sometimes I'm left just looking at type definitions and it's unclear where the source code for the implementation is at all. Aren't source maps supposed to help here?
So, I look at the repo instead, but now I don't know how it matches up with the code I'm actually running.
I find Go modules to be easier to understand. They are similarly distributed (many source repos will work), but when you navigate to the source code, you get the actual source.
I'm finding the business model aspect of Deno KV absolutely fascinating.
That's a Deno core API. It works fine in the open source version of Deno using a local SQLite database file.But as soon as you deploy your application to their proprietary hosted service, that core API feature gets massively more powerful. It's no longer a SQLite database, it's now a globally distributed key/value store backed by FoundationDB, replicated around the world.
It looks like they've extended that idea further with the latest version - you can now do this:
And your local code is now able to manipulate that remote FoundationDB database as well.I'm having trouble thinking of a precedent for this - an open source project that has a core API which is effectively a lead generator for their proprietary cloud service.
I'm not entirely sure how I feel about it. I think I like it: open source projects need a business model, and the openKv() method is still a supported, useful part of the open source offering.
Kind of fascinating pattern though.
UPDATE: I just found this page of docs https://github.com/denoland/deno/blob/be1fc754a14683bf640b7b... - which describes the "KV Connect" protocol they are using. It looks like this evens the playing field, in that anyone could implement their own alternative backend to Deno Deploy if they wanted to.
This firmly establishes me on the "I think this is cool" side of the fence.