I was genuinely surprised just how easy it was to get a fully working RAG set up with Postgres. It was a few hours over a weekend to get something "working" and then probably a bit less time a following weekend to have a nicer database structure and rebuild it learning from the mistakes during the first attempt. The harder part comes next, because that involves multiple tables of user provided data, multi tenancy with a shared core vector schema, and all the actual business logic, so I've put it all on hold for a real breakdown now, but I wouldn't expect it to be much of a problem with what I've found so far with pgvector, and Postgres in general.
the article was aimed more at teams that don't have an existing postgres setup and are evaluating standalone vector databases from scratch. if you're already running postgres with pgvector, you're in a good spot
On one hand some jobs with human element are safe, at first. Think of artists being made obsolete by the camera. Portrait artists became mostly obsolete, but we still pay for art. It's the story behind the art that became important. Or, I still go to cafes with nice atmosphere and friendly staff. There are restaurants with robot staff here in Japan, much cheaper. After the meal you pay at the table without ever talking to a person. But it does not feel nice to sit in there, so I gladly pay a premium for the nice coffee.
On the other hand, it is not only software jobs in danger, but all office jobs. So a lot of people may suddenly be out of money. Let's say you open a cafe, but no one has money to come and pay. Society has to change a lot from the current model to be able to handle this.
Something similar is easy with docker. Build the image when releasing to your first stage env, deploy the very same image to the next stage, until it reaches production. Nothing can break in between and enough time to test.
I speak multiple languages fluently and people are always surprised when I share that my vocabulary is seriously limited. I learned it is an advantage. I am forced to use simple words to explain.
On the opposite end: I had a coworker, I only ever got about 30% of what he said. I thought it's my Japanese skills. He used complicated sentences and words all over the place. But when I asked other Japanese coworkers, they told me they could not understand him either.
I preach to everyone to fail as loudly as possible and as fast as possible. Don't try to "fix" unknown errors in code. It often catches fresh graduates off guard. If you fail very loud and fast most issues will be found asap and fixed.
I had to help out a team in the cleanup of a bug that corrupted some data silently for a while before being found. It was too long out to roll back and they needed all help to identify what was real or wrong data.
Maybe the phone market is still in its golden age? It's a wonder you can get the best phone in the world for only a little over $1000. And everyone can buy it. Online without a line.
Comparing with luxury watches which cost a magnitude or two more and are beaten in precision by a $10 casio.
I guess the thing to watch out for is technical stagnation and "good enough"? Uh..
Well it fits into the news this month:
UT2004 got its latest patch, Diablo 2 got a new expansion. Why not connect a 2003 iBook to download the latest updates?
The GameBoy Advance could run 2D games (and some 3D demos) on 2 AA batteries for 16 hours.
I wonder if we could get something more efficient with modern tech? It seems research made things faster but more power hungry. We compensate with better batteries instead. I guess we can and it's a design goal problem, I also do love a screen with backlight.
E-ink use energy when changing state. A 30fps 3D game would require a lot of energy. Also, e-ink is electromechanical in nature, so there would be a lot of wear as well.
Yes; yet... I thought the efficiency per compute has to do more with the nm process shrinking the die than anything else. That and power use is divided by so many more instructions per second
I use M-disc and I am sure the discs will stay safe for a long time. What I worry about are the drives! It seems the business of making drives is not profitable. So companies exit or reduce.
This is one reason I'd like to see a fully Open Source hardware+firmware optical drive. Probably best to start with CD-ROM, but DVD might also be possible. The optical and mechanical parts seem relatively simple, especially when you're not optimizing for minimum cost or minimum size (meaning you could use the original Philips-style swing-arm mechanism). From what I can tell, the most complicated part is the signal processing, and with modern hardware that looks practical to do in software. I'm not sure how far you could get with home-scale DIY construction, but CDs worked with late 70s technology, so at least that far should be possible.
Setting up an ingestion pipeline to your existing db, vs ingesting into yet another db seems to not solve a problem I have.
If there was one thing I wish pgvector was better at, it would be to allow composite indexes (ie find vector where category). But it's a minor point.
reply