It does make me question humans and thinking but in the opposite direction.
It is like sitting down at a piano, sight reading a piece from sheet music and then someone who has no idea what they are talking about claiming you composed the music on the fly. Then when you point out the sheet music they just double down on some bullshit as to why they are still right and that is still composing even though obviously it is not.
Best analogy so far. I am adopting this for the next wave of "wait until the next model" and "but humans hallucinate, too" comments. Yes, when we feed back our own output (language on the web) into ourselves, things become tricky to tease apart, and it would seem like intelligence to us. Then again, the mechanical turk appears intelligent, too. If we point out how it works, then the "magic" should vanish.
I think it is pretty cool for the first time trying something like this.
It seems like chain of thought combined with search. Seems like it looks for 30 some references and then comes back with an overview of what it found. Then you can dig deeper from there to ask it something more specific and get 30 more references.
I have learned a shitload already on a subject from last night and found a bunch of papers I didn't see before.
Of course, depressed, delusional, baby Einsteins in their own mind won't be impressed with much of anything.
You also didn't have "sampler music". People did creative things with samplers but people didn't seek out music based on the sampler.
As mentioned, current AI music is listening to recordings of Muzak like what they use to play in elevators and at Dennys.
I would say the progress in AI music is basically non-existent. Ironically, MusicLM can make some very unique and interesting sample material. The more popular models though are just slop.
This is my first time using anything from Perplexity and I am liking this quite a bit.
There seems to be such variance in the utility people find with these models. I think it is the way Feynman wouldn't find much value in what the language model says on quantum electrodynamics but neither would my mom.
I suspect there is a sweet spot of ignorance and curiosity.
Deep Research seems to be reading a bunch of arXiv papers for me, combining the results and then giving me the references. Pretty incredible.
It all comes down to a religious faith in AGI or not.
There can't be things that a human can program that AGI can not program or it is not "AGI".
While I am never a true believer in AGI, it seems to go I get a little faith when a new model comes out then I become increasingly agnostic the weeks and months after that. Repeat.
It is just very random. LLMs help me write a synthesizer using an odd synth technique in an obscure musical programming language with no problem, help me fix my broken linux system no problem but then can't do anything right with the python library pyro.
I think it is why people have such different experiences. It all depends randomly on how what you want to solve lines up with what the models are good at.
I think you are not understanding how difficult it is for people with types of phobias about germs and illness.
Imagine being obsessive compulsive about cleanliness and germs, then the pandemic comes along. You would never recover from that experience psychologically.
It would be worse than if someone with ophidiophobia woke up one day and snakes were randomly falling from the sky.
The worst of all though is an infection with no symptoms. Like an invisible prison the germ freak can never escape from.
What is bizarre is we pretend there aren't at least hundreds of thousands of people in the country with these type of phobias.
I think the use cases are just too wide and varied. If someone made me the perfect DAW for my needs, it would be so narrow they would quickly go out of business.
The DAW has to cover so many different use cases and styles that I don't see how you can get around the complexity upfront.
A 6 year old could plug in a guitar and record the audio with any editing software. Once you want multiple tracks though and real time effects on those tracks and midi then you are already at the flavor of most DAWs.
Can't agree more. Might be the most crazy thing I have ever seen done in the browser.
It actually demotivates me to work on music and motivates me to work on some web app ideas.
I am not sure who the audience is though either. Reaper works wonderfully on Linux.
The issue is any DAW or really any musical instrument is massive investment in time to learn to be good. The money isn't really the bottleneck. I can easily get a reasonably priced flute on ebay. The reason I don't play the flute is because the amount of time involved to learn to play it.
It is like sitting down at a piano, sight reading a piece from sheet music and then someone who has no idea what they are talking about claiming you composed the music on the fly. Then when you point out the sheet music they just double down on some bullshit as to why they are still right and that is still composing even though obviously it is not.