I don’t think this is true, software is often able to operate with external stimulus and behaves according to its programming but in ways that are unanticipated. Neural networks are also learning systems that learn highly non linear behaviors to complex inputs, and can behave as a result in ways outside of its training - the learned function it represents doesn’t have to coincide with its trained data, or even interpolate - this is dependent on how its loss optimization was defined. None the less its software is not programmed as such - the software merely evaluated the neural network architecture with its weights and activation functions given a stimulus. The output is a highly complex interplay of those weights, functions, and input and can not be reasonably intended or reasoned about - or you can’t specifically tell it what to do. It’s not even necessarily deterministic as random seeding plays a role in most architectures.
Whether software can be sentient or not remains to be seen. But we don’t understand what induces or constitutes sentience in general so it seems hard to assert software can’t do it without understanding what “it” even is.
The way NN and specifically transformers are evaluated can’t support agency or awareness under any circumstances. We would need something persistent, continuous, self reflective of experience, with an internal set of goals and motivations leading to agency. ChatGPT has none of this and the architecture of modern models doesn’t lend themselves to it either.
I would however note this article is about the cognitive psychology definition of self which does not require sentience. It’s a technical point but important for their results I assume (the full article is behind a paywall so I feel sad it was linked at all since all we have is the abstract)
There is no software. There is only our representation of the physical and/or spiritual as we understand it.
If one fully were to understand these things, there would be no difference between us, a seemingly-sentient LLM, an insect, or a rock.
Not many years ago, slaves were considered to be nothing more than beasts of burden. Many considered them to be incapable of anything else. We know that’s not true today.
That is, until either some form of controlled random reasoning - the cognitive equivalent of genetic algorithms - or a controlled form of hallucination is developed or happens to form during model training.
Depends on the user. Basic LaTeX2e/LuaTeX can be learned over 5 days. Guru level like any programming language needs its 10K hours. There are people who have an aversion for backlashes. The main reason for the "\" is perhaps the only char that is not commonly found in texts. Others like ":" re very common in texts. When parsing LaTeX and behind it is Knuth's original TeX engine, the commands are swimming in a sea of text (as the Dragon book says).
One thing that has helped with ease of use is Overleaf. It is a hosted LaTeX editor with lots of collaboration features (leaving comments, history of edits) that let people collaborate in real time on a paper. It comes with many templates to get you started on a new document. If you're working with collaborators, it has a lock on the market.
LaTeX itself can be easy for simple things (pick a template, and put text in each section). And it can grow into almost anything if you put in enough effort. It is far and away the standard way to write math equations, so if your document has lots of formulas, that's a plus.
I settled using latex with tectonic, you could always leverage playwright or similar for easy html -> print to pdf without any weird libs? (not great startup time, but you can batch many ops in one session)
# justfile ── put in repo root
set shell := ["bash", "-cu"] # one shell → predictable env, pipe-fail, etc.
# Build a PDF once
pdf:
tectonic -X compile src-v0.1/main.tex --outdir target/pdf # or swap for typst
As I had the same feeling as you, I subscribed to a quarterly news magazine called delayed gratification. I feel it's a good balance between keeping up-to-date, while not letting the news interfere with my daily life and emotions.
Even 5 seconds of Trump is enough to cause rage. I'm powerless to change what he says and does, but I'm only empowered to keep it away from me.
My friend's house (and entire town) burned down, so I'm following that news. But even 2 minutes of reading Trump + Republicans saying the fires happened because the LAPD chief is a gay woman, and I had enough for the month.
I think reducing the amount of news one consumes at least slightly improves things. Unfortunately, my wife and many of my loved ones are targets of Trump and his minions, so I can't totally tune things out, but consuming less news media at least makes for less internalization.
I’ll challenge you: what action are you and your wife+family going to be taking based on the news? If that approximates zero, then you are only just stressing yourself.
It depends on who you are. If you're a straight white guy then completely zoning out might be the right answer. If you are an immigrant then knowing that ICE is raiding Chicago on Tuesday is very important information.
It might still be of introductory help to someone who has yet to formally learn what a language model is, what large language models are, and where things might be in the future.
BTW, there's no way to navigate your blog now ? That's good design ?
What do you mean? The user can just use the browser's location field and go up a directory to /blog, where they should see links to published posts for April.
Perhaps if browsers had a 'go up one folder' button like file explorers, this might be reasonable, but it's pretty disingenuous to expect users to edit the URL bar for navigation. Even the simplest Web 1.0 sites still have basic navigation.
Craigslist.com is a masterpiece of usable information architecture in the Web 1.0/1.5 era. They had very minimal styling, close to browser default, and descriptive categories with all related sub-categories appearing underneath it.
Yet I'm constantly doing this on some of the most popular websites (eg. Amazon) to remove tracking portions and other unwanted cruft when sharing or archiving links.
I used to use TamperMonkey (even wrote a somewhat popular script a long time ago) and more plugins, but tired of the cognitive overhead maintaining the setup eg. every time I format/migrate, or when my browser decided to break backward compatibility, or when sites changed and the mod was no longer being updated, or when my browser got slow and I had to try and isolate which one(s) were at fault.
I'm sure the plugins are great, I just personally find it easier now to remove by hand (and other sledgehammers like disabling JavaScript altogether for the most aggregiously annoying sites).
It is clear to me that we are in a holding pattern for web components to become the norm and for component frameworks to become decoupled from application frameworks, so I have no idea what I will be using in 5 years.
So, we'll arrive to something similar to where we are with JavaScript and jQuery, where the latter is less relevant now that JavaScript is appreciably better? Meaning, that with web components the norm, we might arrive to less need for frameworks like React or Vue?
(I might be misunderstanding due to the sleep deprivation, so apologies if what I write comes across as gibberish.)
No, I think we have just as much of a need for frameworks, so you would have the same React for example, but instead of having JSX that is a combination of html5 elements and react components, you'd have JSX that is a combination of html5 elements, web components and fewer react components. That will allow for richer component libraries that share a lot more code between frameworks. In order to build those component libraries we're going to need frameworks that enable that.
We've experimented with the combination react + stencil to build in this way, but there were just too many problems relating to workarounds to get things to work and performance which just wasn't good enough.
It cannot ever be sentient.
Software only ever does what it's told to do.