Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Lua, a Misunderstood Language (2021) (andregarzia.com)
178 points by synergy20 on Sept 29, 2022 | hide | past | favorite | 251 comments


> We usually start to count from one. When a baby completes its first roundtrip around the Sun, they’re one year old, not zero years old.

And before they complete the first trip, they're 0 years old, showing that you start from a count of 0. The one is the result from the first increment.

If I ask you to count the number of red balls in a bag with only 3 yellow balls, then the initial count in your head is 0, you inspect the balls one by one, never encountering a red ball, and thus never incrementing the count. And then you pronounce your final count of 0. So that's counting starting from 0.

What you call "starting at 1" is not so much the start as it is the first increment, which need not arise.

For arrays, the first element is the difference between a size 0 array and a size 1 array. So the choice between 0-indexed and 1-indexed is like the difference between pre-increment and post-increment. Do we associate the element with the array size up to the element, or to the array size up to and including the element? The choice seems arbitrary (personally I prefer using the same range of non-negative numbers for both, but language has a bias for associating "first" with index 1).


Nobody says that a baby is 0 years old. They use other words instead of numbers, or if they do use a number, it is X hours / days / weeks / months old.

0 in normal usage is always a negation of something existing. You don't start counting things that exist at 0 (as you pointed out in your example of number of red balls in a bag that have none). However, position zero at an array in zero-indexed languages is the first position within the array.

There's no getting around the fact that higher level languages inherited 0 by counting byte offsets, rather than positions, even if those languages don't required that every index within an array have the same size in memory (or occupy a contiguous block of memory), and that this inherited usage is at odds with the actual plain-language and true meaning.


> Nobody says that a baby is 0 years old.

Totally irrelevant to the the actual conversation but I have in fact put my infant child’s age as 0 because I didn’t think to count in smaller increments than years. So there is at least one person who says babies are 0 years old.


You start your argument with a missconception, 0 does not relates to the existance of something but rather the lack of. When a bank acount reaches 0 it does not mean money does not exist, neither it means the account does not exist, it just means there are no units there of that you are counting...


> Nobody says that a baby is 0 years old

That's only because age can easily be measured and described in units smaller than whole years, while array indexes can't be divided.


Of course they don't. They use decimals and say the baby is one week old or so.

As for indexing from zero, it's best understood as indexing the position just before elements in an array. Where things will be inserted if you insert an element etc...


So, to use some pseudocode:

var item = my_array[2]

should be read as

"assign to item the value in my_array in the position where you would place something new just before the third item"

1-indexing is, strictly speaking, a better design. The only real argument for it anymore in a higher level language is "it's what I'm used to", or alternatively, "0 already won the popularity contest" (which is actually not a terrible argument in context, though it is a bad one from a purely objective design perspective).


Saying it's strictly better is exaggerated. The benefits of 0 indexing are well known and mostly revolve around the fact you tend to have far less fiddling with -1 and +1 as you write code.

How do you access the item in a 2D array represented on a flat array? my_array[x+ywidth]

But what is all that starts at 1? my_array[x+(y-1)width]

I did a lot of code in both systems and I remember distinctly that 1.. indexing just was more work to do, in turn bringing in more off by 1 bugs.


Dijkstra's arguments [1] about the benefits of 0-indexing are about tricky edge cases when coding with arrays. (The example you give is not a correctness problem, but a memory-efficiency issue: if width=1, you'll not be using at least half of the structure's indices that you allocated). But Lua doesn't have arrays as a type, it only has a way of iterating over the array part of tables, its map data structure.

Idiomatic Lua code for tables doesn't run into the kind of edge cases Dijkstra presents, since the most readable code uses iterators to yield the values of array indices when one wants to access the array part of tables.

It makes perfect sense to add arrays as a userdata type. For these, there are benefits of 0-indexing, as seen in https://github.com/neomantra/lds - although mixing 0-indexed

[1]: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...


>The benefits of 0 indexing are well known and mostly revolve around the fact you tend to have far less fiddling with -1 and +1 as you write code.

Depends on what you are doing. I seldom have to do a flat 2D array these days, but array[total-1] is an almost everyday thing.


1/52 of a year

3/12 of a year

even after a year, they might say 18/12 of a year.


While I agree about the points you made, you cant take the zeroth element out of your shopping bag, you take the first one. The array indexing operator gives you access to the nth element of your data store. IMO OPs point is valid.


You created an unfair definition of the array indexing operator. I can just as easily say that the array indexing operator gives you access to the element at index I, which starts counting at 0. That's not an argument.


> While I agree about the points you made, you cant take the zeroth element out of your shopping bag, you take the first one.

where does the first centimeter / inch start in a physical ruler ?


You're both right.

You want to measure the distance from a starting point (zero), the person you're replying to wants to count/label things from a starting point (one).

The distance method doesn't work as well for picking items from a bag, and the counting method doesn't work as well for calculating offsets.

They can both work for arrays, so neither is wrong.


I'm mad at myself for getting dragged into this, but. Leaving aside the fact that array indexes are not continuous in the way ruler measurements are, what does it say that you just called it the 1st inch, not the 0th?


> Leaving aside the fact that array indexes are not continuous in the way ruler measurements are,

I have always seen a ruler as a discrete sequence of centimeters even if the physical object is continuous

> what does it say that you just called it the 1st inch, not the 0th?

but it's literally what I'm saying aha. The index of the first element is zero, even on basic household objects.


> I have always seen a ruler as a discrete sequence of centimeters even if the physical object is continuous

I have trouble believing this. Does this mean you think of a ruler as a tool to assign lengths (continuous) to discrete intervals, rather than as a tool to (perhaps imprecisely) measure the (continuous) length.


for me a ruler is something that tells me how many centimeters there is in whatever it is I want to measure - but maybe it's for linguistic reasons? e.g. in french I'd say "y'a combien de centimètres là?" when measuring something long with a tape meter and someone's help, which translates to "how many centimeters are there right now?" more or less


The sequence of the cardinal numbers, i.e. of the equivalence classes of sets having the same number of elements, is 0, 1, 2, 3 and so on and those words in any language have been used originally for cardinal numbers, not for ordinal numbers.

For the sequence of the ordinal numbers, whose purpose is to identify the position of the elements of an arbitrary sequence, any sequence of arbitrary symbols may be chosen and fixed by convention.

Most languages had at least in the beginning special words for the first and for the second elements of a sequence, without any relationship with the cardinal numbers. Even in English that remains true, even if "second" is a more recent substitution of the older word used previously. Many languages have special words for the last element and for the element before it. Some languages had special words for the third element and for the third element going backwards from the last. So in some languages it was possible to identify the elements of a 6-element sequence without using words derived from the cardinal numbers.

However inventing a very long sequence of words to be used as ordinal numbers, in order to identify positions in sequences with more than 2 to 6 elements, would have been difficult, so in most languages someone noticed that there already is a sequence of words that everybody had to memorize when learning how to count and which had rules for being extended to any length. So the ordinal numbers were derived by using a suffix or some other derivation rule from the cardinal numbers.

There is no logical reason for using 1 for the first ordinal position, this is just a historical accident.

The reason is that the children have always been taught to count by saying 1, 2, 3 and so on, instead of being taught to recite the sequence of the cardinal numbers from zero.

All the languages have always had a word for zero, but those words were normally created by applying a negation to words meaning "something", "one" or the like.

Because of this, the words for zero were not perceived as having an independent meaning and there was no need to learn them separately when the recitation of the cardinal numbers was learnt.

Nowadays we have a much better understanding of the meaning of the cardinal numbers and we are aware that 0 is a cardinal number like any other, so the children should really be taught to count 0, 1, 2, 3 ... and not 1, 2, 3, ... like 5000 years ago.

In the natural languages there is a huge inertia. Even if one would decide that since tomorrow the ordinal numbers should be 0th, 1th, 2th, 3th, 4th and so on, everybody would still have to know that whenever reading older writings the sequence of the ordinal numbers was 1st, 2nd, 3rd, 4th and so on, so a change of the convention to a more logical one would bring no simplification.

On the other hand, in the programming languages you can ignore the legacy conventions and choose the best conventions. Using for ordinal numbers the sequence 0, 1, 2, 3 ... is the best choice for many reasons, which have been explained in the literature many times, e.g. by Dijkstra.

Choosing to start the ordinal numbers from 1 in a programming language just demonstrates a lack of understanding of what the cardinal numbers and the ordinal numbers are and a lack of practical experience in programming and of understanding of how the programming language will be translated into machine language.

The first programming language for which I have studied the machine code generated by its compiler, when I was young, happened to be Fortran, which uses indices starting from 1. Until today I remember how I considered ugly and error prone all the tricks that the compiler was forced to use in order to avoid in many cases to make extra computations due to the poor choice of the origin of the indices.


The point is valid, but the rationale is not, let me explain:

Caring about 0 based or 1 based indexing is, to me, a sign of someone who struggles with programming in general, or is stuck doing a lot of finicky conversion between the two.

Most modern, higher level languages have generally abandoned using indexing, instead (even c++) they have something like:

"for x in y do z"

1-based indexing is a bit more readable, but in the end, you have hardly any benefit. There are better paradigms, algorithms, which don't require indexing at all, which IMO is the majority of what programmers are doing anyways. Even if you need to process two same-size collections at once (the majority of the remaining legitimate uses for index based looping), you are likely working pre-sorted data and should consider using a zip or pair which eliminates the need for managing indexes.

You say "but aren't you just imposing your own style on others"? Not really, if we want clean, minimal code, there should be as few references to the underlying architecture as possible, even the fact that we are dealing with a list is an implementation detail (is it actually a list, a linked list, a stream, a dictionary, an event, etc...), so in the context of implementing a higher-level language, this point is not only irrelevant, but shortsighted. If you are creating and looping over lists, you are likely not doing anything interesting, which is the whole point of programming in higher level languages, to do interesting things simply, right?

As for the remaining use case when we do actually want array based access, typically you find this in high-performance, architecture aware applications - then we actually want random memory access. We may even want to deal with explicit memory offset, which is what 0-based arrays are good at doing (often times the array is syntax sugar and we are literally assigning to/from pointer offsets).

To bring this around to my previous statement: "Caring about 0 based or 1 based indexing is, to me, a sign of someone who struggles with programming in general, or is stuck doing a lot of finicky conversion between the two."

The reason this comes up in the first place is that there is a divide. 0-based arrays are arguably much better for low level activities, and higher level languages generally left them in as a familiarity. Translating from 1 to 0 based arrays is not any easier than translating from 0 based to 1 based arrays, 1-based arrays provide just as much confusion and less familiarity in these cases. This is not a good thing.

Now I suppose Lua is trying to be something weird, a "high level" low-level language. Maybe that's fine, but it is weird, and people are right to be put-off by the change. If you are looping over bananas in lua, maybe you're not really using it the way it was intended, if you are doing memory-level access then its unnecessary language cruft.


Except in Korea, where babies are considered to be age 1 starting at birth! https://en.wikipedia.org/wiki/East_Asian_age_reckoning#Korea


I wonder if it has something to do with a cultural perception of how life and the world in general is perceived. As in, "He's currently experiencing his first year of life" vs. "He has lived for one year", kind of "the journey" vs. "the destination". Seeing the present more than looking back the past. In the same line of thinking as "What you're doing" vs. "what you've done", or maybe even "making progress towards goals" rather than "accomplishments".

Seems like a healthier and maybe even more productive way to see the world. I don't know enough about Korea to say whether that has anything at all to do with the way they count ages, but I feel like it's an interesting thought regardless.


"Give me the 1st item" --> a[1]

"give me the 2nd item" --> a[2]

"how many items are in the array?" 5

"what is the index of the last item?" 5

I am used to 0-indexed and prefer consistency... yet there is an undeniable consistency to 1-indexed.

You're the person who brings back a dozen bottles of milk from the store when you see that they have eggs ;)


> "what is the index of the last item?" 5

Usually, the more interesting question is:

What is the index of the next item (when extending the array)? Which is 5 in 0-based indexing.


Is it? If you're doing low-level stuff maybe, but I really haven't done much of that so I'm not even sure I know what you mean by "extending" an array (presumably has something to do with allocating contiguous memory?) and I can't actually think of any situation off the top of my head where I would ever need "the last index + 1" what with array methods existing and all.


1-based vs 0-based indexing is such a tired controversy.


And a solved problem. Pretty much all the most popular languages have settled on zero-based indexing, who the hell cares what's more "correct".

What's closer to what programmers expect is a better choice. Being pedantic about "1 is actually better" helps no one.

It's the age old theory vs practice problem. Physicists vs engineers. Etc. This is why no one really cares about tau either.


> Pretty much all the most popular languages have settled on zero-based indexing.

Maybe true depending how you count, but there is plenty of code in languages with 1-based indexing too. My day-to-day these days is all MATLAB (1-based), except for the occasional optimized algorithm written in C. So I'm equally sensitive to being pedantic about "0 is actually better". They lend themselves well to different habits of reasoning. Switching back and forth can be painful if I don't take care.


This is perhaps an unintentionally ironic comment because the only languages that have any business at all being 1-based are MATLAB, Fortran, and Julia because they’re used to directly translate math notation which for series, sequences, and matrices is 1-based.

Lua’s one big niche is interfacing with 0-based C code in plugins and this horribly and completely unnecessarily painful because of its 1-based array design flaw.


0 based indexing was arbitrarily chosen due to the memory model of old programming languages.


I strongly disagree. The decision was not in the least bit arbitrary. Language designers chose zero-based indexing because they believed its advantages outweighed the advantages of one-based indexing.

And it still does, in my opinion. In the end, no matter how high level your programming language is, you'll eventually be computing offsets to a base address in RAM using indexes that are zero-based. Having that be consistent at every level has great value.


It was arbitrary in that they chose to use a leaky abstraction to make it easier for themselves to make arrays using their memory model. The first person to do this could also have subtracted one from the user’s index to get the memory offset.


Or they could have saved themselves a CPU cycle every time they needed to index an array.

Don't get me wrong, non-zero-cost abstractions are great when they improve developer ergonomics commensurate to how much they cost. Garbage collection at the expense of GC overhead? Yes please. Classes and methods and inheritance at the expense of vtable lookups? Sure!

One-indexed arrays at the expense of either giving up the memory of the first item or always subtracting every index operation? Eh... I think I'd rather count from zero.


That's not what arbitrary means.

Arbitrary means there was no logical reason behind the choice, they just picked one. You can dislike the reason all you want. You can disagree with their choice, or think there are more important reasons to choose what you think they should have chosen, but that doesn't make their choice arbitrary.


It was chosen because that's how pointer math works and nothing has happened to change that.

I can see some benefits of using 1-base, but on the whole, if 0-base has to be used anyway, I'd rather only use the necessary version so I'm not making off-by-one errors when switching languages.


Yes I understand pointer array math. The zero based indexing is great for that. It’s still a leaky abstraction to make the implementation correct. I’m also not saying it’s not my preferred way, but it’s not for any really good reason, it’s cultural.


Any fixed array range (at least starting index) is an example of worse is better, pushing the work onto the user of the language to map whatever the actual range should be into a 0-based (or 1-based) form. Different problems have different natural ranges, and you shouldn't have to write a routine that manages that for you. It results in one of several possible outcomes (when your range isn't naturally starting at whatever the language forces on you):

1. You don't help your own users, instead they have to know everywhere that they need to do `histogram[c - 'a']` to calculate the actual index (clutters the code, chance to forget something).

2. You do help your own users, but now they have to remember the function/procedure call to access it: `inc_histogram(c)`. Creating a plethora of setter/getter routines to gloss over this issue and bring performance back to straight array access.

3. You do help your own users, but they realize it's "just" an array and they can use `histogram[c-'a']` to access values (and set them directly) bypassing your API.

Better languages let you do this:

  histogram[c]++
Done.


If you have an array A with indices from S to T, you can use as its base address the address of A[S] minus S multiplied by sizeof(A[S]), then your address calculations are the same as with 0-based arrays. So I'd say it's a problem of a memory model of one old programming language, that is, C.


Not really (the memory model of old machines was chosen too), but who cares?

0 based indexing is also slightly less prone to off-by-1 errors. But again, none of that matters more than keeping it standard.


Exactly, it’s an arbitrary cultural standard now.


That's the point I was making. It doesn't matter if it's arbitrary. The world runs on 0-based index languages. I couldn't care less to give that away for another arbitrary number. There is literally zero benefits and only downsides in getting used to it for any software engineer.


But software engineers aren’t the only ones who program these days. It’s true that many popular languages used by developers are 0-based, but the most popular language, Excel, is 1-based.

Non devs seem to prefer 1-based indexing in my experience. As a teacher of Java and C++ to new programmers, the 0 based nature of those languages is always a sticking point; it causes novice programmers to write incorrect code, which frustrates them and leaves them with the perception that programming is filled with arbitrary rules that are only there to confuse people. And they’re not so wrong, seeing as that even programmers here can argue endlessly about 1 vs 0 indexing with no definitive answer.


If it was chosen for a reason, then it wasn't chosen arbitrarily.


It's amazing to me that the top comment on a post that has very little to do with indexing -- basically waves at the topic in passing -- is about indexing, AND that when it's pointed out that this is not a very scintillating topic of conversation in this day and age, all of the responses to that comment are just more pedantry about indexing. How does this trivial topic hold such fascination for people?


>If I ask you to count the number of red balls in a bag with only 3 yellow balls, then the initial count in your head is 0,

Sorry, no. Humans count from 1. That's just a basic fact reflected in the history of numbers, which at early stages often didn't treat 0 as a number, but as a special case. And if you would say "I counted zero red balls" most people around will find it an unusual wording. Normal way of saying it doesn't involve mentioning 0 at all: "It's empty", "There are no red balls" etc.


That special casing in English of no/none/empty is as much an artifact of lost germanic grammar cases in English as it is anything "natural" or inherent to how English speakers count.


I'm a member of multilingual family, and I'm inclined to insist it's not about Germanic grammar cases, because it's true for non-Germanic languages as well.


I just went "nearest ancestor up the stack" as a short hand, because the evolution of languages is a huge tree and a lot to talk about. If we want to get into it deeper, Proto-Indo-European had some truly fascinating grammar cases from what we think we've reconstructed of them. Most of the stuff that PIE did seems like "natural laws" simply because of how many modern languages we regularly see branched from it and how deeply rooted a tree in the language forest it is. But then we also have had chances to study non-PIE rooted languages and the "universals" are fewer than we think they are.


I mentioned history of numbers, and it starts not from PIE speaking peoples, so I'm a bit lost as for what exactly is your point.


I have no problems with people advocating from 1-based indexing, and I often disagree with EWD, but advocating for 1-based indexing with out referencing EWD's reasonably well known take[1] is going to damage the ethos of your argument.

1: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...


Thank you so much. I love the end:

"In corporate religions as in others, the heretic must be cast out not because of the probability that he is wrong but because of the possibility that he is right."


I'll trust math more than human language evolution as a solid basis for indexing.

Also Julia's (default) 1-based indexing has become a mess.


0-based indexing is not universal in math either. We often index time from 0, but when we refer to components of a vector, we often index those from 1. For example, I'd write or see things like $\sum_{k=1}^N f(x_k)$ where $x$ is a vector. In these cases, 0-based indexing would look out of place.


Pretty much all programming languages designed for math are 1-based. Besides Julia, there's R, Mathematica (or "Wolfram language" as it is called these days), and Matlab.


I am curious, what mess did 1-based indexing create for Julia ???


My understanding (outside the community, I use it but am not hugely familiar with it) is that the 1-based indexing is the default, but you can select any range. Routines that assume 1-based indexing then break on anything that, well, isn't 1-based.

See: https://docs.julialang.org/en/v1/devdocs/offset-arrays/

Discusses how to generalize the code so it handles arbitrary ranges.


IMO 1 based and 0 based is a wash as to which is better, but the only definitive wrong answer is an arbitrary base. It may seem like a good compromise (just use whichever you prefer!), but it opens the door to a mars climate orbiter disaster waiting to happen.


Arbitrary indexing is fine as long as people are sane about how they do things and don't assume. In almost every language these days you can use this or something analogous:

  for x in some_collection
Which is great if you don't need the index, and if you do need the index then you want the equivalent of:

  for i in some_collection.range
or

  for i, x in some_collection.enumerated
Being forced to use 0-based or 1-based or even strictly integer-based indexing is the wrong thing. If you want to use characters/runes as your index you should be able to. If you want to use some enumerated type, you should be able to. And if you want some subrange of one of those, you should be able to.


> as long as people are sane about how they do things and don't assume.

Ime this expectation of developers doesn’t scale reliably. We can’t even trust engineers to keep feet and meters straight. Relying on conventions for correctness is a recipe for disaster.


> If I ask you to count the number of red balls in a bag with only 3 yellow balls, then the initial count in your head is 0, you inspect the balls one by one, never encountering a red ball, and thus never incrementing the count. And then you pronounce your final count of 0. So that's counting starting from 0.

Well, just like lua:

    > a={}

    > print(#a)

    < 0


I can't believe people are still seriously trying to argue this.

You count "things."

You don't count "not things."

Arrays are containers of things. Starting with 1 is overwhelmingly likely the thing that makes sense when using them.


Arrays aren't just used for counting though. For example, the origin in cartesian coordinates is "0,0" and not "1,1". Not to say that I disagree, I don't think I've come across a situation where 0-based indexing would have been better for me personally. But there are definitely a lot of situations where I can imagine it would be preferable, like pretty much whenever you're doing math with them.


I'd say containing things is just one function of arrays. The other arguably more important function is "naming" things using numbers.


Sure, but it seems like the latter isn't hurt by 1 indexing either, especially if the "names" are arbitrary, and if not, use associative?


Author here, thanks for linking my article.

It is a bit of a curse that every time this article ends up anywhere, the discussion quickly devolves onto the merits of 0-based indexing. I have a very strong opinion on this which is that "it doesn't matter". You use whatever paradigm is real in the language you're working. The context switch is not larger than the context switch between languages anyway. Lua is not the single 1-based indexing language out there, it used to be much more prevalent. I bet if JS had used 1-based indexing, half the people on this comment threads would just be praising the benefits of 1-based indexing. Sorry for the rant, but the reason I wrote that article in the first place is because I of such threads.

One thing I agree with many people here is that my baby example is not the best example. It is not wrong, but it is phrased in a confusing way. I'm tempted to edit the article and fix it or replace the example with something better. After all, the example is not important, it was just an example.

One aspect that many people don't realise, one that I couldn't really dive into in the article, is how simple and small the Lua source-code actually is. IIRC it is about 70 files of plain old C that can be compiled by any C99 compiler. That is very refreshing. The amount of power and flexibility you have with Lua when paired with the simplicity and maintainability of the Lua source-code itself is something we should all cherish and praise. Lua is small and nimble, it is small enough that you can vendor all of it in your project if you so want.

Anyway, I like 1-based indexing, they make my loops prettier. And yes, it was just a joke, I'm using iterators like everyone else and indexing hardly matter.


Oof. I feel your pain on the indexing arguments. Thanks for the post, I found it informative.


I think the reason is that the indexing paragraph comes across as quite misinformed. The age of the baby would be 1 year even in 0 based indexing (age is a duration, not an index). And pointers are only one of many places where 0 based indexing is more natural.


I use Lua a fair bit in my day job. The comment about it being a minimalist toolkit that allows you add scripting on top of domain specific tools is spot on. That is exactly the use case that Lua shines in.

My favorite feature is the way that tables and closures allow you build up just as much of OOP as you need for the application. It feels very empowering. I'm working mostly solo, so I've been able to establish my own conventions and stick to them. Doing this as a large team would be fraught with peril and require strict standard practices to maintain an intelligible code base.

I was surprised the author didn't mention my #1 complaint, variables default to global scope AND initialized. If you misspell a variable name Lua will happily initialize it to nil and run with it. This almost always results in subtle incorrect behavior instead of an error.


Amusingly, in a sense, your complaint about global variables is because of what you like about how generally simple Lua is plus how powerful its tables can be.

i.e. The global scope is a table, and variables are keys on that table, so of course the value of any variable is `nil` because that's how tables work and Lua isn't going to special case a table just because it's a magic part of the environment.

...but, naturally, since the global environment is a table, you can set a metatable on it and require that variables be declared before you use them. :D

See: https://www.lua.org/pil/14.2.html


To be clear, I'd like all tables to throw an error on unresolved key names. And I can't think of any cases where the current behavior is necessary for any of the features I like.

I think the design decision to have unresolved names result in defining a new variable, initialized to nil, is the result of a different design goal. I'm not sure what that goal was, something about fault tolerance maybe. In practice I find this feature means I need very thorough test coverage, or a metatable, as you've noted, or a static analyzer, as others have noted. Or a combination of all the above.


unresolved names result in defining a new variable, initialized to nil

This is not what Lua does. All table lookups return a value corresponding to a specified key, if there is a correspondence. Or simply nil, doing nothing in this case. There are no global variables per se, only a table that non-local lookups end up into.

In practice I find this feature means I need very thorough test coverage, or a metatable, as you've noted, or a static analyzer, as others have noted. Or a combination of all the above.

You may choose a language that fits your needs rather than trying to use something that doesn’t. E.g. I think Python more or less matches your expectations here. Lua doesn’t have to please everyone, and trying to make X out of it makes no sense neither for you nor for Lua designers, if X already exists.

That said, I see this as an overdramatization. Strict metatable on globals solves the issue with “automatic global variables”. But other things like static analysis and thorough testing will be required (and require your work too) in any language to achieve the “high-horse” level of guarantees.


Matter of taste, then, I think. Or of how you're thinking about the language, maybe?

It's not really that they're "defining a new variable" -- absent metatables, all variables always exist, and there's not a difference between one that you've accessed and one that you haven't. Behind the scenes there might be a memory allocation difference between a variable that's never been touched and one that's been used and then set back to nil, but to the user that shouldn't really matter.

I'm guessing here, but the goal is probably one of simplicity -- this means there doesn't need to be special language features around declaring variables that users need to be aware of. It ties into the article's discussion of 1-indexed arrays, as things that make sense to humans who haven't done weird things to their brains as we programmers have.

(Granted, `local` is somewhat a language feature for declaring variables, so there's some compromise here.)

I'd probably align it with OOP in the Lua discourse. It's something that the language doesn't provide by default because it's simpler not to... but if you want it, the language makes it pretty easy to add through metatables.


Also, do not forget an easy and powerful sandboxing that you can achieve with Lua by selectively including/excluding parts of its std library at compile time and running Lua itself in a thread with no (or minimal) shared state. This alone is priceless when embedding scripting into an application.


I always add a "__newindex" meta method to the globals table before running any code to notify when unwanted globals are accidentally created.


Regarding globals, defaulting to local scope is a thorny design choice. Try this in python:

    def make_counter():
        def incr(x):
           total = total + x
           return total
        total = 0
        return incr
I think Lua would be a better language if they were to eliminate global variables entirely, and flag as a load-time error any undeclared variable that is not present in the environment supplied to `load` or `loadfile`.

Setting __newindex on the global table to throw errors is a not-too-bad solution, but these will be run-time errors, so to catch them all you need good code coverage (yes, you want that anway) and uncouth third-party libraries can trigger those errors.


EDIT: re-reading your comment I think I see now you were agreeing with me. :-) Leaving this here as elucidation.

That python code throws UnboundLocalError, which seems like what it should do to me. If you move total = 0 to before def incr(x): then incr closes over the local total variable. Every call to make_counter returns a new counter with its own state.

The equivalent Lua defines total as a global (whether total = 0 is above or below). Multiple instances of incr are all changing the same global variable. That doesn't like the intended behavior.


Whoops, actually the python version only works if you move total AND add a nonlocal keyword.

https://docs.python.org/3/reference/simple_stmts.html?highli...


The existence of `nonlocal` is an indictment of the local-by-default design.

(Parenthetically, for completeness, I should note that this is a problem only when the language supports mutable variables or, more precisely, mutable captures or "upvalues", which I think are also a bad idea, but that's another bag of worms for another day.)


Python would be a better language if it had Lua's local. Because Python would have true lexically scoped variables that way, instead of the current system.


I personally like the php `use (...)` approach

https://www.php.net/manual/en/functions.anonymous.php


Total occurrences of nonlocal in Python are tiny. Typoing variables is a constant fact of life.


> If you misspell a variable name Lua will happily initialize it to nil and run with it. This almost always results in subtle incorrect behavior instead of an error.

If your code editor doesn't warn you that you're potentially reading a variable before writing to it, then you need a new code editor.


Originally it was a configuration language. That global syntax makes it friendly to people writing configuration files. Anyway, if that bothers you, you might want to use luacheck or some other linter which will pick it up.


> my #1 complaint, variables default to global scope AND initialized.

Lua, originally, was a configuration language. You know, the sort of use-case you use a .ini (or these days, YAML) file for. Except you have realized you need some logic to set your flags, so you decide to make your config file programmable.

It has evolved to its current form but these roots remain. They should have added a 'strict' option.


Lua never initializes a variable with nil behind your back. Lua simply returns nil for a variable that doesn't exist.


Are there any static analyzers that help with that or is it too dynamic?


Yes, luacheck for example.


I highly recommend if you're interested in Lua checking out a few of my favorite Lua projects...

1. Moonscript (https://moonscript.org/) the compile-to-lua language powering Itch.io, it's amazingly pleasant to work with and I was amazed porting a JavaScript tool to Moonscript that the moon version was more readable despite my lack of familiarity with the language.

2. Redbean (https://redbean.dev/) the single-file, cross-platform binary magic web server that's been all the rage lately. Super fun to embed Lua apps into a single binary and build small web apps.

3. Pico-8 (https://www.lexaloffle.com/pico-8.php) the famed "fantasy console" a fake old computer with fake old requirements that forces you to get creative as you write games and apps with the limited resources, of course, all apps for Pico-8 are written in Lua.

I think the moral here (especially with the last two projects listed) if you like when computers get weird you'll like Lua. I also recently compiled the Lua runtime into wasm to make some of my scripts portable to the browser, it was simple and easy, including modifying the language code just a bit.

Lua is a little wacky, but there's something so fun it brings me back to the early days of writing code when everything was simple and just a little off.


I'll add LOVE (https://love2d.org) to that list as well. Very powerful and flexible game framework,

Currently working on developing a game in Minetest (https://www.minetest.net/), it is also scripted in Lua.


I'll add LOVR (https://lovr.org/), the 3D analog to LOVE. Haven't used it personally so ymmv.


LOVR is amazing and getting better each day. It's just amazing tool to throw up basic 3D playground for data visualization or for playing with visualized math. The interpreter is tiny in size but feature-full (drawing simple shapes & text, powerful low level graphics API, spatalized audio, physics engine!). Code is simple and elegant C. Community is tiny but wonderful and welcoming.

(disclamer, I'm a minor contributor)


Any relation with LÖVE or is it just piggybacking on its name?


It's piggybacking on its name because its API is heavily inspired by LOVE.


Odd to suggest that if they're interested in Lua, that they should check out Moonscript which is a different language altogether (although it compiles to Lua). But if you insist, something a little more Lua-ish is Teal[1] (gradual types ala TypeScript) or Pallene[2] (companion typed subset of Lua meant to generate optimized C libraries for use with Lua).

[1]: https://github.com/teal-language/tl

[2]: https://github.com/pallene-lang/pallene


There's also my IDE which just bolts on types (both structural and nominal) to regular Lua - no transpiling as the types are defined with comments (or inferred).

https://github.com/Benjamin-Dobell/IntelliJ-Luanalysis/

Unfortunately, I haven't been able to give it as much attention as it deserves recently.


Also TIC-80, a fully open "fantasy console" that uses Lua by default, but also supports other languages like JavaScript and Fennel (a Lisp-like variant of Lua).

https://tic80.com/


To clarify, Fennel compiles to Lua, so you can use it in most places that use Lua. TIC-80 appears to embed Fennel, so you don't have to bring the Fennel dependency.


These aren’t Lua projects per se but they are configurable/scriptable using Lua:

Neovim [1] and WezTerm [2].

Neovim is the "hyperextensible Vim-based text editor”--quite the understatement IMHO.

WezTerm is an incredibly well done, GPU-accelerated cross-platform terminal written in Rust and uses Lua for configuration.

[1]: https://neovim.io

[2]: https://wezfurlong.org/wezterm/index.html


Also AwesomeWM [1], to continue the "Lua scriptable Unix desktop".

[1] https://awesomewm.org/


I would add Defold to that list, it's a pretty nice little engine if you're making casual games. I used it for my team's project for this year's TOJam.

https://defold.com/


One Lua thing that is wacky is that you cannot have large integers as keys to a table. Lua converts {2938433:"hello", 983748323:"hi"} to positions in the array portion of a table and creates a giant 99.999% empty table eating all the RAM. Unless somebody knows something I don't on how to make it work.


i don't think that's true - lua stores sparse arrays efficiently. the table above should only have 2 slots allocated. see [1] or [4] for details on how lua implements sparse arrays.

[1] https://cacm.acm.org/magazines/2018/11/232214-a-look-at-the-... [2] http://www.lua.org/doc/jucs05.pdf


You are right. This code runs just fine on both of these online Lua websites. local tab = {} tab[938388893] = "hi" tab[987383332] = "Hello"

for k,v in pairs(tab) do print(k) print (v) end

https://www.lua.org/cgi-bin/demo https://luajit.me/

But when I try it using the Sol library (embedded in C++) it eats all my memory. Is it a Sol issue or a C Lua binding issue? Anybody know?


sorry, i'm not familiar with Sol. might be worth inquiring on the lua mailing list


> simple and just a little off.

Great way to put it. It's a very charming language. Once you learn the (very few) things that are off and can trip you up, it's really pleasant too.


I discovered Moonscript with Lapis web framework. Then OpenResty (NGINX + Lua). Love it!


The popular hit game Project Zomboid is also in Lua I believe.


I’ve just released a demo of my science fiction no combat RPG (think Disco Elysium), also made in Lua

https://spader.zone/firmament-1.0.0/


The game itself is Java, but the mods are Lua


> When a baby completes its first roundtrip around the Sun, they’re one year old, not zero years old.

Lol, well that's a terrible point to make. When a baby is born, it's 0 zero years old. Human beings literally start at zero...


He phrased it wrong. What he probably meant was: The first year in the baby's life is – well – the first year, not the zeroth.

It doesn't matter much, because we will never reconcile the "I'm counting objects" and the "I'm pointing at addresses" (or the "I'm doing math… something something Dijkstra") people.


there's definitely something deeper and more significant between start at zero and start at one.

something which mathematics alone is not sufficient to uncover, it requires more foundations/logic to be relevant; it requires computer science (but focused as 'computology' and pure theory).

in mathematics this becomes more aparent with ring algebraic-structures; and when considering exponentials/logarithms/roots and powers.

a way to bring into focus what I'm trying to point out, is to consider the historical development of counting numbers. In a sense, it is not until the foundations are thoroughly considered (which is more logic than mathematics) that the importance of starting from zero really shines. In this 'historical sense' we started at one, and then went back and realized that it's more 'technically correct' to start at 0; this same foundational redefinition makes it important to note that zero is the only natural without a predecessor.

an alternative way to point at this 'deep and significant' issue, is to consider how zero is the additive identity and one is the multiplicative identity; algebraic-rings having both, and the distributive property being the axiom which links both. (I tend to focus on how both elements are identities; they're both instances of an 'identity' element; two sorts of identity).


To be fair, this is also a matter of culture and convention.

> - In Korea, you are already one year old when you are born. > - In Korea, you "age" a year every New Year rather than on your birthday

https://www.omnicalculator.com/everyday-life/korean-age#:~:t....


If the Korean child is born on the 31st of December at 23:59 it will be 2 years old and 2 minutes old at the same time.


that's interesting. How do they figure out school cohorts with such a system?


So if you're born on the 31st of January, you're two years old one day later?


December* and yes


Having heard _insane_ crap like "Noone uses Lua in production" at a Fortune company (while the genius who uttered that sentence had a device in his pocket that was _definitely_ running Lua), I agree that it's a misunderstood language.

However, if anyone went as far as learning about things like arrays indexed at 1, they are far more aware than the average software engineer. They actually spent time looking at it.

Lua is a victim of its own success. It's small(the runtime is tiny and can be made even smaller by shedding features), it's mostly uncontroversial, it does the job, it's fast. It can be easily sandboxed so you don't hear much about vulnerabilities. It's just there, invisible, in almost every modern computing device.

The games industry is an exception. They have readily embraced it, for very good reasons. Browsers should have used it from the get go, but that ship has sailed.

Next time you think about having scripting abilities in your application, consider Lua. Same way someone should consider Sqlite if they are contemplating creating a new file format for their application. It's a pretty good tool.


Lua's problem is that JavaScript exists. They are fairly similar, but there are a lot more JS devs. QuickJS is about the same speed and size, so a lot of teams seem to be gravitating in that direction.


Also came here to comment about 1-based arrays :-)

It's not about which convention is "right" and which is "wrong" — they are all conventions, after all. It's about context switching, and the additional cognitive load that a programmer coming from a language where arrays start at 0 will have. An entirely unnecessary chore.


And, in particular, Lua is expressly designed to be embedded in C. By design, in order for it to be useful, you're expected to define your own functions in native code and make them accessible through Lua.

Given that, no matter what, you're going to be stuck context switching, since C is zero-based. I wish Lua had picked 0-based indexing not because it's better, but because it's consistent with the host language Lua is designed to be embedded in.


Fun fact about this: Lua is embedded in MediaWiki (the software that runs Wikipedia and also a lot of other wikis). And in MediaWiki template variables are also 1-indexed (no comment on this particular design choice). So the 1-based indexing of Lua becomes an utterly fantastic feature not just because it's convenient but also because a lot of people writing code are super new to programming, and any barrier to entry (like having to shift your indices when you go from the template to the module you're invoking) would potentially stop someone from writing any code at all in the first place.

So for this one use case at least, the 1-based indexing is pretty nice.


Wasn't Lua created by a professor? People in academia tend to develop projects aimed at people with no real world experience, and have often these weird design choices that do not map well in the real world with real code.

OT: which is why I think really great stuff like Racket (maybe even Haskell) have a hard time breaking through. The intended audience is people that have never seen a programming language, not tired engineers with deadlines.


There’s a history of Lua somewhere on the website. It wasn’t created as an academic exercise, but as a configuration language for the Brazilian national petrol company. It grew into a full scripting language.


Sometimes uniformly using 0-based indexing is simply moving a cognitive burden to later time or to a different person. For example some bright mind in my industry has decided that all numbering of quite real and physical entities (not addresses or offsets), like chassis, devices, blades, ports, sessions etc. should be 0-based. This let to a huge number of issues writing code, writing conversions, debugging, and most of all - testing this. Every time you need something wired to port 1, now you need to also clarify that that's not first port, and so on. But we got very hype and moderns numbering system instead. Amazing innovations :) .


One translates index into offset nicely, the other doesn't.


You never think about offsets in Lua.


Offsets are important in a programming language where you are regularly dealing with memory addresses, not nearly so much in one where memory access is almost completely abstracted away.


> they are all conventions, after all

If we discovered aliens and learned their programming languages, which is more likely: That they use 0-based arrays, or 1-based arrays?


It is ludicrously unlikely that aliens would be comprehensible at all. Several of Lem's SF novels try to get this idea across, Solaris is most famous, though I'm more of a Fiasco person myself. An octopus is very closely related to you, far more so than any conceivable space aliens - and yet we have no idea what the fuck is going on in there. When you look at an octopus in an aquarium is the octopus wondering whether you are self aware?

We have assumed that aliens would do arithmetic, because we do arithmetic, and it seems very fundamental to us, but that might just be a matter of perspective. So the aliens might have no use for programming languages, or their languages might have no more built-in concept of "0-based arrays" than our languages have of "Happiness versus sadness".


> An octopus is very closely related to you, far more so than any conceivable space aliens - and yet we have no idea what the fuck is going on in there.

If octopuses really are so closely related, why can't I see any sort of large structures or societies or vehicles or weapons created by them? They're a plateau'd species leeching off the planet barely surviving in the balance of nature, like pretty much every species except humans.


> They're a plateau'd species leeching off the planet barely surviving in the balance of nature, like pretty much every species except humans.

That is a ghost pepper spicy take.

In what way does any species plateau? An extant species is either a) Going extinct, b) Well adapted to its environment, or c) Adapting to evolutionary pressures through natural selection.

Instead of "leeching" I would describe them as living in symbiosis with their ecosystem.

Instead of "barely surviving in the balance of nature" I would describe them as... well, surviving. And that against the onslaught of the Anthropocene.

I like humanity as much as the next guy but your comment has a stunning degree of anthropocentrism.


I interpreted their comment a different way. They're raising a question I've always wondered...

Why are humans the only species to have reached such an incredible level of intelligence relative to all the other species on the planet?

Every other creature has plateaued in intelligence. Some are more intelligent than others, but have any of them even learned to create tools beyond bashing things with rocks and sticks? Have any learned to build things beyond a basic nest? None of them have even learned language beyond a few basic noises or bits of body language to express aggression, submission, and maybe happiness.

The usual answer is "Humans are the only species where learning these gave an advantage", but I think that's hogwash. Imagine how much more successful a prey species could be if they had better ways to communicate that they spotted a predator nearby, or even what a predator looks like.


> Every other creature has plateaued in intelligence

This is just not how evolution works. If you're curious about phenomena such as you're describing, I encourage you to dig into evolutionary biology.

It's more correct to say "we don't know of any creatures living or extinct that have a similar development of linguistic / symbol manipulation as modern humans".

This acknowledges that we don't know the intelligence of every species to have existed on earth. The earth is billions of years old. Quite literally, in that time span, species could have developed advanced civilizations, gone extinct, and left no evidence of themselves that we're currently capable of perceiving as such.

It also acknowledges that, unlikely as it seems, we don't know of any living species with similar intelligence to us. Since that quickly descends into sci-fi territory, I'll leave it there.

Finally, it acknowledges that at any point in the future, other creatures could evolve intelligence similar to us.

It strikes me that my whole disagreement with you hinges on the word "plateaued". In common parlance this means "has developed to a point of permanent stagnation". In evolution nothing is Permanently stagnant, thus my previous comment.

> "Humans are the only species where learning these gave an advantage"

This is another point that a deeper dive into evolutionary studies will illuminate.

Humans are the only extant species that we know of where selection pressure met a series of mutations for increased intelligence, and the cost/benefit equation worked out for the species and environment.


> Why are humans the only species to have reached such an incredible level of intelligence relative to all the other species on the planet?

Maybe they went extinct shortly after attaining similar technological levels as us, and we humans just presumed them (eg dinosaurs) to be stupid reptiles?

In fact, given that homosapiens and our close relatives roamed the earth for hundreds of thousands of years, it’s quite a spooky question to ask why only in the past couple thousand years we’ve been able to build a technologically advanced civilization.


Octopodes have short lifespans and aren't very social, otherwise I'm convinced they would have established civilization by now.


Probably 0-based? I'd assume they'd come to the same reasoning as we did, that indexing into an offset starting at zero works better for addressing memory.


We use 0-based, but we also invented the concept of 0. It's true that multiple civilizations independently invented it, but perhaps an alien civilization never invented it and got along fine without it.


Would they even have linear memory? That's a pretty big assumption. They might also have a completely different paradigm that we would never ever think of.


This would depend massively on the equally long evolutionary, cultural, and political history that they would have… just like that context is necessary to understand our own decisions. Folks assume “aliens” would be some kind of clean room, purely logical beings, but no such thing is possible and the assumption falls apart immediately.


If we discovered them, I don't know. But if they discovered us then they probably use 0-based arrays.


Right. If they used 1-based, they'd have accidentally ended up on Mars.


More likely 0-based, given our prior (n=1) that programming languages typically use 0-based indexing.


That depends. Median usage of languages over the population of devs shows that they prefer 0-based indexing. Median usage of languages across the entire population shows that they prefer 1-based indexing (Excel has more users than all other languages combined, and is a 1-based programming language).

As far as human programming languages as a population, they may tend toward 0-based because they are written by devs, who prefer 0-based. But if we're talking about aliens, are we sure that alien devs would be the ones writing programming languages? What if all aliens could code? What if their languages are expressive enough that any alien could make their own language to suit their personal preference. It's really hard to say anything here without specifying anything about the hypothetical alien species and their culture.


We’d surely discover petabytes of discussions on this and other topics mostly irrelevant to a real alien life.


With no prior knowledge of these aliens, neither. Why is 0 more special than 1 in your head?


To be fair, Zero is a thing called Additive Identity, meaning if we keep Adding it to things, we just get the same thing back. So that's special

One is the Multiplicative Identity, if we keep Multiplying things by One we get the same thing back, however that does feel less relevant to how an array works.


We've seen a lot of "drift" of different languages towards zero-based indexing, probably because it lets the compiler generate slightly simpler code.

For the same reason, we've seen architectures "drift" towards little-endian, even though it's undeniably more inconvenient to read hex dumps from little endian machines.


> probably because it lets the compiler generate slightly simpler code.

A compiler can generate the same code either way by shifting by 1 at compile time. You might have a point if you were to say "it lets the compiler be a little simpler" but in the grand scheme of things, it doesn't really simplify the compiler all that much.


> A compiler can generate the same code either way by shifting by 1 at compile time.

Shifting what by 1? Array address aren't always known at compile time. Neither are array indexes. I don't see how this could be done at compile time, in general, without a magically powerful compiler.

I think you may be imagining that in some specific scenarios, the code generation will be the same. I'm speaking about general cases--in general, in order to generate code for array access starting at 1, you have to do a little extra computation--either to subtract an offset from the pointer to the array or to subtract an offset from the index.


> subtract an offset from the pointer to the array

Yeah, you shift the array, not the index. But the instruction for accessing memory on many architectures assumes a shift, so whether it's 1 or 0, it's still just one instruction. Here's a community of programming language developers talking about exactly this question: https://www.reddit.com/r/ProgrammingLanguages/comments/x95ni...

The consensus there is: it's really not something to worry about, as it all comes out in the wash. Performance is certainly not an argument for choosing one versus the other, as there are other dimensions to the choice, like target user familiarity/comfort.


It sounds like you're definitely misunderstanding what I'm saying.

Linked thread is x86, an architecture with unusually rich addressing modes. People make compilers for all sorts of languages and architectures. The fact that in SOME cases, it's simpler not to have an offset, means that there is some pressure to use 0 as the base for your array. If you pick x86 or amd64 as your benchmark, then you're going to get a very narrow picture.

Just to pick an example, consider Arm. I'm looking at an older version of the architecture right now, but you cannot use both a register offset and an immediate offset at the same time. So if your arrays start at index 1, then you must either adjust the index or adjust the array.

Not all language designers follow the zero-overhead principle like C++ does, but you can see how this would cause some language designers to decide that 0 is a more natural array index because it results in the lowest overhead in the most scenarios (across different architectures, not just x86).

Same thing explains why little endian architectures won over big endian. Big endian is definitely more convenient to debug, it's more convenient to look at hex dumps. However, with little endian, the correspondence between memory and registers is slightly simpler. The difference is not so extreme that big endian architectures do not exist, but it is large enough that we've "drifted" into a world where big endian architectures are all but gone, and you mostly find them these days in network appliances.

And to be clear, I'm not saying that the advantage of 0-based indexes is some massive advantage that makes it a clear winner. I'm just saying that there's slight pressure to use 0-based indexes.


> If you pick x86 or amd64 as your benchmark, then you're going to get a very narrow picture.

99% of the language designers I know target x86 first and foremost. But in the given link only one person mentioned x86, while others were speaking generally and voiced an opinion that it really doesn’t matter.

I take your point that other designers might have other pressures, but the choice of 1 vs 0 in the common case (not a narrow slice as you seem to suggest) comes down to other factors. The pressure to conform to developer expectations for 0-based indexing is much greater than anything else; as demonstrated in this HN thread, some people won’t even consider using a language if it has 1-based indexing. Other communities face persistent confusion with 0-based indexing. That provides far more pressure than the ARM instruction set for language devs that I know. Maybe the ones you know feel differently.

Your point is taken on bigendian vs littlendian, but the same has not happened for indexing. Languages haven’t settle on either, but instead they have bifurcated between dev targeted versus end user targeted languages, with the former being 0 based and the latter being 1 based.


And as someone who wants "programming" to NOT be the domain of "only programmers" I welcome this difficulty if it makes it easier for "regular people."

Let's humanize programming and not keep it in a tower.


This article mentions Janet as a possible "Lua replacement," which is unrealistic. It's hard to overcome Lua's inertia and familiarity.

But! If you're in the market for an embeddable-or-not scripting language, and you are turned off by Lua's various idiosyncrasies, I would encourage you to give Janet a closer look. It quickly became my favorite scripting language after I discovered it about a year ago.

It's great for embedding in a larger app -- it has a simple C API, and it's distributed as a single .h/.c file pair. It has a small standard library (the core composite types are immutable and mutable maps and vectors, rather than linked lists). Its first-class support for PEGs makes it a great choice for text-manipulation scripts. It can also compile statically-linked binaries (linking in the interpreter/runtime/gc) so it's easy to distribute Janet programs, unlike most interpreted languages. Binaries that only use the stdlib weigh about 1mb, if memory serves.

The language is sane: zero-indexed arrays, simple module/import system, variables are block-scoped by default (the language even distinguishes between bindings and reassignable variables). And it has non-hygienic "classic" lisp macros, so you can write helpers to do anything you can think of -- great for making a little embedded DSL.

No LuaJIT, though, and no metatables. And an extremely sparse package ecosystem. But it's pretty easy to interop with C/C++ libraries, so your "ecosystem" is actually broader than it looks (if you're willing to generate some bindings).


If it's a C program, I like to embed Chicken Scheme (alternatively, one could embed Gambit, or Guile). They are all pretty easy to embed with simple FFI and C APIs.


My first introduction to Lua was with an online game called Tibia when I was a teenager. One of my childhood friends was really good at scripting in Lua, but was otherwise quite computer apathetic. He didn't really care for them, it was a tool to achieve what he wanted in the game. He went on to become a tradie.

I sometimes reflect on that. I think it illustrates that there could be value in exposing DSLs for lots of aspects of everyday life, and that we could potentially take coding off it's pedestal so that it doesn't have a stigma of being very hard.

Very hard problems are hard to solve in code, and we build complex pieces of software which are difficult, but. The core building blocks of code are really quite easy, and you can use those to achieve all kinds of things. Sometimes a power-user could utilize a DSL with ease and enable otherwise impossible use-cases if they had only a UI to use.


One of the best ways to get people coding is to not tell them they're coding. Look at all the people who excel at Excel. I've seen some amazingly clever things done in Excel by people who claim to not be 'computer people'.


As a fan of Lua, one complaint not mentioned in the article but that I think is relevant if you're starting to learn the language is that minor releases often contain breaking changes. This is fine - the maintainers can number things however they want I guess - but can be a bit annoying if you're expecting to read the latest 5.4 reference manual and then write an embedded script for something using a different 5.x runtime.

This is mostly relevant because when people talk about Lua's speed, they're usually talking about LuaJIT [1] which (iirc) was based on 5.1 and backported many 5.2 features.

So, if you're someone who's decided to use Lua for a specific use case, you may want to see if your target specifies a specific Lua version (ex. Roblox, LÖVE, PICO-8, OpenResty are mentioned in this thread and all use either LuaJIT or a subset of 5.1), or if you expect to need the performance of LuaJIT in the future.

Edit: formatting

[1]: https://luajit.org/luajit.html


Lua versions are thoroughly explained in their documentation page which includes a page on versioning:

https://www.lua.org/versions.html

The very first section in that page is explaining how the versioning works:

  The releases of Lua are numbered x.y.z, where x.y is the version and z is the release.
  
  Different releases of the same version correspond to bug fixes. Different releases of the same version have the same reference manual, the same virtual machine, and are binary compatible (ABI compatible).
  
  Different versions are really different. The API is likely to be a little different (but with compatibility switches), and there is no ABI compatibility: applications that embed Lua and C libraries for Lua must be recompiled. The virtual machine is also very likely to be different in a new version: Lua programs that have been precompiled for one version will not load in a different version.
LuaJIT is fast but that doesn't mean that PUC Lua is slow. A lot of people say that Lua is fast when comparing it with other languages which are perceived as slow such as Python or Ruby, they might be talking about PUC Lua. Others are saying Lua is fast and comparing it to JS, C, and other fast languages, there is a great chance they're talking about LuaJIT. Anyway, I just wanted to say that PUC Lua while slower than LuaJIT and slower than many other languages is still speedy enough for most use cases it was designed for.


I love Lua, it's one of my favorite languages to write code in. It's small and simple, but you can do so much with the few things it offers. Really well designed imho.

I also like MoonScript[1] a lot, a language that compiles to Lua. I use it even more than Lua because I prefer its brevity and it solves one design problem that I (and many others) see in Lua: in Lua variables are global by default, but local by default in MoonScript.

[1] https://moonscript.org


moonscript seems not under active development since 2017, last release 0.5.0 was 2016 and never reached 1.0, which is very unfortunate.


Yes, this is unfortunately true.

There's a spiritual, mostly compatible, successor that is under active development: https://yuescript.org/


wow, not aware of this, looks very interesting!


Surprised nobody has mentioned LuaJIT. The engineering is very impressive; Mike Pall may know more about writing just in time compilers than any one person.

It seemed to have more prominence many moons ago when it supported the latest Lua language definitions, and it is still actively maintained, but the author I suppose decided that Lua would routinely do things that made it a moving target, and version 5.1 would suffice.


Mike Pall complained that the introduction of the _ENV scope layer would make LuaJIT slower and therefore less interesting, so he let it diverge. But also, since LuaJIT is heavier and less portable, it kind of makes sense that it should have a more stable syntax, since that tilts the use cases towards more standalone projects.


Does it? I remember JIT issues imposed by Apple Store restrictions, but otherwise it’s just a small so/dll/dylib which is around hundreds of kilobytes in size, iirc. It can’t run on 8-16 bit cpus, but realistically would you run vanilla Lua there? ESP8266/NodeMCU comes to mind, not sure if one could target LuaJIT without JIT compiled in to it.


Lua 5.2 + luajit with the 5.2 compat flag seems fine to me, don't feel like I'm missing anything post-divergence


As someone who is yet to have the chance to use Lua but heard about it a lot, I'm very happy to read this article. It's so useful to hear how a programming language should be used. Understanding the philosophy behind the language and what are the use cases for it are super helpful when you want to select the right language for the job.

Unfortunately it is so common today to skip these important details when describing a language. Even the official lua.org website doesn't include the "toolkit" description, and puts the focus on the language's traits, which are only half helpful.

Thank you so much for the article!


Thanks to Lua, there’s been many new plugins written for Neovim, which has a ton of momentum behind it as a modern fork of the venerable Vim editor.

Neovim comes with Lua 5.1 and LuaJIT, so these plugins are fast. And while Lua isn’t without its issues (no language is perfect after all), the majority of Neovim’s users have taken to Lua instead of VimL, which is essentially a DSL for Vim.

Neovim runs virtually all of the old configurations and plugins, since version 0.5, Neovim can be configured almost entirely using Lua.

What’s also interesting is Neovim attracting lots of new users and creating lots of new Lua programmers. Many of the core contributors aren’t your typical grey-beard types, which is refreshing to say the least. They’re not hung-up on 0-based indexing and all of the other crap people often bring up regarding Lua.

Some of them are on YouTube and Twitch live streaming sessions of them configuring and creating plugins for Neovim, all in Lua. This alone is introducing tens of thousands of viewers to Lua programming.

It’s been quite something to watch a new generation bringing new life to the Vim/Neovim ecosystem and role Lua is playing.


The first item can be at any index - it just has to be the first one.

Some like 0. Some prefer 1. I think my suggestion of 7, more easily reached by the right index finger from the home row, has merit.


"Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration." - Stan Kelly-Bootle


That sounds familiar, now you mention it. Fascinating. I wonder what percentage of my thoughts are actually original.


I tend to think that for a "this is a useful way to use them", that pascal did it right.

    Type
      LatLonTemp = array[-180..180] of array[-90..90] of Real
Arrays may start and end at any value of an ordinal type.

There are other problems with arrays in pascal (hinted at the type definition), but that the first element of the array doesn't have to be 0 or 1 or even a positive number can make some code easier to work with.


Can I index by 100s so there is plenty of room between them to add more indexes later?


In Ada yes. It will work in Lua too but iterators aren't guaranteed to go in order if you do.


Ada indexes have to be continuous ranges. So you'd need a type that was only every 100th number in order to use them as indexes, which I don't think can be made (I've never seen it, and looking it up I'm not seeing any way to make it). But you can select any arbitrary range as your array index, very handy as it neatly resolves the 0-/1-based issue: Choose the natural range for your situation.


I really like Lua, but it is not so practical in my daily work. I have two strange loves from an earlier life; Lua and Prolog (actually Mercury), but not sure how to make living with either of them. I tried with Lua and made a game/app services company with it and it went well, however around that time (2011 or so), companies here started to worry about ‘the stack’ and did not want weird things they never heard of (which is very understandable). So it worked for smaller clients who wanted solutions but not for bigger who wanted solutions that were maintainable for anyone.


> Complaints about 1-based indexing for arrays

...

> Using 1 as the initial position for an array makes sense when you’re talking like a human.

If the target audience for the scrips is non-programmers, then sure.

The place where I found Lua's 1-based indexing unacceptable was when Lua was being introduced at a past job as a secondary language. The problem here is that many different programmers would be infrequently modifying the Lua code. In those cases, you need as few gotchas as possible to make sure people are successful and deviating from what everyone is used to in that environment for indexing does not lead to success.


> If the target audience for the scrips is non-programmers, then sure.

It kind of is. Lua was developed as a configuration language "on steroids" for Petrobrass programs. The target users were non-programmers. Nowadays, Lua allows mildly tech-savvy persons to create or modify scripts. That includes high-school kids, who typically don't give a huck about Djikstra's arguments [1] ;-)

The only way to avoid language "collisions" when you program in different languages in parallel is to choose languages vastly different in syntax. 1-based indexing is not "unacceptable", it is a drop in the ocean when you use together with C or C++. Typically you forget semicolons in C because Lua doesn't mandate it, or you forget that in Lua, zero is not "false" for if, etc. Personally I write some Forth on the side too, and as both Forth and Lua have the "then" keyword but used in a different way, I often misplace it.

Blaming C, Lua or Forth for any of these difficulties doesn't make sense. The only answer, really, is "deal with it, it won't be the last time unless you quit programming".

[1] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/E...


I'm not blaming Lua. I'm saying that the context where it is used is important and can effect whether things are problems or not.

Semi-colons are a bad example, you are generally not going to have subtle bugs from those. Off-by-one bugs are already bad enough in programming, having people switch their base index between tasks will not help.


I agree that semantic mismatches are a worse problem than syntactic mismatches.

However in the case of Lua, because of dynamic typing and late binding, it would not be wise to not have unit tests and full test coverage. This mostly reduces semantic mistakes (indexing, 0-is-not-false, etc.) mostly to annoyances like syntax errors. "mostly" because some "stateful" operations need additional test efforts, and finding out why a test fails is not always straightforward.


Sometimes I port some other language code to Lua and in such cases the 1-based indexing can be tricky. Though recently I discovered using a metatable to convert the index from 1-based to 0-based and vice versa can make the porting a bit smoother wrt this issue.


Indeed, since one of Lua's primary use cases is bindings to C/C++/etc., whenever you have C<=>interop you have to adjust indices.


Discussed at the time:

Lua, a Misunderstood Language - https://news.ycombinator.com/item?id=25796852 - Jan 2021 (134 comments)


> Using 1 as the initial position for an array makes sense when you’re talking like a human.

I don't care. Every other popular language uses 0-based arrays. Using Lua is a massive pain because I have to change my mental model for a basic computer science structure compared to all other languages I use.


Massive pain is overstating it I think. Plus there's almost certainly a patch out there that changes it to zero based.


I don't know, I'm used to zero-based semantics, but I've used MATLAB and R a bit, and it's not that hard to switch gears. My biggest problem is usually figuring out which library to use in a given ecosystem, and how to use it.


Taking the opportunity of this thread to suggest a look at TekUI, a self contained Lua GUI toolkit. It's quite old but full featured and multiplatform. Years ago I played with it also on embedded boards (Allwinner A10 if memory serves) using both X and the framebuffer. Looks like it is unmaintained, which is a shame because it has great potential.

http://tekui.neoscientists.org/


Lua is one of the best looking programming languages from the aesthetic point of view. Two languages have achieved the perfect syntax, in my opinion: Lua and Standard ML (maybe also Inform 7, but this is a work of art, not a general-purpose programming language). All others have some compromises.


After using plain lua on Roblox for about 5 years I used the new Luau with type hints and found it made a huge difference. Similar to the js => typescript transition.

https://luau-lang.org/typecheck


Demo of how lua/luau profiling works in Roblox

https://www.youtube.com/watch?v=BbIPalpAfaI


>>> Lua should be considered a toolkit

Spot on. I don't see myself writing a complete app using Lua, but there are quite a few powerful examples of Lua as a DSL, running embedded in a larger app.

Openresty (nginx + Lua + LuaJIT) is probably the most popular example. I have been using it for heavy network applications with great success.

Heavily opinionated, but I am yet to see a data structure more flexible and powerful than Lua tables. And I have seen a lot ... started programming with the Sinclair ZX81 ...

[1] https://openresty.org/en/


A list of things that I disliked when I used lua some long time back:

- No break or continue in loops. Had to move those code blocks to a separate functions simply to put a "return" mimicking break.

- No ternary operator ? : or unary ++ --

- No way to count(table) which was a frequent requirement. As basic data structure for lua had been its table. And for counting you need to write a function that iterates over the elements and counts them every time.

- Need to put "local" before every variable declared inside a function, otherwise by default it will turn into a global variable.

- And this 1 based index. Had to put a lot of -1 in code.


Can't speak to whether some of this is just things that've become more convenient since you used the language, but some of these aren't the case.

Ternary: you can use logical operators for it (`[condition] and [if-true] or [if false]`) in most cases.

Counting arrays: the # operator returns the size of an array. (But only a table used as an array. If it's being used as a hash then you're right and you're on your own for counting it.)


> Counting arrays: the # operator returns the size of an array. (But only a table used as an array. If it's being used as a hash then you're right and you're on your own for counting it.)

Lua doesn't have arrays. The # operator will give you the length of a "sequence"


I was using the terminology of the parent comment. :D


I know, but the difference is important because a lot of the gotchas related to the # operator make a lot more sense and are well defined in the manual.

> A table with exactly one border is called a sequence. For instance, the table {10, 20, 30, 40, 50} is a sequence, as it has only one border (5). The table {10, 20, 30, nil, 50} has two borders (3 and 5), and therefore it is not a sequence. (The nil at index 4 is called a hole.) The table {nil, 20, 30, nil, nil, 60, nil} has three borders (0, 3, and 6) and three holes (at indices 1, 4, and 5), so it is not a sequence, too. The table {} is a sequence with border 0. Note that non-natural keys do not interfere with whether a table is a sequence.

> When t is a sequence, #t returns its only border, which corresponds to the intuitive notion of the length of the sequence. When t is not a sequence, #t can return any of its borders. (The exact one depends on details of the internal representation of the table, which in turn can depend on how the table was populated and the memory addresses of its non-numeric keys.)


If all the users barring the evangelists misunderstands something, perhaps it's not the users?

My opinion is that Lua's OK, but it's certainly not my first choice when I have a choice.


You'd need to define "all the users".

I read this article as talking to people who're (professional?) programmers, who're evaluating Lua as a programming language that they might want to use and are comparing it to things like C / C++ / Python / etc.

But I would be willing to bet money that the vast majority of people who write code in Lua are scripters. Roblox users alone probably outnumber people who're doing general-programming in Lua. For them, things like 1-indexed arrays are a particularly good idea.


I use Lua as a user-facing extension language to write modules in my music synth app [1]. Though I'm used to zero-indexing, the one-indexing never bothered me much. None of my beta testers have complained about it. I think it's a good language for beginner programmers, which most of the users are.

[1] http://audulus.com (the Lua-based modules are in version 4, now in beta)


The Lua scripting interface for Ardour, the cross-platform GPL'ed DAW:

https://manual.ardour.org/lua-scripting/class_reference/

We bind the majority of the backend (written in C++) to Lua, and provide access to GUI actions also. It is magic, and wonderful, and we probably should written more of the application itself using Lua.


I guess I don't see any misconception as the title implies. Most people I know see Lua like a toolkit. A toolkit that is cavernous, and if you use it say, as a way to manage your UI in a popular MMORPG, you could risk it's misuse compared to something more domain specific, (as we've seen with WOW RCEs over the years).

To be clear, I'm not making a case against using Lua, just sharing that the developers I know see Lua as a toolkit as the author describes (outside of the title irking me a tad I found the details insightful and enjoyed hearing someone's perspective of using it for projects).


Lua is misunderstood because it looks like superficially similar to a number of other languages but has a unique value proposition that is rarely well explained. The frustration that ensues from trying to use lua as though it is javascript or python or c is real. Because the value proposition isn't well explained, one has to use it and gain experience with it to appreciate it.

One of the key differentiators for lua is that it has multiple implementations that are both rock solid and will work on essentially any computer that you are likely to ever work with. Luajit is api compatible lua 5.1 (with a few 5.2 features backported). Lua 5.1 came out in 2006, which is to say that when writing lua 5.1 code, you are targeting a language that hasn't really changed in 15 years. This is incredibly liberating because you can trust that any code that you write in luajit or lua 5.1 is likely to work essentially forever (modulo security issues of course).

Lua is also one of the most simple languages out there. Pretty much everything that you need to know is in the reference documentation, which is a single file that you can easily download and store for offline use. It also hasn't changed in a decade and there is no reason to expect it will in the future.

Simplicity begets simplicity. As simple as lua is, you can make it even simpler by restricting yourself to a limited subset of the language. The more you restrict yourself, paradoxically the more liberated you are. You eventually realize that you can express any computational problem in lua. Lua provides an implicit vm that serves as an idealized model of a computer. It has numbers, strings, arrays and associative arrays (with a uniform syntax for both types of arrays). This is enough to do pretty much anything without too much pain. The lua runtime itself is fast enough to do most programming tasks faster than you can perceive. For the remaining tasks, you can just implement them in c and call them from lua. But you won't be writing normal c, you will be writing your own dialect of c that itself reflects lua. Many of the problems of c, particularly around memory management, are avoidable when the c code is explicitly designed to be embedded inside of a lua program.

Continuing on the theme of simplicity, lua is also one of the fastest scripting languages with respect to startup time (EDIT apologies for the poor formatting below but console output doesn't work great with HN):

bash-5.1$ time lua -e 'print "Hello, world!"' Hello, world!

real 0m0.004s user 0m0.002s sys 0m0.002s

bash-5.1$ time python3 -c 'print("Hello, world!")' Hello, world!

real 0m0.034s user 0m0.024s sys 0m0.007s

bash-5.1$ time node -e 'console.log("Hello, world!")' Hello, world!

real 0m0.062s user 0m0.049s sys 0m0.011s

bash-5.1$ time ruby -e 'puts "Hello, world!"' Hello, world!

real 0m0.086s user 0m0.062s sys 0m0.021s

The lua startup time is actually comparable to a standalone c executable:

bash-5.1$ time printf '#include <stdio.h>\nint main() { puts("Hello, world!"); return 0; }' | gcc -x c -o a.out - && ./a.out "on mac os, there are some checks that are run the first time a new exectuable is called so do one dry run before timing the results" > /dev/null; time ./a.out

real 0m0.074s user 0m0.050s sys 0m0.021s

Hello, world!

real 0m0.002s user 0m0.001s sys 0m0.001s

Comparing startup time may seem silly, but it raises the question how exactly you are benefiting from the excess complexity relative to lua that those other language runtimes have. You also can exploit the fast startup time to create a low latency development experience with minimal tooling. Configure your editor to save lua files with every keystroke. Then in a shell window, run something like: > ls foo.lua | entr -c time lua foo.lua

You might be amazed by how much you can do in a lua script before the the latency becomes noticeable. By timing the results, you are also immediately aware of the impact on performance that any change to the script might have. This enables you to have an entirely different kind of development experience in which you are always fully aware of the state of your program because you are never waiting for compilation. The program either does what you expect it to, or it doesn't. No time is spent in the liminal state where one suspects the code may be correct (or incorrect), but you aren't really sure. You start realizing that if you organize your code in a certain way, you almost never have to wait more than say 100ms to get feedback.

On top of all that, if you don't like the syntax of lua, you can write a parser for another language in pure lua. Your new language, which will transpile to lua, will automatically inherit the good properties of the lua runtime.

Mastering lua will completely change the way that you think as a programmer and has the potentially to radically improve your experience of programming. Of course, once you have mastered lua the next step is to go beyond lua...


I know this is silly, but my biggest complaint is lack of shorthand ++ and += operators.

I can get over 1-based, but i want my ++.


+= has a confusing interpretation when the lvalue is a metatable invocation or function result. For readability given the flexibility of Lua's operator overloading, it is always better to be explicit.

The ++ increment operator is even worse, because it leads to statements within statements. Even an experienced C programmer might have trouble understanding what happens when I write `y = x + x++;`. Lua's grammar achieves a simple parse with no statement delimiters because it enforces a distinction between expressions and statements.


All these problems could be solved by just having ++ and += not return a value, which gets rid of all the confusing cases but keeps the useful cases.


++ is useful to switch over the items of a string:

  switch (x++) {
      case ‘+’:
          if (x == ‘+’) {
              op = OP_INC;
              x++;
          } else if (x == ‘=‘) {
              op = OP_ADDEQ;
              x++;
          } else {
              op = OP_ADD;
          }
          break;
       …
  }


It may have a confusing behavior, not interpretation. += may mean “evaluate lvalue, add a value to it as usual and set it back”. If you end up getting confused by:

  t.x = t.x + v
Then it is a question to t’s metatable designer either way.


isn't y = x + x++; undefined behavior in C, because of a lack of a sequence point? If that's true, then it makes sense for no C programmer to know what happens.. I think to fix it you would have to go like: y = x; y = y + x++; I think that would work..


What's confusing about it?


FYI These operators are being removed from modern languages now, via style guides, linters, or the language itself (eg: Swift). Personally I actually like the increment and decrement operators, but I understand how they can be confusing (especially ++x vs x++).


This trope needs to die. If you cannot understand ++x vs x++ then you fundamentally do not understand the language - or programming in the more general sense.

Let's stop stupifying our languages for beginners. Beginners are only beginners for a short while, and increasing verbosity with zero gain is annoying at best for everyone else.


> If you cannot understand ++x vs x++ then you fundamentally do not understand the language

I hardly think this is the issue. The problem is this kind of thing becomes so easy to write but not consciously notice that it causes disproportionately costly mistakes, and I don't think the benefit is worth it.

Python and other languages don't have the increment operator (pre or postfix), but never has this been a problem for productivity in any of my teams. On the contrary, I've encountered several instances where a senior engineer with 20+ years in the industry can't for the life of them figure out why some particular value is off-by-one. It wastes hours of time and often takes a fresh set of eyes to figure it out.

I know a senior principal engineer who wasted an entire day of his time because a comma at the end of the line turned his Python scalar into a unary tuple and was incredibly difficult to notice. "5," is semantically equivalent to "(5,)" but Python isn't strict enough in this instance. It should be strict, though, not for stupifying but for protecting against errors that we all make.


Perhaps the problem isn't really the increment operator, but rather folks writing too fancy of code while using it. One should probably not deeply nest things like increments... because they do get difficult to read. That's not an increment operator problem though, that's an engineer problem.

The example with Python doesn't apply here. We cannot just go around removing language features because sometimes people write bad code.

Further, if you read the code and walk through it in your head, finding a mistaken tuple someplace unexpected should have been easy to find... which means your senior principal engineer isn't very good at reading code, even if they are very good at writing code.

Reading code is it's own skillset I've found... that and apparently the debug steppers for python aren't very good either...


> Perhaps the problem isn't really the increment operator, but rather folks writing too fancy of code while using it

I find this is a common theme of many complaints of C++ and OOP.

For example, C++ lets you do some incredibly ugly things using operator overloading. This can be useful so you could, say, make a matrix class and allow matrix operations to just using + or * rather than .add and .multiply functions. The fact that it can be abused to do very non-sensical, unintuitive things should be a criticism of the programmer who does it, not the feature.


>... a comma at the end of the line turned his Python scalar into a unary tuple

I have been burned by this mistake a handful of times. Each instance managed to eat up a shocking amount of time. I definitely consider it a misfeature.


I'm trying to figure out how you ended up with a stray comma at the end of a line. A simple typo, bumping the key without noticing?


My guess would be a bad auto-complete suggestion that wasn't read carefully before accepting.

Loose-type languages often struggle with auto-complete and related. I don't know enough about current state of python development to say with certainty, however.


Cannot offer any justification for the error. Just that I have done it more than once, and each time it has taken me too long to debug the underlying cause.


++x and x++ just seem so unnecessary. Two extra operators for a very specific case, in which they either safe one character verses x+=1 or mix different mutations into the same line. Iterating a list by incrementing an index doesn't even come up that often if your language supports iterators well.

Edit: Just checked, the project I just had open has 7 instances of +=1 in 5k loc and none of them would have been improved by ++


Well, Lua also does not support += or any of it's counterparts. You have to be extremely literal with x = x + 1, which is just a huge waste for very little gain.

I'm all for beginners learning easier constructs... teach them x = x + 1 if you want, just don't remove the shorter constructs for the rest of us.

Iterating a list/array is just one of many use cases for the increment/decrement operators anyway...


i like x++, but relying on the return value usually tends to be evil.


Didn't Adobe(?) post a full presentation about what they learned doing Lua for a full application (Lightroom maybe?)? Overall, it wasn't great.

Unfortunately, I can't seem to find it anymore. I wonder if the HN hivemind can cough it up.


Regarding tables--I just don't want to deal with tables any more. You get some extra flexibility, but the cost is just too damn steep--it's like going to a restaurant and getting escorted to the kitchen so you can design your own menu. I just want to eat, damn it.

> Many people don’t realise that Lua is extremely nimble. Lua can be built by compilers using the c89 standard (even though the default is c99 IIRC), using very little assumptions about the hardware it is running on.

There are other languages with similar size and complexity to Lua, which are also made in C.


Lua doesn’t really solve anything fundamental. It’s popular because it’s small and easy to embed. But embeddable interpreters, while fun pet projects, they’re redundant and trivial.

Jonathan Blow has a great explanation of this https://youtu.be/y2Wmz15aXk0

Write reusable libraries first, let people embed them into whatever scripting language/GUI app they want.

Scripting languages, Linux distros, it’s all Pokémon cards.


Important to call out the TypeScriptToLua project (https://typescripttolua.github.io/) which is incredibly robust. This is a best of both-worlds situation -- extremely robust typing, but able to drop down to raw Lua when needed (or integrate with a blended Lua codebase), and if you're running on LuaJIT, still performant as hell. (People tend to overlook how sophisticated LuaJIT is - in many benchmarks it can keep close pace to v8)


I would love to use this in my C++ game engine. Can I somehow embed it in my C++ application without having to use Node?


You just compile TypeScript to Lua at build time. Then run in the Lua in your Lua VM which you've presumably embedded in your game.

If you wanna do this at runtime, well, I suspect a node-like environment would be required at runtime.


I learned Lua in the summer of ~2010 with the WoW addon book from the creator of Deadly Boss Mods https://www.amazon.com/Beginning-Lua-World-Warcraft-Add-ons/... It's a bit outdated now but WoW still supports Lua addons. And it was a useful experience, still today. Not just the language itself but how to build mods for games


Lua doesn't stand well entirely on its own. You could write command-line utilities with it, but if you're finally leaving the traditional masochism of using C and C++ for that, then you might as well go all the way to Python or Ruby.

I think Lua was always intended to be used for making other software programmable, at least that's where it seems it works best.

Something to do: count and compare the number of applications written in Lua to the number of applications using Lua.


> Lua doesn't stand well entirely on its own.

That is addressed in the article.


So Lua is the new Tcl. A lightweight scripting language designed to be embedded in other programs. Then I went and looked at http://tcl.tk and it appears the current maintainers of Tcl forgot why it exists as well. ;-)

Everyone hated Tcl back in the day for similar reasons. Too lightweight, no serious language features, etc...

Maybe we should just use Scheme.


Not to mention that Tcl also has an article about how it's been misunderstood, see https://news.ycombinator.com/item?id=4920831 and https://news.ycombinator.com/item?id=31129936 :-)


The saddest of all this is the fact that "it is a lean platform that allows you to build what you want" (basically 'no bateries included') is considered as something to be praised, like "Look how small and simple its source code is"... Well of course it will be small and simple and there are few files that needed to be compiled for the whole thing, but in the "remote case" that you want to build anything meaningful with it you will need to write the other 1500 files (most certainly with "not as clean code") that many other more mature languages already have

Just don't be a hipster... In deed most languages created past C++ have added very little innovation, its just "my own vanilla flavor" (yet it is still vanilla) some, very few indeed, have added valuable syntactic sugar, yet how I wish new languages stopped being created pretending to excel at a corner case that gives you "huge benefits" (which of course are all thrown to waste when you realize most of your application cycles and latency actually comes from other places rather than such corner case.

This is not an argument for "new* languages are bad" , rather one of "until trully needed reinvent the wheel" and I think that will happen when we finally get to use qbits and new paradigms of coding/logic arise (basically an architecture change)


There are loads of libraries for lua. You can either get them from luarocks in modern fashion or as more ad hoc source code in the old school. They're either written in C, against a stable language API, or in lua. Or a mix, doesn't make much difference. Also you can splice them directly into lua itself, e.g. the interpreter I use has libuv and sqlite built in.

The argument against shipping everything with the language is that the standard library is where code goes to die. Once shipped, fear of breaking users tends to block all progress. See mature languages like python and c++.

And regarding lack of innovation, lua has coroutines in the base language. Real ones that don't need co_await written near them. It has a module system built on dictionaries. It ships the compiler, so writing source code that you eval on the fly works out of the box.

Lua is spectacularly good. We are blessed to have it available.


Embedding Lua instead inventing your own DSL is often the right choice. Two projects that come to mind in this respect are Neovim/Vim and AwesomeWM/i3.

Neovim and AwesomeWM use Lua for configuration and scripting instead of inventing their own language/format. Vim and i3 do have their own languages.

I really like having the same basic tools for scripting in multiple programs.


I'm using Lua for a game right now (sdwr.itch.io/underlod), and it's grown on me significantly. Using tables is a bit clunky, but the flexibility is insanely powerful.

The codebase I'm using has multiple inheritance, and it's implemented right out in the open, in 40 lines of code using metatables. Been dabbling with runtime function binding as well.


I've only played with Lua a bit in Pico-8. At first I hated it, but I've grown a bit of a begrudging love for it.

The language is absurdly flexible, and that's a good and bad thing, but I'd say the language is, overall, more good than bad.


The world would be better if we had Lua in our browsers instead of Javascript.


They are both prototype-based languages influenced by Self.


I first came across LUA as the extension language for a measurement instrument from Keithley.

It was very fast to learn, pretty easy to do everything I needed (was not just a one-liner).

I can attest, at least for that use case, I found it wonderful.



Pretty sure everything about OOP in here also applies to JavaScript


My first real job was mostly working in Lua and, to me, the real warts are the weird edge-cases in how Lua behaves. It's been a while, but IIRC I remember getting caught by how Lua will not insert nil into the array values of a table, but it will skip the index, so you end up with gaps in your tables' values unexpectedly.

example:

   a = {1, 2, 3, nil, 4}
   for i in pairs(a) do 
      print(string.format("%i -> %i", i, a[i]))
   end
   [...]
   1 -> 1
   2 -> 2
   3 -> 3
   5 -> 4

There are just tons of little quirks like that in the language. They don't matter if you are doing light work, but they come to be really important for more developed things. I still love it, but compared to other dynamic languages you needed to be super careful about types and values and data consistency in a way that caused development to slow.


Lua is simply too flexible for me. I don't like languages with that much "power". Paradoxically at the same time it feels verbose and unexpressive...


He completely missed the luajit criticism, on the new int data type. So Lua lost its most important implementation.


off the topic, how is the "Mentions" secion in that page implemented?


It uses WebMentions. Some sites send it directly to my site, in other cases I use a SaaS such as brid.gy to implement it.


Lua is pretty terrible as an embedded language. I’ve worked on numerous applications in both gaming and outside that industry in which lua was heavily utilized to “provide flexibility.” I’ve never liked it, and I probably never will. It has too many “weird” behaviors, ranging from the not-so-bad 1-based indexing to the absolutely absurd “everything is a table, but even tables are mixed metaphors between arrays and hashtables.”




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: