> use of UTF-8 character for keywords [concerns me]
What do you mean? All the keywords in the standard language are ASCII. And even if you meant some other part of the standard language, they're (almost) all ASCII too.
(The most notable exceptions are use of superscript 0 thru 9 for powers -- eg `2¹⁶-1 == 65535` -- and the `«` and `»` characters, which are part of the Latin-1 aka "8 bit ASCII" character set, are aliases for the (7 bit) ASCII `<<` and `>>` tokens. I understand these exceptions may concern you, but can assure you they're not really a problem in practice.)
> The problem with sigils is that they compose poorly when casting
Are you thinking Raku sigils are like sigils in other languages, eg Perl or Javascript or PHP?
From my perspective one of the several _strengths_ of Raku's sigils is that they combine succinct compile time type constraints and composition.
Just type `@` (reads as "at") to denote a compile time enforced type constraint for the `Positional` interface, an abstract/generic datatype for any iterable integer indexed collection.
So, if you have an array of points, Raku will happily let you store it in a variable named, say, `points-array`, but naming it `@points` means Raku will compile time enforce binding of that name to an integer indexed collection and visually reflect that that is so for any human glancing at the code.
As for "casting", if you want to treat `@points` as a Single Item ("a single array of points") then just write `$@points` -- the `$` reads as Single Item -- `S` over `I`.
(Technically speaking that's not time consuming casting, but just stripping off an indirection, the optimal optimization of that scenario, but I am guessing this is semantically the kind of thing you meant.)
> (refs, counts)
Again, are you thinking that Raku's sigils are like other languages'? (They're not.)
> and do not generalize to other types.
Again, are you thinking that Raku's sigils are like other language's sigils? They are not.
Raku's `$` sigil is the generic Single Item interface for any datatype. (It can be parameterized with a data structure's typed structure.)
The `&` sigil is the generic Single Item interface for any function type. (It can be parameterized with a function's type signature.)
The `@` sigil covers any integer indexed collection. (It can be parameterized with the collection's fixed length for a fixed size array, or shape for a multidimensional structure. To parameterize a nested heterogeneous data structure's type signature, use `$` instead.)
The `%` sigil covers any name indexed collection. (It can be parameterized with the collection's key/value types. To parameterize a nested heterogeneous data structure's type signature, use `$` instead.)
> Plus, they seem to encourage the language designers to implement semantic that is "context aware" which would have been another billion dollars mistake if perl had become more popular.
Why are you mentioning Perl in a subthread about Raku? Are you aware the language was renamed precisely because so many people were completely misunderstanding the nature of Raku?
> In other words, that's unnecessary complexity bringing the attention to a poor type system. A bad idea that deserves to die, in my opinion.
If you're thinking of Perl's type system and applying what you know of that to Raku's, that's like thinking Python's type system is like Haskell's. They are very different.
I hear you were coming from the angle of being useful. In a sense that's what matters most, and I love that you have that spirit.
If Wikipedia has deadnamed Raku with grace then that might be a model to follow, but in general it's far from helpful unless it's done carefully. There's a reason why the community embarked on the somewhat painful multi decade process of renaming it. To try clarify my understanding I'll summarize it here.
Because of the original name for Raku, people assumed for a long time (long after it became problematically misleading) that it shared semantics, or syntax, or compiler tech, or libraries, or something technical like that, with some other programming language(s).
This was partly because Raku did indeed take some inspiration philosophically and/or technically from some existing languages (traces of a half dozen or so are evident), and partly because Raku and its compiler tech features the ability to use Python code and compilers, and C code and compilers, and Perl code and compilers, and so on, as if they were native Raku code/compilers.
But Raku was never C, or Python, or Perl, and the old name is unfortunately a form of deadnaming -- which is to say, something that is seldom helpful, especially if some commentary like this comment is not included.
At least, that's how I experience it.
That said, regardless of my view, I love your impulse of being helpful, which is why I've written this -- and I hope it does help any readers.
Bad: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating.
Worse: A site being usable for a significant amount of time per day, but also unusable for a significant amount of time per day, and the ratio between usable and unusable time per day significantly deteriorating _significantly faster_.
Clearly, Anubis is at best an interim measure. The interim period might not be significant.
But it might be. That is presumably the point of Anubis.
That said, the only time I've heard of Anubis being tried was when Perl's MetaCPAN became ever more unusable over the summer. [0]
Unfortunately Anubis and Fastly fought, and Fastly won. [1]
A major new version of Perl ships regularly. A few weeks ago the latest major new version shipped. From the 2025 changes document:
> Perl 5.42.0 represents approximately 13 months of development since Perl 5.40.0 and contains approximately 280,000 lines of changes across 1,600 files from 65 authors.
Skipping back 5 major new versions (to 2020):
> Perl 5.32.0 represents approximately 13 months of development since Perl 5.30.0 and contains approximately 220,000 lines of changes across 1,800 files from 89 authors.
2015:
> Perl 5.22.0 represents approximately 12 months of development since Perl 5.20.0 and contains approximately 590,000 lines of changes across 2,400 files from 94 authors.
2010:
> Perl 5.16.0 represents approximately 12 months of development since Perl 5.14.0 and contains approximately 590,000 lines of changes across 2,500 files from 139 authors.
There's been well over 10 million lines of code changed in just the core Perl codebase over the last 25 years reflecting huge changes in Perl.
----
Perl 6 ... isn't backward compatible
Raku runs around 80% of CPAN (Perl modules), including ones that use XS (poking into the guts of the Perl 5 binary) without requiring any change whatsoever.
(The remaining 20% are so Perl 5 specific as to be meaningless in Raku. For example, source filters which convert Perl 5 code into different Perl 5 code.)
----
But you are right about one thing; no one you know cares about backwards compatibility, otherwise you'd know the difference between what you think you know, and what is actually true.
> But you are right about one thing; no one you know cares about backwards compatibility, otherwise you'd know the difference between what you think you know, and what is actually true.
What the hell is this? Even if nobody I know cares about backwards compatibility, how does this relate to whether my knowledge is true or not?
Apologies for trivializing perl5's progress in the past 25 years, but come on, chill out dude.
> dual licensing means that you don’t actually believe in software freedoms
If both licenses are accepted by essentially everyone as inherently fully free software licenses, and they don't contradict each other, then what's not to like?
Consider the appropriate example here, which isn't anything to do with AGPL (which many people do NOT accept as a free software license), but rather the AL2 / GPL pairing which was the evolution of the original pairing created by the inventor of dual licensing (Larry Wall).
What's your beef with Artistic License 2.0 / GPL dual licensing?
> I would [guess?] that [Raku(do)] is using hyper operators and internal concurrency to "cheat".
You missed a word. I've guessed it was the word "guess". :)
I haven't checked Rakudo's code but I'm pretty sure any performance optimizations of your code were not related to using hyperoperators or internal concurrency.
Here's two things I can think of that may be relevant:
* Rakudo tries to inline (calls to) small routines such as `* + *`. Wikipedia's page on inlining claims that, when a compiler for a low level language (like Rust) succeeds in inlining code written in that language it tends to speed the code up something like ten percent or a few tens of percent. In a high level language (like Raku) it can result in something like a doubling or, in extreme cases, a ten fold speed up. The difference is precisely because low level languages tend to compile to fast code anyway. So while this may explain why Raku(do) is faster than CPython, it can't explain your conclusion that Rust is half as fast as Raku. (I think you almost certainly made a mistake, but let's move on!)
* In your Raku code you've used `...`. That means all but the highest number on the command line are computed for free, because sequences in Raku default to lazy processing, and lazy processing defaults to use of caching of already generated sequence values. So a single run passed `10 20 30 40` on the command line would call the `* + *` lambda just 40 times instead of 100 (10+20+30+40) times. That's roughly a doubling of speed right there.
So if Rakudo is doing a really good job of codegen for the fibonacci code, and you removed the startup overhead from your Raku timings, then perhaps, maybe, Raku(do) really is "beating" Rust because of the `...` caching effect.
I still find that very hard to believe but it would certainly be worth trying to have someone reasonably expert at benchmarking trying to repeat and confirm your (remarkable!) result.
> At this point in its evolution, there is still much work to be done on code optimisation.
Understatement!
It took over a decade for production JVMs and JS engines to stop being called slow, another decade to start being called fast (but not as fast as C), and another decade to be considered surprisingly fast.
Rakudo's first production release came less than a decade ago. So I think that, for now, a reasonable near term performance goal (I'd say "by the end of this decade") is to arrive at the point where people stop calling Raku slow (except in comparison to C).
> Let me try a translation:
Let me have a go too. :) But I'll rewrite the code:
sub MAIN #= Print fibonacci values.
(*@argv); #= eg 10 20 prints 55 6765
print .[ @argv ] # print fibonacci values.
given
0, 1, 1, 2, 3, 5, # First fibonacci
values.
sub fib ( $m, $n ) # Fibonacci generator
{ $m + $n } ... Inf # to compute the rest.**
FWIW I find it really hard to believe your bench marking was reliable / repeatable. Mistakes happen, and I think mistakes must have happened in this case.
I'll write more about this in another reply to another of your comments.
oh yeah - that was dumb - sorry I misread the examples in the parent
fwiw I am pointing out that
say my $x=42;
is doing the 'my' declaration in the middle of the 'say' and '$x=42' so that providing declarations in the middle of a regular routine such as `say $x` is kind of a ternary syntax in how that stanza is parsed
> Sadly another language with concurrency support that fails to learn the lessons from occam.
Weren't the top two lessons learned from occam about indeterminacy?:
1. If you ignore indeterminacy, you haven't really tackled concurrency, so you become irrelevant. CSP 1 ignored it, so occam 1 and 2 were designed to ignore it. While they became irrelevant for several reasons, I've long thought it was the fatal mistake of ignoring indeterminacy that made the demise of the original occam series inevitable.
2. If you tackle indeterminacy, but not its deeper consequences, you remain irrelevant. occam-π, which has added indeterminacy constructs as an after thought, has the problem that it's tackled it from the outside in.
During the 2019 concurrency talk panel that brought together three concurrency giants -- the late Joe Armstrong (Erlang), Sir Tony Hoare (CSP / occam), and the late Carl Hewitt (Actor Model) -- Tony said:
> The test for capturing the essence of concurrency is that you can use the same language for design of hardware and software, because the interface between those will become fluid. We've got to have a hierarchical design philosophy in which you can program each individual ten nanoseconds at the same time as you program over a 10 year time. And sequentiality and concurrency enter in to both those scales. So bridging the scale of granularity and of time and space is what every application has to do. The language can help them do that and that's a real criterion to designing the language.
Both Joe and Carl instantly agreed about what Tony said but also instantly disagreed about the central role of indeterminacy, and one could be forgiven for thinking Tony still hasn't learned the lesson of the mistake he made with CSP 1 and occam 1.
Erlang's "let it crash" concept distilled Joe's fundamental understanding of the nature of the physical universe, and how to write correct code given the inescapable uncertainty principle aka inescapable indeterminacy.
The Actor Model, which is a simple, purely mathematical model of purely physical computation, ironically the right theoretical grounding if you apply "Occam's Razor" to concurrent computation in our physical universe, contrasts sharply with the more "airy fairy" process calculi, which abstract processes as if one can truly ignore that, for such calculi to be useful in reality, the processes they describe must occur in reality -- and then indeterminacy becomes the key characteristic to confront, not an afterthought.
At least, that's my understanding.
I recall loving occam when I first read about it in the mid 1980s, partly because I was writing BCPL for a living at the time, and occam's syntax was based on BCPLs, but also because I was getting interested in concurrency, and fell in love with the Transputer, which was created by Inmos, a Bristol UK outfit, and I lived in the UK having spent a couple years living a few miles from Bristol.
But one can't deny reality, and the laws of physics, and the growing complexity of software, and the consequences of those two fundamentals, so eventually I concluded the Actor Model was going to outlive CSP and occam. I still think that, but am ever open to being persuaded of the error of my ways of thinking...
What do you mean? All the keywords in the standard language are ASCII. And even if you meant some other part of the standard language, they're (almost) all ASCII too.
(The most notable exceptions are use of superscript 0 thru 9 for powers -- eg `2¹⁶-1 == 65535` -- and the `«` and `»` characters, which are part of the Latin-1 aka "8 bit ASCII" character set, are aliases for the (7 bit) ASCII `<<` and `>>` tokens. I understand these exceptions may concern you, but can assure you they're not really a problem in practice.)
> The problem with sigils is that they compose poorly when casting
Are you thinking Raku sigils are like sigils in other languages, eg Perl or Javascript or PHP?
From my perspective one of the several _strengths_ of Raku's sigils is that they combine succinct compile time type constraints and composition.
Just type `@` (reads as "at") to denote a compile time enforced type constraint for the `Positional` interface, an abstract/generic datatype for any iterable integer indexed collection.
So, if you have an array of points, Raku will happily let you store it in a variable named, say, `points-array`, but naming it `@points` means Raku will compile time enforce binding of that name to an integer indexed collection and visually reflect that that is so for any human glancing at the code.
As for "casting", if you want to treat `@points` as a Single Item ("a single array of points") then just write `$@points` -- the `$` reads as Single Item -- `S` over `I`.
(Technically speaking that's not time consuming casting, but just stripping off an indirection, the optimal optimization of that scenario, but I am guessing this is semantically the kind of thing you meant.)
> (refs, counts)
Again, are you thinking that Raku's sigils are like other languages'? (They're not.)
> and do not generalize to other types.
Again, are you thinking that Raku's sigils are like other language's sigils? They are not.
Raku's `$` sigil is the generic Single Item interface for any datatype. (It can be parameterized with a data structure's typed structure.)
The `&` sigil is the generic Single Item interface for any function type. (It can be parameterized with a function's type signature.)
The `@` sigil covers any integer indexed collection. (It can be parameterized with the collection's fixed length for a fixed size array, or shape for a multidimensional structure. To parameterize a nested heterogeneous data structure's type signature, use `$` instead.)
The `%` sigil covers any name indexed collection. (It can be parameterized with the collection's key/value types. To parameterize a nested heterogeneous data structure's type signature, use `$` instead.)
> Plus, they seem to encourage the language designers to implement semantic that is "context aware" which would have been another billion dollars mistake if perl had become more popular.
Why are you mentioning Perl in a subthread about Raku? Are you aware the language was renamed precisely because so many people were completely misunderstanding the nature of Raku?
> In other words, that's unnecessary complexity bringing the attention to a poor type system. A bad idea that deserves to die, in my opinion.
If you're thinking of Perl's type system and applying what you know of that to Raku's, that's like thinking Python's type system is like Haskell's. They are very different.