Hacker Newsnew | past | comments | ask | show | jobs | submit | xigoi's commentslogin

You’re assuming that people choose languages based on merit and not based on how much money someone will give them for using them.

You're assuming something better on merit wouldn't make more money as a result, and I'm questioning the actual merits as a result

the silent assumption in both of your perspectives is that the current monetary system is an even playing field when it comes to this context (corporations and their programmers)

> What business concepts or thoughts can you express in Clojure that you can’t express in Python or Rust?

If you only think about programming languages as a way to make money, the analogy of being stuck in Flatland is perfect.


That's a bit of an ad feminam attack, isn't it? Just because I used the phrase "business concepts", somehow money is the only thing I care about when it comes to language choice? And yet, in my top-level post I said I went and learned lisp and clojure and read SCIP, and I will add that I did both of those things for fun. So no, I don't only think of programming languages as a way to make money. Elegance and expressiveness are interesting for their own sake. I trained as a mathematician; of course I think that.

But TFA was riffing on Paul Graham's old essay Beating the Averages, which argued precisely that the expressiveness of Lisp gave his startup a business edge. That was the context of my comment. I'd add that most of what most of us do in our day jobs is to use programming languages to make money, and there's no shame in that at all. And if you want to talk about why certain languages get widespread adoption and others not, you have to talk about the corporate context: there is no way around it.

But I'll rephrase my question, just for you: "what abstract problems can you solve or thoughts can you express in Clojure that you can’t express in Python or Rust?"


I’m sympathetic to looking down on the obsession with money. But there’s something deep and important about the monetary element. Engineering is about solving real-world, practical problems. The cost is a real factor in whether a potential solution is a useful one.

I think the money question is a red herring here. I’d phrase it more like: what problem in a user’s problem space is expressible only like this? And if the only user is the programmer, that’s alright, but feels more aligned with pure academia. That’s important, too! But has a much smaller audience than engineering at large.


some people only think about life as a way to make money. unfortunately coding was best-in-slot career for too long and these kinds of people hijacked the culture.

There are GC algorithms that don’t require stopping the world.

By your definition, is Nim a GC language?

So by that definition yes, as of Nim 2.0 ORC is the default. You need to opt-out of it.

I'm not sure this opt-in/out "philosophical razor" is as sharp as one would like it to be. I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

For example, in Nim, at the compiler CLI tool level, there is opt-in/opt-out via the `--mm=whatever` flag, but, at the syntax level, Nim has both `ref T` and `ptr T` on equal syntactic footing . But then in the stdlib, `ref` types (really things derived from `seq[T]`) are used much more (since it's so convenient). Meanwhile, runtimes are often deployment properties. If every Linux distro had their libc link against -lgc for Boehm, people might say "C is a GC'd language on Linux". Minimal CRTs vary across userspaces and OS kernel/userspace deployment.. "What you can rely on/assume", I suspect the thrust behind "optionality", just varies with context.

Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented"). That doesn't even bring in "use by common dependencies" which is an almost independent axis/dimension and starts to relate to coordination problems of "What should even be in a 'std'-lib or any lib, anyway?".

I suspect this rule is trying to make the adjective "GC'd" do more work in an absolute sense than it realistically can given the diversity of PLangs (sometimes not so visible considering only workaday corporate PLangs). It's not always easy to define things!


> I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

I think optionality is what gives that definition weight.

Think of it this way. You come to a project like a game engine, but you find it's written in some language and discover for your usage you need no/minimal GC. How hard is it to minimize or remove GC. Assume that changing build flags will also cause problems elsewhere due to behavior change.

> Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented")

Vagueness is the intrinsic quality of human language. You can't escape it.

The logic is fuzzy, but going around saying stuff like "Rust is a GC language" because it has an optional, rarely used Arc/Rc, is just off the charts level of wrong.


Do you consider ORC a tracing GC, even though it only uses tracing for cyclic types?

By modularizing the compiler, it will presumably be easier to create syntactic skins for Nim. I’m planning to make an S-expression syntax if nobody else does.

On the other hand, it’s more ergonomic and readable because you don’t need to declare a new name.

  if name != nil:
    echo name
versus

  case name
  of Some(unwrappedName):
    echo unwrappedName

Compared to HTML, Markdown is very bad at being mahcine-readable.

If you need wraparound, you should not use signed integers anyway, as that leads to undefined behavior.

Presumably since this language isn't C they can define it however they want to, for instance in rust std::i32::MIN.wrapping_sub(1) is a perfectly valid number.

Nim (the original one, not Nimony) compiles to C, so making basic types work differently from C would involve major performance costs.

And yet, Nim does overflow checking by default.

Signed overflow being UB (while unsigned is defined to wrap) is a quirk of C and C++ specifically, not some fundamental property of computing.

Specifically, C comes form a world where allowing for machines that didn't use 2's compliment (or 8 bit bytes) was an active concern.

Interestingly, C23 and C++20 standardized 2's complement representation for signed integers but kept UB on signed overflow.

Back when those machines existed, UB meant "the precise behaviour is not specified by the standard, the specific compiler for the specific machine chooses what happens" rather than the modern "a well-formed program does not invoke UB". For what it is worth, I compile all my code with -fwrapv et. al.

> UB meant "the precise behaviour is not specified by the standard, the specific compiler for the specific machine chooses what happens"

Isn't that implementation-defined behavior?


Nim (the original one, not Nimony) compiles to C, so making basic types work differently from C would involve major performance costs.

> making basic types work differently from C would involve major performance costs.

Not if you compile with optimizations on. This C code:

  int wrapping_add_ints(int x, int y) {
      return (int)((unsigned)x + (unsigned)y);
  }
Compiles to this x86-64 assembly (with clang -O2):

  wrapping_add_ints:
          lea     eax, [rdi + rsi]
          ret
Which, for those who aren't familiar with x86 assembly, is just the normal instruction for adding two numbers with wrapping semantics.

Presumably unsigned want to return errors too?

Edit: I guess they could get rid of a few numbers... Anyhow it isn't a philosophy that is going to get me to consider nimony for anything.


Please no. Design by committee would lead to another C++.

Languages with design by committee are a plenty, including all mainstream ones, not a single one is still being developed by a single person.

The second or third most popular language of all time? God forbid lol

Popular does not mean good. Tobacco smoking is also popular.

Do you think this is clever? For a metaphor to be relevant to a discussion it has to be fitting, not just a dunk.

It’s not a metaphor. I was giving a counterexample to your implied claim that popularity is an indicator of quality.

That wasn't an implied claim because we're not discussing metrics for judging quality.

You're right. It's everyone's least-favorite gotcha. Reminds me of this:

Waiter: "How is everything?"

Customer: "Great!"

Waiter, disgusted: "Even war?"


It needs to be this way so that UFCS works properly. Imagine if instead of "a,b".split(','), you had to write "a,b".(strutils.split)(',').

ok I do not understand.

What is preventing this import std/errorcodes

from allowing me to use: raise errorcodes.RangeError instead of what Nim has?

or even why not even "import std/ErrorCodes" and having the plural in ErrorCodes.RangeError I wouldn't mind


Nothing, and it fact this works. To move to an example which actually compiles:

    import math
    
    echo fcNormal
    echo FloatClass.fcNormal
    echo math.fcNormal
    echo math.FloatClass.fcNormal
All of these ways of identifying the `fcNormal` enum value works, with varying levels of specificity.

If instead you do `from math import nil` only the latter two work.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: