Hacker Newsnew | past | comments | ask | show | jobs | submit | germandiago's commentslogin

I think you cannot get an idea of in hiw many ways 2. can break...

That is not even half realistic. Are you going to port all that code out there (autotools, cmake, scons,meson, bazel, waf...) to a "true" build system?

Only the idea is crazy. What Conan does is much more sensible: give s layer independent of the build system (and a way to consume packages and if you want some predefined "profiles" such as debug, etc), leave it half-open for extensions and let existing tools talk with that communication protocol.

That is much more realistic and you have way more chances of having a full ecosystem to consume.

Also, noone needs to port full build system or move from oerfectly working build systems.


Relfection was a desperate need. Useful and difficult to design feature.

There are also things like template for or inplace_vector. I think it has useful things. Just not all things are useful to everyone.


A bureau from the top call is not the way to do it.

Just beat it. Ah, not so easy huh? Libraries, ecosystem, real use, continuous improvements.

Even if it does not look so "clean".

Just beat it, I will move to the next language. I am still waiting.


But compile-time processing is certainly useful in a performance-oriented language.

And not only for performance but also for thread safety (eliminates initialization races, for example, for non-trivial objects).

Rust is just less powerful. For example you cannot design something that comes evwn close to expression templates libraries.


> And not only for performance but also for thread safety

This is already built-in to the language as a facet of the affine type system. I'm curious as to how familiar you actually are with Rust?

> Rust is just less powerful.

On the contrary. Zig and C++ have nothing even remotely close to proc macros. And both languages have to defer things like thread safety into haphazard metaprogramming instead of baking them into the language as a basic semantic guarantee. That's not a good thing.


Writing general generic code without repetition for Rust without specialization is ome thing where it fails. It does not have variadics or so powerful compile metaprogramming. It does not come even remotely close.

Proc macros is basically plugins. I do not think thos is even part of the "language" as such. It is just plugging new stuff into the compiler.


> For example you cannot design something that comes evwn close to expression templates libraries.

You keep saying this and it's still wrong. Rust is quite capable of expression templates, as its iterator adapters prove. What it isn't capable of (yet) is specialization, which is an orthogonal feature.


Rust cannot take a const function and evaluate that into the argument of a const generic or a proc macro. As far as I can tell, the reasons are deeply fundamental to the architecture of rustc. It's difficult to express HOW FUNDAMENTAL this is to strongly typed zero overhead abstractions, and we see where Rust is lacking here in cases like `Option` and bitset implementations.

> Rust cannot take a const function and evaluate that into the argument of a const generic

Assuming I'm interpreting what you're saying here correctly, this seems wrong? For example, this compiles [0]:

    const fn foo(n: usize) -> usize {
        n + 1
    }

    fn bar<const N: usize>() -> usize {
        N + 1
    }

    pub fn baz() -> usize {
        bar::<{foo(0)}>()
    }
In any case, I'm a little confused how this is relevant to what I said?

[0]: https://rust.godbolt.org/z/rrE1Wrx36


> Rust is quite capable of expression templates, as its iterator adapters prove.

AFAIU iterator adapters are not quite what expression templates are because they rely on the compiler optimizations rather than the built-in feature of the language, which enable you to do this without relying on the compiler pipeline.


I had always thought expression templates at the very least needed the optimizer to inline/flatten the tree of function calls that are built up. For instance, for something like x + y * z I'd expect an expression template type like sum<vector, product<vector, vector>> where sum would effectively have:

    vector l;
    product& r;
    auto operator[](size_t i) {
        return l[i] + r[i];
    }
And then product<vector, vector> would effectively have:

    vector l;
    vector r;
    auto operator[](size_t i) {
        return l[i] * r[i];
    }
That would require the optimizer to inline the latter into the former to end up with a single expression, though. Is there a different way to express this that doesn't rely on the optimizer for inlining?

Expression templates do not rely on optimizer since you're not dealing with the computations directly but rather expressions (nodes) through which you are deferring the computation part until the very last moment (when you have a fully built an expression of expressions, basically almost an AST). This guarantees that you get zero cost when you really need it. What you're describing is something keen of copy elision and function folding though inlining which is pretty much basics in any c++ compiler and happens automatically without special care.

> since you're not dealing with the computations directly but rather expressions (nodes) through which you are deferring the computation part until the very last moment (when you have a fully built an expression of expressions, basically almost an AST).

Right, I understand that. What is not exactly clear to me is how you get from the tree of deferred expressions to the "flat" optimized expression without involving the optimizer.

Take something like the above example for instance - w = x + y * z for vectors w/x/y/z. How do you get from that to effectively

    for (size_t i = 0; i < w.size(); ++i) {
        w[i] = x[i] + y[i] * z[i];
    }
without involving the optimizer at all?

I am not sure what the "truly useful features are" if you take into account that C++ goes from games to servers to embedded, audio, heterogeneous programming, some GUI frameworks, real-time systems (hard real-time) and some more.

I would say some of the features that are truly useful in some niches are les s imoortant in others and viceversa.


Adding taxes to things does not help anyone and goes against free choice.

A better alternative would be to create the incentives so that companies like these can be born in Europe.


Companies like Microsoft should not be given "incentives to exist" anywhere (at least as they exist currently). They are corrosive to the public good.

Are you sure? Enumerate all the competition and associated technilogy this has enabled for decades: tooling, software, etc.

I am not sure how someone should be entitled to prevent others from enjoying thr benefits of better technology.

If you do not like it, just skip it, as I do.


Are you seriously trying to pitch the flaming garbage heap that is Microsoft Windows as "better technology"? Microsoft is a predator, they offer licenses to schools at a knock-down rate in order to nurture a dependency on their product. The volume of cash that has been extracted from the general populace in this way is obscene. To top it off they have gone out of their way to sabotage free and open competitors, limiting the market to their shitty and overpriced offerings.

No, ehat I am saying is that if you go against companies that create wealth not only the bad ones will disappear but also the good ones.

Because you will scare all of them.


This doesn't work in when a market is run by oligopolies, you have to regulate to restore some sort of normalcy and competition.

Oligopolies are the result of overregulation.

Just fix the right things, not the wrong ones.


How is this particular oligopoly the result of overregulation?

Disagree the only alternative is to let the people decide, I don't trust a dozen men that already have deeply undemocratic beliefs to dictate the direction of tech for society.

You are against democracy, I am not. Democracy has led to some of the best advances of civilization, all oligarchies have done is introduce mass poverty, mass misery, and mass death.

At least with democracy we went to the moon for mankind, not shareholders.


No. I am not against democracy. If you do not like what someone does, you have rights to fight for.

Your definition of not letting people choose goes more against democracy that what I mean IMHO.

Just do not let these people collude with the power. Fewer politicians would mean fewer people doing business to influence others through politics.

That is what my experience tells me.


Exhausted.

In my experience conversions is one of the things that maximum warning levels do excellent static analysis for nowadays. In the last 15 years I hardly had a couole problems (init vs paren initialization). All narrowing etc. is caught out of the box with warnings.

The right strategy to use C++ efficiently is to set warnings to the maximum as errors and take the core guidelines or similar and avoid past cruft.

More often than not (except if you inherit codebases but clang has a modernize tool) most of the cruft is either avoidable or caught by analyzers. Not all.

But overall, I feel that C++ is still one of the most competitive languages if you use it as I said and with a sane build system and package manager.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: