Hacker Newsnew | past | comments | ask | show | jobs | submit | zamalek's commentslogin

You should be comparing to AMD, at that. There's hope for the upcoming Intel chips - but anything current isn't competitive. Furthermore, M* is good for a specific form-factor, but don't for a second suggest that 4 P-Cores will outdo the 16 hyper threaded cores on a 9950x

Two weeks with Rust and you're still fighting with the compiler. I think the LLM pulled a lot of weight selling the language, it can help smooth over the tricky bits.

What people get wrong is that you don't just trip balls and get cured. Re-integration therapy is vital for lasting effects. Grabbing some shrooms and digging in is recreation, which is perfectly fine, but don't fool yourself or anyone else by suggesting it's for treatment.

I was a depressed teenager a long time ago and I am almost certain mushrooms made things worse.

I didn't need mushrooms. I needed therapy, friends, a social life, a sex life, goals, something to look forward to in the real world.

All I found on mushrooms at the time were horrible existential loops that just made things more hopeless. I would read about people having these peak wonderful experiences or Mckenna alien experiences and just get more depressed that even the mushrooms didn't help me.

It is almost blasphemous in this space to say what actually ended up changing my life were SSRIs. A little prozac fixed something that was just chemically wrong in my head.

What seems obvious is there is enormous variability in people's brain chemistry so the tool to fix the problem has to be quite specific for the individual.


Yes. For example, IV Ketamine can yield not only immediate relief in a chemical sense, the treatment itself results in a fully-aware, balls-tripping, metaphor and symbolism-filled, time and space-warping experience in an entirely fictional space. With thoughtful guidance prior-to and after each experience, a series of them can, for example repeat a message until you "get it," or each may deliver a component of a profoundly larger message when they are combined, weeks later. What you do with it all will determine what you get from it.

I think there's simply so much value in being able to see the same thing in so many different perspectives that you never have considered possible at all in your life before.

This is particularly true of a deep psychedelic experience "inside" with IV Ketamine.

Your own internal processing will still determine how you perceive a perspective change, but specific to this idea in particular, you may for example, within, suddenly find it obvious to think of things as being made of something different than in the outside world reality (and this sort of "change of bases" may reveal some kind of truth not otherwise visible.) You may see something as formed of language instead of molecules and atoms, or vice-versa.


That's not exactly it. Light gets redshifted instead of slowing down, because light will be measured to be the same speed in all frames of reference. So even though we can't actually observe it yet, light traveling towards us still moves at c.

It's a different story entirely for matter. Causal and reachable are two different things.

Regardless, such extreme redshifting would make communication virtually impossible - but maybe the folks at Blargon 5 have that figured out.


> Cross-vendor GPU support: A single codebase runs on AMD, NVIDIA, and CPU via KernelAbstractions.jl

This is why I wish Julia were the language for ML and sci comp in general, but Python is sucking all of the air out of the room.


i hope you realize this is purely because julia uses LLVM and LLVM has backends for those targets (noticeably absent are GPUs which do not have LLVM backends). any other language which uses LLVM could do the same exact same thing (and would be hampered in the exact same way).

Probably true, but one unique thing about Julia is, that exposes almost all stages of the compilation to the user. From typed IR to native code generation you can customise the compilation in many ways. Together with the power of LISP's metaprogramming features, that's a really fine basis for powerful and performamt DSLs and code transformations.

All those GPU targets are powered by libraries, that are not part of Julia itself (GPUCompiler.jl). The same goes for automatic differentiation. That's remarkable in my opinion.

So you're right, that many programming languages could do it, but it's no wonder, that other languages are lacking in this regard compared to Julia.


Maybe because Python can reasonably used to make actual applications instead of just notebooks or REPL sessions.

What's stopping Julia from being reasonably usable to make actual applications? It's been awhile since I've touched it, but I ain't seeing a whole lot in the way of obstacles there — just less inertia.

Presumably inertia and ecosystem size (but that's a follow on of inertia). When Julia came out Python already had traction for ~most things.

Keep in mind that it went with 1 based indexes to make the switch easy for Matlab types. I'm not sure if that was a good or bad move for the long term. I'm sure it got some people to move who otherwise wouldn't have but conversely there are also people like me who rejected it outright as a result (after suffering at the hands of 1 based indexing in Matlab I will never touch those again if I have any say in the matter).

I've considered switching to it a few times since seeing that they added variable indexes but Python works well enough. Honestly if I were going to the trouble of switching I'd much rather use Common Lisp or R5RS. The nearest miss for me is probably Chicken, where you can seamlessly inline C code but (fatally) not C++ templates.

If I ever encounter "Chicken, except Rust" I will probably switch to that for most things.


That's part of the answer, but there's a bit more to it IMO.

The syntax is a bit weird; python, swift, rust, and zig feel more parsimonious.

I absolutely love multimethods, but I think the language would have been better served by non-symmetric multimethods (rather than the symmetric multimethods which are used). The reason is that symmetric multimethods require a PHD-level compiler implementation. That, in turn, means a developer can't easily picture what the compiler is doing in any given situation. By contrast, had the language designers used asymmetric multimethods (where argument position affects type checking), compilation becomes trivial -- in particular, easily allowing separate compilation. You already know how: it's the draw shapes trick i.e., double-dispatch. So in this case, it's trivial to keep what compiler is "doing" in your head. (Of course, the compiler is free to use clever tricks, such as dispatch tables, to speed things up.)

The aforementioned interacts sensitively with JIT compilation, with the net outcome that it's reportedly difficult to predict the performance of a snippet of Julia code.


Just to clarify the above:

1. I use the term "performance" slightly vaguely. It's comprised of two distinct things: the time it takes to compile the code, and the execution time. The issue is the compilation time: there are certain cases where it's exponential in the number of types which could unify with the callsite's type params.

2. IIRC, Julia compiler has heuristics to ensure things don't explode for common cases. If I'm not mistaken, not only do compile times explode, but certain (very common) things don't even typecheck. There's an excellent video about it by the designer of the language, Jeff Bezanson -- https://www.youtube.com/watch?v=TPuJsgyu87U . Note: Julia experts, please correct me if this has been fixed.

3. The difficulty in intuiting which combinations of types will unify at a given callsite isn't theoretical; there are reports of libraries which unexpectedly fail to work together. I want to qualify this statement: Julia is light years ahead of any language lacking multimethods when it comes to library composability. But my guess is that those problems would be reduced with non-symmetric multimethods.

4. The non-symmetric multimethod system I'm "proposing" isn't my idea. They are referred to variously as encapsulated or parasitic multimethods. See http://lucacardelli.name/Papers/Binary.pdf

I have huge respect for Jeff Bezanson, for the record!


> The syntax is a bit weird

In what way? It's more-or-less the same syntax as Ruby and Elixir, just with different keywords. Like as much as I love Zig, Zig's syntax is way weirder than Julia's IMO (and none of 'em hold a candle to the weirdness of, say, Erlang or Haskell or Forth or Lisp).


First, let's distinguish between two types of syntactic constructs: null and left denominations. (Terminology borrowed from Pratt parsers.) Null denominations can exist on their own, left denominations can't -- they are inherently chained (eg arithmetic expressions, statements in a block, or elements of a tuple), and allow a succinct, infix notation for variable-length constructs (no lispy parentheses hell).

Second, null denominations usually introduce names -- whether for variables, types, functions, lifetimes, macros, etc. One exception to this are free-standing value expressions (a bit weird; less so when they're the last expression in a block indicating the value returned by it). Another other exception would be directive type constructs - eg directives to import names from another module, directives to give hints to the compiler, etc. The last two exceptions are the most common ones: variable assignment and function invocation.

The golden rule of good language design, as I see it, is this: null denominations must begin with a fixed and unique token. The only permissible exceptions should be for assignment and function invocation; exceptions which exist because those use-cases appear so often in a typical program that requiring a prefix would be insufferable.

Julia breaks this rule for global variables. (Fair enough, Python also commits this error, but it's a mistake and a source of bugs!) But wait, Julia also has "const" and "local" binding constructs, where it follows the golden rule -- but now your syntax isn't consistent. So now you need to keep in your head these nuances -- and know the difference between a soft and hard scope -- when you want to write a function which modifies a function using macrology.

(As a point of taste on the choice of prefix token: introduction of variables through "local" is just as weird as C++'s "auto" -- and at least Bjarne Stroustroup had an excuse for that choice. Anyone who introduces a global variable in a local scope should be punished imho, so there's no need to say "this is a local variable", it's obvious from the fact that the name is introduced inside a function. Instead, my personal preference is to introduce constants through "let", and variables through "var". The former is well-known to anyone numerate, and the latter is ubiquitous in software engineering. Both read well; they're as close as possible as you can get to constructs in English.)

Julia breaks the golden rule again with its succinct, Mathematica-style notation for function definition. I get that it wants to appeal to Mathematica users, but Python already proved you don't need to do that. This is a programming language; brainy types, like mathematicians and physicists, aren't going to be flummoxed by an unfamiliar notation for function definition, or irritated by having to type a few extra characters.

I mention macrology; it's not just that. Let's say you want write a syntax highlighter -- you need to take into account all that weirdness. If null denominations have a fixed & unique prefix, parsing is easy-peasy. Want to add a capability to "inline" HTML code within Julia, react-style? You're going to run into similar issues. And so on...


I've always thought it sad that lush died; in many ways it was a spiritual predecessor to julia. here's a nice blog post about it: https://scottlocklin.wordpress.com/2024/11/19/lush-my-favori...

I was excited about julia as an application development language when it first came out, but the language and ecosystem seem to be targeting long-running processes. there was just a ton of latency in build time and startup time for things like scripts and applications, so I moved on.

https://yuri.is/not-julia/

> My conclusion after using Julia for many years is that there are too many correctness and composability bugs throughout the ecosystem to justify using it in just about any context where correctness matters.


It's actually better suited IMO, being a compiled language. I'm not sure how anyone could consider the current train wreck of getting python code just to run "actual applications." uv is great and all, but many of these "actual applications" don't use it.

https://juliahub.com/case-studies

Most "Python" applications are actually bindings to C, C++ and Fortran code doing the real work.


And it ... works?

Use C, C++ or Fortran for the heavy lifting, and Python for UI/business logic/non-high perf stuff for rapid app development.


> And it ... works?

It gets the job done, but the existence of Cython, Numba, Pythran, PyPy and many, many, others are indication that this isn't a global optimum.


Yup, there would have been much less Git buy-in if it weren't for git flow; people grow incredibly attached to their beloved taxonomies.

> Yup, there would have been much less Git buy-in if it weren't for git flow

I don't buy this. I've never used git-flow in life. No team I've worked for has ever used git-flow. Yet all of us have been using Git for ages. Git has been hugely successfully independently and different teams follow different Git workflows. Its success has got very little to do with git-flow.


>I don't buy this.

It's not really debatable. Git flow came about because of SVN / CVS practices and was the first and for many still is THE branching model they use.

>Yet all of us have been using Git for ages

You say "all of us" but then you completely ignore the primary branching model the vast, vast majority of people use on Git.

Just for the record, this isn't being stated in support of git-flow it's just a historical fact that's not really debatable.


> the primary branching model the vast, vast majority of people use on Git.

> it's just a historical fact that's not really debatable.

Over my last 15 years of software dev, I have _never_ heard of anyone actually using Gitflow in their codebase.

I'm not saying you're wrong. My experience is anecdotal. But I don't know why you say it's a "fact". Was there surveys or anything?


I'm not questioning your experience, but how "enterprise" is that experience? Gitflow was no small part of my convincing my company to move off TFVC. I doubt they still use, but it was shallow waters for scared folk.

I strongly doubt that my story, just as much as yours, is unique.


> It's not really debatable.

Very weird for you to start a reply like this when we are literally debating it.

> You say "all of us"

Yes, I mean those of who don't use git-flow. That's what I meant by "all of us".

> ignore the primary branching model the vast, vast majority of people use on Git.

Do you live in a git-flow bubble or what? I've been using VCS since the dark ages of CVS. Moved to SVN. Mercurial. Git. Never worked in a team using git-flow. Never used git-flow myself. Never met anyone IRL who uses git-flow. I only read about these things on HN and blogs.

What kind of stats do you have to claim that this is the primary branching model. If I go by my experience, it's a minority branching model that only people living within the bubble care about.

> it's just a historical fact that's not really debatable.

What is a historical fact? That people use git-flow. Nobody is contesting that. What I am contesting is that the success of Git is not connected to git-flow like the grand-grand-parent comment said.


I'm not debating it... we're not debating it. You're having it explained to you.

>If I go by my experience

That would be the very definition of a bubble.


This is one of the most ridiculous comments I've ever read on Hacker News. You really think git became popular because someone wrote up a branching convention for it?

Git became popular because it was one of the first two open source distributed version control systems. Compared to the least-bad open source (non distributed) version control system before, SVN, the native branches and the ability to have a local copy of the whole tree were self evidently a revolution.

(The other one was Mercurial by the way, released at almost exactly the same time as git. Partly git won that race because of the cachet of being written by Torvalds and being used for the kernel, but I suspect mainly it was due to the existence of GitHub.)

Aside from the above, it's also just clearly not true that git flow was particularly common. It's no good claiming anyone that disagrees is in a bubble. We all have access to GitHub! Look for yourself at some random repos (and make sure you sample a few different languages). It will verify my experience of looking at dozens, probably hundreds, of repos over many years: the number of people using git-flow is, to a first order approximation, roughly zero.


> I'm not debating it... we're not debating it. You're having it explained to you.

You have not explained anything.

> That would be the very definition of a bubble.

Just as is your bubble.


I think we're fine: https://youtube.com/shorts/3fYiLXVfPa4?si=0y3cgdMHO2L5FgXW

Claude invented something completely nonsensical:

> This is a classic upside-down cup trick! The cup is designed to be flipped — you drink from it by turning it upside down, which makes the sealed end the bottom and the open end the top. Once flipped, it functions just like a normal cup. *The sealed "top" prevents it from spilling while it's in its resting position, but the moment you flip it, you can drink normally from the open end.*

Emphasis mine.


He tried this with ChatGPT too. It called the item a "novelty cup" you couldn't drink out of :)

> being left behind (“I still can’t bring myself to vibe code. I have to at least skim every diff. Meanwhile this guy is joining OpenAI”).

I don't believe skimming diffs counts as being left behind. Survivor bias etc. Furthermore, people are going to get burned by this (already have been, but seemingly not enough) and a responsible mindset such as yours will be valued again.

Something that still up for grabs is figuring how how to do full agenetic in a responsible way. How do we bring the equivalent of skimming diffs to this?


Intel very recently seems to be making progress thanks to what the previous CEO kicked off. People are comparing Panther Lake to M1 (but we'll see when it is in reviewers' hands).

Yup, but "normies" do need menus or at least some way to do things that has some degree of visual affordance (e.g. a persistent cmd/ctrl+p, which I think Office has/had).

not everyone can drive a Ferrari

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: