Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>I think Python was popular as a general-purpose language first.

What were their choices though, Perl? It's easy to see why Perl lost out. Other than PHP, I don't really know of any other JIT scripting languages they could have chosen.



I knew about Python in 2006 or so when got into Linux, and at that time (when Python 2 was a thing, iirc it just came out) it was very popular between the FOSS world for doing apps and GUIs (I even toyed a lot with PyGTK), whereas I felt Perl was much more about more "serious" stuff like text processing and kind of a sh language with steroids. I just barely heard about Ruby and wasn't sure what it was its purpose - I just heard about its "gems" but not about Rails. Still as both Python and Perl were FOSS I supposed their niche and user base were going to be around it, as many things FOSS at the time.

At 2008 I started my graphic design studies and I just pretty had to forgot about programming (which in hindsight if I had kept doing it I would have a very strong programming background and maybe my life would be much better now, but it is what it is) - but was very surprised to discover around 2011 or so that it seemed _everyone_ was using Python. Like I just blinked and it took over the world somehow.


The other strong contender at the time was Tcl, especially when combined with the graphical library as tcl/tk. Python implemented tk as well given it's popularity.

Tcl's extensibility led to expect, which was very useful for automating scripting over telnet.

https://en.wikipedia.org/wiki/Expect


Tcl also influenced many Microsoft Powershell aspects.


There were always other choices. Lua is the main one that comes to mind.

The point is that data science use came quite late in the Python world, and the increase in users due to it is incremental. Python was already at 3.x before the ML world adopted it. If the ML world picked another language, Python would still be in the top 5 languages.


You arbitrarily restricted the scope. Java is what I saw.

I was in college 2006 - 2010 in CS, and while all the introductory courses were in Java, by 2008 or so a lot of the other students had switched to Python on their own, for projects where either language would work. Didn't really see anything else, just Java and Python.


No, it wasn't arbitrary. A JIT language is much easier to pick up than something needing a compiler and an executable. The focus on a scripting language was deliberate

Edit: turns out the term I'm looking for wasn't JIT but an interpreted language.


Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It would seem that having a standard is more popular.


Yeah, there's the ongoing Perl joke about writing a script that works today, but not understanding how it works tomorrow. Too much one-liner type stuff that did not allow for maintainability


That’s any code I haven’t looked at for a while, to be honest. I can’t count how many times I’ve looked at code I wrote or bug tickets I fixed and have absolutely no memory of doing it. It’s almost like the act of committing flushes the local storage in my brain.


I run into this problem as well I'll often come across something I wrote a few years ago and struggle to remember why I wrote it that way.

I've learned to add comments to my code - from what I see commenting code is frowned upon by a certain subset of developers but I've taught myself that whenever I am doing something subtle or unintuitive to add a short comment explaining what the code is doing, for every potentially unnecessary comment I've added I've also saved myself time when I've had to come back to something months or years later and been able to refer back to the comment.


> from what I see commenting code is frowned upon by a certain subset of developers

I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a... belief.


The main argument seems to be comments should left with the commit message and not "inline" i.e mixed with the code. Personally I make heavy use of inline style comments.


I've worked in tech my whole life and never heard a dev make these comments are bad claims. I only work in the boring world though. Is this a startup or FAANG or silicon valley thing?


In clean code there is a chapter about eliminating useless comments. I think this has some merits. Think of the following made up example:

sum = sum_two_numbers(number1, number2)

Probably doesn't need a comment.

The other argument is that comments need to be maintained and are subject to decay. E.g. the code they are commenting on changes.

The book goes into other examples but i think the idea isn't to eliminate comments but to be thoughtful and judicious in which you use


Perl is different as people really tried to do this on purpose with maximizing the concept of one liners. Maybe it's a PTSD kind of an effect? "Any" code or good code doesn't try to be developed in a way that is difficult to parse by the next person. Perl developers definitely didn't adhere to the policy of writing code like the person maintaining it is a serial killer that picks people that write unmaintainable code and they know where you live.


> Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It wasn't just "having a standard" that mattered; in Perl it was actively encouraged to find a different way of doing something.

"There is more than one way to do it" was often taken by the Perl programmer as a challenge to find all the other ways, and then use them all in different parts of the same program.


Icon? Smalltalk? Dylan? Scheme? Common Lisp?


Perl's actually excellent at processing unstructured data, and it had a strong foothold in bioinformatics for a time. I don't think the decision was as obvious as it looks.


This is true. Bioinformatics was full of Perl scripts for a variety of (text) analyses. I however remember well that many students began to hate it soon after working with it, as it was very difficult to understand existing code. So, when given a choice, many choose Python as an alternative. And stayed with it.


For most people the thing that is hard to understand about Perl scripts is the regexp code. However, regexps looks more or less the same in any language. But, the thing is, most Perl scripts process things such as log files and similar data. Which makes the scripts highly dependent on regexps, hence hard to read and maintain. The same thing goes for any code that uses a lot of regexps.

Actual Perl code, disregarding regexp, certainly isn't anymore difficult to comprehend than code in most other languages.


Python may be JIT now (is it?), but it certainly wasn’t back then.


JIT may not be the correct thing to call it. At least in my head, any script that doesn't need to be compiled by the dev or the user before running is JIT. It's the scripting vs building difference to me. If that's not correct, then I'd love to be corrected.

Here's my reference:

https://en.wikipedia.org/wiki/Just-in-time_compilation


Interpreter vs. Compiler might be closer to the distinction you are looking for:

https://en.wikipedia.org/wiki/Interpreter_(computing)


Thanks. "My interpreter doesn't care about white spaces, why should I" being something that should have clued me in.


JIT is a compilation model for interpreted languages, which translates bytecode into machine code in runtime to obtain better performance. The opposite of JIT is AOT (Ahead of Time). JIT languages go through two compilation steps: one to bytecode, and another to native code in runtime. Java is one example of a JIT compiled interpreted language, that also does ahead of time compilation to bytecode, while python compiles to bytecode transparently, but does not do any JIT compilation to native, which is part of the reason it's considered a "slow language" (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language).

TLDR:

Java uses AOT bytecode compilation + JIT compilation

Python uses import-time bytecode compilation + slow interpreter


> (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language)

Yes, which is why this is the way CPython works, but PyPy uses JIT and is faster.


> and is faster.

After a very long warmup time, which may make one-shot applications much slower.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: