Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I appreciate the effort but I think that if you are unit testing shell scripts, I think it’s probably better to use a language like Python or something similar.

I am serious.

I have decided for my own efforts to use python for any kind of command line script and I’m always glad I did.



I still find it a bit awkward to use python just to run some commands in sequence with some massaging of input/output data and parameters based on simple logic. bash scripts are great for that. And in that case it’s still a good idea to have automated testing instead of relying on running the script a few times to ensure it behaves as it should. I’ll definitely give this a try :)

I do agree though that if you need extensive massaging of output or arguments, python can help and make the whole thing easier.


"sh" for python is actually pretty cool

https://amoffat.github.io/sh/

Available as `python3-sh` in Ubuntu.


This seems friendlier out the box then subprocess. Any reason to use this vs subprocess?


The trouble is that bash is necessary to setup python. For instance, the lingua franca of Docker is bash (well sh, but you should obviously change that).

I've been resisting learning (any more) bash for a while now, but I think I'm going to have to. This looks like it could reduce some of my terror at the insanities of shell scripts (why does one even need two quote characters that do different things?)


Ruby has two kinds of quotes as well (interpolating and non-interpolating), but that must be borrowed from sh


I'd wager Ruby borrowed from Perl, which was the original borrower from sh.


Js has 3 quotes, of which 2 do the same, and the remaining one does two different things...


And that's another language I've been avoiding learning ;)


edit: minor bugfix in example code

A couple recommendations that make bash a much saner and avoid entire classes of problems:

First, whenever you are expanding a list of args, use "$@"! That exact four character sequence expands to the properly quoted positional params ("$1" "$2" ...) with any necessary escaping included so each param expands as a single word. Almost all of the problems you've probably heard about bash "not handling spaces properly" or otherwise having problems with whitespace or strange characters in filenames are fixed by using "$@". If you're using arrays/hashes, you can get the same effect using "${somearrayorhash[@]}" (quotes included, just like "$@"). Removing the quotes or using the tradition $* is almost always a bug.

Second, always use explicit quotes/brackets! Forget that they were ever optional. Using "$@" fixes most of the whitespace-in-filename problems; expanding your variables with explicit quotes fixes the rest. Assuming these:

    showargs() { 
        echo "$# args"
        for i in "$@" ; do
            echo "arg[${i}]"
        done
    }
    declare -- name="filename with spaces\\!.txt"
    declare -A h='([a]="b c" [foo]="'\''bar'\'' \"baz\" qu*x" )'
Instead of using the traditional shortcuts (which cause problems):

    showargs $name
    # 3 args
    # arg[filename]
    # arg[with]
    # arg[spaces\!.txt]

    showargs ${h[*]}   # or ${h[@]}
    # 7 args
    # arg[b]
    # arg[c]
    # arg['bar']
    # arg["baz"]
    # arg[qu*x]
Always using quotes/brackets simply does the right thing:

    showargs "${name}"
    # 1 args
    # arg[filename with spaces\!.txt]

    showargs "${h[@]}"
    # 2 args
    # arg[b c d e]
    # arg['bar' "baz" qu*x]
Bash still has it quirks and strange historical baggage, but in my experi4nce, using these two rules (and actually taking the time to read the bash(1) manpage...) changed writings shell scripts from an annoying mess of buggy arcane incantations into an actually sane(-ish) programming language.


One set of quotes (double quotes) allows for interpolation of commands/variables. Try:

  $ echo ‘pwd’
  $ echo “pwd”


You're right, but it's neither of those for command interpolation. Double quotes enable variable expansion, single quotes do not. Command interpolation is back quotes, which is a carryover from sh. The more modern bash way is as follows:

   $ echo $(pwd)
Incidentally: how does one type back quotes on an iPhone?!

Edited to add: shellcheck [0] will flag the back quote usage if you're writing a bash script instead of a sh script.

[0] https://www.shellcheck.net/


Longpress single-quote.


That ... lacks discoverability.

`Thank you`!


Like most iPhone tips. How was I to discover long press space bar for cursor placement?! I like it, but discoverability...


It was much better on the 3D Touch equipped models, press “through” the keyboard to get cursor placement, press “through” again while on a word to select the whole word.


ShellCheck always points out backticks as "legacy".


Thanks for the correction


If it’s just a few lines or some one-off thing, I understand the use of bash.

I start to get the feeling that if you feel the need for shell scripts, it might be wise to pause and wonder of this is really the right approach long-term. Especially if you feel the need to put it in git or something.

My experience is that there are often I need to put in some checks for safety and before I know it, you create a mess of grep awk cut sed and you wish you started out with python.

Are you really that much in a hurry or do you have the time to calmly spend a little bit more time to ‘do it right?’


Or you literally just need to run a sequence of commands to create some stuff in the filesystem without doing any text processing whatsoever, which is my use case for the three or four large-ish BASH scripts I've written professionally. Make was totally inappropriate in that case because I would have had to enumerate a lot of intermediate files and would have wound up with a parallelizable series of mini-scripts that would need to run serially to work correctly. And Python would ultimately turn into a DSL that looks almost exactly like BASH because the problem domain is "run a bunch of commands in sequence" which is what BASH is designed to do.


Yes, I think this is exactly the right kind of reason to use bash.


Is there an elegant way to do unix-like piping in python?


- use shell=True and subprocess module i.e., use sh for what it is good for (one-liners that run external commands) and use python to glue it together in a sane manner (consider shell as a DSL in this case). - you could use the plumbum module to embed commands in Python itself https://plumbum.readthedocs.io/en/latest/#piping - for Makefile-like functionality invoke/fab could be used (flexible configuration, composition, debugging of the commands) https://docs.fabfile.org/en/2.6/getting-started.html#addendu...


One of the biggest downside is that the startup time for python is significantly slower than the shell. With just one script this isn't really noticeable, but if you're composing a lot of small scripts together, python becomes noticeably less responsive for interactive commands than bash.

Along similar lines bash's syntax is incredibly streamlined for composing standalone scripts and programs through pipes. A simple bash one-liner like the below would be much more awkward to write in python:

> diff <(netcat $server | grep town | sed 's/street/St' | cut -f 3 | head -n 5) <(cat ./$(psql $query).dat)


I wonder if that ‘lots of small scripts together’ is a desireble situation. Can’t it be just one app that performs all the steps?

It is all about context to me. A shell oneliner is not something I would replace with python but as soon as you start up an editor, think again I would say


Agreed entirely. I inherited a system with 50k lines of unreliable shell scripts. I figured out ways to unit test them, do mocking, and pushed developers to unit test their scripts. I even wrote long articles of best practices for writing reliable scripts, including massive lists of gotchas.

Eventually I realised it was a lost cause and really you just shouldn't use shell scripts for anything that you want to be reliable above a trivial level of complexity.

That was with PowerShell, but I wouldn't be surprised if the same applies to bash as well.


That is what you get when you let anybody write scripts without code review.

Bash people are the worst when they switch, since they keep the same mentality and continue parsing strings or use other bashizms.

See my answer above also.


With Powershell it is possible to write decent code imho.

Powershell is on another plane, more like Python and less like bash shell scripting.

I used Powershell and Pester a lot and it worked great.

Bash becomes a mess of cat grep sed awk cut


I agree that you should go to a more robust language when shell scripts get bigger, esp if you are needing or wanting unit tests. I would avoid Python if there are lots of dropping into the subprocess module; it suffers the same problem as Go - lots of ceremony and boiler plate to run shell commands. I recommend Ruby or Perl because they can drop into a shell-like mode and you can execute shell commands more “naturally.”


Ar this point anything else seems better that shell scripting. Sure not brainf@ck or COBOL, but I think the point is clear.


We have tones of PowerShell and it works like a charm, although we have some experienced posh devs

We also test REST backend in PowerShell using Pester and home made Posh rest client.

PowerShell is preferred in this house because

a) you can run it on any Windows OS on the spot and modify it in ad hoc manner, you can even debug it with breakpoints etc easily. Also our Linux machines have it so it is unifying admin interface.

b) its powerful, you can do anything in it with few lines of code (one case: we did 10 million SOAP requests using certificate per day for entire country)

c) many Windows tools use it like SqlServer, IIS etc. which makes management way easier - for example we use [1] to install sql server on all dev/prod machines or use [2] to monitor all our servers or use [3] to send CI metrics to influx (all those are just minor samples, we have bunch of stuff like that)

d) we find it way easier to keep CI/CD vars in PowerShell hashtables then in yaml, so our yaml fiels are one liners and everything works locally.

e) Python, ruby and friends are NOT designed for shell work. Its akward, unfriendly and most of all not there on Windows OTB.

---

[1] https://github.com/majkinetor/Install-SqlServer

[2] https://github.com/majkinetor/flea

[3] https://github.com/majkinetor/psinflux


Hey, since you appears to use Chocolatey and there's few people using it, how do you feel it? Do you think it will keep existing? I am not that into the way MS is doing the winget thing, but I also noticed it appears to impact how people adopt or not chocolatey. I see a lot of manual scripts for installing things on Windows CI systems (some using now defunct fciv and soon to be defunct bitsadmin)... I really like chocolatey but I am worried it will disappear soon.


Chocolatey is the only thing I use to install stuff both on CI and on each dev machine and I regularly create packages for it [1]. I worked hard to make what I need stable and not depend on their existence - [2] and all the repos using the same methodology release packages on GH [4] and there is a handy script to install from there. I also created AU for it [3] and managed to convince people to embed software in packages [5] so packages always work (you can cache them on your own via file system, artifactory, nexus etc). You can also host your own gallery in number of different ways. So, in short, there is escape plan. TBH, it looks like choco is going better then ever. And you can't simply find any better repository for sw, its better and more up to date then most linux package repos (on par with Arch).

> I am not that into the way MS is doing the winget thing,

That is years away IMO, no scripting there too, and it moves like a snail. I would really be embarrassed if I were leading that team.

> I see a lot of manual scripts for installing things on Windows CI systems

Yeah, most people suck, like their scripts :-) There is literary 0 chance for you to make reliable installation script in general that works in any context.

> I really like chocolatey but I am worried it will disappear soon.

Just use it. I don't work for them. I maintain core team repo [2]. Its great tool now. What will happen tomorrow nobody knows but like I said, you have escape plan and even if they go down your CI will still work for many years if you set it up properly.

---

[1] https://gist.github.com/majkinetor/a700c70b8847b29ebb1c918d4...

[2] https://github.com/chocolatey-community/chocolatey-coreteamp...

[3] https://github.com/majkinetor/au

[4] https://github.com/majkinetor/au-packages/tags

[5] https://github.com/chocolatey-community/chocolatey-coreteamp...


Powershell is an entirely different matter to me.

Powershell is so much powerfull, I’ve written a ton of code in that and also used Pester for unit testing.

Entirely different world compared to shell scripting.


In a similar vein, there's also babashka for these "bash+" use-cases: https://github.com/babashka/babashka


bats is used for running tests against command line interfaces of your program. This is useful for making sure weird strings and argument patterns are handled correctly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: