Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Using GNU Stow to manage your dotfiles (invergo.net)
188 points by shawndumas on Oct 21, 2014 | hide | past | favorite | 68 comments


> but many/most dotfiles reside at the top-level of your home directory, where it wouldn't be a good idea to initialize a VCS repository.

Right. Which is why I use git with a workspace detached from the git directory. The git directory is in ~/dotfiles/.git, with "git config core.worktree ~", and I set an alias "dgit=git --git-dir ~/dotfiles/.git". The advantage of this method is that there are no symlinks to maintain, however initial clone is a bit painful, since you have to move the files and set the config.

It's not perfectly straightforward, but it's as close to it as I could get it.


The one thing I like about stow is that I can choose to only add configurations that make sense for the machine I am using. If don't have a certain application, I don't 'stow' the configuration for that application in my home directory.


You could have a separate git branch for each machine :-)


I used to have complex configuration files that worked in all scenarios using conditionals and detecting available features, but now I just generate the configuration files, or rather specialise the parametric versions of them, and push them with chef/cfengine/ansible or even rsync.


A sibling post mentions a thing called vcsh (with no additional details) that appears to allow you to do just that with the same premise by overlapping multiple repositories onto your homedir.


You can have multiple git repositories, each for different application. Or you can use git submodules when suitable (eg. vim plugins, directories in ~/.config).


I'm using the same scheme for some time and it works nice. It started with simple alias as well, but then it expanded into bigger wrapper script over time. You can check it here

https://github.com/marbu/dogit/blob/master/README.md

That said, I'm not sure if it would be direclty usable for anyone other that me in it's current state. It's tailored to my use case and there are few not so well solved use cases.

Few additional advantages of this approach in general:

You can use local branch which is private to the machine and a public branch, Whatewer you commit ends up in the local private branch and if you would like to share it, you need to cherry pick particular commit and rebase (so that you I don't end up sharing private ssh key on github).

You can use any git tool directly on your dotfiles.

One can directly add vim submodules into the dotfiles repo.


This is great! It fixes the only problem I have with maintaining my dotfiles as a git repo: the .git dir in ~. That interferes with so many things such as the CtrlP vim plugin, the integration of git with the bash prompt and many other things.

Thank you for this.


Yeah, the git in the bash prompt thing was the most annoying thing to me, honestly :)

Since there is interest, I'll elaborate on the understated "a bit painful" part of moving the files after an initial clone which leaves the files in the ~/dotfiles dir. The problem is that I didn't find an easy "move and overwrite recursively" command in Linux.

Here's the magic command from my script: git ls-files --cached -z | rsync -av --from0 --remove-source-files --files-from - ${dir} ~/ && find . -type d -empty -delete

To spell it out, it's having git list all the files in the repo (so skipping any unversioned files such as the .git dir itself), using the NUL byte as file termination to avoid any (most?) special character issues in file names, and using rsync to move the files over thanks to --remove-source-files. However rsync doesn't remove empty directories (durr), so have the find do that.

Also, you probably want to set "git config status.showUntrackedFiles no" ;)


I have very similar setup to yours, but my solution is to use a bare repository. To set up dotfiles on a new computer is just

    $ git clone --bare git@github.com:mbudde/homedir.git .homegit
    $ git --git-dir=.homegit --work-tree=~ checkout -f    # Overwrite existing files
    $ echo '*' >> .homegit/info/exclude
And then I have a simple git wrapper script [1]. I moved from using an alias to a script for some reason I can't remember (and of cause I didn't write it in the commit message -_-).

[1] https://github.com/mbudde/homedir/blob/master/usr/bin/hgit


This definitely looks like a better alternative!


> I didn't find an easy "move and overwrite recursively" command in Linux.

You don't need recursiveness into commands, just use find (maybe with xargs) to generate the list of files to overwrite (or commands to execute).


That's what I did in my first iteration, except find is hell.

No wait, I mean find is a really powerful and sophisticated tool, but it takes time and effort to get it to do precisely what you want. It turned out rsync did what I wanted more easily.


Find has an atrocious syntax. Nowadays, I seldom use any other feature of find rather than just enumerating files. Rather, I enumerate files, create commands on them (with things like xargs and sed), and pipe the output into sh. Generating code is a very flexible technique, and it's actually easier (and safer) that trying to muck with find options.


Since we're all talking about it...

I keep my dotfiles in a sub-directory (doesn't matter where) and use the following Rakefile to manage it:

https://github.com/tvon/dotfiles/blob/master/Rakefile

Then it's just `rake status` or `rake link` to symlink everything. I've only been using this for a few years but I haven't run into any problems yet.


This is exactly what I do and I think it works fantastically. I also have the same problem with initial clone and I keep meaning to write a script to automate it (especially in a push sort of way where I just specify the host that I want to convert and it does it over ssh with git push).

I also make an effort to push my config files into .config wherever possible. I really hate the ~ dotfile clutter.


I've once thought of something similar for hg, but a bit bigger, in that:

1. There could be many "hg-dirs" for a one regular "workdir" (e.g. .hg_foo, .hg_bar, .hg_baz, ...)

2. There'd be a wrapper tool, that on any hg command would autodetect to which hg-dir any given affected file belongs, if already committed earlier (with ability to override the autodetection by hand, including for adding brand new files to the repo).

However, didn't care enough to try implementing this yet. Also, not quite sure how/whether it could work with git, given the existence of staging area.

(full story: http://stackoverflow.com/a/5789827/98528)


I just made ~/.git and ignore everything with .gitignore, whitelisting files I want to commit: https://github.com/burke/dotfiles/blob/master/.gitignore

It works really well for me.


I remember giving advice once, to someone who did that and then

    git clean -dfx


That is terrifying. I just added this to my zshrc:

    git() {
      local toplevel=$(command git rev-parse --show-toplevel 2>/dev/null)
      if [[ "${toplevel}" == "${HOME}" ]] && [[ "$1" == "clean" ]]; then
        >&2 echo "Do NOT run git clean in this repository."
        return
      fi
      command git "$@"
    }


That is especially problematic with gits directory traversal behavior. If you are in a wrong directory inside your home, you might accidentally nuke your homedir instead of the intended repo.


That's exactly how vcsh works too.


I gave GNU Stow a try, but thought it did too much.

Ended up using dotfiles (https://pypi.python.org/pypi/dotfiles). My dotfiles (https://github.com/cjoelrun/dotfiles). I handle system specific stuff in each config: OS checks in zsh, emacs, tmux configs. Allows me to move between OSX and Arch linux.


> I gave GNU Stow a try, but thought it did too much.

Not to argue with your experience, but it's hard to see how it could do less—it literally just creates a bunch of symlinks. (It seems to have grown a bunch of options since I last looked at it, but it seems that they can all be ignored if you want simplicity. (I've only ever used `-d`, `-t`, and `-D`.)


I don't know whether it was just unclear to me or unclear in this article but (after checking man stow), here is the relevant section of the man page that cleared up how it worked for me:

The stow directory is assumed to be the value of the "STOW_DIR" environment variable or if unset the current directory, and the target directory is assumed to be the parent of the current directory (so it is typical to execute stow from the directory /usr/local/stow). Each package given on the command line is the name of a package in the stow directory (e.g., perl). By default, they are installed into the target directory (but they can be deleted instead using "-D").


What I do is just keep all of the dotfiles in ~/dotfiles and symlink them to the home directory, then I can keep track of them with git. Doesn't have to be any more work than that. Maybe it's not an issue for me because I only use one machine, and when I switch to another, I just pull down my dotfiles from github and symlink them again. I only do this every couple years.


This is what I do as well, and I also have a little script that creates all the symlinks in my home directory as well. Dead simple, and it works. I was hoping Stow would somehow be an improvement over that but I don't really see how it does anything more for you.


I agree and think don't understand why you'd want your dotfiles to depend on Make, Ruby/rake, etc. let alone something like Stow. Git and a shell are enough; you can eliminate the Git dependency by packaging from a post-receive hook too.


Handling dotfiles doesn't have to be a hard problem. Here's a simple script I wrote that copies them to and from a git repository.

https://github.com/Lambdanaut/dotfiles/blob/master/cp_dotfil...


This is how I manage my dotfiles since I discovered stow a couple of months ago.

The only thing that bugs me is that I haven't figured out an elegant way to have some configs include machine specific configs. For example a set of bash aliases that I only use at work but don't want at home.

My best idea is to source all files in, say ~/.bashrc.d and put machine specific configs in there... Haven't tried it out yet.


I also use stow to manage my common personal dotfiles, as well as my employer-specific dtofiles. The way I do this is to keep my general files in my dotfiles repository, but then keep work specific ones in a separate repo (dotfiles-employer).

This separate repo has .private versions of all of my commonly used dotfiles (for example .bashrc.private), and my top-level files will check for the existence and source these, if present.

Both repos will be installed using stow, with the work dotfiles generally just extending the personal dotfiles.


Not sure if this would work with Stow, but in my .bashrc I just do this:

  if [ -f ~/config/bashrc_$(hostname -s) ]; then
        source ~/config/bashrc_$(hostname -s)
  fi
this gives me machine specific configs.


You can do something similar to target *nix platforms like Darwin|FreeBSD|Linux|NetBSD|OpenBSD with:

  if [ -f ~/config/bashrc_$(uname -s) ]; then
      source ~/config/bashrc_$(uname -s)
  fi


Great idea! It wouldn't work for me though as I tend to call all my machines 'fuckup' (I know...) ;)


In that case, you could have it read or check the existence of something else, like a file in /etc.


> My best idea is to source all files in, say ~/.bashrc.d and put machine specific configs in there... Haven't tried it out yet.

A fair amount of the time when I'm setting up a .foorc, I'll want to add some functions to bashrc, or change a MANPATH or something, so this pattern is useful for that too; you could just have a ~/dotfiles/.bashrc.d/foo.sh and have your .bashrc unconditionally source everything in ~/.bashrc.d/. Then when you use stow to setup foo, its bashrc modifications get installed too.


Very nice! This is something I've always meant to set up but never bothered with. He mentions installing different subsets of your dotfiles on different machines. To do that easily and have it "written down", I'd put a Makefile into the dotfiles directory with .PHONY targets for each machine, and have those targets run the right `stow` commands for me.


This makes no sense. Somehow having a versioned dotfiles directory with either a Makefile or a shell script is "hard" because either you need to have Python (no you don't, shell script is fine) or you forget the name of the install script (type ls), but somehow using this software (which needs to be installed) is "easier".

With this software you also need to remember the magic invocation, a trivial thing in both scenarios, but nevertheless something the author complained about, and you also need to install stow. Previously the author complained about adding dependencies in the form or Python (which made no sense, but still, he complained about dependencies) but now he adds this new dependency (which is more rare than python anyway...).


This use case is one of the expressed reasons for stow's existence.

"GNU Stow is a symlink farm manager which takes distinct packages of software and/or data located in separate directories on the filesystem, and makes them appear to be installed in the same place. This is particularly useful [to] facilitate a more controlled approach to management of configuration files in the user's home directory, especially when coupled with version control systems." [1]

Installing a single binary might be easier than installing a scripting environment on a server that may have existing versioning considerations. (brew install stow | sudo apt-get install stow)

You literally type `stow bash` and you're done.

----

[1]: http://www.gnu.org/software/stow/


You literally type make and you're done too.


You might also checkout https://github.com/andsens/homeshick. Similar idea, but tailored for dotfiles.


Thank the programming deities for this one. I've been doing a balancing act with VMs lately and I was about to push my dotfiles to github, but it just seemed like a really bad idea


I've been doing this for a year or two with dropbox. I've tried vcsh and others and stow has been by far the best so far.


Note that this was an official use case for Stow for at least a year before the linked post:

    This is particularly useful for keeping track of system-wide and per-user installations of software built from source, but can also facilitate a more controlled approach to management of configuration files in the user's home directory, especially when coupled with version control systems.
(from https://www.gnu.org/software/stow), where "especially …" links to http://lists.gnu.org/archive/html/info-stow/2011-12/msg00000....

(There's nothing wrong with re-discovering it, of course!)


Since I have a setup script I run on any new installation creating symlinks from my git repo isn't a big deal.

https://github.com/aclough/dotfiles/blob/master/setup.sh


I just move stuff to my home-folder with a very simple Ruby install script. It checks for existence of tools, and if they aren't there they will be installed, and the config files will be overridden.

https://gist.github.com/kaspergrubbe/99853334b0e91b540a34


A while back I got very interested in cleaning up my configuration files. I wanted to figure out out a way to store all of the configurations I use across multiple machines into a single repository, while at the same time not having to worry about symlinking and installing configurations for things I don't need on a particular machine.

I looked at a lot of different things like Stow and homeshick and the like and really didn't like any of the solutions. So I ended up writing my own tool.

I've wanted to write up something for it for awhile now and post it around, but I just haven't yet. Anyway, here's my tool: https://github.com/EvanPurkhiser/dots


Hi, I'm the maintainer of homeshick and would be very interested in knowing which shortcomings of homeshick motivated you to write your own tool. Could you elaborate?


The article doesn't explain how stow 'knows' where to place the symlinks.

The answer is (of course) in the help output:

  -d DIR, --dir=DIR       Set stow dir to DIR (default is current dir)
  -t DIR, --target=DIR    Set target to DIR (default is parent of stow dir)


Recently spent some time organizing my dotfiles and discovered fresh[1] which I've been using to great success. Both for servers I manage and on my mac.

The fact is that most people are going to be reusing (and probably be better of) a lot of code from other well tested dotfiles. `fresh` lets you pick and choose what you want directly from other's repos in addition to adding whatever you want and produces a single bundle (or multiple if you need it).

As far as I can tell, its a self contained shell script and does not have any dependencies (if that is a concern). Its definitely worth checking out.

[1] https://github.com/freshshell/fresh


I use a similar construct for managing config files (mostly the stuff that goes to /etc/*): put all the config for a specific program (or problem) into a VCS-handled directory, then add a Makefile with an "install" target that copies everything into its place. Using GNU install (instead of ln or cp) allows for copying files (and creating directories) with fine control about permissions and ownership.

This can also be done recursively: toplevel Makefile for a machine's configuration files calls the sub-makefiles that install apache, asterisk, openvpn configurations etc.


I maintain environments on Windows (Cygwin), Linux, OS X and Solaris machines.

Some of these are work, some are home. Not all files overlap.

Additionally, I have scripts that are only relevant when I have a certain program installed, or a certain source tree available.

I ended up spending a weekend writing my own system that composed a PATH, .bashrc, .bash_profile etc. based on machine configuration. A single git clone isn't quite enough, especially if you don't want to smash everything into one repo (I don't have my home stuff installed at work, for example).


see jackalope's helpful comment -- https://news.ycombinator.com/item?id=8488828


I've been reasonably happy with my solution to this, which is https://github.com/staticshock/seashell

It's a fully contained Makefile that gives my dotfiles repo `make && make install` semantics. `make` initializes whatever needs to initialize locally on that system, and `make install` creates all the necessary symlinks in the $HOME directory.


I created something a while ago (that I haven't updated in a while, I shamefully admit) that also handles dot files:

http://configr.io

I also do namespaced dotfile saving (folder by application), but I focused on WGET/CURL-ability (which is unfortunately hard to divine from the current website UI -- the curl command is not obvious, as you have to have content-disposition turned on.


Looks useful; I just symlink into a folder on my dropbox, which accomplishes much the same thing with about the same effort.


I prefer dedicated repositories (I use mercurial) for each app (vim, mutt, etc.) or context (X11) that include Makefiles for creating the necessary links or handling some environment-specific details. This still allows me to cherry pick what I want on each machine, but keeps the commit history in the appropriate repo.


Personally I really like https://github.com/thoughtbot/rcm. It is simple, and to the point. Using it is straightforward, and it has allowed me to change machines and migrate my dotfiles seamlessly.


I wrote my own hack to do the same: https://github.com/aksrikanth/settings/blob/master/bin/setup...

It works well enough, and I have to set it up once.


Stow also goes nicely with managing a directory in Dropbox for syncing game saves, app config data, etc, for apps that don't have any native support for cross-machine syncing but still work well enough when you fake it with symlinks.


I have no desire to use this for dotfiles, but GNU Stow sounds like it solves a real problem for me with the programs I install from source. Being able to uninstall without manually tracking down each file would be wonderful.


For installation from source I use CheckInstall [1]. It will keep track of all the files created during a make install, and create a standard deb or rpm package that can then be installed and removed by the distribution's package manager.

[1] http://www.asic-linux.com.mx/~izto/checkinstall/docs/README


For that use case, I normally use unison: http://www.cis.upenn.edu/~bcpierce/unison/

Then I have a script to automatically sync that files.


I just have a dotfiles repo on github that I clone to any new computer I'm using. Then make a symbolic link between the dotfiles and the repo (`ln -s ~/dotfiles/.whatever ~/.whatever`).

Super simple.


This is almost exactly the same strategy, only stow handles the symlinks for you, so you don't have to go through and `ln -s` all your files/directories. Your dotfiles still go in a git repo.


I put the "ln -s" commands in a shell script.

    git clone ... ~/.my-config
    ~/.my-config/setup-links
Done.

Some servers at work don't have Git installed (and certainly not Stow), so I rsync ~/.my-config from my workstation to my home directory. (There's little need for me to edit my .zshrc on a server.)


RCS is a good option for dotfiles.

RCS is a version control system that is useful for individual files which must remain in a specific location.


Maybe I just don't have enough dotfiles, but I just symlink mine into Dropbox.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: