TL;DR
Do you want to improve your terminal, both in terms of looks and efficiency? Try this Z Shell-based dotfiles repo, described in the step-by-step article.
Do you want to improve your terminal, both in terms of looks and efficiency? Try this Z Shell-based dotfiles repo, described in the step-by-step article.
Do you remember the first day you saw a computer terminal? No matter whether it was a Linux or Windows, the confusion it probably caused was huge. All those commands to remember, parameters to pass… Just to list your files or check the disk usage.“Who needs that knowledge if I can click all of that? " might have been your first thought.But afterwards it turned out that using CLI instead of GUI is way more efficient and—let’s admit it—it looks awesome. The main difference between GUI and CLI tools is that GUI belongs to “learn how to use" category, while CLI is more like “learn how to configure".Nowadays—in the cloud times—I don't have to worry about storing data like photos, music or code every time I change my computer. There is, however, one thing that I need to set up over and over again – my machine’s configuration. Here’s where the dotfiles repositories come in handy.What’s best in dotfiles is that they are not only your backup, honestly this concept seems to be closer to a shared configuration. With dotfiles repo you can keep the same settings for your computers at home, at the office, remote servers or even share it with teammates.Sounds nice, right?Let me present you with a short set of useful preferences which you can fork from my repository (at the moment of writing this article its version is 1.6.0) and use on your own computer. I’ve created it for Ubuntu and OSX. All of the installation scripts for those modules are going to back up your current configuration and use symlinks to connect your configuration with files included in my repository.
First of all, let’s run the terminal! For Debian distributions the shortcut for that is alt + tab + t. If your rather use OSX press ctrl + space, type “terminal" (or iTerm if you use it) inside the prompt and press enter. Now you probably see your default bash shell like this one:
It’s just basic. Nothing less than a fully functional shell and nothing more than a basic tty. Let’s change that!We are going to switch from bash to Z Shell which is able to manage bash scripts but also do way more interesting things. There are many community frameworks which provide ready-to-work powerful ZSH configurations. A good practice is to use one of them if you don’t want to spend much time on configuration, but those solutions are a bit like closed boxes – it’s hard to connect them with external plugins.That’s why we are going to use ZPlug to integrate with the library of the most popular framework called Oh My Zsh and in the process – with many awesome open source tools from GitHub. Let’s start!Note: before running this script on Ubuntu please install gawk. It’s ZPlug’s dependency.[code language="bash" title="command for automated installation of dotfiles"]bash -c "$(curl -fsSL https://raw.githubusercontent.com/mkjmdski/.dotfiles/master/install.sh)"[/code]
Looks better now, right? It’s thanks to spaceship prompt theme, which is my default one (you can change it by setting the ZSH_THEME variable in .zshrc).You can see that it displays useful information like:
It also works well with other virtualenvs, VCS and even AWS Profiles. To make it work, my script has powerline fonts installed which is a very popular dependency of most ZSH themes. Let’s check a bit of the installed configuration now.
There are two main files: the first one is called .zshrc which is located in your $HOME directory, and is responsible for the Z Shell configuration. It’s sourced once a user logs in. If you want to share this file with all the other users of the machine, you should move it to /etc/zshrc location.The second file’s name is .zplugs.zsh which is loaded by .zshrc and it stores the configuration for ZPlug plugins. If you want to learn more about sourcing files flows in ZSH, check this article on ZSH startup. In order to see files loaded by your ZSH, run:[code language="bash" title="command for tracking zsh sourcing trace"]zsh -o SOURCE_TRACE[/code]
Zshrc is an rc file like bashrc or shellrc. It should contain all PATH definitions, custom aliases and exports for system variables.ZSH contains a really useful system variable called commands. You can use it to check if a program is loaded to the system. I have used it to resolve custom bin paths to add:[code language="bash" title="set of $PATH extensions in from zsh/.init directory"]# Add go binariesif [ -d "$GOPATH" ]; then export PATH="$GOPATH/bin:$PATH"elif [[ $commands[go] ]]; then export PATH="$(go env GOPATH)/bin:$PATH"fi# Add yarn global binariesif [[ $commands[yarn] ]]; then export PATH="$(yarn global bin):$PATH"; fi[/code]Here you can see how to set your default language and the files editor.[code language="bash" title="example of system configs from zsh/.postload directory"]function _get_editor { echo $(which Vim) || echo $(which vi) }export EDITOR="$(_get_editor)"export LANG=en_US.UTF-8 # Default language[/code]
Now let’s see some snippets from the zplug itself. For instance, this code allows zplug to update itself. Zplug first downloads its own repository, and later the hook which enables auto updating is evoked.[code language="bash" title="running hooks for zplugs"]zplug 'zplug/zplug', hook-build:'zplug --self-manage'[/code]Basing on the OS type, this part of configuration installs the correct gopass (I’ll talk about the gopass itself later) binary from its GitHub releases (check it to see how the _gopass_release function works).[code language="bash" title="downloading precompiled release"]function _gopass_release { [ "$(uname)" = "Linux" ] && echo '*linux*amd64*tar.gz' || echo '*darwin*'}zplug "gopasspw/gopass", from:gh-r, as:command, use:"$(_gopass_release)"[/code]Here you can see how to load the library from the oh-my-zsh framework together with its minimal configuration (not all of the functions are loaded).[code language="bash" title="loading and configuring oh my zsh in .zplugs.zsh"]HIST_STAMPS="mm/dd/yyyy" # variable used in oh-my-zsh/lib/history.zshzplug "robbyrussell/oh-my-zsh", use:"lib/{clipboard,completion,directories,history,termsupport,key-bindings}.zsh"[/code]The last snippet shows how to load the autocompletion oh-my-zsh plugins (but they also could be not connected with any framework) basing on programs you have installed on the current profile.[code language="bash" title="loading autocompletion plugins"]zplug "plugins/docker", from:oh-my-zsh, if:'[[ $commands[docker] ]]'zplug "plugins/docker-compose", from:oh-my-zsh, if:'[[ $commands[docker-compose] ]]'[/code]
Now, when you know how to configure zplugs, I will show you how to keep your software dependencies inside the repository without losing time for checks on startup. It’s very important, especially if you spawn around one hundred shell sessions daily (like I do).
When you need to get a new executable, software package managers come in handy. Many distributions have their own ones (like apt for Ubuntu or yum for Fedora/CentOS), but for some time now we have an awesome cross-platform tool for that.Let me present you brew together with its linux port, linuxbrew. If you’ve installed dotfiles from my script, one of them is ready to work on your computer already. Now, if you need a new program (for e.g. the silver searcher) just add one line to the Brewfile.[code language="text" title="example of package included in Brewfile"]brew “the_silver_searcher"[/code]Done! Now when you run brews_install from installers.zsh your local packages will be checked against Brewfile and the difference will be updated. If you want to do it on each startup, just set BREW_UPDATE variable to true in .zshrc.[code language="bash" title="brews_install code"] if ! brew bundle check --verbose --file=${DOTFILES}/Brewfile; then _log_info "Install missing brew formulas? [y/N]: " # Prompt about installing plugins if read -q; then echo; brew bundle install --file=${DOTFILES}/Brewfile fi fi[/code]
I’d like to present you with a list of curated plugins I have included with my ZPlug. It’s highly possible that you don’t need some of them or you’d like to include other ones. Well, that’s the best reason why dotfiles should be forked.Most of my ZPlugins are kept as so called “commands” inside $ZPLUG_BIN directory. It means that after they are installed they don’t have to be loaded (unless they are going to be updated). If you want to do it, set $ZPLUG_UPDATE variable to true.[code language="bash" 1="zplug" 2="commands"]if [ ! -d ~/.zplug ]; then git clone --depth=1 https://github.com/zplug/zplug ~/.zplug;fiexport ZPLUG_LOADFILE="$DOTFILES/zsh/.zplugs.zsh"source ~/.zplug/init.zshzplug loadif [ "$ZPLUG_UPDATE" = true ] ; then zplugs_install zplug updatefi[/code]Luckily in ZPlug, including or excluding the program is a matter of just a few lines. Configurations presented below correspond to sections in .zplug.zsh commented with the same titles.
Do you like Vim? Well, feel amazed because you can type your shell commands in Vim-style using this config! You can use all three of Visual, Insert and Normal modes, and hjkl arrows to navigate through the history.
I called this set of plugins magic, because indeed it’s a magical experience. It adds syntax highlighting to the commands you type, colors your manual pages, adds autosuggestions based on what you had typed before and allows you to search history with commands you have typed using only up/down arrows (or “j” “k” in normal vimode).
There are two basic commands in *nix systems: ls and cd. Forget about them. Now with exa/colorls and autojump browsing your files is way easier. Exa colors your ‘ls’ output and gives you a tree functionality. Autojump remembers how you change your directories and later you just need to type j <directory name=""> and the program finds this directory automatically so you don’t need to go through the full file-tree again.</directory>
I assume you were trying to parse your outputs many times with grep, awk and other GNU tools. That’s fine, especially as this is the standard approach. When you are inside a container, you probably don’t have any other solutions. But on your own computer? Say hello to peco and jq. Peco allows you to perform an easy search on outputs without using complex regular expressions, while jq is a great tool for getting data out of JSONs.
Last but not least, a tool which I have included in my setup is gopass. This program, written in golang, is a modern version of good old pass. It allows teams to share secrets kept in .git repositories encrypted with GPG. That’s great when you need to maintain sensitive data, but you don’t necessarily want to set up a separate vault for them.
Some soft can’t be installed from brew or zplug integration with github. I created function called gems_install which checks your local gems and install those uninstalled. If you want to run it on each startup, just set GEMS_UPDATE variable to true. What’s best – you can adjust this function for npm, composer or any other package manager of your choice.[code language="bash" title="automatic update of declared ruby gems"]_log_info "Checking installed gems..."#### DECLARE GEMS TO CHECKlocal -a gems=( colorls)local -a not_installed_gems#### CHECK WHICH GEMS ARE NOT INSTALLEDfor gem in "${gems[@]}"; do if ! gem list -i "${gem}" &> /dev/null; then not_installed_gems+=("${gem}") fidone#### PROMPT ABOUT INSTALLING ALL GEMSif [ ${#not_installed_gems[@]} -gt 0 ]; then echo "${not_installed_gems[@]}" _log_info "Install missing gems? [y/N]: " if read -q; then echo for gem in "${not_installed_gems[@]}"; do gem install --user-install "${gem}"donefielse _log_info "Gems dependencies satisfied."fi[/code]
Git is a great tool which nowadays is shipped with nearly each of the available operating systems. It’s ready to work instantly but this master of VCS could also acquire some additional, craftsman guru setup. Enter .dotfiles/git and run install.sh. Now, both global.gitconfig and global.gitignore are included in your local configuration.
Let’s say that you need to clone a repository from Bitbucket, and you know the path, but you are too lazy to write the full URL neither find it in the browser. [code title="short urls for git"][url "git@bitbucket.org:"] insteadOf = bb:[/code]Now you can just type in your shell:[code language="bash" title="using short urls for git"]git clone bb:apptension/project.git[/code]Or another example: it’s a good practice to always rebase during the pull, because you can avoid having artificial merge commits in your history. Tired of typing git pull --rebase all the time?[code title="always rebase set in gitconfig"][pull] rebase = true[/code]Done! Tired of pushing tags in a separate command?[code title="always push tags"][push] followTags = true[/code]
Git allows us to create aliases. It’s a really nice utility, if you know your most used git combinations. As a DevOps Engineer I have to init repositories with some infrastructure setup. After generating templates I just run git this. What is git this?[code title="example of useful git alias"]this = !git init && git add -A && git commit -m \"Initial commit.\"[/code]Or let’s say it’s time to clean up your work before pushing commits to the remote.[code title="other aliases"]undo = reset --soft HEAD^amend = commit --amend --no-editlog-line = log --oneline --graph --decorate[/code]
Git smoothly integrates with external tools in zplug setup you can find a tool called icdiff. It’s my replacement for the standard git diff command, the output of which is pretty unreadable. By using git-difftool variables icdiff is able to analyze data in the same way that git does, but in more readable way.[code title="configuration for git difftool"][diff] renames = true tool = icdiff[difftool] prompt = false[difftool "icdiff"] cmd = icdiff --line-numbers $LOCAL $REMOTE[/code]Let’s check the results of this integration!
We can do the same integration for mergetool. I will use PyCharm because I love VCS tools shipped by JetBrains.[code title="configuration for git mergetool"][merge] tool = pycharm[mergetool "pycharm"] cmd = /usr/local/bin/charm merge "$LOCAL" "$REMOTE" "$BASE" "$MERGED"[/code]Done!
To make it clear – I’m not going to try convincing anybody that Vim is better than graphical IDEs. Both of them have their own advantages and disadvantages and probably that’s why JetBrains have a plugin for using Vim inside their tools, and why there are projects that add graphical interface to Vim.But there are use cases when I prefer using Vim over VS Code and as a newbie I wanted to have a bit more friendly experience with this program. If you want to edit a Python file in plain Vim you will see something like this:
It ain’t user friendly at all. And probably that’s why many people don’t like Vim’s in the first place. But luckily the community around Vim is huge and those people have developed many useful plugins.To manage the add-ons we obviously need some plugin manager. In this case it’s Vundle. My installation script will link .vimrc to the one in a repository and run vundle to install all plugins. After this process, the same Python file looks like this:
Syntax highlighting, error console, line numbers, you can even scroll it with your mouse! As a terminal editor it’s way more useful now, isn’t it?So when actually do I want to use Vim? Mostly for ssh purposes. When you need to edit remote server configuration you are going to enter Vim or nano anyway. It’s just better and easier to do it with your local setup. All you need to do is execute a command like this one:[code language="bash" title="using vim for editin remote files"]vim scp://remoteuser@server.tld//absolute/path/to/document[/code]
Configuring this tool is actually a piece of cake. Most of plugin managers for Vim are compatible with most implementations of Vim and using them is really easy. You just need to find a repository on GitHub with the plugin you need and stick to this easy schema:[code title="vundle vim plugins configuration"]set nocompatible " be iMproved, requiredfiletype off " required" set the runtime path to include Vundle and initializeset rtp+=~/.vim/bundle/Vundle.vimcall vundle#begin()" alternatively, pass a path where Vundle should install plugins" call vundle#begin('~/some/path/here')" let Vundle manage Vundle, requiredPlugin 'VundleVim/Vundle.vim'Plugin 'bash-support.vim'" All of your Plugins must be added before the following linecall vundle#end() " requiredfiletype plugin indent on " required[/code]Also remember that each command you can execute in Vim, like this for showing line numbers:[code title="example command run inside vim"]:set number[/code]can be added to your .vimrc to get loaded on program start.
So now, when we know how to set up a good Vim configuration, it is time to do the same thing for the notepad. Nowadays there are three most popular lightweight, hackable code editors for programmers: Visual Studio Code from Microsoft, Atom from Github and Sublime.I have tried all of them and my personal choice is VS Code. It starts up fast, even if many extensions are loaded, is open source and has many useful plugins from the community. All settings are kept as a one JSON file. For example, this is how you turn on autosave (same like in heavy JetBrains software):[code title="files autosaving configuration for vs code"]{"files.autoSave": "afterDelay","files.autoSaveDelay": 1000}[/code]If you want to link the settings and snippets directory (VS Code allows you to create predefined code blocks) of your vscode to the repository just enter the vscode directory in dotfiles and run install.sh. This will also prompt you about installing extensions listed in installed_vs_extensions. So now you probably think: ok, so you want me to add each extension to this file when I install one?No!This is the place when git hooks come to work. Git hooks are scripts evoked after specific git actions designed to help developers maintain the automatically generated part of the repository (or automatic actions). Those files are kept inside the .git/hooks folder so you can’t push them directly to the remote, but anyway you can set them explicitly by running install.sh inside .hooks/ directory.I have designed three hooks:
Normally in basic terminal emulators you have one shell session per each tab. This is not efficient because you need to switch between tabs to control those sessions. Let’s change that!For Linux users I recommend installing terminator and for Mac users iTerm 2. Terminal freaks could also use tmux. Now you are able to manage a few sessions in one tab, turning your terminal into a really powerful tool:
Let me show you a few tricks I use when working with my setup daily:
[code language="bash" title="snippets for changing directories"]cd /Users/me/Documents/devops/graylog #it goes to the directory/Users/me/Documents/devops/graylog #it goes to the directory in oh my zshj graylog #autojump finds this directory if you’ve been there before- # previous directory3 # three times previous directory… # two directories up…. # three directories upmkdir -p /usr/newbin/{groovy_bins,ruby_bins} #create two directories inside newbin[/code]
[code language="bash" title="snippets for extracting archives"]tar -zxvf archive.tar.gz #it unpacks tarball in debiangunzip archive.tar.gz | tar xopf - #it unpacks tarball in osxx archive.tar.gz #it unpacks tarball on bothx archive.zip #works with zip, rar and others toocat file # probably that’s how you get file content to copy it inside clipboardclipcopy file #not anymore# wanna paste?clippaste[/code]
Let’s say you need to set up three identical servers. The first one you want to configure manually – to learn the environment, but the other two will be provisioned automatically. Use the script command![code language="bash" title="creating shell commands cache record"]script my.terminal.sessionecho $PWDdatesudo apt install -y gawk[/code]Now you can review all of your steps by simply running:[code language="bash" title="running commands cache"]cat my.terminal.session #all session redirected to outputless my.terminal.session #see the last steps of your sessionmore my.terminal.session #see first steps of your session[/code]
Now when you have zsh magic it won’t be hard, because autosuggestions and search history can help you a lot. But still, there are a few standard shell tricks which can improve providing input into shell.[code language="bash" title="recreating commands from history"]!! # last command!1345 # command number 1345 from your history!docker #last command which contains word docker$? # returns exit code of last command!$ # last word from last command. Useful when you want to enter file again with different program[/code]Thanks for your attention! Remember about the last command.[code language="bash" title="see you!"]leave +0330 # Reminds you that you have to leave in 3 hours and 30 minutes[/code]
Explore our collection of insightful blog posts.