I love using the Git branching model outlined by Vincent Driessen. My flow is almost identical to what he has described except for where the feature branches are stored. Specifically he states:

Feature branches typically exist in developer repos only, not in origin.

I’m a little less trusting with feature branches and instead often push them to the central remote repository. For me, I fear that days of work could be lost by keeping the work solely on my repo until it is ready to be merged back into the develop branch. I can imagine my repository being corrupted, my hard drive dying or my computer being stolen all causing work loss. I don’t like to leave my hard work to Murphy’s law.

Now one could use a backup strategy to mitigate this issue and I do highly recommend doing this. I have used CrashPlan for years and it has saved me a few times, making CrashPlan worth every penny. However, I also like to know there is a central Git repository holding my code (again a trust issue). Plus some companies will not allow their code to be placed on any server outside of their network.

Since myself and my entire team have chosen to place our “feature-*” branches on our central remote repository, it has caused a small issue. We find that we end up with a lot of old and forgotten branches on this remote repository as well as our local machines. Of course Git is great at this task of removing the old branches.

Delete Local & Remote Branches

We can remove the remote branch by using the command:

git push origin --delete <branchName>

Then we can remove the local branch with the command:

git branch -d <branchName>

Great, problem solved I can run those two commands each time I’m done working on a feature. But wait, I don’t want to remember those two commands and I want to run only one command. So, I put together a couple bash functions that will make this easier. Just drop these functions in your ~/.bashrc file or in the appropriate dotfiles directory if you happen to be using my Dot File Manager.

function confirm () {
    # call with a prompt string or use a default
    read -r -p "${1:-Are you sure? [y/N]} " response
    case $response in
function git-delete-branch () {
  if [ "$#" -ne 1 ]; then
      printf "Usage: $FUNCNAME branchName\nWill delete the specified branch from both the local repository and remote\n";
      return 1;
  echo "Delete the branch '$1' from your local repository?" && confirm && git branch -d $1;
  echo "Delete the branch '$1' from the remote repository?" && confirm && git push origin --delete $1;

After adding these functions don’t forget to use the source command to load the functions into your shell.

Now a branch can be easily deleted from both the local and remote repository or just one of them by executing the command:

git-delete-branch <branchName>

Delete Merged Branches

Since we have multiple developers all pushing their branches to the central repository, we often find that over time we start to get a long list of orphaned branches (branches that no one is working on anymore). Any easy way to clean up most of these branches is with the following command which will show the branches that have already been merged into your current branch.

git branch --merged

Then each branch can be manually deleted with the command:

git push origin --delete <branchName>

Again, I like things simple so I put together the following function to streamline the process:

function git-delete-merged-branches () {
  echo && \
  echo "Branches that are already merged into $(git rev-parse --abbrev-ref HEAD) and will be deleted from both local and remote:" && \
  echo && \
  git branch --merged | grep feature && \
  echo && \
  confirm && git branch --merged | grep feature | xargs -n1 -I '{}' sh -c "git push origin --delete '{}'; git branch -d '{}';"

Make sure you use the the confirm function pasted above.

Another tool

Another possible solution to the above stated problem is to use the git-flow tool. At first glance it seems to be a handy tool to automate many of the tasks in the GitFlow process.

However, in my short experience with it, I found it doing things I didn’t expect. I then checked the github repository and noticed an unhealthy amount of open issues (175) and pull requests (78) with the last commit date being September of 2012. This project is officially dead in my book.

More recently, I have found out that there is another fork of this project that is being kept up-to-date. It’s called git-flow (AVH Edition). This one might have more potential.

However, I think the GitFlow approach is easy enough to carry out with the standard Git tool that I have not yet taken the time to test out this newer git-flow (AVH Edition) tool.

Have you used the newer git-flow tool? What are your experiences with it? Do you have any other git commands or tools that have helped with your day-to-day Git operations?

{ Be the first to comment }

I’ve had my blog running on port 80 for years and have finally decided it is time to deprecate HTTP and move everything to a secure SSL connection. This decision was a lot easier to make now that Let’s Encrypt is providing free SSL certificates and has been out of beta since April. I also appreciate that the entire installation can be done via command line and that the certificate can be automatically renewed a month before it expires. Wahoo, no more pesky calendar reminders to tell me to hurry up and buy a new certificate before it expires and manually install it.

With that said, my blog is currently running on CentOS 6 with Apache with vhost files placed in a non-standard directory by DirectAdmin. This means I will have to manually add the certificate information to the vhost file for each host instead of letting Let’s Encrypt do all the work for me. That’s ok though, as I will only have to do this once.

Also, CentOS 6 will throw a little curve ball as it doesn’t have Python 2.7 setup by default and depends on Python 2.6 for yum. So, care will be taken to get them both setup on the machine.

Here are the steps needed to set up Let’s Encrypt. First, we need to set up the IUS repository with the following commands:

wget https://centos6.iuscommunity.org/ius-release.rpm
sudo rpm -Uvh ius-release*.rpm
rm ius-release.rpm

Then, we need to get Python setup:

sudo yum update
sudo yum install centos-release-scl python27 python27-devel python27-pip python27-setuptools python27-virtualenv

Next, we will install pip as that gives us any easy way to install Let’s Encrypt and update it in the future.

sudo easy_install-2.7 pip

Now we will install Let’s Encrypt (which also goes by the name certbot)

sudo pip2.7 install letsencrypt letsencrypt-apache

If you happen to have a very vanilla Apache setup and are running Debian then the following command to generate and install the certificate may magically setup everything for you. This was not the case for me so I didn’t use this step.

sudo certbot --apache -d brett.batie.com

Or, you could try to be more specific about where your config files are located for Apache as I did in the following command. However, at the time of this writing this does not work if your vhost file has more than one vhost in it. So, I didn’t use this step either.

sudo certbot --apache --apache-server-root /etc/httpd/ --apache-vhost-root /usr/local/directadmin/data/users/admin/httpd.conf -d www.brett.batie.com

The route that I actually took was to use the following command which only generates the certificate. This is the webroot approach which differs from the above Apache approach.

sudo certbot certonly --webroot --webroot-path /home/admin/domains/brett.batie.com/public_html/ -d brett.batie.com

This provided output like the following:

- Congratulations! Your certificate and chain have been saved at
/etc/letsencrypt/live/brett.batie.com/fullchain.pem. Your cert will
expire on 2016-09-01. To obtain a new or tweaked version of this
certificate in the future, simply run certbot again. To
non-interactively renew *all* of your certificates, run "certbot
- If you like Certbot, please consider supporting our work by:
Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate
Donating to EFF: https://eff.org/donate-le

Sweet, I have a certificate! Now, I just have to tell Apache to use it. My vhost file is located at /usr/local/directadmin/data/users/admin/httpd.conf and I need to add the following 4 lines to the appropriate vhost in that file.

Include /etc/letsencrypt/options-ssl-apache.conf
SSLCertificateFile /etc/letsencrypt/live/ryogasp.com/cert.pem
SSLCertificateKeyFile /etc/letsencrypt/live/ryogasp.com/privkey.pem
SSLCertificateChainFile /etc/letsencrypt/live/ryogasp.com/fullchain.pem

Now do a graceful restart of Apache and we should have the new SSL certificate up and running.

sudo service httpd graceful

You can load the site in your browser to see if it is using the new SSL certificate as well as test it at SSL Labs to see if you received a passing grade.

Since I decided to completely remove http (port 80) from my site I also added the following redirect which could be placed in the appropriate (port 80) vhost (preferable) or in a .htaccess file (less preferred).

Redirect / https://brett.batie.com/

Now, we just need to automate the certificate renewal by adding the following to a cronjob:

@monthly /usr/bin/certbot renew

As you can see there were a few steps involved in this process but overall I found this easier than generating certificate requests, verifying domain ownership, receiving an email with the certificate, needing to find the entire certificate chain and then manually put all the files where they needed to go. The fact that Let’s Encrypt is free and it simplifies the process makes it a no brainer and should help our Internet become a little more secure.

{ Be the first to comment }

Dot File Manager

by Brett on October 17, 2013

Today I finished putting together a dot file manager. This is a tool that helps manage all of the settings and configurations for multiple computers. I’m currently storing configs and settings for vim, sublime, bash, aliases, custom functions, bin files, thunar, diffuse, kupfer and the like. This tool allows building out a new computer with all of my settings very quickly and also helps me keep multiple computers/servers configs in sync. No more painful copying to usb, emailing or using scp/rsync to manually pull my settings and configs. YEAH!

The installation of this tool couldn’t be easier with a simple one line command:

bash <(wget -nv -O - https://raw.github.com/brettbatie/dotfiles/master/bin/dotm)

That command will download all settings/configs into a dotfiles directory and then ask if symbolic links should be created. The dot file manager tool (called dotm) can be used by anyone to store their custom configs (without being forced to use mine). I hope others find it useful, I know I have.

More detailed instructions about this tool can be viewed on my github account: https://github.com/brettbatie/dotfiles

Below are some features that this tool currently supports. Please feel free to submit suggestions or issues on github.

  • Handles symlinks to files in sub directories of the dot file directory. This will match the directory structure in the users home directory. So ~/dotfiles/somedir/.somefile will have a link created in ~/somedir/.somefile.
  • Symlinks to directories in the dot files directory. Any directory name that ends in .lnk will have a corresponding symlink pointing to it from the home directory. So ~/dotfiles/somddir.lnk/ will have a symlink in ~/somdir.
  • Creates backups of files that exist in the users home directory. Places backup in a user defined directory (~/dotfiles/backup by default) and appends a timestamp to the filename
  • Automatically creates symlink for all files in the dot files directory (~/dotfiles by default) and sub directories except for special directories.
  • Custom directory (default ~/dotfiles/custom) to put files that won’t be symlinked.
  • Bin directory (default ~/dotfiles/bin) to put scripts that won’t be symlinked. Will be added to the path via .bashrc.
  • Source directory (default ~/dotfiles/source) to put source files that won’t be symlinked. Will be sourced via .bashrc.
  • Option to ask before creating each symlink.
  • Option to create symlinks from a minimal list. Allowing for only some symlinks to be created on specific servers.
  • Updates dot files from remote repository on each run.
  • Command line options to change default settings (remote repository, dotfile directory, etc).

{ Be the first to comment }

Mysqldump Specific Table From All Databases

January 22, 2013

I recently had a task where I needed to export a specific table that was in a few hundred different databases. However, mysqldump does not have a way to specify that a specific table should be dumped out of every database. See the supported formats below: mysqldump [options] db_name [tbl_name …] mysqldump [options] –databases db_name […]

Read the full article →

Automatically Allocate IP Address to AWS Instances

December 31, 2012

I recently had a task where I needed to quickly start up 50 spot instances that all required an Elastic IP (EIP) address. I initially worked out the steps in the web console and determined I needed to accomplish the following: Request 50 spot instances based on an existing AMI Allocate 50 new EIPs Associate […]

Read the full article →

HydraIRC / Freenode – Auto Connect, Identify, Join

June 27, 2012

I often join IRC channels where other developers hang out. I’ve found this to be very beneficial in keeping up to speed with changing technologies. I generally stick to two servers Freenode and OFTC. Freenode is by far my favorite as it seems to be the standard server for other developers to join and create […]

Read the full article →

Disable / Enable Symantec Protection via Command Line

November 5, 2011

On occasion I need to run some software tests where Symantec gets in the way. So I put together a simple batch file that will stop and start Symantec. Just add the following commands to a symantec.bat file. Then you can run the commmands symantec start or symantec stop. if “%1” == “stop” ( echo […]

Read the full article →

Mercurial Hook for Syntax Checking (PHP)

October 8, 2010

For those unfamiliar with Mercurial, it is an awesome Source Control Management (SCM) tool. One of my favorite features of Mercurial is that the repositories are distributed which allows each machine to have a full copy of the project’s history. Being distributed has many advantages such as faster committing, branching, tagging, merging, etc. since it […]

Read the full article →

Java Live Messenger (MSN) Robot

September 2, 2010

I recently had a project to setup an Instant Messenger Robot for Windows Live Messenger. A IM robot can have many purposes such as: Keeping track of when contacts are online/offline and when they were last seen. Broadcasting a message to all contacts. Automatically answering common questions. Notifying contacts about new events. A newer site […]

Read the full article →

UltraMon Breaks After Remote Desktop Connection (RDP)

April 6, 2010

I use the application UltraMon to help manage my multiple monitor setup. Overall this application is awesome as it makes moving applications between monitors a breeze and supports a separate task bar on each monitor, among other things. However, I have had this issue for a while where UltraMon will not move applications between monitors […]

Read the full article →