Dot File Manager

by Brett on October 17, 2013

Today I finished putting together a dot file manager. This is a tool that helps manage all of the settings and configurations for multiple computers. I’m currently storing configs and settings for vim, sublime, bash, aliases, custom functions, bin files, thunar, diffuse, kupfer and the like. This tool allows building out a new computer with all of my settings very quickly and also helps me keep multiple computers/servers configs in sync. No more painful copying to usb, emailing or using scp/rsync to manually pull my settings and configs. YEAH!

The installation of this tool couldn’t be easier with a simple one line command:

bash <(wget -nv -O -

That command will download all settings/configs into a dotfiles directory and then ask if symbolic links should be created. The dot file manager tool (called dotm) can be used by anyone to store their custom configs (without being forced to use mine). I hope others find it useful, I know I have.

More detailed instructions about this tool can be viewed on my github account:

Below are some features that this tool currently supports. Please feel free to submit suggestions or issues on github.

  • Handles symlinks to files in sub directories of the dot file directory. This will match the directory structure in the users home directory. So ~/dotfiles/somedir/.somefile will have a link created in ~/somedir/.somefile.
  • Symlinks to directories in the dot files directory. Any directory name that ends in .lnk will have a corresponding symlink pointing to it from the home directory. So ~/dotfiles/somddir.lnk/ will have a symlink in ~/somdir.
  • Creates backups of files that exist in the users home directory. Places backup in a user defined directory (~/dotfiles/backup by default) and appends a timestamp to the filename
  • Automatically creates symlink for all files in the dot files directory (~/dotfiles by default) and sub directories except for special directories.
  • Custom directory (default ~/dotfiles/custom) to put files that won’t be symlinked.
  • Bin directory (default ~/dotfiles/bin) to put scripts that won’t be symlinked. Will be added to the path via .bashrc.
  • Source directory (default ~/dotfiles/source) to put source files that won’t be symlinked. Will be sourced via .bashrc.
  • Option to ask before creating each symlink.
  • Option to create symlinks from a minimal list. Allowing for only some symlinks to be created on specific servers.
  • Updates dot files from remote repository on each run.
  • Command line options to change default settings (remote repository, dotfile directory, etc).

{ Be the first to comment }

I recently had a task where I needed to export a specific table that was in a few hundred different databases. However, mysqldump does not have a way to specify that a specific table should be dumped out of every database. See the supported formats below:

mysqldump [options] db_name [tbl_name ...]
mysqldump [options] --databases db_name ...
mysqldump [options] --all-databases

I was hoping for a command like: mysqldump --all-databases 'table_name'.

mysqldump does have an --ignore-table option but in my case there were too many different tables to list and I didn’t want to go there.

My next thought was to build a quick PHP script that would loop through every database, check if the desired table exists and then mysqldump it. Before I had the chance to start on this approach I realized I could accomplish this with a one line shell command. The approach I took was the following:

mysql -s -N -e "select TABLE_SCHEMA from information_schema.tables where TABLE_NAME='users'" | xargs -I % sh -c 'mysqldump % users | mysql -uUSERNAME -pPASSWORD -hHOST %'

In the example above, I got a list of all databases (TABLE_SCHEMA) that contained a “users” table. I piped that output to xargs which runs mysqldump on the specific database and users table. Last I piped mysqldump to send the output to another server so that it could be imported in the same step.

{ 1 comment }

I recently had a task where I needed to quickly start up 50 spot instances that all required an Elastic IP (EIP) address. I initially worked out the steps in the web console and determined I needed to accomplish the following:

  1. Request 50 spot instances based on an existing AMI
  2. Allocate 50 new EIPs
  3. Associate each EIP with one of the newly running spot instances.

Requesting 50 spot instances from the web console was quick and painless. However, the EIP allocation and association quickly become tiresome. As the Amazon web console only allows allocating and associating 1 EIP at a time. To repeat the following steps 50 times did not seem like a good use of time: requesting a new EIP, determining the appropriate instance ID to associate with the EIP and then assigning the EIP

Instead I decided to quickly put together a few commands to achieve the goal using the AWS API. First I issued a request for 50 instances with a command like the following

ec2-request-spot-instances ami-1d2b34e5 --price .15 -n50 -s subnet-c1f234ae -t m2.4xlarge --kernel aki-88aa75e1

Of course the above command will need to be modified for each specific case. The specific AMI ID, max bid price, number of instances, subnet, instance type and kernel will all need their respective values modified.

Then I waited until all of the instances were running with a command like the following:

ec2-describe-instances --filter "instance-state-code=16" | grep 'spot' | grep -E '10\.0\.1\.[0-9]{1,3}\s+vpc' | wc -l

The above command lists all of the running instances (instance-stat-code=16), limits it to only spot instances and then limits the output to a specific VPC that has an internal address in the 10.0.1.* range.

Once the above command displayed 50 I was ready to start allocating and associating IP addresses. I accomplished this with a combinations of commands. I needed to use ec2-allocate-address to request a new EIP and ec2-describe-instances to get a list of instances that need an EIP. Last, ec2-associate-address needed to be used to associate the new EIP with a specific instance ID. The command to accomplish this looked like the following:

for((i=0;i<50;i++)); do ec2-associate-address -a `ec2-allocate-address -d vpc | cut -f5` -i `ec2-describe-instances --filter "instance-state-code=16" | grep 'spot' | grep -E 'monitoring-[a-Z]+\s+10\.0\.1' | cut -f2 | head -n1`; done

The above runs the ec2-associate-address command 50 times. It then runs two sub commands one which requests a new EIP address in the vpc (ec2-allocate-address -d vpc) and one which gets the next running spot instance that does not have an EIP (ec2-describe-instances –filter “instance-state-code=16″…).

Last, the new EIPs can be listed with a command like the following:

ec2-describe-instances | grep 'spot' | cut -f17

This worked beautifully for my goal of quickly firing up 50 spot instances and assigning an EIP address. As always, there is room for improvement. If automating something like this on a regular basis, I would suggest taking the one line command and doing more validation on the output of each command. The above assumes that everything is in a good state and that no issues occur in requesting or assigning the EIPs.

Let me know if you find this useful!

{ Be the first to comment }

HydraIRC / Freenode – Auto Connect, Identify, Join

June 27, 2012

I often join IRC channels where other developers hang out. I’ve found this to be very beneficial in keeping up to speed with changing technologies. I generally stick to two servers Freenode and OFTC. Freenode is by far my favorite as it seems to be the standard server for other developers to join and create […]

Read the full article →

Disable / Enable Symantec Protection via Command Line

November 5, 2011

On occasion I need to run some software tests where Symantec gets in the way. So I put together a simple batch file that will stop and start Symantec. Just add the following commands to a symantec.bat file. Then you can run the commmands symantec start or symantec stop. if “%1″ == “stop” ( echo […]

Read the full article →

Mercurial Hook for Syntax Checking (PHP)

October 8, 2010

For those unfamiliar with Mercurial, it is an awesome Source Control Management (SCM) tool. One of my favorite features of Mercurial is that the repositories are distributed which allows each machine to have a full copy of the project’s history. Being distributed has many advantages such as faster committing, branching, tagging, merging, etc. since it […]

Read the full article →

Java Live Messenger (MSN) Robot

September 2, 2010

I recently had a project to setup an Instant Messenger Robot for Windows Live Messenger. A IM robot can have many purposes such as: Keeping track of when contacts are online/offline and when they were last seen. Broadcasting a message to all contacts. Automatically answering common questions. Notifying contacts about new events. A newer site […]

Read the full article →

UltraMon Breaks After Remote Desktop Connection (RDP)

April 6, 2010

I use the application UltraMon to help manage my multiple monitor setup. Overall this application is awesome as it makes moving applications between monitors a breeze and supports a separate task bar on each monitor, among other things. However, I have had this issue for a while where UltraMon will not move applications between monitors […]

Read the full article →

How to Setup BlackBerry (bb) with a Different Ringtone for each Email / Contact

April 2, 2010

The BlackBerry has the ability to setup ringtones for each application and also use different ringtones for each contact / email. These features are great for giving full control over notifications for email, phone call, SMS, MMS, IM, etc. With this article I will explain how to setup both application level notifications, contact/email level notifications […]

Read the full article →

Putty with Tango Look & Feel (using regedit script)

March 23, 2010

I ran into a powershell script that Tomas Restrepo created to set putty up with a better color scheme. The color theme he used was based on Tango. This inspired me to make a few updates to my default configuration for putty. I used the powershell script to update my color theme and then updated […]

Read the full article →