Computing

Quick-n-dirty git getting started guide

Posted in Software Development on September 7th, 2011 by Jeff – Be the first to comment

As a git neophyte I approve of this post:

http://news.ycombinator.com/item?id=2970637

UPDATE: I also found this helpful site: gitref.org

Lighting a fire under WordPress

Posted in Infrastructure Management on September 5th, 2011 by Jeff – Be the first to comment

Since I moved my personal web site from Roller to WordPress a couple of years ago, my web site had been a dog. After reading an article about a PHP-based web site configured to support 9 millions hits per day, and knowing through experience that my site should be significantly faster, I decided it was time to light a fire under WordPress.

(Note that I’ve included gists at the bottom of the article with the important configuration files.)
read more »

Tip for optimizing MySQL data types

Posted in MySQL, Software Development on June 28th, 2011 by Jeff – Be the first to comment

This is a tip that I’ve kept forgetting to write down so here it is:

During a system’s life cycle, requirements change and components are refactored. This includes databases as well, and particularly as data grows. Decisions and assumptions are made at the beginning of a system’s life cycle that may or may not hold up over years of operation and it’s good practice to continually analyze how well the initial design is working.
read more »

MySQL udf_median on Windows 7 64bit

Posted in Information Technology, MySQL on May 21st, 2011 by Jeff – 6 Comments

In a minor but ongoing saga of supporting the venerable MySQL UDF function udf_median, I can now add a HOWTO for building it on Windows 7 x64 using Microsoft Visual C++ Express 2010.
read more »

Intercept HTTP requests with Squid

Posted in Systems Administration on April 20th, 2011 by Jeff – Be the first to comment

On one of my projects we had some questions about how much bandwidth was being used by requests to a third party service but we didn’t have any a view beyond general traffic on the network interface. I hit upon the idea of using a transparent proxy to log requests then use log analysis to break out data transfer amounts per third party service. And since we already had squid as part of our infrastructure applications it seemed like a good choice.

The tricky part of this setup is that everything is hosted on the same hardware node and we also have some web services that needed to be left untouched. These requirements implied some network configuration using iptables to force outbound web requests through the proxy.
read more »