Wednesday, January 22, 2014

GeoGraphing with R; Part 1: Zipcode Mapping

I'd like to share some graphing work I've done with the R programming language.  I have been interested in R for a few years now, and have enjoyed the extremely intuitive platform it provides for data analysis.  Although I don't make much use of the powerful statistical tools R provides, I've found that this is the charm of R.  It provides a platform for any use you could need, with an intuitive interface like Python.  I keep R on my personal Ubuntu and Windows machines, use it at work, and have even installed R on my Raspberry Pis

I am a big fan of the RStudio IDE which provides some editing and data/file management services to the ultilitarian basic R GUI.  I have also test similar code on a Raspberry Pi, which installs with a simple call to apt-get.


After seeing a presentation of some of the geographical presentation features of Tableau (GIS-lite within their visual analytics platform) I became inspired to experiment with mapping visuals, for free.

Using the wonderful wealth of user packages, I was able to get started on this quickly using some tutorials and documentation I found.  I am especially in debt to Jeffrey Breen, the creator of the zipcode package and whose tutorial I found immensely helpful in creating this particular chart.  This charting program is built around the plotting of latitude and longitude points against a contiguous United States map defined by state borders.  Since the coordinates in each set is sympathetic, the matching between the borders and points is exact.


This particular chart is a version of a project I created for work, plotting the locations of bank branches for the top five banks by number of branches.  In an era of thin branch banking, deep networks of brick and mortar branches aren't always considered key to retail banking success, but this type of analysis is still useful.  This program is based on the publicly available branch location data from the FDIC downloaded as csv filess and parsed by R into data.frame objects.  I have yet to find a public API for this data, bonus points to anyone who has.

The code below makes use of the zipcode package mentioned above as well as the ever useful ggplot2 graphing library.  This is ready to run on any R platform with these packages installed.


#Install needed libraries (Note that zipcode is used for a dataset)
library(zipcode)
library(ggplot2)
data(zipcode)
 
#Read and format .csv's downloaded from the FDIC 
#Source http://research.fdic.gov/bankfind/
#csv's were renamed to the stock ticker of each bank but are otherwise unchanged
#The raw csv's include 7 rows of metadata, this is removed allowing row 8 to be used as headers
#Since Zip and Bank are all we care about, for now other headers are ignored
#Bank name is added to allow aggregation by entity later
#I've created a quick function for importing the data
readBank <- function(filename) {
  bank <- read.csv(paste(filename,".csv",sep=""), header=TRUE,skip=7)
  bank$Bank <- filename
  bank
}
WFC <- readBank("WFC")
JPM <- readBank("JPM")
BAC <- readBank("BAC")
USB <- readBank("USB")
PNC <- readBank("PNC")
 
#Concatenate bank files together
top5 <- rbind(WFC,JPM, BAC, USB, PNC)
#merge five bank set with zipcode to make mapping possible
top5Zip <- merge(zipcode,top5, by.x= "zip",by.y ="Zip" ) 
 
#Much of the following has been taken from Jeffrey Breen at http://jeffreybreen.wordpress.com/2011/01/05/cran-zipcode/
#Begin mapping function.  Colors denote bank names.  "size" is increased to enhance the final plot
g <- ggplot(data=top5Zip) + geom_point(aes(x=longitude, y=latitude, colour=Bank), size = 1.25)
 
#Simplify display and limit to the "lower 48"
#Some banks have Alaska branches (specifically Wells Fargo in this data), this is included, but ignored by the ggplot
g <- g + theme_bw() + scale_x_continuous(limits = c(-125,-66), breaks = NULL)
g <- g + scale_y_continuous(limits = c(25,50), breaks = NULL)
 
#Don't need axis labels
g <- g + labs(x=NULL, y=NULL)
g <- g + borders("state", colour="black", alpha=0.5)
g <- g + scale_color_brewer(palette = "Set1")
#Arbitrary title
g <- g + ggtitle("Top Five Banks by Number of Branches") + theme(plot.title = element_text(lineheight=.8, face="bold"))
g <- g+ theme(legend.direction = "horizontal", legend.position = "bottom", legend.box = "vertical")
g
Created by Pretty R at inside-R.org

Following the creation of this plot I usually use the ggplot2 ggsave feature to save the plot to an image file:
ggsave("branches5.png", plot=g)


The resulting plot:


As seen with the simplicity of the merge statement, you could substitute nearly any zipcode based data.  Other charts I've created have included asset locations and temperature data.


As a preview, the next R GeoGraphing Post will focus on state level mapping data, and includes some animation tricks.

Sunday, January 19, 2014

Favorite Tools: FRED and St. Louis Fed Research Tools

I'd like to use this series as a set of love notes on my favorite data tools.  Some of these I use almost constantly at work, others are personal favorites I have come across.


FRED is a tool I came across a few years ago while reading economics blogs.  The distinctive color of a standard FRED graph (with obligatory recession shading) was something I began to associate with the econ blogger crowd.  It seems this has been noticed by many, and Paul Krugman, his blog being one I first noticed FRED on, is quoted as saying "I think just about everyone doing short-order research — trying to make sense of economic issues in more or less real time — has become a FRED fanatic."

After using these tools at work and home I have come to feel the same way about the tool, even evangelizing its merits to my coworkers and friends.

FRED graphs are distinctive and immediately recognizable


In my work in data analysis at a national bank, I have come to greatly value FRED for two main reasons.  FRED is a singularly well organized and populated database and it allows the immediate reference to data which is often useful in a one off fashion.  Pulling this data out during a meeting has more than once garnered some recognition of my economic knowledge which might not have otherwise occurred.

The breadth of data available is somewhat astounding.  International Data might usually take you all over the web and to a few commercial sites, but FRED has enough to do most high level macroeconomic survey work.  I find the somewhat more obscure metrics very interesting at times, and it's fun to eyeball for trends.

It's too easy to make weird charts...


After discovering FRED's website I was ecstatic to find that an Excel Add-In had been developed.  i immediately made use of the feature and made sure I spread the news around.  Being able to quickly pull in common economic data while doing simple (or complex) analysis can save a lot of time.  Outsourcing the data storage and update costs to FRED is wonderful.  I've been able to cut down on some user table creation and maintenance I owned was a time saver.

In order to facilitate the access to my company's internal economic data hub I even created my own version of the FRED Excel Add-In, which I named ED.  Using some simple VBA  GUI elements (drop downs, radio buttons, many MsgBox's...) and an ODBC connection I was able to mimic the Excel Add-In functionality of FRED.  Adding in some charting code I was able to mimic the distinctive graphs as well.  Given that the data is proprietary, I don't see any issue in my imitation of FRED, and I view it as a labor of love in tribute to the data tool.
Tying FRED into R was an obvious result, and I've already begun to make use of this data.  Being able to pull this data down into the R environment makes it even easier to manipulate the data quickly, without the worry of Excel resources (Autosave I'm looking at you!), or adding the data to a database structure.  A R programming project I'll detail later exhibiting geographical plotting uses similar data, maybe I'll tie FRED in to show off the functionality.

I also happily own the FRED mobile app, which I find entirely too amusing, and has come in handy for wonky discussions, and to prove my data nerdiness to anyone in sight.

If they sold T-shirts, sign me up for two.


The St.. Louis Fed includes three other tools GeoFred (data Mapping), ALFRED (historical economic series), and CASSIDI (a personal favorite of mine, which details US banking industry data).  I believe I'll include love notes on these as well, CASSIDI especially.

Tuesday, January 14, 2014

Google Acquires Nest

In what seems to me like the perfect snapshot instagram of current technology trends, Google announced yesterday that it acquired the home automation pioneer, Nest.

Buying Nest, after forays into home energy data and hardware, seems like a great fit for Google.  Similar to the Android platform, Google is once again making use of hardware outsourcing (or corporate crowd sourcing) in order to focus on its core competency, smart data acquisition.

Nest's products are a great example of how known technology can be totally reinvented with the introduction of machine learning and UI enhancements.

The thermostat is no new product, and one of the most common technologies used in the sensing of temperature for thermostats has remained mostly unchanged over the past century since the first patent.  Simple circuits using thermistors can mimic the thermostat for less than $20.  Adding a fancy microcontroller only adds to the fun.  I've built a few temperature sensing projects which i hope to share in this space.

What makes the Nest thermostats and smoke detectors so exciting is the integration of the simple technology of household appliances and adding the relatively new and buzzy machine learning approach to existing technology.  A remote controlled thermostat is interesting, a bluetooth controlled thermostat might be fun, but an intelligent thermostat can actually change the way we interact with the technology.  Sensing and adapting to human behavior makes Nest's products both trendy and useful.



It has been my feeling for some time that consumer technology in this decade will be defined by the marriage of smart technology and big data.  Google's announcement only cements this path, and their place in the development of the machine learning era.

2013: a Christmas Tree

After a several years of fumbling with guitar electronics, playing with Arduinos, and now cookng with Raspberry Pis, my interest in the application of DIY electronics has infected the holiday rituals of my girlfriend and I.  Christmas 2013 was smart, in the trendy sense of the term.


Smart and shiny!

Raspberry Pi Powered Web Switch


To make use of my second Raspberry Pi (first Pi's application to be detailed later!) I chose to try out some simple smart relay techniques.  Using an example and inspiration from a great Make published RPi book (Great resource, I found every example useful and fun to try) I decided to use my Pi and Wifi to build a hands free Christmas light set up.  Following the timeless ideal of a creative solution to sometimes disproportionate problem I used my Pi to build a web server based remote for our tree's lights.

This project is a modified form of that found in Matt Richardson's RPi book mentioned above, also found at his website (specifically the WebLamp examples).  The script used for tree lights modified was gratefully modified from the example found at these sources


Materials (In order of coolness)
Raspberry Pi Model B     My first, totally worth the frantic refreshing and wait after pre-order
Power Switch Tail II      Such a great tool, makes me feel like an electrician, without trips to the ER
Adafruit T-Cobbler GPIO Breakout     I have both the standard and "T", T shape looks cool




 Hardware Connection

Pin 25 of RPi/Cobbler to +in of PowerSwitch (controls PSwitch relay)
Ground of RPi/Cobbler to -in of PowerSwitch



Software

I won't repeat all of the great work featured at Matt Richardson's site, except for the alterations I made.  The projecct is based around some simple python work using the python Flask extension, which can be used to support a simple webserver and more.  The python-based code is separated into multiple scripts; the main python code and a templates directory (the main HTML to be used in creating the Christmas Lights webpage).  The modular design makes it easy to modify the webpage for different applications.

By following all of the instructions at these resources you should arrive at a workable Flask-based web server accessible through your Pi's internal IP address or with http://raspberrypi.local for Apple products or Bonjour enabled devices. 

I've updated the HTML in the python files to be a little more festive, but this is purely cosmetic.

Additionally, I added a wrapper shell script to my /etc/init.d directory with sudo execution  on the Pi and updated the default boot list to include this shell program.
The wrapper shell script includes the following commands:
sudo nohup python /home/pi/WebLamp/weblamp.py &
Note that the home directory may be different based on your Linux distro and configuration.

With these steps and modifications I was able to create a cell phone (or any browser-capable device on the WiFi network, Flask is very forgiving) switch for our lights.  My girlfriend loved the functionality, and it added another personal touch to our decorations.


eOrnaments


In addition to the WiFi switch, I added a couple more electronics decorations to our tree.


Is there ever a bad time for a Ping)))?

Incorporating an earlier electronics project, this year we added an electronic advent countdown to the tree.


3 alligator clips clipping...

I made this device from Wicked Device's Day Counter kit.  I originally used the kit for a scheduling aid at the office, but liked the idea of an active decoration. The day counter is powered by a  micro-usb breakout (huge fan) and an old cell phone charger.


Almost sad to break down the project, it's definitely made us keep the tree up longer this year

Metadata

Part of the motivation for writing has been my slow realization of the impact my work in data has had on my outlook.  Instead of seeing streaming bits of green CRT nonsense floating in front of my face I have become awakened to the ubiquity of data (recorded and potential) indifferently existing in our world.

Among other straw men, I might say there are two extreme reactions to the idea of the digitization of reality.  One position may state that the mining of data from our existence has a corrupting effect on the real-time bio-analog experience.  "Why look for patterns in clouds when we can use machine learning to build then mine Clouds?"  Another viewpoint may be imagined as the enthusiastic defense of the plugged-in lifestyle, where the internet of things includes the tweeting toaster and the social media keggerator.

I have come to view the data revolution as a positive influence, and hope to learn what I can and perhaps play some part in my generation's version of dawn of the transistor era.  I understand the legitimate concerns of the enabling of a sometimes narcissistic or detached attitude, or the privacy/security risks of a data rich world.  Not in spite but because of these concerns I feel it is important to keep an optimistic view of data and electronic enhancements.  While market competition may provide some of the necessity impetus of invention, an over cautious pessimism can stifle the risks necessary to create.  Optimism and the tilting at green-energy windmills are among the most sustainable fuel sources for creative progress in data and electronics.


I hope to use this space to document my efforts in learning and building data driven projects and give back to the online community of digital artists and inventors I have come to love.  I may even build a tweeting trash can...