Lance Laughlin

I'm a graduating New Media Interactive Development student from the Rochester Institute of Technology. I blog about open source software and web development

Month: February, 2014

Entropy Status Update

This blog post is a reference to a team project I’m working on for Advanced FOSS:

So the first week of development has officially ended! Over the next week we’ll need to finish up the rest of our functionality and start preparing for testing and packaging.

So far we have a command line tool that is successfully reading in an image and spits out some basic info about it; including, the screen resolution and format. This week I worked on and completed the functionality to allow the user to analyze an image that is hosted on the web and accessible by a URL. I was actually quite surprised by how easy it was to get this functionality working. We are already using Pillow to pull some data about the image; therefore, all I really needed to handle was actually pulling in the image from a URL and the code we have already worked flawlessly on it.

Here is a quick code rundown of what I needed to do:

import urllib
import cStringIO

First off, we need to import urllib and cStringIO. urllib allows us easily open up an url and pull data from whatever it serves us. cStringIO is a helper class that allows us to use strings a bit better…I’m not 100% of all the functionality but it’s definitely helpful in this situation.

In our initializer function is where most of the magic happens.

def __init__(self, source): #Pass in the argument image location
self.source = source
if 'http' in self.source:
print 'reading from url'
file = cStringIO.StringIO(urllib.urlopen(self.source).read())
self.image =
self.image =
except IOError:
print "Cannot load image. Be sure to include http:// if loading from a website"

Here we are basically checking if the value the user passed in is a web url. There may be a better way of doing this than checking if the string has http but it works for now. If it is a url we can simply load that image into our program with a single line and then instantiate it as an Image object.

We later go on to print some data about the image using the Image method, which is a part of the Pillow library:

def output(self): #Probably only print this in verbose mode in the future
print "We're processing the image: " + self.source
        print "This " + self.image.mode + " image is in the " + self.image.format + " format"
        print self.image.size

We hopefully will finish off the rest of the required functionality this week. More to come!


Rochester Civic App Challenge

In lieu of our Advanced FOSS class we went to the Rochester Civic App Challenge press conference held at The MAGIC Center. If you’re not sure what the RCAC is check this out: Essentially, it’s a 60 day challenge in which members of the Rochester community hack on apps that are civic in nature. The challenge requirements in general are pretty vague and for a good reason: rules hinder creativity.

The press conference went pretty well and had a decent showing. A large portion of the audience were students from the Advanced FOSS class which isn’t necessarily a bad thing. Many of these students will be participants in the challenge so their presence helps give the organizers a good idea of the type of people who will be competing. I want to use the word competing lightly here. Yes, this is a competition; however, like most things FOSS I truly believe that most of the participants are very pumped to see what their “competitors” come up with..though the prize money certainly adds a touch of competitiveness to the equation.

Unfortunately, I’m not sure if I’ll be competing in the challenge myself due to the massive amount of work I have going on at the moment. This includes both my school work, my on-campus job for Just Press Play, freelance work and preparing for the career fair and entering the real world in a few short months. I’ll likely take advantage of the 24-hour hackathon kick-off though to work on some of my school work and try to lend a hand to anyone that may need it. I’m really looking forward to seeing the type of project ideas people come up with and the data sets that are used. It will be interesting to see how people tackle problems that we face in both government and as Rochester citizens.

First meetup for Advanced FOSS

Now I’m not a nooby when it comes to We went as a class every month for our HFOSS (Humanitarian Free & Open Source Software) class last semester. This semester we are rollin’ deep with both this semester’s HFOSS class as well as our Adv FOSS class, totaling about 20 people. This month was pretty beneficial for the HFOSS class as the first hour was about data structures and basic programming logic in Python. I feel like this would have been super helpful for our HFOSS class last semester because about 85% of the class had zero Python knowledge going in. Our class did manage to pick up Python over the course of the semester; however, I really feel like we could have done so much more with our projects had we had a mini-seminar like this. We did get a pretty awesome guest lecture from Threebean though which personally helped clear up a few things for me.

Aside from the intro to Python we had an hour at the end to give lightening talks to the group. Most of the Advanced FOSS students gave lightening talks about our first project for the class. I’m working on a group project with Chris Knepper and Dustin Raimondi which will help people decide whether a wallpaper is suitable for their machine. You can check out our project proposal here: I thought our lightening talk went okay; unfortunately, we only have a proposal at the moment so we couldn’t talk about too much. I felt like the presentation was sort of boring because it was 3 people talking about something very basic and likely would have went better with a bit more content or with less people presenting: live and learn. Check out our presentation here:

Of the other lightening talks for Advanced FOSS projects I must say that I really really like Lindsey Ellis (Fangy) and Joe Prezioso (ArcticSphynx) project. It’s a dog tracker that emails people when their dog(s) are being noisy. I like this idea because it’s very practical and really perfect for the Raspberry Pi. This has inspired me to do something more Pi-centric for the next project rather than something that is simply written in Python and ran on the Pi.

Project 1 Proposal

Title of Hack:

Short Description of Hack:
Rates wallpapers based on a variety of criteria

Software Libraries Needed (e.g. python packages, npm, ruby gems, etc…):

  1. PIL – load local images and manipulate pixels
  2. requests – load images from URL
  3. xlib – get screen information via x session

Upstream Distribution Repository (e.g.
PyPI, GitHub

Open Hardware Needed:

Team Members (if Applicable. With Roles):

  1. Dustin Raimondi (Developer)
  2. Lance Laughlin (Developer)
  3. Chris Knepper (Developer)

Project Milestones (REMEMBER You only have 2 weeks of development!):

  1. load an image from a local source

  2. pull metadata from image

    1. resolution

    2. filetype

  3. pull various pixel data from image

    1. fill in gaps in metadata

    2. overall color temperature

  4. algorithmically calculate image score

  5. load an image from the web

  6. verbose mode

  7. process a folder of images


threebean (pending approval)

Mmmm. Pi.

We took a mini field trip to the MAGIC Center today to a get a bit of a treat; A Raspberry Pi Ultimate Starter Kit:

I’ve been eye-balling a Pi for quite some time. I debated buying one while on break but decided to hold off to get one provided by RIT. We all arrived and received our Pis pretty quickly and began digging in. We mostly went over all the components to make sure everything that was supposed to be in the kit actually was in the kit.

Now this is the first time I’ve even seen a Raspberry Pi in person, let alone code on one. Upon doing a bit more research about the Raspberry Pi I’ve began to understand how everything works together. For my New Media Team Project my group is likely going to use an Arduino to do some physical computing which lead me to this pretty good overview article for a beginner:
Turns out that the Raspberry Pi is a mini-computer whereas the Arduino is a micro-computer. In essence, the Pi is basically a very tiny PC while the Arduino simply aids in electronic projects. Understanding that difference has helped me quite a bit.

I’m exciting to finally get mine booted up; Unfortunately I need to get my hands on an HDMI monitor and a keyboard. This is the first time in my college career that not having a desktop environment has been an inconvenience. Regardless, it will be fun to get started!