I was on the radio

I was a guest on today’s episode of Where We Live on WNPR. The topic was on Connecticut Startup Culture. Give it a listen, and let the mocking commence (“Check it out, it’s awesome!” groan).

One thing we didn’t get to on the show (and, let’s be honest, there are probably a dozen conversations wrapped into this wax ball of a topic), was the demographic differences between CT and NYC or San Francisco. My sense is, and this is definitely anecdotal, that our mix of entrepreneurs in CT is probably a little older than the folks in those other cities. Probably doubly so in tech.

Before the show, I joked to Gitamba that the prototypical ramen diet isn’t really an option when you have a child at home. Accelerators like TechStars or Y Combinator target that frenetic pace and schedule. That makes sense for their programs, and there’s nothing wrong with that, but it might not be the only or best model for areas still building up their entrepreneur base and culture.

Anyway, just an extra thought on the way out the door of the station.

Sensors, LEDs & iPhones, oh my!

Over Christmas, I hacked together my first hardware/software project. It’s been a long time since I’ve picked up a soldering iron, let alone built something worthy of sharing. It turned out to be a fun little project.

Cause, effect, agency

I got the idea to do a project over Christmas while looking for toys for my son for Christmas. I wanted to find something that would teach him simple cause and effect relationships where he could cause something (e.g. clicking red & blue blocks together) that produced an observable effect (e.g. the blocks changing color to purple). I hoped that I could prime some interest in science. I also really wanted to instill a sense that he can make things happen for himself.

For a two year old, I basically came up empty.

But for a kid slightly older, we’re living in a golden age of hackable creativity. We have 3D printers that are slowly becoming affordable. The Internet makes finding (and sharing!) instructions on building everything from customized furniture to undersea robots easy. Open source and community based tools are getting cheaper and easier to use every year. Several businesses have grown up offering easy instructions and tutorials. (Come on, these look cool, don’t they?)

So, I decided I’d use the four-day Christmas long weekend to hack together a hardware prototype (with help from my wife’s nephew).

The project

For the project, I set out to build a simple thermometer & barometer that I could check from my iPhone. I also wanted it to have some visible indicator that would be fun to look at so my son could check it. As a beginner, I also wanted something I thought I could pull off.

My project centered around an Arduino microcontroller board. An Arduino is an inexpensive open source “electronics prototyping platform” that can be programmed using nearly any computer and a USB cable. Because it’s cheap and freely documented, people have hooked up dozens (if not hundreds) of sensors & other electronics to it.

I had an old kit laying around that I rediscovered after I returned to Fanzter and started working with our resident hardware hacker extraordinaire, Josh. I recommend starting with a starter kit if you’re just getting into electronics projects. You can get several decent options from Adafruit, MakerSHED, or Sparkfun. I have an older Sparkfun Inventor’s Kit, but any from these three vendors will do.

You should go through a few of the tutorials before trying the rest of this to get familiar with the basics of Arduino programming.

Here’s the full parts list:

Tools:

In addition, if you want to get an app running on your iPhone, iPad, or iPod Touch, you’ll need to have a developer account with Apple.

The build out is really simple. The BLE shield snaps onto the Arduino board basically extending the pins and sockets on the Arduino through itself. Just make sure you line up all the pins and sockets. More details at RedBearLab if you want them.

For the LED matrix and the temperature sensor, there’s a little soldering involved. For both the soldering and the basic wiring setup, I followed the instructions in Adafruit’s tutorials:

Make sure you position the LED matrix correctly before soldering it. I got multiple warnings about that from people.

Here are the results of my soldering job:

Arduino project images

I’ll admit, I’m proud of how well that came out considering it was my first soldering project in 20 years.

Arduino project photo The only difference in my final wiring from the two tutorials is that I hooked the matrix CLK and DAT pins to the same rows containing the CLK and DAT lines from the Arduino to the temperature sensor. In the picture at left, those are the green and orange wires (Click through for a larger view). This works because they both speak a protocol called I2C and have different addresses. [1]

For power and ground, I used the breadboard instead of hooking the sensor or backpack directly to the Arduino board. This is standard, and what the kit tutorials encourage. Just thought I should mention it, since it’s not directly mentioned in the two Adafruit tutorials above.

The next step is programming the Arduino. Rather than walk you through all the details, here’s the source code. Feel free to fork the project and mess around. I’d appreciate any bug fixes if you have them. To use the source code, you’ll need to install the Arduino software & the Ino tool. I used Ino so that the github repository would have everything you need. To run the project, launch Terminal, then type ino build and then ino upload to get the project onto your Arduino. If you want to see the serial output, you can use ino serial -b 57600 to get that on your terminal screen.

I also have the iOS code available if you’d like to play with that. You’ll need to be comfortable with iOS development to use this. I may submit a version to the store if there’s enough interest. Let me know.

That’s it. The finished wiring looks like this:

Arduino project images

When lit up, it looks something like this (only 2 readings are displayed – normally there are 8):

The arduino end of this, the simple temperature station.

The iOS app is really simple:

Weekend hack: arduino weather station talking to iOS app via BLE. Boom.

Drag up to trigger a connect or disconnect. Eventually, I’ll add a pull down to trigger a temp refresh. Otherwise, it polls every minute.

Known issues

The code isn’t perfect and, as I get free time, I’m still cleaning up a few things. Here are some known issues:

  • Bluetooth reliability: For some reason, the iPhone doesn’t seem to disconnect and/or reconnect properly to the device. Pressing the reset button on the BLE shield usually fixes it, which makes me think there’s something wrong in my code.

  • Memory usage: So, the main challenge programming an Arduino is that the device only has about 2K of RAM for the sketch. Yes, that’s two kilobytes. It’s a challenging environment when I’m used to phones that have 256-512MB RAM (or more). My code is definitely not particularly optimized. The program did run out of memory regularly. I think it’s stable now, but it’s not as good as I think I can get it.

Next steps

I’m going to try and hook it up to a Raspberry Pi and put it in an weatherproof enclosure so I can leave it outside. My other goal is to change the LED Matrix to an LED strip like this so I can make it look like an actual thermometer.

I’ll update this with photos if I get that far.

Hope that helps someone out. It was a fun project, and I’m looking forward to working on this more.


 

1 I2C is a simple two-wire interface to hardware components. I2C allows the Arduino to control multiple devices over just two pins. The Wikipedia page has the gory details, but just know that each device has an address which has to be unique, and then you just wire them up in parallel. The LED Matrix backpack that Adafruit provides provides an I2C interface to the LED matrix, and the Bosch sensor comes on a board that also speaks I2C, so all the work is basically done for you.

That’s more detail than you probably need, but I thought it was neat.


Update: Two corrections above, both minor but notable. I accidentally described an Arduino as a microprocessor instead of microcontroller, but then Josh pointed out that it’s really a whole platform because the microcontroller is the specific chip at the heart of the Arduino. It’s a significant detail when you get more advanced because different versions of the Arduino might have different microcontrollers at the heart of the platform.

The other is how I described the I2C wiring in the footnote. The sensor and the LED matrix are wired in parallel, not series. I had a feeling that was the wrong word, but forgot to look that up. Minor detail, but again significant for deeper understanding.

Sorry about both of those. They’re fixed above.

A jumble of thoughts

On a normal Friday of a normal December, a bunch of families said goodbye to their kids and sent them off to school. Announcements. Meetings for the Principal. A normal Friday.

Then the abnormal sound of gunfire. Of violence. Of death.

And now, the sound of tears, of sadness, of remembering and loss… of fears and nightmares past and future.


I keep imagining what it must be like for the parents of the victims today, especially of the young children. I imagine myself in their shoes, I imagine walking by a now-always-empty room with bright paint evoking happier times.

My thoughts are with the families affected by the tragedy.


When I walk into my son’s class, the kids say “Hi Che’s Dad!” They make me smile no matter how crappy my morning has been. I keep thinking about what someone must go through to walk into a classroom full of children and open fire. Why would anyone want to hurt them?

It’s unfathomable. The loss is unfathomable. The chance and the randomness is most unfathomable of all.


I wrote this earlier today re: discussing gun control policy on the day of a tragedy (slightly edited for grammar):

My only point is that today should be about the tragedy itself, to burn in the news and to cope with the loss & fear of loss that these events bring up.

We get wrapped up in hammering the stats, the policy ideas, and how “stupid” the other side is on this debates today, but the real work is how we keep this in the public eye going forward.

I’m also seeing things like this at TPM that ask good questions. This is the nitty gritty of how we help prevent these sorts of tragedies in the future. Even on the mental care front, how would those laws work?

I’m just incredibly sad & angry about this, and I’m not ready to have those conversations today. That’s all I’m saying.

The response I got from people on Twitter & elsewhere primarily centered on getting people talking about the issues at play while attention is fully on the tragedy. They argue that tomorrow the pain will be a little less for those of us not directly affected. In a month the national media will forget about this completely.

There’s probably a lot of truth to that… the second, little tragedy that goes along with the big, evil tragedy that happened in Newtown.

“Those who cannot remember the past are condemned to repeat it.”

So, it seems our job is to keep us from forgetting this bit of our past, to keep it squarely in focus during the next Congress and the next legislative session here in CT and beyond. Take a moment and ask yourself, “What am I doing for the victims of Newtown?”

If you’re upset about this, and want the government to do something about this, learn what you can about the issues at play. What ideas do people have for reducing gun violence? What should we ask our representatives in government to do? Is gun violence reduction about more than guns? Do we need to rethink mental health policies in this country?

As you do learn about these things, contact your representatives in your state legislature, Congress, and anyone else you think is in a position to advocate for better policies. Tell them what you’re learning, ask them to look into the best ideas you find.

There’s always a risk that bad laws get rushed through in the wake of tragedies. This one will be no different, so it’s up to all of us to become smarter about the dynamics at play here and what public policy options people have considered. We need to do something. Let’s make sure that something lives up to the memories of those lost today.

We need context, not balance

Like many towns, West Hartford has a advertising supported, free weekly paper that’s mailed to everyone. The latest issue arrived today with this as the front page article:

Fire, police overtime nears $3 million in 2010

The Town of West Hartford paid nearly $3 million in overtime compensation in 2010 to police and fire department personnel, according to data released under a Freedom of Information request.

Town administrators this year combed through expenses, trimming nearly $1 million from the original proposed budget before approving a 2012-13 spending plan. But in trying to cut, administrators lack the power to control specific items, such as overtime paid to union employees.

You should go read the article.

Here’s what stood out to me: the next several paragraphs (easily the first third of the article) includes quotes by the Republican members of our town council (the minority party) and other claims that collective bargaining is to blame for the OT in 2010.

At first, I wondered why they didn’t include any quotes by the Mayor or from Democrats on the council. Then I realized that I actually didn’t really care about that. I don’t really care about balance in that sense.

What bugs me is that there’s no context given to these numbers or information. For example, the reporter names particular police officers and firefighters who earned a significant amount of overtime. The reporter writes about how the Republican members of the council want to go after the contracts negotiated with the unions. They give particular salary numbers out.

I read all of this and was left with… “and so what?” I don’t know if this is a matter of lazy reporting or tight deadlines and a need to print something or an assumption by the writer that readers are already super informed, but as a pretty regular watcher of town news, I was left with a bunch of questions. I needed context in order to put the numbers in perspective and to understand some of the claims made in the article (both by people quoted as well as the author).

Some questions I would love to see answered:

  • What is a typical yearly budget for overtime expenditure?
  • What is typical overtime budgeting for towns around the state with similar populations?
  • Why did the cited officers get so much overtime pay? How many hours did they work? What was the purpose of those hours?
  • What are some of these union rules that drive overtime costs? What union rules do the Republican members find objectionable?

That’s just a small list. These numbers sound big (“$3 million in OT!” – “That police officer doubled his salary!”) but given some context, they may not be crazy. For example, if the officer is working double shifts most days to make up for staffing shortages, that may be reasonable compensation. If he does it every year, maybe there’s a policy change required. How can we tell with the information provided?

I want better reporting from our papers. People might be willing to pay for them if they actually dug a little deeper than a blogger with SEO skills…

Proxigram supports Facebook

You can now pull in photos from your Facebook account. Right now, it grabs everything, but I have controls in the works to let you choose privacy settings or hide individual photos.

There’s pseudo realtime support, too. Proxigram should update basically as soon as you upload the photo.

As always, everything is on Github.

Proxigram now supports Flickr

Quick update on Proxigram: it now supports Flickr, Yahoo’s popular photo sharing service. If you’re a Pro account holder, it will even get realtime updates from Flickr, just like Instagram provides.

The “point” of the app has changed, too. The goal is to build a single API endpoint for all of your photos. While the photos will still be hosted on their respective services, you can now get one read-only API to see a normalized view of them all.

The project is still open source, so if you’re looking for a sample node app that connects multiple third party services via oAuth (using passport.js), you can get the source on Github.

Facebook support is coming next. If you want support for your favorite photo services, please let me know what you want or, if you have the ability, submit pull requests with patches.

I’ve also written a few bits of supporting code for this. I abstracted out the basic PubSubHubbub verification calls into a standalone library: node-push-helper.

I also have stubbed out a new node Flickr client. I made a new one because I wanted to use oAuth instead of the deprecated Flickr authentication methods. After trying to retrofit one of the other libraries, I decided to just start over. I may merge this back into one of them, but for now, expect new functionality in the coming weeks. Here it is: node-flickr.

Would love code reviews and criticism from node experts. I know the code can be better.

Proxigram – a sprint using Node.js, Express.js, & the Instagram API

I’m happy to share a little experiment I played with this week. I needed to take a look at Node.js & it’s family of technology for a project but found it hard to find good explanations of best practices, etc. There are a half-dozen competing boilerplate/template samples that have very little in the way of explanation or comments. So, I decided the best way to get familiar with the nitty gritty of building a Node/Express app was to write one.

I decided to solve a simple problem I had. I wanted to get my recent photos from Instagram onto my blog. I wanted it to be a simple JS call or plugin, and I wanted it to be smart about storing keys for read-only access to my Instagram account. It seemed like a simple proxy for the Instagram API would suffice. The OAuth credentials are stored on the proxy and a new, non-Instagram specific key gets embedded in the JS.

And thus, Proxigram was born.

Sure, it’s a little contrived, but now that I’ve built it, I’ve got ideas for some improvements and, even better, I now have a functional, real app to share with everyone so I can get feedback about all the things I did poorly.

The source code is all on Github, both for Proxigram itself as well as the jQuery Proxigram library to access it.

The app is interesting to look at on a few levels. The package.json listing the bits and pieces I used is below. The app talks to the Instagram API, obviously uses MongoDB to cache results local to the app. It keeps that cache fresh by using Instagram’s real-time API to get updates for users. It uses passport.js for authentication (though it seems like more of the cool kids are using everyauth these days). It uses less.js for the stylesheets.

So, if you need a working example for all of those things, here you go.

Please leave feedback about things that make you itch about the code. I know a bunch of you are serious Node.js mavens, so I’m really curious what you folks think and what conventions you’re following in your projects. My biggest question at this point is how to deal with making shared components available to code in different files. For example, I made some of the authentication filters global because I split my routes up into multiple “controller-ish” files. None of the boilerplate/template apps did it better, IMHO. If you have thoughts on that one, let me know.

PS. The images on the right are getting served through Proxigram. 🙂

Here’s the dependency list from my package.json:

{
    "name": "proxigram"
  , "version": "0.1.0"
  , "private": true
  , "dependencies": {
      "express": "2.5.8"
    , "less-middleware": ">= 0.0.1"
    , "jade": ">= 0.0.1"
    , "moment": ">= 0.0.1"
    , "passport": ">= 0.1.8"
    , "passport-instagram": ">= 0.1.1"
    , "passport-http-bearer": ">= 0.1.2"
    , "mongoose": ">= 2.5.0"
    , "connect": ">= 0.0.1 < 2"
    , "connect-redis": ">= 1.0.0"
    , "connect-heroku-redis": ">= 0.1.2"
    , "airbrake" : ">=0.2.0"
    , "instagram-node-lib": ">=0.0.7"
    , "express-messages-bootstrap": "git://github.com/sujal/express-messages-bootstrap.git#bootstrap2.0"
  }
  , "engines": {
      "node": "0.6.x"
    , "npm":  "1.1.x"
  }
}

Incentivizing individual relocation vs. corporate relocation

Where We Live ran a show today on why younger people (25-34) are leaving the state. I ended up missing the show (listening to it now!), but caught a very lively discussion on Twitter.

One side conversation (you can see it on storify here) that I joined in on was about how hard it is to convince people to move to CT.

Let’s be honest. It’s hard. Harder than it should be, quite honestly, considering how nice it is to live here. I had a lot of experience with this when I was hiring people at ESPN. Even with a kick ass company, a dream job for most sports fans, and great relocation packages, it was hard to convince people to move to CT. There are a lot of reasons, and they vary by person, but they all seemed to boil down to cost and opportunity.

For people moving from a relatively low cost-of-living area, like Kansas or Vermont or Nevada, it was mostly about cost. CT looks and feels expensive when you’re browsing real estate or rents even though we’re cheap-ish for the region. For whatever reason, cost-of-living adjustments never seemed to truly capture the difference people felt.

For people moving from a big city or high cost-of-living area, like New York City or Boston or San Francisco, it was mostly about opportunity. What if they didn’t like the job here? What job opportunities would they have at a similar style company? This was exaggerated in my particular industry. There aren’t many of the kind of companies that are doing consumer Internet or mobile products that ESPN makes. If you live in NYC or Boston or SF, on the other hand, you have dozens of options plus many, many strong startup and entrepreneurial companies nearby.

There are people working on the second issue (e.g. this effort). So I asked about the cost reason, specifically:

As I mentioned, you can read the whole convo if you want, but that’s the main point.

CT just authorized $291 million in spending over 10 years to incentivize Jackson Labs to build a $1 billion facility here. Of that, $192 million is a loan that is entirely forgiven if the facility creates & retains 300 jobs in 10 years.

Now, I don’t know whether that’s a good program or not for 300 jobs, but like a lot of programs, the government seems to focus on giving money to companies rather than individuals. This isn’t bad per se, but the tradeoffs are worth talking about.

For one thing, this means that the government has to choose an industry or individual company to offer these incentives to. This doesn’t prevent existing companies from leaving, nor does it help them. That seems odd to me. But it’s also exceedingly common. Remember the brouhaha about Boeing moving it’s HQ?

Instead, I wonder if cities or states have considered incentivizing individuals to take existing jobs. For the same expenditure ($192 million over 10 years), the state could offer 5 year rent or property tax credits of $5,000 to over 7,500 families or individuals. $5,000 would cover over 100% of the median property tax for the state.

On the surface, that seems like a better use of taxpayer money. That assumes there are, of course, 7,500 jobs to fill in the state. Then again, more people moving here mean more people needing services and products. I suspect like any social network, virtual or real, there are network effects that come into play once you create some momentum.

Curious if anyone has seen studies or programs like this in the United States. I wonder how well they work, and what data we can glean from them.

What we have here is a failure to communicate

(I should point out, coincidently, and in testament to how obvious the headline choice is, the Courant chose a similar headline. I started writing this before I saw the Courant article, for the record. 🙂 )

I had a rather animated conversation with a friend today about CL&P’s performance during this most recent storm. I won’t bore you with the whole thing, but there are a few things I wanted to open up to a broader conversation. At this point, my focus is about how to ensure that people aren’t surprised, frustrated and without power the next time weather happens.

My expectation as a customer is that they have a plan to:

  1. … maintain the lines during normal times to minimize potential damage from a weather event.
  2. … repair the lines quickly, including how to get additional crews in if necessary
  3. … coordinate repairs with town leaders around the state

Reporting about the outage has called into question CL&P’s effectiveness on all three aspects. The Times published an article this weekend calling into question the maintenance budgeting at CL&P and planning. Even better was this anecdote from our own Mayor Slifka in the Courant:

In West Hartford this week, when the electric company was refusing to tell town leaders what streets its crews were going to be working on, officials came up with their own improvised solution. Municipal leaders sent town police over to CL&P’s staging area at Westfarms mall in the evening to ask the crews themselves where they would be heading to work the next day.

This isn’t neurotic small town bureaucrats overreacting. This information is critical during an emergency, when fire and rescue personnel must know what streets are passable. Already, one elderly West Hartford woman without power died in a fire at her home this week.

So what did CL&P do when they found out West Hartford police were tracking down where crews were going to work?

They told their workers not to talk to the cops. Now there’s a company that cares.

The communication issues seem unforgivable. No one has come up with a plausible reason why CL&P couldn’t tell the towns where their crews would be, or in what order they were approaching the work. Either they didn’t have a plan or central coordination, or they put their corporate image above the safety of citizens. That’s basically it.

There’s going to be an investigation into CL&P’s performance, so maybe we’ll find out more about how they stack up to other utilities. Regardless, though, the episode has raised an alternate option that we should consider.

I think it’s time to consider organizing our utilities differently. For example, the way we handle water here in West Hartford is via a public/private corporation. With the MDC, we have pooled together resources with several surrounding towns and cities in order to provide water to our citizens via a non-profit corporation. The city of Norwich has run their own public utility corporation (for profit) for over 100 years. We let governments at all levels maintain roads, airports, and other infrastructure. In my mind, the lines that carry telephone, cable, and Internet (particularly at the local level – the last mile networks) should be like roads – shared, impartially maintained where private companies compete to provide us service. I don’t see why power lines, especially at the local level, should be any different. In all of these cases, you break monopolies, accelerate the competition from private vendors for new products and solutions, and bring accountability closer to the customer. At least the mayor won’t have to send out the police to chat up crews to find out details about the repair efforts.

These seem like good things. There are certainly going to be tradeoffs. Curious how everyone else feels about something like this.

If you’re interested in learning more, the Colin McEnroe show covered this topic today, including a few towns that converted their transmission functions to a public agency or a public/private corporation. I caught part of it this afternoon, looking forward to listening to the rest later tonight (after the Eagles game – Go Eagles!)

Google I/O: An unadulterated celebration of technological imagination

That mouthful is my one sentence description of Google I/O. The demo floors and tonight’s After Hours party are full of whimsy and wonder, literal playgrounds for technology geeks of all stripes.

The atmosphere at I/O is all about the possible, the future, and the fanciful. There are companies making robots, others building home mesh networks that can control all your lights, and yet others working on all sorts of crazy gadgetry. It was all very cool. I went from demo to demo filled with a sense of wonder of what the next few years might bring. Sure, there were really practical sessions about new APIs and tech, but stepping outside of the sessions put you in a geeky wonderland.

At the same time, I can’t help but see this as a metaphor for the differences between Apple & Google or Apple & Android. At Apple’s WWDC, it’s intensely about what will get done now. It’s about making connections and learning the tech that you’re going to put into your next application. Folks on stage demo apps that will launch very soon. Apple talks about features that will be available in weeks. It’s about go, go, go and very much about making money. It’s almost businesslike during the day, with fun during the evening.

WWDC is all about execution, making money, and shipping NOW. For Android, folks know it’s going to be huge, eventually, so they know they’ll make money, eventually. Every time I saw a cool hardware demo, or a neat looking app with some advanced tech, the answer I inevitably got to my, “When can I get it (I really want it!)?” was “fourth quarter this year” or later.

If that doesn’t summarize the state of the two app markets, I’m not sure what else does.

I’ll be adding more photos to my Google I/O 2011 set as I get time. I have some great photos of some of the demos at the party and around the daytime demo area. You can see what I mean for yourself.