The Quantified Tube

My TV has its own twitter feed now, capturing my TV watching habits, backed by a Belkin smart switch and IFTTT.

My TV has its own twitter feed now, capturing my TV watching habits, backed by a Belkin smart switch and IFTTT.

One of my (sadly neglected) side projects, code named Worldline, is to discover ways to automate the tracking of my daily activities and form an unfolding timeline scaffold of my life, both for personal retrospection as well as quantitative analysis. I track steps, heart rate, and sleep with a Fitbit, geolocation with Moves, driving with Automatic, iOS device usage with Moment, laptop activity with RescueTime, and some health metrics as well. The tracking of specific human activities -- like when and what I'm eating, when I'm showering or brushing my teeth, when I'm lying on my couch reading a book, or sitting in front of my TV watching a movie -- are more difficult to automate. A significant component of my project is to address that problem using RF proximity devices to provide implicit clues to discern what I'm up to: with strategically placed beacons throughout a dwelling, I can determine when I'm present in a particular room, or near a specific object. 

Proximity info by itself, while giving finer brush strokes than geolocation tracking, does not provide enough fidelity to discern whether I'm reading a book or watching TV while I'm sitting on my couch. However, by folding in other tracking devices, such as motional data from a Fitbit, the whole concert of data streams becomes greater than the sum of the individual devices, revealing a finer-grained picture of what is going on. I recently acquired some new tools to add to the orchestral lifelogging lineup that make it relatively easy to track when and what I'm watching on TV. 

With an Apple TV at the core of my entertainment system, you might think it would be straightforward to extract out my viewing history, but Apple doesn't provide this functionality. After a bit of research (thanks to @eramirez for the tip), I learned about from a Quantified Self presentation 'Tracking Media Consumption' by Ian Forrester (@cubicgarden). is a 3rd party app that integrates with the FireCore aTV Flash Apple TV jailbreak mod, which automatically tracks (aka 'scrobbles') your viewing activity and links it into social networks. On the surface, it wasn't clear to me exactly what raw data is made available to the enduser, either through the API or paid VIP service (which unlocks an RSS feed). 

Although seems worth exploring further, I decided to come up with a solution that can definitely satisfy my data needs (who doesn't like a good DIY project now and then?). My solution has three core pieces: (1) a Belkin WeMo Insight switch to detect when my TV is turned on or off, (2) IFTTT integration, and (3) the IMDB iPhone app

I purchased the Belkin WeMo Insight switch from Amazon for $50, which was relatively straightforward to set up. When you plug it into an outlet, it broadcasts its own WiFi signal that you can connect your smart phone to. Then using the WeMo app on your phone, you configure the device and redirect it to connect to your home's WiFi network. I have found this process can be a bit temperamental (when your WiFi goes down, for example, you have to repeat the steps over from scratch). After setup, I then plugged my TV into the WeMo. For my purposes, the WeMo is always left on, letting me control my TV using the TV's remote as I normally would. When I switch my TV off, the WeMo transitions from an 'On' state to a 'Stand-by' mode (in contrast, switching the WeMo off puts it into the 'Off' state). 

The WeMo app provides a direct route to integrate with IFTTT. Once you have registered your WeMo with IFTTT (assuming you already have an IFTTT account set up), you can then create recipes to capture the times when the WeMo enters any of the three states of 'On', 'Stand-by' or 'Off', and then react to each of those events. I set up recipes to record those events in a Google spreadsheet.

But as anyone knows who has played around with IFTTT, it's kind of addicting and so I decided to also set up recipes to send out tweets when my TV is turned on or off. For this I created a separate Twitter account: @jamieinfinityTV. Capturing data in a Google spreadsheet is useful, but being connected to Twitter opens up other possibilities, since nearly every site or app has Twitter integration.

This got me thinking: why not also try to capture the content of what I watch? IMDB has a smartphone app that lets you quickly search for movies and TV shows, and then tap a button that puts that show in your Watchlist. Using an IFTTT RSS recipe, you can then react to each of those events, i.e. if a new show is added to your IMDB Watchlist feed, then tweet the title and link for that show. I sometimes watch Vimeo videos as well, and Vimeo has an IFTTT channel, so I set up a recipe that when I 'like' a Vimeo video, then it tweets the title and link. 

To summarize: when I turn my TV on or off, the WeMo/IFTTT mechanism automatically captures those times, in a Google spreadsheet and on Twitter. However, in order to capture what I'm actually watching, I need to manually search for the show in the IMDB app, and then add it to my Watchlist, which, while not automated, is still pretty effortless. 

The next step is to write some Python and/or Javascript code to aggregate the @jamieinfinityTV twitter feed data and visualize it. My long-term goal is to integrate all of my various tracking feeds together into a more holistic picture of what I'm up to each day.

Codegraphy Project

Examples of relational data visualizations: CodeFlower is a D3.js module for drawing a file dependency graph; CodeCity represents classes as buildings and packages as districts laid out in a grid, color-coded with code metrics; and Circos is a tool for visualizing annotated relational data laid out on a circle.

Examples of relational data visualizations: CodeFlower is a D3.js module for drawing a file dependency graph; CodeCity represents classes as buildings and packages as districts laid out in a grid, color-coded with code metrics; and Circos is a tool for visualizing annotated relational data laid out on a circle.

Lately I've been doing a bit of research to find out what kind of code metrics are commonly used to better understand the structure and health of a codebase, and what tools exist for visualizing those metrics. It's a pretty vast subject (I've probably only scratched the surface in my research), but I'll try to give a summary of my findings so far, and sketch out what I hope to tackle in this area.

I would like to build a tool to visualize the relational structure and informational flow in a large-scale iOS project. I'm a very visual thinker who likes to gain a big-picture understanding of things, which I find can be difficult to do when joining on a new project with a large codebase. It'd be very helpful if there were ready-made tools available for Objective C to visualize a call-graph or dependency matrix of the code, color-coded with metrics like lines-of-code, cyclomatic complexity, test-coverage, or modification activity, which would help identify hot spots of potential code smells and make better-informed iterative architectural decisions. 

Although code metric and visualization tools do exist for statically typed languages like C# (e.g. Visual StudioNDepend) and Java (e.g. Sonargraph, JArchitect), as well as dynamically typed languages like python (e.g. Radon, Python Call Graph), Ruby (e.g. Code Climate), and Javascript (e.g. JSComplexity, Code Climate), there seems to be a dearth of such tools in the land of iOS (see this Wikipedia page for a list of static code analyzers, which these tools are typically built upon). For Objective C I have unearthed a couple of tools that look worth investigating further: SonarQube is a multi-language platform for managing code quality that has an Objective C plugin. I also came across a blog post that describes how to set up iOS code metrics in Jenkins. There is also this python script for generating an import dependency graph.

Given that a visualization tool for Objective C code structure and metrics doesn't exist (at least not in a form that I have in mind), I've begun to explore what it would take to build one. The first ingredient I'll need is a tool for parsing through code and generating the relational graphs I would like to visualize. The clang compiler has an API library in C called libclang that can be used to parse C/C++/Objective C code into an abstract syntax tree (AST) structure, as well as process such structures. There is also a convenient python binding for libclang (for a helpful reference, see this blog post).

So, the first step in creating a visualization tool is to use libclang to process all of the Objective C code in a project into a graph data structure (or dependency matrix). But what defines this structure? What are the nodes and links? Depending on the analysis, one could consider a node to be a file, a class, or perhaps even an object. A link corresponds to some kind of directional relation between nodes, such as when a file depends on another file, or a class calls a method in another class, or an object is injected into another object, either via constructor or method injection. I've begun to explore these different possibilities and likely more than one will turn out to be useful.

The next major step after building a relational structure will be to calculate various code metrics, such as lines of code (LOC), complexity, and code coverage, which can be incorporated via various graph element stylings, such as node size and color. Aside from the usual basic metrics, it would be interesting to consider ways to quantify properties such as code coupling and cohesion, within the source code and between source and tests, to get a sense of how flexible the code is to modification. 

The final form of this tool will most likely be a D3.js driven interactive web page. I've come across some existing code that should serve as useful references, such as CodeFlower and DependencyWheel (which is similar to a Circos visualization). I'm also intrigued by the CodeCity project, which is based around a city metaphor, representing classes as buildings. I wonder how far one could take that metaphor, perhaps superimposing transit-like network structures to represent the flow of data through the system. 

RF Proximity Spikes

StickNFind, Gimbal, and RedBearLab beacons use Bluetooth LE technology to detect the proximity of a mobile device.

StickNFind, Gimbal, and RedBearLab beacons use Bluetooth LE technology to detect the proximity of a mobile device.

For more than a year I've been scoping out potential devices I could use for RF proximity sensing, a technology that is becoming increasingly mainstream since Apple embraced it with their Bluetooth LE-based iBeacon spec and developer API (subtly introduced in iOS 7 last summer). The basic premise is that a very small device (a 'beacon') with a radio frequency antenna can periodically emit a signal that broadcasts a unique identifier assigned to the device, and a mobile device such as an iPhone may then detect that signal when it is within range. By gauging the signal strength, the mobile device can estimate how close it is to the beacon. The Bluetooth LE specification is well suited for this type of application, and several manufacturers had already started bringing their devices to market before Apple formally released their proprietary iBeacon specification earlier this year. 

One of the earliest devices in this space was the StickNFind beacon, released commercially over a year ago with the targeted use case of finding lost objects. The company quickly began offering developer support in the form of an iOS SDK and a developer forum. I purchased 10 StickNFind beacons in bulk, for $15 per unit, last summer and began trying out their SDK shortly thereafter. Overall, I found their developer support left much to be desired: new versions of the SKD were released infrequently and via email rather than through a hosted web site. I found the signal strength of the beacons to undergo rather wide fluctuations, which made it difficult to use to detect proximity. After several months I gave up on their platform.

By the end of 2013, Qualcomm unveiled their Gimbal proximity beacon platform with full iOS and Android developer support. Upon registration to their developer program, they shipped three of their beacon devices for free to help get started. So I began experimenting with their devices and feature-rich SDK earlier this year. I found the iOS-facing API to be well architected and easy to use. In particular, the key ingredient for my use case is the ability to continuously monitor the signal strength of the beacon when the app is backgrounded, which they provide a call-back method for. Once they made the devices available for purchase, at an astonishingly low cost of $5 per unit, I obtained enough to actually start to deploy in my home for testing. I'm not yet fully committed to Gimbal, but so far it seems like the most promising option.

Another strong contender in this space that I've only recently begun to explore is the offering by RedBearLab. These devices were quite a bit more pricey at $25 per unit (prices vary based on purchase size), but what I find compelling is that they've made their platform open, and they support the iBeacon spec. I believe this would allow more under-the-hood customization than offered by Gimbal, but with the obvious downside of requiring more up-front investment in building out the API.  

Hopefully I'll find time to explore these devices so I can commit to one or the other and move to the next stage of my project.