Last week at Google I/O, Brad Abrams and I decided to go for the gold and give a talk on building a cloud backend for your Android app, complete with live coding on stage. For a more interesting example, we chose to build a sample app which relies on location services and Google Cloud Messaging. It all seemed like a good idea at the time. For the benefit of other Android presenters, I wanted to give a behind-the-scenes view of how we overcame some significant challenges to pull off the demos. Also, I will explain the anomalies we saw.
Device display. The easiest way to show an app is using an emulator, but Maps V2 API is part of Google Play Services, which doesn’t currently run on the Android emulator, so we needed physical devices. Droid@Screen works, but it updates slowly and requires the USB cable, which we needed for hardwired Ethernet. Some devices support HDMI out (Galaxy Nexus, N10) which can be used with a BlackMagic or similar HDMI capture box to show the device display on your Mac screen. Note that this requires a Mac with two Thunderbolt ports (hello Retina! BlackMagic in, Mac display adapter out). We could not use this because the phone’s HDMI connector is shared with the USB connector, which we needed for hardwired Ethernet. Fortunately, we had a Wolfvision projector camera available and a 4-port video switcher to show our Macs or the camera. For best results with a projector, set the phone display brightness to max.
The network. Gather 5000+ geeks in a massive concrete structure, give them all a new wifi device, and then try to get a working Internet connection… Despite Google’s stellar efforts, getting wireless Internet at Moscone Center remains very challenging. 3G / 4G / LTE is inhibited by the concrete and steel, and even 5GHz wifi suffered from RF interference or sheer demand. Thus, we knew in advance that we would need hardwired Internet. This was no problem for our laptops, but we also needed 2 phones for the demos to show continuous queries in action. Fortunately, Galaxy Nexus phones support wired connections using a USB OTG adapter paired with a USB to Ethernet adapter, and it just worked. So far so good.
Deferred launch. Because we were coding live in Eclipse, we deployed the code to the phones in real time via the USB cable. When you run an Android app in Eclipse, the default action is to install and run the app. But this created a problem for the simple version of our demo app, which needed an Internet connection on launch to send a location update immediately. We had to disconnect the USB cable, walk the phone over to the camera, plug in the hardwired Ethernet, and only then launch the app via the home screen. Fortunately, Eclipse has a way to support this. In the project’s run configuration on the Android tab, simply select “Do nothing” for the launch action. This installs the app, but doesn’t launch it.
Live coding. Speaking of live coding, we have all seen those demos with lots of tedious cut and paste operations from a text file to the IDE. As a presenter, I want something visible in the IDE to prompt me for the fully manually steps as well as an easy way to paste in blocks of code. Eclipse snippets are just the thing. You can show them off to the side of your main editor window, then double-click on a snippet to paste it in. I used snippet titles without code to prompt me for the manual steps. Another challenge of live coding is that you might make a mistake at an early stage of the demo which ruins a later stage. In order to facilitate recovery, we created snapshots of working demo code at six different stages during our presentation. It would have been nice to preinstall these app versions on the devices; however, they all use the same package name and the Android installer doesn’t allow more than one at the same time. Hence the need for adb install and the deferred launch technique. Also we had to uninstall the prior version of the app on the device before each coding stage. Furthermore, Eclipse doesn’t let you create multiple projects with the same name. There is a workaround, but it kind of spoils the magic of our talk and I’ve already spilled enough secrets… let’s just say Eclipse workspaces and “Copy project” are a beautiful thing.
Location services. This is the one that snagged us in the first demo. It turns out that location services are not in fact magic, but require GPS (in Moscone?!) or wifi, which we turned off just before speaking to eliminate the possibility of conflict with the hardwired Ethernet. Fortunately, we were able to get a good enough wifi signal in a later demo to show location. On desktop devices, geolocation works with a hard-wired connection, so I’m not yet sure whether this is a bug or a feature on mobile… it is admittedly an uncommon use case.
The unanticipated. After recovering from the first demo glitch by turning on wifi, I demonstrated that location services worked, but it never sent the location to the server. It turns out the code was working as intended–I had already enabled authentication for my next segment, but the app from the previous segment which I revisited did not have auth turned on. Finally, the last demo showed two markers when it should have showed only one. The second marker turned out be a fellow Googler in the audience who had helped dogfood an early version of the demo app. So once again, the code worked as intended, just not as I intended at the moment. But… this is the stuff live demos are made of. I still prefer live demos with a few glitches to an hour full of slides 🙂
So now you know what Brad and I were busily doing while the other was talking and why talks of this complexity are pretty much impossible without two people and a lot of presentation gear. It was a teeny bit intimidating knowing that our session was part of the Google I/O live stream with thousands of viewers, but in the end, it came off (almost) without glitches and our gracious audience was wowed by the Mobile Backend Starter project. Mission accomplished!
See my next post for a complete list of resources on Mobile Backend Starter.