It’s not every day that you are given the opportunity to help execute an idea to recreate the infamous “famous bullet” effect from the Matrix using a very different set of tools: 50 Lumia 1020s. This effect would be the star of the new advertisement for the Lumia. What came next was an interesting app development challenge that would inevitably showcase the productivity of Microsoft’s development platform and tools. The goal? To create a camera app that would take full resolution pictures from all 50 phones, all at the same time, to capture 50 different perspectives of the same moment in time. Challenge accepted.
“Fail fast” was the mantra. An aggressive schedule demanded the fail fast mindset. We had to think simplest way to achieve the goal and try it. Ensuring that all the phones could capture a sequence of pictures with a latency of less than 50ms(1ms is a 1000th of a second) was not a simple goal. We wanted to avoid point to point connections to each device to push the “shutter”. The rationale behind this was that in such cases, the triggering app would have to sequentially enumerate each connection to send data, hence introducing the risk of triggering delays between the first and last devices.
Initially, we considered an approach where we would rely on the device’s clock and schedule when the picture will be taken. The phones would then poll a cloud resource to receive the time they need to take the picture. We quickly realized that the devices were off by a few seconds despite having clock syncing enabled. We wanted to give this approach another try so we implemented a time adjustment algorithm that uses a central point of reference in the cloud. Even though we brought the latency down to 150ms, we were still off from our goal.
At this point, we iterated again and began to work on a new solution based on network multicasting. Network multicasting is a form of networking communication where the data is addressed simultaneously to a group of devices.
The new approach showed promising results in our first tests. Odd discrepancies between certain devices had to be tackled, caused by the camera’s flash and the automatic lighting adjustment of the device. But after a few iterations of tweaks we were able to meet our latency goal!
The iterative process coupled with Visual Studio enabled us to adapt quickly, code with superb productivity and implement these ideas with a few lines of code.
One Controller to Control them All
Each phone runs a custom camera app built from the grounds up for this project that takes advantage of the Lumia’s 1020 maximum picture resolution. The app listen listens for a network “trigger” that fires the camera and saves the picture in the device.
A key concern was avoiding manipulating the devices once they were mounted on the rig. The rig is a custom built metal arc where each device was purposely placed to take a 180 degrees’ perspective of the same moment in time. Great for making the effect possible, but not for manipulating each device individually, therefore having the ability to set the camera's settings for all the devices from a central location was a chief goal. So taking advange of the same networking approach we used for the trigger we implemented a mechanism where each device would "listen" for the camera settings as well.
Like any project, the aggressive schedule kept looming while the list of new challenges was growing. One of those challenges was to figure out how the director would trigger the shooting sequence and the broadcasting of the camera settings.This functionality was better achieve with a bigger form factor made sense and a Surface was the natural choice.
From the development’s perspective, this meant that we had to create another app. This app would need to send the execution trigger as well as the camera settings to all the phones. For this purpose, the WinRT networking APIs provided us with the stepping stone that we needed to create a Windows Store App in a very short period of time.
Testing with 50 Devices
But, wait, how do you measure latency? Specially considering that the internal clocks of the devices could be out of sync by a few seconds and this rules out any time-stamp or logging mechanism generated from the phones. An impossible task? Well, not really. Our solution was quite simple. To test, we pointed the cameras to a large screen displaying a stopwatch counting down to the milliseconds. Once we’ve triggered the capture, we checked the time value captured in each photo. Latency measured!
With so many individual moving parts, there was no way we could anticipate the number and types of issues we were going to face once the components were integrated and tested as a whole. The first challenge we encountered was the arm which held the phone on the rig obstructed the viewfinder. To fix this issue, we made the viewfinder resizable.
A more pressing concern was that we could not get a good sense of whether the devices were receiving the execution trigger and the settings. With the multicasting mechanism we had low latency to the expense of reliability. So we decided to show on each device a running log of what was received so we check after each shooting.
Simply put, we strived towards finding simple and practical solutions to our challenges.
At the shooting, the hard work paid off and the applications performed as expected, delivering foundation for the awesome video that Paul Trillo created.
In this blog post, I provided you a quick overview into how we approached the creation of software for the Lumia Moments commercial. In my next blog posts, I will dive deeper into some key aspects of the implementation of the software.
Update: I made some changes to this blog post since I first published it to include some additional details about the phone app and how we tested latency.