Azure Video and the Silverlight Path


Fig 1 – Video Synched to Map Route

.
My last project experimented with synching Streetside with a Route Path. There are many other continuous asset collections that can benefit from this approach. Tom Churchill develops very sophisticated software for video camera augmentation, Churchill Navigation. He managed to take some time out of a busy schedule to do a simple drive video for me to experiment with.

In this scenario a mobile video camera is used along with a GPS to produce both a video stream and a simultaneous stream of GPS NMEA records. NMEA GPRMC records include a timestamp and latitude, longitude along with a lot of other information, which I simply discarded in this project.

First the GPS data file was converted into an xml file. I could then use some existing xml deserializer code to pull the positions into a LocationCollection. These were then used in Bing Maps Silverlight Control to produce a route path MapPolyline. In this case I didn’t get fancy and just put the xml in the project as an embedded resource. Certainly it would be easy enough to use a GPS track table from SQL Server, but I kept it simple.

NMEA GPRMC Record Detail:

$GPRMC,050756,A,4000.8812,N,10516.7323,W,20.2,344.8,211109,10.2,E,D*0E

0	$GPRMC,	http://www.gpsinformation.org/dale/nmea.htm#RMC
1	050756,	time 05:07:56
2	A,		Active A | V
3	4000.8812,	Latitude
4	N,		North
5	10516.7323,	Longitude
6	W,		West
7	20.2,		ground speed in knots
8	344.8,	track angle degrees true
9	211109,	date 11/21/2009
10	10.2,		magnetic variation
11	E,		East
12	D*0E		Checksum

XML resulting from above NMEA record

<?xml version="1.0" encoding="utf-16"?>
<ArrayOfLocationData xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <LocationData>
    <ID>050754</ID>
    <Description>Boulder GPS</Description>
    <Location>
      <Latitude>40.0145083333</Latitude>
      <Longitude>-105.278808333</Longitude>
      <Altitude>0</Altitude>
      <AltitudeReference>Ground</AltitudeReference>
    </Location>
  </LocationData>
  <LocationData>
    <ID>050756</ID>
    <Description>Boulder GPS</Description>
    <Location>
      <Latitude>40.0146866667</Latitude>
      <Longitude>-105.278871667</Longitude>
      <Altitude>0</Altitude>
      <AltitudeReference>Ground</AltitudeReference>
    </Location>
  </LocationData>
      .
      .

Once the route path MapPolyline is available I can add a vehicle icon similar to the last streetside project. The icon events are used in the same way to start an icon drag. Mouse moves are handled in the Map to calculate a nearest point on the path and move the icon constrained to the route. The Mouse Button Up event is handled to synch with the video stream. Basically the user drags a vehicle along the route and when the icon is dropped the video moves to that point in the video timeline.

Video is a major focus of Silverlight. Microsoft Expression Encoder 3 has a whole raft of codecs specific to Silverlight. It also includes a dozen or so templates for Silverlight players. These players are all ready to snap in to a project and include all the audio volume, video timeline, play-stop-pause, and other controls found in any media player. The styling however, is different with each template, which makes life endurable for the aesthetically minded. I am not, so the generic gray works fine for my purposes. When faced with fashion or style issues my motto has always been “Nobody will notice,” much to the chagrin of my kids.

Expression Encoder 3 Video Player Templates

  • Archetype
  • BlackGlass
  • Chrome
  • Clean
  • CorporateSilver
  • Expression
  • FrostedGallery
  • GoldenAudio
  • Graphing
  • Jukebox
  • Popup
  • QuikSilver
  • Reflection
  • SL3AudioOnly
  • SL3Gallery
  • SL3Standard

At the source end I needed a reliable video to plug into the player template. I had really wanted to try out the Silverlight Streaming Service, which was offered free for prototype testing. However, this service is being closed down and I unfortunately missed out on that chance.

Tim Heuer’s prolific blog has a nice introduction to an alternative.

As it turns out I was under the mistaken impression that “Silverlight Streaming” was “streaming” video. I guess there was an unfortunate naming choice which you can read about in Tim’s blog post.

As Tim explains, Azure is providing a new Content Delivery Network CTP. This is not streaming, but it is optimized for rapid delivery. CDN is akin to Amazon’s Cloud Front. Both are edge cache services that boast low latency and high data transfer speeds. Amazon’s Cloud Front is still Beta, and Microsoft Azure CDN is the equivalent in Microsoft terminology, CTP or Community Technical Preview. I would not be much surprised to see a streaming media service as part of Azure in the future.

Like Cloud Front, Azure CDN is a promoted service from existing Blob storage. This means that using an Azure storage account I can create a Blob storage container, then enable CDN, and upload data just like any other blob storage. Enabling CDN can take awhile. The notice indicated 60 minutes, which I assume is spent allocating resources and getting the edge caching scripts in place.

I now needed to add Tom’s video encoded as 640×480 WMV up to the blob storage account with CDN enabled. The last time I tried this there wasn’t a lot of Azure upload software available. However, now there are lots of options:

Cloud Berry Explorer and Cloud Storage Studio were my favorites but there are lots of codeplex open source projects as well.

Azure Storage Explorer

Factonomy Azure Utility(Azure Storage Utility)

SpaceBlock

Azure Blob Storage Client

I encountered one problem, however, in all of the above. Once my media file exceeded 64Mb, which is just about 5min of video for the encoding I chose, my file uploads consistently failed. It is unclear whether the problem is at my end or the upload software. I know there is a 64Mb limit for simple blob uploads but most uploads would use a block mode not simple mode. Block mode goes all the way up to 50GB in current CTP which is a very long video. (roughly 60 hours at this encoding)

When I get time I’ll return to the PowerShell approach and manually upload my 70Mb video sample as a block. In the meantime I used Expression Encoder to clip the video down a minute to a 4:30 min clip for testing purposes.

Here are the published Azure Storage limits once it is released:

•Blobs:
200GB for block blobs (64KB min, 4MB max block size)
64MB is the limit for a single blob before you need to use blocks
1TB for page blobs


Fig 2 – Video Synched to Map Route

Now there’s a video in the player. The next task is to make a two way connect between the route path from the GPS records and the video timeline. I used a 1 sec timer tick to check for changes in the video timeline. This time is then used to loop through the route nodes defined by gps positions until reaching a time delta larger or equal to the current video time. At that point the car icon position is updated. This positions within a 1-3second radius of accuracy. It would be possible to refine this using a segment percentage approach and get down to the 1sec timer tick radius of accuracy, but I’m not sure increased accuracy is helpful here.

The reverse direction uses the current icon position to keep track of the current segment. Since the currentSegment also keeps track of endpoint delta times, it is used to set the video player position with the MouseUp event. Now route path connects to video position and as video is changed the icon location of the route path is also updated. We have a two way synch mode between the map route and the video timeline.

  private void MainMap_MouseMove(object sender, MouseEventArgs e)
  {
    Point p = e.GetPosition(MainMap);
    Location LL = MainMap.ViewportPointToLocation(p);
    LLText.Text = String.Format("{0,10:0.000000},{1,11:0.000000}", LL.Latitude, LL.Longitude);
    if (cardown)
    {
    currentSegment = FindNearestSeg(LL, gpsLocations);
    MapLayer.SetPosition(car, currentSegment.nearestPt);
    }
  }

FindNearestSeg is the same as the previous blog post except I’ve added time properties at each endpoint. These can be used to calculate video time position when needed in the Mouse Up Event.

  private void car_MouseLeftButtonUp(object sender, MouseButtonEventArgs e)
  {
    if (cardown)
    {
    cardown = false;
    VideoBoulder1.Position = currentSegment.t1;
    }
  }

Silverlight UIs for continuous data

This is the second example of using Silverlight Control for route path synching to a continuous data collection stream. In the first case it was synched with Streetside and in this case to a gps video. This type of UI could be useful for a number of scenarios.

There is currently some interest in various combinations of mobile Video and LiDAR collections. Here are some examples

  • Obviously both Google and Microsoft are busy collecting streetside views for ever expanding coverage.
  • Utility corridors, transmission, fiber, and pipelines, are interested in mobile and flight collections for construction, as built management, impingement detection, as well as regulatory compliance.
    Here are a couple representative examples:
      Baker Mobile
      Mobile Asset Collection MAC Vehicle
  • Railroads have a similar interest
      Lynx Mobile
  • DOTs are investigating mobile video LiDAR collection as well
      Iowa DOT Research

Summary

Mobile asset collection is a growing industry. Traditional imagery, video, and now LiDAR components collected in stream mode are becoming more common. Silverlight’s dual media and mapping controls make UIs for managing and interfacing with these type of continuous assets not only possible in a web environment, but actually fun to create.

This entry was posted in Uncategorized by admin. Bookmark the permalink.

Comments are closed.