Wrox Home  
Search
Silverlight 1.0
by Devin Rader, Jason Beres, J. Ambrose Little, Grant Hinkson
October 2007, Paperback


Excerpt from Silverlight 1.0

Building Silverlight Video Applications

By Jason Beres

In pretty much every demo of the new Silverlight technology, the product demos show off a video integrated into a Web page. This was intentional. The goal of the initial release of Silverlight was to provide rich, multimedia experiences on Web pages, which in the case of Silverlight 1.0, means audio and video on Web pages. If you take a look at the top 100 trafficked Web sites on the Internet, almost all of them have video playing on the home page or have video prevalent throughout. If Microsoft was to take the next step of having a complete stack of capabilities for Web pages, having multimedia integration was essential. Adobe Flash is pretty much the dominant cross-platform vehicle for any media playing.

Working with XAML in This Article

To work with the XAML here, you should create a new Visual Studio 2005 Silverlight project. Follow the next few steps to create an application that will run, but is such that you can type in the XAML or JavaScript to test the code that you are learning about.

  1. In Visual Studio, create a new Silverlight project named SilverlightWrox1.
  2. Delete all of the XAML in the Scene.xaml file except the root Canvas declaration and the namespace references. It should look like the following code:
<Canvas xmlns="http://schemas.microsoft.com/client/2007" 
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
</Canvas>
  1. Delete all of the JavaScript in the Scene.xaml.js file except the following JavaScript:
if (!window.SilverlightWrox1)
    window.SilverlightWrox1 = {};
SilverlightWrox1.Scene = function() 
{
}
SilverlightWrox1.Scene.prototype =
{
    handleLoad: function(plugIn, userContext, rootElement) 
    {
    }
}

You now have a basic template that you can add XAML and JavaScript code to so you can test the concepts in this article. Most of the XAML listed here can simply be typed in the Scene.xaml file inside of the root Canvas object. If there is anything special that needs to be done to get code to work, we will point it out.

Adding Silverlight Video to Web Pages

To add video or audio to a Web page, you set the Source property on the MediaElement object. The following code demonstrates playing the video file water.wmv automatically when the canvas is loaded:

    <Canvas
        xmlns="http://schemas.microsoft.com/client/2007"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        Width="640" Height="480">
        
        <MediaElement Source="water.wmv"></MediaElement>
        
    </Canvas>

The Source property is the URI of a valid video or audio file. In the preceding code example, the source file is located in the directory where the XAML file and the HTML page that loaded the XAML file is located. Your media files can be located in various locations, including the Web site folder structure you are running the page from, or from a remote site. In either case, in order to maintain cross-platform support, you must use the "/" in place of the "\" in your URIs. For example:

   <MediaElement Source="..\..\media\water.wmv"></MediaElement>

should read:

   <MediaElement Source="../../media/water.wmv"></MediaElement>

If the Source property is pointing to a file on a Windows Media Server using the MMS protocol, the player will automatically attempt to stream the video down to the client. The default behavior is a progressive download, which means the audio or video will begin playing immediately and background load as you are playing the media. The drawback to progressive downloads is that even if you pause the video, it still downloads the media file, even if you never intend to continue playing it. With streaming media, the only data that is downloaded is the data that you actually play, which is a more efficient use of network resources.

At Mix '07, Microsoft announced a free media streaming service for Silverlight applications, named Silverlight Streaming Services. Using Silverlight Streaming Services, anyone can upload up to 4 gigabytes of Silverlight content to stream to your pages. There is an SDK and other tools that make it easy to work with the streaming service, but the end goal is to provide an easy-to-use service that eliminates barriers for adopting and using the Silverlight platform. To get a free account for this service, visit https://silverlight.live.com.

Supported Audio and Video Formats

The MediaElement supports the Advanced Stream Redirector (ASX) playlist file format, as well as the audio and video formats listed in Table 1.

Table 1: MediaElement Supported Formats
Audio Format Description
WMA 7 Windows Media Audio 7
WMA 8 Windows Media Audio 8
WMA 9 Windows Media Audio 9
MP3 ISO/MPEG Layer-3 in the following configurations:

ISO/MPEG Layer-3 compliant data stream input

Mono or Stereo channel configurations
8. 11.025, 12, 16, 22.05, 24. 32. 44.1 and 44.8 kHz sampling frequencies

8-320 kbps and variable bit rates

Free format mode (ISO/IEC 11172-3, sub clause 2.4.2.3) is not supported
Video Format Description
WMV 1 Windows Media Video 7
WMV 2 Windows Media Video 8
WMV 3 Windows Media Video 9
WMVA Windows Media Video Advanced Profile, non-VC1
WMVC1 Windows Media Video Advanced Profile, VC1

It is important to note that the actual file extension of a media file is irrelevant; the player will determine the encoding and play the media file with the appropriate codec. To encode audio and video to the various formats described, you can use the new Expression Encoder, another tool in the Expression Suite of products. Using Expression Encoder, you can set the encoding to the aforementioned formats, as well as add markers, overlays, and even perform cropping on your media files. I have attempted to play files with the WMV extension, but they did not play. This is most likely because the tool that was used to stream the video was using some other custom codec, not one of the supported encodings for Silverlight. The water.wmv video used in Listing 1 was originally an MPEG video that I recorded using my digital camera. To get the WMV format that is optimized for Silverlight 1.0 Mbps broadband, I used the Expression Encoder.

Interacting with the MediaElement Object

When you add a MediaElement to a page, you can get or set a number of properties that will change the state of the media in the MediaElement. Table 2 lists some of the properties that the MediaElement exposes.

Table 2: Properties Exposed on the MediaElement
Property Description
AutoPlay Gets or sets a value indicating if media will begin playback automatically when the Source property is set.
Balance Gets or sets the ratio of volume across stereo speakers.
BufferingProgress Gets a value that indicates the current percentage of buffering progress.
BufferingTime Gets or sets the time at which this timeline should begin.
CurrentState The current state of the MediaElement: Buffering, Closed, Error, Opening, Paused, Playing, or Stopped.
DownloadProgress A value indicating the percentage of download completed for content located on a remote server. The value ranges from 0 to 1.
IsMuted Gets or sets a value indicating whether the audio is muted.
Markers The collection of timeline markers (represented as TimelineMarker objects) associated with the currently loaded media file.
NaturalDuration Gets the natural duration of the media.
Position Gets or sets the current progress through the media's playback time.

If the media does not support seek operations, setting this property will have no effect on media playback.
Source Gets or sets a media source on the MediaElement.
Volume Gets or sets the media's volume represented on a linear scale between 0 and 1.

The MediaElement also exposes a number of properties that let you interact with the MediaElement in a similar fashion that you would with any video or audio player, such as Play, Pause, and Stop. To create a more interactive media player, you need to add Image, TextBlock, or Rectangle objects to a Canvas and write the JavaScript to interact with the elements to play, pause, and stop the media. Listing 1 is the XAML needed to create a simple, interactive media player.

Listing 1: Adding Play, Pause, and Stop Capabilities to a MediaElement Object

<MediaElement x:Name="media"
     Source="water.wmv" 
     Width="480" Height="360"/>
<!-- Plays media-->
<Canvas MouseLeftButtonDown="mediaPlay"
  Canvas.Left="10" Canvas.Top="365">
    <TextBlock Canvas.Left="5" 
               Canvas.Top="5">Play</TextBlock>
</Canvas>
<!-- Pauses media playback-->
<Canvas MouseLeftButtonDown="mediaPause"
  Canvas.Left="70"Canvas.Top="365">
    <TextBlock Canvas.Left="5" 
               Canvas.Top="5">Pause</TextBlock>
</Canvas>
<!-- Stops media playback-->
<Canvas MouseLeftButtonDown="mediaStop"
  Canvas.Left="130" Canvas.Top="365">
    <TextBlock Canvas.Left="5" 
               Canvas.Top="5">Stop</TextBlock>
</Canvas>

Listing 2 is the JavaScript needed to set the actual state of the media. Notice the use of findName on the sender object passed into the function call. Remember that findName will search the object tree of the entire XAML DOM to locate the elements you are attempting to find.

Listing 2: Using JavaScript to Interact with the MediaElement State

function mediaStop(sender, args) {
    sender.findName("media").stop(); 
}
function mediaPause(sender, args) {
    sender.findName("media").pause();
}
function mediaPlay(sender, args) {
    sender.findName("media").play();
}

Figure 1 is what the output of the video player will look like.

Figure 1
Figure 1

Using Markers and Timelines in Video

Using tools like the Expression Encoder, you can add markers to video files. Markers are metadata that is stored in the video file when it is encoded. Using markers, you can create timelines that allow seek operations, and you can use the metadata associated with the marker to display additional information during playback. Markers and timelines are especially useful for doing video overlay and closed captioning. Based on a marker timeline metadata, you can display custom text or images over the playing video.

When the MediaElement object reaches a marker during playback, the MarkerReached event is raised. You can handle this event in JavaScript, which allows you to perform operations based on the marker metadata. The following properties associated with the marker metadata can be accessed via JavaScript:

  • Time — A TimeSpan object that specifies the time when the marker is reached.
  • Type — A string that specifies the marker's type. This value can be any user-defined string.
  • Text — A string that specifies the marker's value. This value can be any user-defined string.

The XAML in Listing 3 demonstrates the syntax for specifying the MarkerReached event in the MediaElement. Listing 4 demonstrates the JavaScript that updates the TextBlock elements to display the metadata associated with the markers in the video file.

Listing 3: Registering the MarkerReached Event on a MediaElement Object

<MediaElement x:Name="media"
     Source="watermarkers.wmv" 
     MarkerReached="onMarkerReached" 
     Width="480" Height="360"/>
<Canvas Canvas.Left="525" Canvas.Top="5">
    
    <TextBlock>Time:</TextBlock>
    <TextBlock x:Name="timeTextBlock"
      Canvas.Left="60"/>
    
    <TextBlock Canvas.Top="30">Type:</TextBlock>
    <TextBlock x:Name="typeTextBlock"
      Canvas.Left="60" Canvas.Top="30"/>
    
    <TextBlock Canvas.Top="60">Value:</TextBlock>
    <TextBlock x:Name="valueTextBlock"
      Canvas.Left="60" Canvas.Top="60"/>
</Canvas>

Listing 4 is the JavaScript that handles the MarkerReached event and retrieves the Time, Type, and Text values of the marker that triggered the event and updated the TextBlock objects with the correct information. The results are shown in Figure 2.

Listing 4: The onMarkerReached Function

   function onMarkerReached(sender, markerEventArgs)
   {
     sender.findName("timeTextBlock").Text =
     	markerEventArgs.marker.time.seconds.toString(); 
     	
     sender.findName("typeTextBlock").Text = 
          markerEventArgs.marker.type;  	
     
     sender.findName("valueTextBlock").Text = 
          markerEventArgs.marker.text;
     
   }

 

Figure 2
Figure 2

Figure 3 is the Expression Encoder that I used to add the markers to the video. Notice the Markers window that lists the timeline of the various markers that were added.

Figure 3
Figure 3

Painting Video onto Objects

XAML uses various brushes that you can apply to objects such as the TextBlock element and Rectangle element. (These brushes are covered in Chapter 2, "Building Silverlight Applications Using XAML," of the book, Silverlight 1.0 (Wrox, 2007, ISBN: 978-0-470-22840-1). The VideoBrush is a type of brush object that lets you paint video onto other elements using the MediaElement control. You can paint video onto the ForeGround of a TextBlock, the Fill of Rectangle, or the BackGround of a Canvas object.

Listing 5 demonstrates applying a VideoBrush to a TextBlock object, which is reflected in Figure 4.

Listing 5: Adding a VideoBrush to a TextBlock

<MediaElement x:Name="media" 
              Source="water.wmv"
              Opacity="0"/>
<TextBlock Canvas.Left="5" Canvas.Top="30"
       FontFamily="Verdana" FontSize="90"
       FontWeight="Bold" TextWrapping="Wrap"
       Text="Video Brush">
    <!-- Add the VideoBrush object  -->
    <TextBlock.Foreground>
        <VideoBrush SourceName="media"
                Stretch="UniformToFill" />
    </TextBlock.Foreground>
</TextBlock>

 

Figure 4
Figure 4

Creating Video Reflections

In many applications, video is not presented straight-on in a 2D space. Video and other objects are skewed and reflected to enhance the visual effects of the user interface. Let's look at some more advanced XAML that creates a reflection on a rotated and skewed video. Figure 5 demonstrates a video that is reflecting and skewed.

Figure 5
Figure 5

To accomplish what you see in Figure 5, examine the commented code in Listing 6.

Listing 6: Using Transforms, LinearGradientBrush, and Opacity to Reflect Video

    <!-- Create the main video image -->
    <Canvas Canvas.Left="86" Canvas.Top="68">
        <Canvas.RenderTransform>
            <TransformGroup>
                
                <!-- Skew and Scale the canvas -->
                <SkewTransform AngleY="-19" AngleX="0"
                               CenterX="0" CenterY="0"/>
                <ScaleTransform  ScaleY="1" ScaleX = "1"
                                 CenterX="0" CenterY="0"/>
            </TransformGroup>
        </Canvas.RenderTransform>
        
        <!-- Add the MediaElement to the Canvas -->
        <MediaElement Source="watermarkers.wmv"
                      Width="300" Height="300" />
    </Canvas>
    <!-- Create the Canvas for the reflected video -->
    <Canvas Canvas.Left="313" Canvas.Top="588">
        <Canvas.RenderTransform>
            
            <!-- Skew and Scale the canvas -->
            <TransformGroup>
                <SkewTransform  AngleY="19" AngleX="-41"
                                CenterX="0" CenterY="0" />
                
                <!-- Set the ScaleY to -1 to Flip the image -->
                <ScaleTransform ScaleY="-1" ScaleX="1"
                                CenterX="0" CenterY="0" />
            </TransformGroup>
        </Canvas.RenderTransform>
        <MediaElement Source="watermarkers.wmv"
                      Width="300" Height="300" Volume="0">
            
            <!-- Set an Opacity Mask on the Media Element -->
            <MediaElement.OpacityMask>
                <LinearGradientBrush  
                    StartPoint="0,.25" EndPoint="0,1">
                    <GradientStop Offset="0.25"
                                  Color="#00000000"  />
                    <GradientStop Offset="1"
                                  Color="#CC000000"  />
                </LinearGradientBrush>
            </MediaElement.OpacityMask>
        </MediaElement>
    </Canvas>

The two important code blocks that make this happen are the ScaleTransform on the Canvas object that contains the second video and the OpacityMask applied to the second MediaElement object.

By setting the ScaleY property on the ScaleTransform, you are "flipping" the image, which is how a reflection would be rendered:

<ScaleTransform ScaleY="-1" ScaleX="1"
                CenterX="0" CenterY="0" />

If you consider how an object reflects against water, it has a trailing transparency. This is accomplished by applying the OpacityMask on the MediaElement object:

<MediaElement.OpacityMask>
    <LinearGradientBrush  
        StartPoint="0,.25" EndPoint="0,1">
        <GradientStop Offset="0.25"
                      Color="#00000000"  />
        <GradientStop Offset="1"
                      Color="#CC000000"  />
    </LinearGradientBrush>
</MediaElement.OpacityMask>

Just like that, you can have enhanced rendering capabilities like reflections on media. The same concept can be applied to any object, including VideoBrush, TextBlock, and Rectangle objects.

This article is excerpted from Chapter 2, "Building Silverlight Applications Using XAML," of the book, Silverlight 1.0 (Wrox, 2007, ISBN: 978-0-470-22840-1,) by Devin Rader, Jason Beres, J. Ambrose Little, Grant Hinkson. This new full-color book shows all of the code samples in the same colors that you'll see them in if you work in Visual Studio. Jason Beres is the Director of Product Management for Infragistics, the world's leading publisher of presentation layer tools. Jason is also one of the founders of Florida .NET User Groups, a founder of the New Jersey .NET User Group, a Visual Basic .NET MVP, and a member of the INETA Speakers Bureau and the INETA Board of Directors. Jason is also the author of several books on .NET development, an international speaker, and a frequent columnist for several .NET publications. He also keeps very active in the .NET community.