I have an H264 stream (IIS - smooth streaming) that I would like to play with SilverLight. Apparently SilverLight can do it, but how?
Note:
the VC-1 stream can be played by the SilverLight, but H264 not.
Also, I can provide a stream and any additional information required. H264 encoder is the one in Media Foundation (MFT). Same goes for the VC-1 that works (although is it impossible to create equal chunks for smooth streaming because forcing key-frame insertion makes video jerky.
EDIT: MPEG2VIDEOINFO values for H264:
How to play H264 stream with SilverLight?
1.6k views Asked by user1764961 At
2
There are 2 answers
20
Sergey K.
On
Silverlight 3 can play H264 files. Use MediaStreamSource for this.
Here is the interface description: http://msdn.microsoft.com/en-us/library/system.windows.media.mediastreamsource(v=vs.95).aspx
Also, this blog entry is related to H264 playing sing Silverlight 3: http://nonsenseinbasic.blogspot.ru/2011/05/silverlights-mediastreamsource-some.html It will help you with other issues that may arise.
Related Questions in SILVERLIGHT
- How to Integrate Snapdeal affiliate api in asp.net application
- How to convert a List<string> to an IEnumerable<ServiceReference.datatable> C# Silverlight WCF RIA Services LINQ to SQL
- Silverlight doesn't have a Uri.IsLoopback property
- Silverlight Nested Custom Controls results in StackOverflowException
- Bootstrap dropdown menu hiding behind Silverlight application embedded using iframe
- Binding BitmapImage to Image on ListBox doesn't work
- Could not load type 'Telerik.Windows.Documents.Packaging.DeflaterOutputStream'
- Insert a blank row into a Datagrid with dynamic columns
- String to Double Conversion (Comma to dot issue)
- Embedded resource image is adding a white dot to my image
- do you have any solution to reduce the build time?
- Style vs inline property setting in silverlight with a custom control
- How Does the Silverlight Client Talk to the Server in a 3 Tier Lightswitch Application?
- Problems to retrieve Image from SQL Server in Silverlight
- Bind DataGrid to List<Dictionary<string,string>>
Related Questions in VIDEO-STREAMING
- Image based steganography that survives resizing?
- Strange picture noise at the beginning of live stream
- Android: Video Player Like Dailymotion App
- Cloudfront stream only part of the video
- Realtime/zero-latency video stream: what codec parameters to use?
- How to toggle mjpg_streamer on and off
- Cam streaming Flash client/widget
- ffserver - invalid codec name libvpx
- Only play rtsp video using VideoView or MediaPlayer after 3 minutes or more
- Why so many partial content requests in Firefox when streaming mp4 video on Apache?
- IIS 8 video streaming concurrency
- Why segment files into chunks for HTTP streaming?
- Error being thrown by ffmpeg and ffserver, not getting a stream
- How to send HTTP chunked response to emulate a video stream using Proxygen and Folly?
- Youtube Stats for Nerds: What does the (137/140) behind "DASH: yes" stand for?
Related Questions in H.264
- Realtime/zero-latency video stream: what codec parameters to use?
- Including SPS and PPS in a raw h264 track
- How to get width and height from the H264 SPS using ffmpeg
- VLC in Raspberry Pi won't play h264 video file
- Converting mkv to h264 FFmpeg
- Convert JPEGs to H264 and stream to my server
- H.264 decoding error log from RTSP stream
- How to extract key-frames closest to given frame numbers from H264 video with ffmpeg
- Configure MediaCodec with the proper MediaFormat from a raw H.264 byte buffer
- Record and play h.264 video in memory using Jcodec
- forcing VLC to play h264 video file
- Decoding a h264 (High) stream with OpenCV's ffmpeg on Ubuntu
- What video encoder gives best performance on an Android device for given quality?
- Demux H264 from (already recorded) raw RTSP stream on HDD
- Color Banding Playing Live Raw H.264 Stream In Android
Related Questions in MS-MEDIA-FOUNDATION
- Get all supported FPS values of a camera in Microsoft Media Foundation
- How to create IMFSample for WindowsMediaFoundation H.264 encoder MFT
- Media Foundation set video capture frame rate using PROPVARIANT structure
- Use Windows message loop to receive an event in a library I'm writing
- Video Processor MFT (Media Foundation) missing under Windows 7 Pro 64bit
- Windows Store App DLL 32bit and 64bit
- How to write in-memory PCM data by using IMFSinkWriter?
- Network media sink in Microsoft Media Foundation
- How can i properly configure ASF media sink in Media Foundation
- Source reader and custom non-seekable byte stream
- Timeout error occurred for capturing an image from photo stream in IMFSourceReaderCallback::OnReadSample?
- Media Foundation MP4 Encoding: IMFSinkWriter doesn't accept PCM input
- Media Foundation webcam video H264 encode/decode produces artifacts when played back
- custom MFT in windows phone app
- Using IMFSourceReader to open a video file
Related Questions in SMOOTH-STREAMING
- ExoPlayer: "Internal runtime error" on certain PlayReady-protected assets
- Decoding h264 in mp4 fragment from IIS Smooth Streaming
- Sending some non-encrypted fragments in a PlayReady stream
- Encoding videos for web and mobile using Azure Media Services
- Wowza + Live Streaming + Windows Phones?
- Samsung Smart TV, PlayDRM and HLS
- RTP to Live Smooth Streaming
- OSMF serial composition with smooth streaming media element
- Guessing values in real time based on previous values
- How to broadcast time-synchronized text streams using IIS Smooth Streaming?
- Playing a part of video using SMIL on iPad
- How to play smooth streaming video in Chromecast?
- SPS and PPS (aka dwSequenceHeader) in Media Foundation's H264 encoder
- How to play H264 stream with SilverLight?
- There is no way to encode video smooth streaming
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Just a guess. Based on your question 18009152. I am guessing you are encoding h.264 using the annexb bitstream format. According to comments, you can not tell the encoder to use AVCC format. Therefore, you must perform this conversion manually (Annex B WILL NOT work in an ISO container). You can do this by looking for start codes in your AVC stream. A start code is 3 or 4 bytes (0x000001, 0x00000001). You get the length of the NALU by locating the next start code, or the end of the stream. Strip the start code (throw it away) and in its place write the size of the NALU in a 32bit integer big endian. Then write this data to the container. Just to be clear, this is performed on the video frames that come out of the encoder. The extra data is a separate step that appears you have mostly figure out (except for the NALUSizeLength). Because we uses a 4 byte integer to write the NALU sizes, you MUST set NALUSizeLength to 4.