I'm using the H264 encoder from Media Foundation (MFT).
I extracted the SPS and PPS from it, because I need it for smooth streaming.
The MSDN says that the number of bytes used for the length field that appears before each NALU can be 1, 2, or 4. This is all in network byte order. As you can see, the first 4 bytes in the buffer are 0, 0, 0, 1. If we apply any of the possible lengths, we will get nothing. If the number of bytes used for length is 1, then the length is zero, if it is 2, the length is zero again. If it is 4, the length of first NALU is 1?! And, that's not correct.
Does anybody know how should I interpret this SPS and PPS concatenated together??

SPS and PPS (aka dwSequenceHeader) in Media Foundation's H264 encoder
7.8k views Asked by user1764961 At
1
There are 1 answers
Related Questions in H.264
- Android mediacodec avc/h264 encoder always produces 1MB output buffer size
- Video Emulation solution
- Exoplayer does not play h264 mpeg-4 avc (part 10) codec in Android
- Client side H.264 (MP4) video compression/encoding
- Gstreamer Serial communication between 2 devices
- Decode h264 frame using android hardware accelerated decoder in gstreamer
- FFMPEG C Library: Encoding h264 stream into Matroska .mkv container creates corrupt files
- Adding h264 frames to mp4 file
- H264 data changing after serial communication in Python
- Extend Frame Size and Re-Encoding Video to be Blu-Ray Compliant with ffmpeg and tsMuxer
- Is there min size of IMFSample when ProcessInput?
- RTSP server on live555 start send client on I-Frame (h264)
- Python Handling H264 Frames for Live Stream from Eufy Server
- MediaCodec Async mode with NDK not triggering callback functions
- GstAppSink: Sharing between two pipelines
Related Questions in VIDEO-ENCODING
- Is there a hardware encoding framework for linux
- Client side H.264 (MP4) video compression/encoding
- How to retrieve, process and display frames from a capture device with minimal latency
- Overlay a video with rounded corners - FFMPEG React Native (Like Facetime or other Video Chats)
- Can Video Files Be Stored in MCAP Format?
- Could anyone help me understand why moviepy is rendering at 2.5 it/s?
- Expected frame sizes of HEVC encoding result when a window is moving (VTCompressionSession on Mac)
- MoviePY write_videofile using GPU for faster encoding
- put mapbox-gl in a worker and read with readpixels() to encode in the main thread
- NVIDIA Video Codec Samples giving ffmpeg errors while trying to build
- Encoding custom image frames into video using gstreamer
- Fastest way to extract raw Y' plane data from Y'Cb'Cr encoded video?
- High latency in NVENC Encoding with lower frame submission
- Facebook 360 Encoder Error - FFmpeg libavdevice.57.dylib (not a mach-o file)
- Extract frames using GStreamer with NVDEC
Related Questions in MS-MEDIA-FOUNDATION
- HDR video publishing
- Video isn't recognized as HDR in YouTube upload
- m_pTransform->ProcessInput(0, pInputSample, 0) fails
- Media Foundation cannot initialise transform (cannot connect to Windows Update)
- E_INVALIDARG when I encode the NV12 texture?
- Is there a way to intercept webcam "TakePhoto" trigger button in Windows Media Foundation?
- Is there min size of IMFSample when ProcessInput?
- ISampleGrabberCB sampleCB() not getting called when using DirectShow
- MFTransform::ProcessOutput hangs in the function CDXVAFrameManager::WaitOnSampleReturnedByRenderer
- How to use a custom MediaSource for a virtual camera? Also, how and where should I register the CLSID for my virtual camera?
- how to decode data using MFVideoFormat_H264?
- How to get the required elevated privileges for my C++ application?
- Using DXVA2 in MediaFoundation, LNK2001 was encountered when compiling
- How does MediaFoundation set the brightness and other properties of UVC cameras
- MediaFoundation uses IMFTopology, IMFTopologyNode, IMFTopoLoader to create playback topology
Related Questions in SMOOTH-STREAMING
- How to request a single ISM smooth stream fragment and parse a single video frame?
- Concatenating Smooth Streaming output to a single MP4 file - problems with A/V sync. What is CodecPrivateData?
- Microsoft Smooth Streaming declares timescale of 10M by default
- Failed to resolve: com.google.android.exoplayer:exoplayer-smoothstreaming:2.6.1
- Load Testing video streaming HLS/MPEG-DASH/Smooth : What metrics do I need?
- Windows 10 UWP Smooth Streaming Video set the heuristic profile
- Splitting mp4 files vs mp4-dash
- UWP Windows 10 app - Smooth Streaming support
- Azure Media Services freezes on Android
- Playready encrypted test SMOOTH content with a license URL and token
- ExoPlayer: "Internal runtime error" on certain PlayReady-protected assets
- Download ism video for offline playback
- Programmatically configure application pools to support smooth streaming publishing points?
- Live Smooth Streaming in IIS from webcam using FFMPEG
- Smooth streaming - How does silverlight choose default language track in multi audio manifest
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
The answer here is simple: the data is valid and formatted according to Annex B, prefixed by start codes
00 00 00 01and not run length encoding.H.264 extradata (partially) explained - for dummies
More details on H.264 spec - freely available for download. Page 326 starts with "Annex B - Byte stream format".