Hi all,
I am new to this forum. I am now interested in Multimedia development. I want to know how to play mpeg 2 streams using visual c++. What are all the basic things I need to develop such application.
Regards,
Gaja
Hi all,
I am new to this forum. I am now interested in Multimedia development. I want to know how to play mpeg 2 streams using visual c++. What are all the basic things I need to develop such application.
Regards,
Gaja
Hi,
I am trying to create a custom media source for a live video source, which will be enumerated as a capture device, when calling MFEnumDeviceSources. How do I register the media source to achieve that?
Thanks in advance.
I am facing a problem using IE9 under Windows 7. When opening an HTML5 page that has a video tag of type (H.264 / MP4), the video tag will fail to play the source video.
The problem occurs when I register my own custom Media Foundation MP4 media source, and as documented I write my byte stream handler CLSID in the registry key:
\HKEY_CURRENT_USER\Software\Microsoft\Windows Media Foundation\ByteStreamHandlers\.mp4
IE9 tries to use my media source to play the video tag which fails, even though my media source sucessfully plays the same video on any Media Foundation media player application.
Are we even allowed to write a custom media source to be used in IE9?
Is there any documentation that explains this?
I think the priority must be for the Microsoft media source. It seems that is always the case for Windows Media Player.
Is that a bug in Windows 7 (in windows 8 the problem does not appear)?
Can I do something to set the priority for IE9 back to Microsoft’s media source?
I have noticed that I can work around the problem by writing Microsoft’s byte stream handler CLSID to the above mentioned registry key, and then writing my CLSID. This seems to make the Microsoft MP4 Byte stream handler be preferred against my own. Is this a safe and acceptable solution?
I'm working on a simple video editing application. It is required to do simple trimming and combining across multiple video files (ie take 10 seconds from video A and 10 seconds from video B and write them to a single file). I may assume that all video files are MP4 containers with one H264 encoded video stream. I currently have a working build that fulfills these basic requirements. However, I must be able to combine clips from videos with different resolutions. Setting SinkWriter's input type to have a different resolution than its output type results in an error. Attempting to set the SourceReader's output type to a different resolution than the video it is reading also results in an error. The approach I've decided to take is to use the Video Resizer DSP to manually resize the frames after I get them from the SourceReader, then pass them on to the SinkWriter.
I'm creating a VideoResizer and using it as an IMFTransform. I use its IPropertyStore to set the 4 MFPKEY_RESIZE_DST properties, then set its input and output types using the output type of the SourceReader. Finally, I call ProcessMessage with MFT_MESSAGE_NOTIFY_START_OF_STREAM. However, when I call ProcessInput with a video sample, it returns E_INVALIDARG. Error checking is omitted in the code below, all function calls return S_OK.
Creating the Resizer's input and output media types from the reader
//Use the reader's output type as the Resizer's input type IMFMediaType* theInputType; aReader->GetCurrentMediaType( mOutputVideoStreamIndex, &theInputType ); //Copy the reader's ouput type IMFMediaType* theOutputType; MFCreateMediaType( &theOutputType ); theInputType->CopyAllItems( theOutputType ); //Set the frame size to the desired size UINT32 theFrameWidth, theFrameHeight; MFGetAttributeSize(mUncompressedVideoType, MF_MT_FRAME_SIZE, &theFrameWidth, &theFrameHeight); MFSetAttributeSize(theOutputType, MF_MT_FRAME_SIZE, theFrameWidth, theFrameHeight);
ConfigureVideoResizer( theInputType, theOutputType );
Resizer Configuration
CoCreateInstance(CLSID_CResizerDMO, NULL, CLSCTX_ALL, __uuidof( IMFTransform ), (void**)( &mVideoResizer ) ); MFGetAttributeSize(aOutputType, MF_MT_FRAME_SIZE, &mVideoWidth, &mVideoHeight); IPropertyStore* thePropertyStore; mVideoResizer->QueryInterface( &thePropertyStore ), dHere ); PROPVARIANT theTopLeft; InitPropVariantFromInt32(0, &theTopLeft); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_LEFT, theTopLeft; thePropertyStore->SetValue( MFPKEY_RESIZE_DST_TOP, theTopLeft; //Set width and height to scale to PROPVARIANT theWidthVariant, theHeightVariant; InitPropVariantFromInt32( mVideoWidth, &theWidthVariant); InitPropVariantFromInt32( mVideoHeight, &theHeightVariant ); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_WIDTH, theWidthVariant ); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_HEIGHT, theHeightVariant ); mVideoResizer->SetInputType( 0, aInputType, NULL ); mVideoResizer->SetOutputType( 0, aOutputType, NULL );
1. The obvious first question: is using the VideoResizer a viable approach to solve the problem I am facing
2. What are the possible causes of receiving E_INVALIDARG from the VideoResizer?
Hi all.
I red that: Protected Media Path.
In that document, "An ITA is created by a trusted media source"
But, I don't know how to get a instance of ITA.
Please help me. Thanks reader.
- JooJin
When I used Media Foundation to create a WMV file, I am not able to see any properties such as frame width, frame height, data rate and more by right-clicking on the created file on selecting the Details tab.
From other WMV files I see:
Length
Frame Width
Frame Height
Data Rate
Total Bitrate
Frame Rate
Audio Bitrate
Audio Channels
Audio Sample Rate
for other files I also see the codecs used.
The technique I use to create WMV files use a CLSID_MFSinkWriter to which I AddStream and SetInputMediaType, and call WriteSample for every video frame I have. I can provide more details about how I'm creating the WMV files if needed.
Hello, all.
I just have a question about the Protected Media Path.
As you know, if the media source is created inside the application process, the source creates a proxy for itself in the protected process.
But, i don't know how to create proxy of media source.
Please help me.
Re-posting this under MSDN account.
I'm working on a simple video editing application. It is required to do simple trimming and combining across multiple video files (ie take 10 seconds from video A and 10 seconds from video B and write them to a single file). I may assume that all video files are MP4 containers with one H264 encoded video stream. I currently have a working build that fulfills these basic requirements. However, I must be able to combine clips from videos with different resolutions. Setting SinkWriter's input type to have a different resolution than its output type results in an error. Attempting to set the SourceReader's output type to a different resolution than the video it is reading also results in an error. The approach I've decided to take is to use the Video Resizer DSP to manually resize the frames after I get them from the SourceReader, then pass them on to the SinkWriter.
I'm creating a VideoResizer and using it as an IMFTransform. I use its IPropertyStore to set the 4 MFPKEY_RESIZE_DST properties, then set its input and output types using the output type of the SourceReader. Finally, I call ProcessMessage with MFT_MESSAGE_NOTIFY_START_OF_STREAM. However, when I call ProcessInput with a video sample, it returns E_INVALIDARG. Error checking is omitted in the code below, all function calls return S_OK.
Creating the Resizer's input and output media types from the reader
//Use the reader's output type as the Resizer's input typeIMFMediaType* theInputType; aReader->GetCurrentMediaType( mOutputVideoStreamIndex,&theInputType );//Copy the reader's ouput type IMFMediaType* theOutputType;MFCreateMediaType(&theOutputType ); theInputType->CopyAllItems( theOutputType );//Set the frame size to the desired size UINT32 theFrameWidth, theFrameHeight;MFGetAttributeSize(mUncompressedVideoType, MF_MT_FRAME_SIZE,&theFrameWidth,&theFrameHeight);MFSetAttributeSize(theOutputType, MF_MT_FRAME_SIZE, theFrameWidth, theFrameHeight);
ConfigureVideoResizer( theInputType, theOutputType );
Resizer Configuration
CoCreateInstance(CLSID_CResizerDMO, NULL, CLSCTX_ALL, __uuidof(IMFTransform),(void**)(&mVideoResizer ));MFGetAttributeSize(aOutputType, MF_MT_FRAME_SIZE,&mVideoWidth,&mVideoHeight);IPropertyStore* thePropertyStore; mVideoResizer->QueryInterface(&thePropertyStore ), dHere ); PROPVARIANT theTopLeft;InitPropVariantFromInt32(0,&theTopLeft); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_LEFT, theTopLeft; thePropertyStore->SetValue( MFPKEY_RESIZE_DST_TOP, theTopLeft;//Set width and height to scale to PROPVARIANT theWidthVariant, theHeightVariant;InitPropVariantFromInt32( mVideoWidth,&theWidthVariant);InitPropVariantFromInt32( mVideoHeight,&theHeightVariant ); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_WIDTH, theWidthVariant ); thePropertyStore->SetValue( MFPKEY_RESIZE_DST_HEIGHT, theHeightVariant ); mVideoResizer->SetInputType(0, aInputType, NULL ); mVideoResizer->SetOutputType(0, aOutputType, NULL );
1. The obvious first question: is using the VideoResizer a viable approach to solve the problem I am facing
2. What are the possible causes of receiving E_INVALIDARG from the VideoResizer?
Hello,
I want to develop a MMF program, which is able to capture 3D(stereo, not 3D Graphics) videos from two cameras, and then do some transforms and output them. But I have not seen relative samples of MMF.
Any body can help me?
I am captureing MJPEG frames from a Webcam using the Media Session.
The topology is: IMFMediaSource -> h.264 Encoder -> SampleGrabberSinkActivate.
When removing the encoder, the grabber works as expected, the IMFSampleGrabberSinkCallback methods are called as expected.
When inserting hte h264 encoder transform, the topology can be resolved according to MFTrace but the IMFSampleGrabberSinkCallback methods are not called:
Here is a part of the MFTrace Output:
348 2780,1EDC 14:21:32.06482 CTopologyHelpers::Trace @0BF20AC8 <<<<<<<<<<<<< ready topology
349 2780,1EDC 14:21:32.06483 CMFTransformDetours::Attach @0A883DAC Rate control @0A883DD8
350 2780,1EDC 14:21:32.06911 COle32ExportDetours::CoCreateInstance @ Created {ACC56A05-E277-4B1E-A43E-7A73E3CD6E6C} DeviceBroker (C:\Windows\SysWOW64\deviceaccess.dll) @0A873CB8 - traced interfaces:
351 2780,1EDC 14:21:32.07023 CMFTransformDetours::SetOutputType @097943D8 Succeeded MT: MF_MT_FRAME_SIZE=5497558139600 (1280,720);MF_MT_AVG_BITRATE=663552000;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_AM_FORMAT_TYPE=FORMAT_VideoInfo;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_FRAME_RATE=128849018881
(30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=2764800;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_MJPG
352 2780,1EDC 14:21:32.28836 CMFTransformDetours::SetOutputType @097943D8 Succeeded MT: <NULL>
353 2780,1EDC 14:21:32.28837 CMFTransformDetours::ProcessMessage @097943D8 Message type=0x10000000 MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, param=00000000
354 2780,1EDC 14:21:32.28839 CMFTransformDetours::ProcessMessage @097943D8 Message type=0x10000003 MFT_MESSAGE_NOTIFY_START_OF_STREAM, param=00000000
355 2780,1660 14:21:32.28845 CMFMediaSourceDetours::EndGetEvent @09794A48 Met=205 MENewStream, value @097ADDC0,
356 2780,1434 14:21:32.28847 CMFTransformDetours::HandleEvent @097943D8 Met=000 MEUnknown, value (empty), MF_EVENT_MFT_INPUT_STREAM_ID=0
357 2780,1660 14:21:32.28853 CMFMediaSourceDetours::HandleEvent @09794A48 New stream @097ADDC0 (ID 0), MT: MF_MT_FRAME_SIZE=5497558139600 (1280,720);MF_MT_AVG_BITRATE=663552000;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_AM_FORMAT_TYPE=FORMAT_VideoInfo;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_FRAME_RATE=128849018881
(30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=2764800;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_MJPG
358 2780,1660 14:21:32.28858 CMFQualityManagerDetours::NotifyQualityEvent @0BF20068 Object=0x09794A48 Event=0x0A873F80 Type=205
359 2780,1660 14:21:32.28861 CMFMediaStreamDetours::EndGetEvent @097ADDC0 Met=202 MEStreamStarted, value 5190122227246,
360 2780,1660 14:21:32.28863 CMFMediaSourceDetours::EndGetEvent @09794A48 Met=201 MESourceStarted, value 5190122227246,
361 2780,1660 14:21:32.28875 CMFTopologyDetours::GetUINT64 @0BF20AC8 attribute not found guidKey = MF_TOPOLOGY_PROJECTSTART
362 2780,1660 14:21:32.28876 CMFTopologyDetours::GetUINT64 @0BF20AC8 attribute not found guidKey = MF_TOPOLOGY_PROJECTSTOP
363 2780,1660 14:21:32.28876 CMFTopologyNodeDetours::GetUINT64 @0BF20F30 attribute not found guidKey = MF_TOPONODE_MEDIASTOP
364 2780,1660 14:21:32.28877 CMFTopologyNodeDetours::GetUINT32 @0BF20F30 attribute not found guidKey = MF_TOPONODE_MARKIN_HERE
365 2780,1660 14:21:32.28877 CMFTopologyNodeDetours::GetUINT32 @0BF20F30 attribute not found guidKey = MF_TOPONODE_MARKOUT_HERE
366 2780,1660 14:21:32.28879 CMFTransformDetours::ProcessMessage @0A883DAC Message type=0x10000003 MFT_MESSAGE_NOTIFY_START_OF_STREAM, param=00000000
367 2780,1660 14:21:32.28880 CMFTransformDetours::ProcessMessage @0BF27EE8 Message type=0x10000003 MFT_MESSAGE_NOTIFY_START_OF_STREAM, param=00000000
368 2780,1660 14:21:32.28880 CMFTransformDetours::ProcessMessage @006AAF74 Message type=0x10000003 MFT_MESSAGE_NOTIFY_START_OF_STREAM, param=00000000
369 2780,1660 14:21:32.28882 CMFMediaStreamDetours::EndGetEvent @097ADDC0 Met=214 MEStreamTick, value 5190122227246,
370 2780,1730 14:21:32.28887 CMFTopologyNodeDetours::GetUINT32 @0DA36F58 attribute not found guidKey = {6D7E1A30-106C-43B9-ACCE-ADBA943F42EC}
371 2780,1730 14:21:32.28895 CMFTopologyNodeDetours::GetUINT32 @0DA36F58 attribute not found guidKey = MF_TOPONODE_RATELESS
372 2780,1730 14:21:32.28982 CMFPresentationClockDetours::GetTime @0BF1FDF0 Time 0hns
373 2780,1730 14:21:32.28987 CMFMediaSessionDetours::EndGetEvent @0BF1F120 Met=111 MESessionTopologyStatus, value @0BF20AC8, MF_EVENT_TOPOLOGY_STATUS=200
374 2780,1730 14:21:32.28989 CMFTransformDetours::Attach @0A883DAC Rate control @0A883DD8
375 2780,1730 14:21:32.29039 CMFClockStateSinkDetours::OnClockStart @0BF2049C System time 519012224ms, clock start offset 519012222ms
376 2780,1660 14:21:32.29042 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=305 MEStreamSinkRequestSample, value (empty),
377 2780,1660 14:21:32.29044 CMFTopologyNodeDetours::GetUINT32 @0DA34AC0 attribute not found guidKey = MF_TOPONODE_D3DAWARE
378 2780,1660 14:21:32.29051 CMFTransformDetours::ProcessOutput @006AAF74 failed hr=0xC00D6D72 MF_E_TRANSFORM_NEED_MORE_INPUT
379 2780,1660 14:21:32.29051 CMFTopologyNodeDetours::GetUINT32 @0DA34D18 attribute not found guidKey = MF_TOPONODE_D3DAWARE
380 2780,1660 14:21:32.29058 CMediaObjectDetours::ProcessOutput @0BF27F00 MediaBuffer @0BF27C90, flags 0x00000000, Time 0ms, Duration 0ms, Size 0B
381 2780,1660 14:21:32.29058 CMFTransformDetours::ProcessOutput @0BF27EE8 failed hr=0xC00D6D72 MF_E_TRANSFORM_NEED_MORE_INPUT
382 2780,1660 14:21:32.29059 CMFTopologyNodeDetours::GetUINT32 @0DA34CA0 attribute not found guidKey = MF_TOPONODE_D3DAWARE
383 2780,1660 14:21:32.29062 CMFTransformDetours::ProcessOutput @0A883DAC failed hr=0xC00D6D72 MF_E_TRANSFORM_NEED_MORE_INPUT
384 2780,1660 14:21:32.29063 CMFMediaStreamDetours::RequestSample @097ADDC0 Token @00000000
385 2780,1660 14:21:32.29064 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=305 MEStreamSinkRequestSample, value (empty),
386 2780,1660 14:21:32.29064 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=305 MEStreamSinkRequestSample, value (empty),
387 2780,1660 14:21:32.29065 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=305 MEStreamSinkRequestSample, value (empty),
388 2780,1660 14:21:32.29066 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=301 MEStreamSinkStarted, value (empty),
389 2780,1660 14:21:32.29069 CMFStreamSinkDetours::EndGetEvent @0BF20610 Met=306 MEStreamSinkMarker, value 2 0 0 0 48 ef f4 b 1 0 0 0 ,
390 2780,1730 14:21:32.29073 CMFMediaSessionDetours::EndGetEvent @0BF1F120 Met=110 MESessionCapabilitiesChanged, value (empty), MF_EVENT_SESSIONCAPS=5;MF_EVENT_SESSIONCAPS_DELTA=4
391 2780,1730 14:21:32.29091 CMFMediaSessionDetours::EndGetEvent @0BF1F120 Met=103 MESessionStarted, value (empty), MF_EVENT_PRESENTATION_TIME_OFFSET=0 (0,0)
392 2780,1730 14:21:32.29092 CMFTransformDetours::Attach @0A883DAC Rate control @0A883DD8
393 2780,1730 14:21:32.66207 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
394 2780,1730 14:21:32.66207 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
395 2780,1730 14:21:32.66208 CMFTransformDetours::ProcessOutput @097943D8 0 ProcessOutputs available
396 2780,1730 14:21:32.66213 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 0, Sample @0BF28200, Time 519012596ms, Duration 33ms, Buffers 1, Size 99521B, MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=1;MFSampleExtension_Interlaced=1
397 2780,1730 14:21:32.66213 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 1, (null)
398 2780,1660 14:21:32.66218 CMFMediaStreamDetours::EndGetEvent @097ADDC0 Met=213 MEMediaSample, value @0BF28200,
399 2780,1660 14:21:32.66221 CMFMediaStreamDetours::HandleEvent @097ADDC0 Sample @0BF28200, Time 519012596ms, Duration 33ms, Buffers 1, Size 99521B, MFSampleExtension_Token=@00000000;MFSampleExtension_DeviceTimestamp=5190125963555
(1208,1805469987);MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=1;MFSampleExtension_Interlaced=1
400 2780,1660 14:21:32.66226 CMFTransformDetours::ProcessInput @0A883DAC Stream ID 0, Sample @0BF28200, Time 519012596ms, Duration 33ms, Buffers 1, Size 99521B, MFSampleExtension_Token=@00000000;MFSampleExtension_DeviceTimestamp=5190125963555
(1208,1805469987);MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=1;MFSampleExtension_Interlaced=1
401 2780,1660 14:21:32.66896 CMFTransformDetours::ProcessOutput @0A883DAC Stream ID 0, Sample @0BCF5928, Time 519012596ms, Duration 33ms, Buffers 1, Size 1843200B, MFSampleExtension_CleanPoint=1;MFSampleExtension_Interlaced=0
402 2780,1660 14:21:32.66903 CMFTransformDetours::ProcessInput @0BF27EE8 Stream ID 0, Sample @0BCF5928, Time 519012596ms, Duration 33ms, Buffers 1, Size 1843200B, MFSampleExtension_Token=@00000000;MFSampleExtension_DeviceTimestamp=5190125963555
(1208,1805469987);MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=1;MFSampleExtension_Interlaced=0
403 2780,1660 14:21:32.66904 CMediaObjectDetours::ProcessInput @0BF27F00 MediaBuffer @0BF27D70, flags 0x0000000F, Time 519012596ms, Duration 33ms, Size 1843200B
404 2780,1660 14:21:32.66908 CMediaObjectDetours::ProcessOutput @0BF27F00 MediaBuffer @0BF27A98, flags 0x00000000, Time 0ms, Duration 0ms, Size 0B
405 2780,1660 14:21:32.66908 CMFTransformDetours::ProcessOutput @0BF27EE8 failed hr=0xC00D6D72 MF_E_TRANSFORM_NEED_MORE_INPUT
406 2780,1660 14:21:32.67101 CMediaObjectDetours::ProcessOutput @0BF27F00 MediaBuffer @0BF27A98, flags 0x01000000, Time 50ms, Duration 50ms, Size 2764800B
407 2780,1660 14:21:32.67104 CMFTransformDetours::ProcessOutput @0BF27EE8 Stream ID 0, Sample @0BCF5928, Time 0ms, Duration 0ms, Buffers 1, Size 2764800B, MFSampleExtension_CleanPoint=0
408 2780,1660 14:21:32.67110 CMFTransformDetours::ProcessInput @006AAF74 Stream ID 0, Sample @0BCF5928, Time 0ms, Duration 0ms, Buffers 1, Size 2764800B, MFSampleExtension_Token=@00000000;MFSampleExtension_DeviceTimestamp=5190125963555
(1208,1805469987);MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=0;MFSampleExtension_Discontinuity=1;MFSampleExtension_Interlaced=0
409 2780,1660 14:21:32.67111 CMFTransformDetours::ProcessInput @006AAF74 failed hr=0xC00D36C8 MF_E_NO_SAMPLE_TIMESTAMP
410 2780,2058 14:21:32.67170 CMFMediaSessionDetours::EndGetEvent @0BF1F120 Met=001 MEError, value (empty), failed HrStatus=C00D36C8 MF_E_NO_SAMPLE_TIMESTAMP,
411 2780,1660 14:21:32.71000 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
412 2780,1660 14:21:32.71001 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
413 2780,1660 14:21:32.71001 CMFTransformDetours::ProcessOutput @097943D8 0 ProcessOutputs available
414 2780,1660 14:21:32.71006 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 0, Sample @0BF283C8, Time 519012644ms, Duration 33ms, Buffers 1, Size 99320B, MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=0;MFSampleExtension_Interlaced=1
415 2780,1660 14:21:32.71006 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 1, (null)
416 2780,1660 14:21:32.74202 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
417 2780,1660 14:21:32.74203 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
418 2780,1660 14:21:32.74203 CMFTransformDetours::ProcessOutput @097943D8 0 ProcessOutputs available
419 2780,1660 14:21:32.74207 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 0, Sample @0BF28218, Time 519012676ms, Duration 33ms, Buffers 1, Size 99615B, MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=0;MFSampleExtension_Interlaced=1
420 2780,1660 14:21:32.74207 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 1, (null)
421 2780,1660 14:21:32.78998 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
422 2780,1660 14:21:32.78998 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
423 2780,1660 14:21:32.78999 CMFTransformDetours::ProcessOutput @097943D8 0 ProcessOutputs available
424 2780,1660 14:21:32.79003 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 0, Sample @0BF28380, Time 519012724ms, Duration 33ms, Buffers 1, Size 99607B, MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=0;MFSampleExtension_Interlaced=1
425 2780,1660 14:21:32.79003 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 1, (null)
426 2780,1660 14:21:32.83794 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
427 2780,1660 14:21:32.83795 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
428 2780,1660 14:21:32.83795 CMFTransformDetours::ProcessOutput @097943D8 0 ProcessOutputs available
429 2780,1660 14:21:32.83799 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 0, Sample @0BF28230, Time 519012772ms, Duration 33ms, Buffers 1, Size 99051B, MFSampleExtension_BottomFieldFirst=1;MFSampleExtension_CleanPoint=1;MFSampleExtension_Discontinuity=0;MFSampleExtension_Interlaced=1
430 2780,1660 14:21:32.83799 CMFTransformDetours::ProcessOutput @097943D8 Stream ID 1, (null)
431 2780,1660 14:21:32.87000 CMFTransformDetours::HandleEvent @097943D8 Met=602 METransformHaveOutput, value (empty),
432 2780,1660 14:21:32.87000 CMFTransformDetours::HandleEvent @097943D8 1 ProcessOutputs available
Thanks a lot for any suggestions.
My code used to output a large .mp4 file (e.g. 4MB) that I could successfully play in WMF12. Since the last time I tested this feature, however, the file being produced is only 677 bytes and WMF12 complains "... might not support the file type ... or codec ...".
I'm pasting a reduced collection of statements from my code that are being executed. I've traced with breakpoints and every step is happening. The return result is S_OK every time, including the WriteSample() call. Yet it seems samples are NOT begin written to the file.
I've included below the output from my own TraceAttributes() function as well. I believe I was indeed using a combination of pType2 and pType, as the code shows. I did try changing the AddStream to use pType rather than pType2, but doing so caused the SewtInputMediaType to return 0xC00D5212 (... no codec ...)
This is at the limit of my understanding of Media Foundation. What might be the reason it doesn't work? And remember, it used to work, although I can't say for sure what code elements might have changed a little since then.
IMFSinkWriter *m_pWriterProcessedImage; ........................................ hr = MFCreateSinkWriterFromURL( FILENAME, NULL, NULL,&m_pWriterProcessedImage ); ....................................... hr = m_pWriterProcessedImage->AddStream(pType2, psink_stream); hr = m_pWriterProcessedImage->SetInputMediaType(*psink_stream, pType, NULL); hr = m_pWriterProcessedImage->BeginWriting(); TraceAttributes("for AddStream()", pType2); **BEGIN OUTPUT FROM TraceAttributes** STREAM ATTRIBUTES: for AddStream() Type = {73646976-0000-0010-8000-00AA00389B71} Subtype = {34363248-0000-0010-8000-00AA00389B71} bitrate = 2400000 size = 640 x 480 frame rate = 30.000000 (30000/1000) pixel aspect = (1/1) interlace = 2 (progressive) {AD76A80B-2D5C-4E0B-B375-64E520137036}: <UNRECOGNIZED> {3C036DE7-3AD0-4C9E-9216-EE6D6AC21CB3}: <UNRECOGNIZED> {261E9D83-9529-4B8F-A111-8B9C950A81A9}: <UNRECOGNIZED> {9AA7E155-B64A-4C1D-A500-455D600B6560}: <UNRECOGNIZED> **END OUTPUT FROM TraceAttributes** TraceAttributes("for SetInputMediaType()", pType); **BEGIN OUTPUT FROM TraceAttributes** STREAM ATTRIBUTES: for SetInputMediaType() Type = {73646976-0000-0010-8000-00AA00389B71} Subtype = {3231564E-0000-0010-8000-00AA00389B71} bitrate = 15373949 size = 640 x 480 frame rate = 30.000000 (30000/1000) pixel aspect = (1/1) interlace = 2 (progressive) stride = 640 independent samples = 1 fixed size samples = 1 sample size = 460800 user data = 460800 **END OUTPUT FROM TraceAttributes** ...................................... hr = m_pWriterProcessedImage->WriteSample(0, pSample); **THIS GETS CALLED MULTIPLE TIMES** ........................................ hr = m_pWriterProcessedImage->Finalize();
Hi,
i am working on audio encoding and struggling with the initialization of the sink writers input and output.
My app does real time audio recording of the system mix via loopback/event-driven-buffering and the formats i am getting with the buffer are not always plain mono/stereo wave format. So when its a multichannel wave format i definitly want to record all the present channels and it seems that only the Windows Media Audio Professional/Lossless and the MPEG2 Audio codec are able to do this in Windows Media Foundation.
This is the source format i have at the moment :
Wave Format = WAVE_FORMAT_EXTENSIBLE / 65534
Audio Format = WAVE_FORMAT_IEEE_FLOAT
Audio Channels = 6
Audio Channel Mask = 1551
Audio Samples = 48000
Audio Bits Per Sample = 32
Audio Valid Bits Per Sample = 32
Audio Block Alignment = 24
After reading into the available codecs for Media Foundation i chose WMA Lossless because it should give me exactly what i want. In my app i am capturing audio and video in real time and because video encoding with sink writer is already finished and runs perfectly i just tryed to add an audio stream. First i was creating an MPEG4 file sink and set input and output for video, which worked fine, then i added an audiostream and the app crashed ( tryed hundreds of codec settings and of course went precise after the documentary ). But because i never succeded with the audiostream ( not even when i build the sink myslef with plain stereo WMV/WMA streams ) i became curious. So i did build a simple testapp with just one button for recording, and took the "capturing a stream" example from here : http://msdn.microsoft.com/en-us/library/windows/desktop/dd370800(v=vs.85).aspx
Here is the code for the sink writer initialization which Crashs :
HRESULT InitializeSinkWriter(WAVEFORMATEX *pwfx) { IMFMediaType *pMediaTypeIn = NULL; IMFMediaType *pMediaTypeOut = NULL; HRESULT hr = MFCreateSinkWriterFromURL(L"C:\\Users\\XXXXX\\Videos\\Output.wma", NULL, NULL, &pSinkWriter); WAVEFORMATEXTENSIBLE *pwfex = reinterpret_cast<WAVEFORMATEXTENSIBLE*>(pwfx); if(SUCCEEDED(hr))hr = MFCreateMediaType(&pMediaTypeOut); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Audio); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetGUID(MF_MT_SUBTYPE, MFAudioFormat_WMAudio_Lossless); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetUINT32(MF_MT_AUDIO_NUM_CHANNELS, pwfex->Format.nChannels); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetUINT32(MF_MT_AUDIO_SAMPLES_PER_SECOND, pwfex->Format.nSamplesPerSec); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetUINT32(MF_MT_AUDIO_BITS_PER_SAMPLE, pwfex->Format.wBitsPerSample); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetUINT32(MF_MT_AUDIO_BLOCK_ALIGNMENT, pwfex->Format.nBlockAlign); if(SUCCEEDED(hr))hr = pMediaTypeOut->SetUINT32(MF_MT_AUDIO_AVG_BYTES_PER_SECOND, pwfex->Format.nAvgBytesPerSec); if(SUCCEEDED(hr))hr = pSinkWriter->AddStream(pMediaTypeOut, &audioStream); if(SUCCEEDED(hr))hr = MFCreateMediaType(&pMediaTypeIn); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Audio); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetGUID(MF_MT_SUBTYPE, MFAudioFormat_Float); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_NUM_CHANNELS, pwfex->Format.nChannels); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_CHANNEL_MASK, pwfex->dwChannelMask); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_FLOAT_SAMPLES_PER_SECOND, pwfex->Format.nSamplesPerSec); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_BITS_PER_SAMPLE, pwfex->Format.wBitsPerSample); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_AVG_BYTES_PER_SECOND, pwfex->Format.nAvgBytesPerSec); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_VALID_BITS_PER_SAMPLE, pwfex->Samples.wValidBitsPerSample); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_AUDIO_BLOCK_ALIGNMENT, pwfex->Format.nBlockAlign); if(SUCCEEDED(hr))hr = pMediaTypeIn->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE); if(SUCCEEDED(hr))hr = pSinkWriter->SetInputMediaType(audioStream, pMediaTypeIn, NULL); if(SUCCEEDED(hr))hr = pSinkWriter->BeginWriting(); pMediaTypeOut->Release(); pMediaTypeIn->Release(); return hr; }
The average bytes per second on the output type should be different from the input of course but theres no guideline in the documentary how high it can go ( for all other codecs its in the documentary... ) or how to set the compression/quality rate ( from the documentary it seems the wma/wmv MFT has a default rate when u dont set anything ). But whatever u set the average bytes per second to it doesnt change anything crash related. The app crashs on both lines pMediaTypeOut- and pMediaTypeIn->Release, what makes it obvious that the pointers are already NULL and never were set right.
EDIT : If i omit these 2 pointer release lines then the app dont crash
Any ideas ?
regards
coOKie
Hi all,
Does anyone know how to configure the evr's mixer when using a custorm presenter to directly process the d3d surface which come from the dxva decoder?
I am writing a custom evr presenter by making use of the Windows SDK sample "EVRPresenter" and I have succeed in using the custom evr presenter to render the video. However, When I have compared the custom evr with the default evr, I found that the custom evr presenter can not work efficiently with the hardware DXVA-enabled decoder. The custom evr presenter do not directly process the d3d surface coming from the dxva decoder and there are only logs on DXVA2_ProcessDeviceCreated, DXVA2_ProcessBlt and DXVA2_ProcessDeviceDestroyed can be traced using DXVA Checker. However, when using default evr presenter, the DXVA Checker can trace all of the logs, including the DXVA2 decoding api and DXVA2 process api.
It seem that there are some more work to configure the evr's mixer when using the custom evr presenter. So Does anyone can give me some adivce?
Hi, all?
I want to have the PMP-PE certification, but i don't know email form that to send Microsoft.
My company name, company phone number, etc.. What do Microsoft want any email form?
And should attach the file (MediaSource, ByteStreamHandler, etc) in email?
Please help me, Please Please Please~
I have to set the GOP size on 'WMVideo9 Encoder MFT' encoder, but I don't know how. I didn't find such attribute through the IMFAttributes. Also, I would have to tell encoder to use closed GOP. The output format should be MFVideoFormat_WVC1.
Regarding GOP size, the closest I got is MF_MT_MAX_KEYFRAME_SPACING. But, it doesn't look like it will set the GOP size.
Can somebody please comment this?
Should I call QueryInterface() on the IMFTransform in order to get some interface that provides a way for setting the GOP size?
Thank you!
I am using WMVNetWrite sample in Windows Media Format SDK and inserting timestamp to stream every 10 seconds with script commands , but for multi bitrate streams on user side where I embed the stream it shows 3 videos . I think WMVNetwrite doesnt handle multi bitrate streams well .
How do I resolve this issue ?
in my application i need to play a video and audio file simultaneously. The video is without audio and the audio file is selected based on the language we want to play.
The audio files have the same duration as the video files (typically 60 to 120 seconds) and we have done our best effort to make the sound 'lipsync' with the video.
i need to make a player that i can call from my c# application (it may be a separate application, native win32 or win64) where i can pass (as commandline arguments):
- the name of the video file (i.e. "instruction.wmv")
- the name of the audio file (i.e. "instruction_EN.mp3")
i have seen so many samples in the various MS SDK's and in this community that i come to the conclusion that i don't know enough to make the most effective choice where/how to start.
my question: does anyone know of a sample application (C++ or C#) that does play audio and video from separate files that could be the basis for my application?MftH264Decoder::DecoderOutputState MftH264Decoder::GetOutput() {
CComPtr<IMFSample> output_sample;
if (!use_dxva_) {
// If DXVA is enabled, the decoder will allocate the sample for us.
output_sample.Attach(CreateEmptySampleWithBuffer(out_buffer_size_));
if (output_sample == NULL) {
cout<< "GetSample: failed to create empty output sample";
return kNoMemory;
}
}
MFT_OUTPUT_DATA_BUFFER output_data_buffer;
HRESULT hr;
DWORD status;
for (;;) {
output_data_buffer.dwStreamID = 0;
output_data_buffer.pSample = output_sample;
output_data_buffer.dwStatus = 0;
output_data_buffer.pEvents = NULL;
hr = decoder_->ProcessOutput(0, // No flags
1, // # of out streams to pull from
&output_data_buffer,
&status);
IMFCollection* events = output_data_buffer.pEvents;
if (events != NULL) {
cout<< "Got events from ProcessOuput, but discarding";
events->Release();
}
if (FAILED(hr)) {
cout<< "ProcessOutput failed with status " << std::hex << hr
<< ", meaning..." << ProcessOutputStatusToCString(hr);
if (hr == MF_E_TRANSFORM_STREAM_CHANGE) {
if (!SetDecoderOutputMediaType(output_format_)) {
cout<< "Failed to reset output type";
return kResetOutputStreamFailed;
} else {
cout<< "Reset output type done";
continue;
}
} else if (hr == MF_E_TRANSFORM_NEED_MORE_INPUT) {
// If we have read everything then we should've sent a drain message
// to the MFT. If the drain message is sent but it doesn't give out
// anymore output then we know the decoder has processed everything.
if (drain_message_sent_) {
cout<< "Drain message was already sent + no output => done";
return kNoMoreOutput;
} else {
if (!ReadAndProcessInput()) {
cout<< "Failed to read/process input. Sending drain message";
if (!SendDrainMessage()) {
cout<< "Failed to send drain message";
return kNoMoreOutput;
}
}
continue;
}
} else {
return kUnspecifiedError;
}
}
The above function is my implementation of H264 Decoder. The input samples are read from the H264 dump file as an elementary stream. Initially i am trying to passing only the first frame which s of size 1043 and passing it to decoder.
bool MftH264Decoder::ReadAndProcessInput() {The processOutput function always returns MF_E_TRANSFORM_NEED_MORE_INPUT as a result it keeps on reading the first frame from the buffer.
Is this the correct way of calling Process Input and Process output functions?
hi there,
in my scenario, i have several Mediasessions running in parallel with Custom MediaSources and a custom Decoder MFT.
These sessions (and therefore also the media sources) are created and closed in parallel.
My MediaSource allocates a private WorkQueue using MFAllocateWorkQueue (based on the samples from the Windows SDK - OpQueue.h) to perform certain steps asynchronously. So far everything works great. However, in rare cases (sometimes during Playback, sometimes when playback is Started) callingMFPutWorkItem on the WorkQueue returns MF_E_UNEXPECTED. The WorkQueue remains unusable from this point on.
I tried getting some more information using mftrace, but even with level 16 i just get messages from my own code, nothing from WorkQueue-stuff itself.
My target platform is Windows 7, so trying MFPutWorkItem2 isn´t an option for now.
Any ideas on what might be the reason for MFPutWorkItem to deliver this "unexpected" issue?
Thanks in advance
In a project, I use media session + sample grabber to capture audio and video buffers from media file. To keep A/V sync, the attribute MF_SAMPLEGRABBERSINK_IGNORE_CLOCK is set to be FALSE for both audio and video. In the same process, I use IMFSourceReader to capture audio buffer from microphone in asynchronous mode. If I add Sleep to IMFSampleGrabberSinkCallback::OnProcessSample(), the function IMFSourceReaderCallback::OnReadSample() is also slowed down. There's no lock shared in these 2 callback functions. From the current thread ID, the 2 callbacks are running in different threads.
Do IMFSourceReader and media session share the same clock? How to make IMFSourceReaderCallback::OnReadSample() be not affected by MFSampleGrabberSinkCallback::OnProcessSample()?
By the way, if MF_SAMPLEGRABBERSINK_IGNORE_CLOCK attribute is set to be TRUE, I don't see the issue.
Thanks!