Quantcast
Channel: Media Foundation Development for Windows Desktop forum
Viewing all 1079 articles
Browse latest View live

Unexpected error 0xc00d36b4 The data specified for the media type is invalid, inconsistent, or not supported by this object in Windows 10

$
0
0

We have a video creation module  based on the Microsoft "Tutorial: Using the Sink Writer to Encode Video" at https://msdn.microsoft.com/en-us/library/windows/desktop/ff819477(v=vs.85).aspx.

This code worked great with MFVideoFormat_H264 on Windows 7, 8 and 8.1 to generate MP4 files but we get the following error on Windows 10 when invoking:

hr = pSinkWriter->SetInputMediaType(streamIndex, pMediaTypeIn, NULL);

although the Windows 10 machines have a valid H264 coded installed.

Is it a known limitation?

How to workaround.


encoded mp4 video using moov before mdat

$
0
0

I'm following the documented example shown for creating an encoded video using WMF

https://msdn.microsoft.com/en-us/library/windows/desktop/ff819477(v=vs.85).aspx

However I've noticed that the files have the mdat before moov. Is there a way to write the moov before the mdat?

I've noticed there is an attribute MF_MPEG4SINK_MOOV_BEFORE_MDAT, however setting this does not seem to have any effect. I am presuming this is because I'm calling the regular sink function like the example, not the MFCreateMPEG4MediaSink.

My question is 

1. How do I get the example to write moov before mdat?

2. If this is impossible, how do I convert the example to use MFCreateMPEG4MediaSink? I can't seem to find any examples of it. Will this allow me to use attribute MF_MPEG4SINK_MOOV_BEFORE_MDAT?

Thanks,

Getting ID3D11Texture2D from IMFSample produced by DXVA-enabled decoder MFT

$
0
0

I am trying to use source reader initialized with an H.264 MP4 file and Direct3D 11 device. Windows 10 system, classic desktop application.

The background of the process is given in Supporting Direct3D 11 Video Decoding in Media Foundation with the exception that I am not building a topology and I don't have a session. The case is reduced to just source reader. And even though the article describes the process, I am not implementing decoder here, I am trying to utilize this decoder instead assuming that stock H.264 decoder is working along the given lines.

The DXGI Device Manager is used to share the Direct3D 11 between components. The DXGI Device Manager exposes theIMFDXGIDeviceManager interface. The pipeline sets the IMFDXGIDeviceManager pointer on the software decoder by sending theMFT_MESSAGE_SET_D3D_MANAGER message.

I already have Direct3D 11 device, I created IMFDXGIDeviceManager and initialized with this device. UsingMF_SOURCE_READER_D3D_MANAGER the manager is passed to source reader withMFCreateSourceReaderFromURL.

Things are going well up to getting a sample from the source reader. I have NV12 media sample and I see that it's coming from internally used MFT, with attributes that I see (among other):

  • MF_SA_D3D11_AWARE, vValue 1 (Type VT_UI4)
  • CODECAPI_AVDecVideoAcceleration_H264, vValue 1 (Type VT_UI4)

They are not confirming DXVA use, as I suppose - these only confirm that I requested hardware assisted decoding. Getting a media sample I actually don't have any confirmation as for whether decoding is hardware accelerated or not.

IMFSample has one buffer, which is IMFMediaBuffer implemented by MF'sCMF2DMediaBuffer. It does have IMFGetService but it does not expose any services, specifically it does not exposeMR_BUFFER_SERVICE which presumably makes it unable to reach underlying surface/texture (if it at all exists).

As far as I can see I either failed to initialize Direct3D enabled H.264 decompression, or decoder is unable to initialize the DXVA decoding context and fell back to software decoding, or it is doing all right and I am just unable to access the underlying Direct3D 11 Texture2D.

  • Is there any way to query H.264 decoder if it is actually using hardware Direct3D 11 decoder?
  • Any way to troubleshoot MFT configuration within source reader to ensure it picked up Direct3D device?
  • How am I supposed to resolve IMFSample to Direct3D resource, is it still buffer'sIMFGetService::GetService(MR_BUFFER_SERVICE, __uuidof(ID3D11Texture2D) call?

http://alax.info/blog/tag/directshow

Order of streams in source created using MFCreateAggregateSource

$
0
0

If I combine multiple media sources using MFCreateAggregateSource, is it safe to assume that the streams in the aggregate source will be ordered according to the collection I pass to MFCreateAggregateSource?

Eg are the stream indices in the aggregate source (guaranteed to be) located at 0, # streams in source 1, # of streams in source 1 + # of streams in source 2, etc?

If not, what would be the way to find out the source of a stream given that I only have the aggregate source? The application scenario would be multiple audio sources played simultaneously, but I want to direct them to different sinks and need to know which sink receives samples from which source?

Thanks in advance,
Christoph


How to get correct device symbolic link to audio capture device?

$
0
0

Hi,

I need to process WM_DEVICECHANGE message for handling an audio device loss. 

In case of Video device all work correctly: after receiving DBT_DEVICEREMOVECOMPLETE, I see correct video device id in dbcc_name member of DEV_BROADCAST_DEVICEINTERFACE structure. In my case this is a:
L"\\\\?\\USB#VID_046D&PID_082D&MI_00#7&1bfad7d5&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\\{bbefb6c7-2fc4-4139-bb8b-a58bba724083}" string and it's same heading with string that I receive using MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK:
"\\\\?\\usb#vid_046d&pid_082d&mi_00#7&1bfad7d5&0&0000#{e5323777-f976-4f5b-9b55-b94699c46e44}\\{bbefb6c7-2fc4-4139-bb8b-a58bba724083}". Heading is L"\\\\?\\usb#vid_046d&pid_082d&mi_00#7&1bfad7d5&0&0000#" here.

In case of Audio device I have problem: after receiving DBT_DEVICEREMOVECOMPLETE, I see audio device id in dbcc_name member of DEV_BROADCAST_DEVICEINTERFACE structure that is different from received from MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_SYMBOLIC_LINK or  MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID. In my case this is a:
L"\\\\?\\USB#VID_046D&PID_082D&MI_02#7&1bfad7d5&0&0002#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\\GLOBAL" string, but I receive "\\\\?\\SWD#MMDEVAPI#{0.0.1.00000000}.{6e254647-6d40-41ab-afa0-1848c4a11bea}#{2eef81be-33fa-4800-9670-1cd474972c3f}" using MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_SYMBOLIC_LINK. Using MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID is not succeeded too.

In case webcam have mic embedded and after removing camera from system I loss access to Video and Audio device on same time, so I need correct handling of this case.
Could you please describe, how to get WM_DEVICECHANGE compatible device symbolic link in case of removing audio capture device?

Thanks!

Media Foundation Samples

$
0
0

I've recently installed Visual Studio 2015 on a Windows 10 system. None of the SDK versions that were installed include the Media Foundation samples. I tried installing the Windows 7 SDK, but it won't install for some reason. Is there any other way to get the samples?

Thanks,

Chuck

How to decode super short H.264 clip properly with MFT

$
0
0

I am using MFT to decode H.264 byte stream, which works well with normal clips. My program is like:

(1) Read compressed frame from media file;

(2) Call ProcessInput to send this frame to MFT for decoding;

(3) Call ProcessOutput in loop to retrieve decoded pictures;

(4) Drain MFT when there is no more input frame, and repeat step (3) to get all remaining pictures decoded.

However, my program cannot work well with super short clip, which contains 20 frames or less. For such clip, there would not be any decoded frame when ProcessOutput is called due be initial buffering in MFT. However, when I call Drain MFT and call ProcessOutput, I get Stream Changed Error. Following the online document, if I set the output media type again, the ProcessOutput calls give me a sequence of Unspecified Errors with empty IMFSamples.

Thanks for your advice on how to fix my program for such super short clip.

MPEG2 Playback

$
0
0

Hi,

how is it possible to enable the MPEG2 decoder MFT in Windows 8.1 Pro? At the moment I always get the "SL_E_LICENSE_FILE_NOT_INSTALLED" error when I try to activate the decoder "ppActivate[0]->ActivateObject (IID_PPV_ARGS (&decoder)".

Both Windows Media Packs are no longer availble for Windows 8.1 (as noticed in https://support.microsoft.com/en-us/kb/3107057). I`m not sure if it is really possible to get the MFT MPEG2 decoder starting when I install a DVD player application. Has someone of you already some experience with this?

best regards

saoirse


MFStartup several calls

$
0
0

Hi,

in the Microsoft documentation you can read that you can call MFStartup several times in one program.

"An application must call this function before using Media Foundation. Before your application quits, callMFShutdown once for every previous call toMFStartup"

In which situtations is this useful or recommended?

best regards

saoirse

IMFSourceReader::ReadSample - notification for receiving all bytes for a whole encoded video frame?

$
0
0

Hi,

using the IMFSourceReader API, how do you know that you receive a whole frame by using the ReadSample method?

In my special case, I have a MPEG2 video file. I do not use the SourceReader API to decode the frames - I just want to get the encoded frames. Therefore I set "MFVideoFormat_MPEG2" as native format for the IMFSourceReader.  

Microsoft says: "The contents of the media data depend on the format of the stream. For an uncompressed video stream, each media sample contains a single video frame. For an uncompressed audio stream, each media sample contains a sequence of audio frames." (https://msdn.microsoft.com/en-us/library/windows/desktop/dd389281%28v=vs.85%29.aspx)

In my case, it seems to me that each MPEG2 frame needs several IMFSourceReader::ReadSample() calls to get a complete frame. Is there any chance or option to know when you get all bytes from the IMFSourceReader for one video frame? I`m not sure if I can use the GetSampleDuration() method to determine if I have a complete sample or not. 

best regards

saoirse

Submitting raw H264 samples to SinkWriter

$
0
0

Hello,

I can successfully encode H264 to avi file, using a IMFSinkWriter, by submitting YUV420 frames.

Now, I would like to use the SinkWriter not as an encoder, but as a simple AVI writer. I encode the YUV420 frames with another encoder (Cuda NVENC), I get H264 samples, and I would like to send the H264 samples to SinkWriter so that the H264 stream is properly encapsulated in the AVI file.

But I fail at finding the way to configure the input and output media types of the sinkwriter. None of the configurations I tried worked.

Is it possible to use the sinkwriter that way ?

I also tried to write the H264 samples directly to the imfbytestream behind the sinkwriter, but the resulting file does not work.

Regards

Pierre Chatelier



Drive Letter or full path of media device or MTP devices

$
0
0
is there is any solution to find out the full path mtp devices or can enumerate it's drive letter? 

Codec / format diff between WMP12 and IMFSourceReader

$
0
0

Hi,

is there a difference between the format / codec support in Windows Media Player 12 and the format / codec support using IMFSourceReader API?

I have a lot of AVI video files (e. g. with DV codec id dvsd, 720x576) which can be playback by the Windows Media Player but can not be opened by the IMFSourceReader.

IMFSourceReader* reader = nullptr;
HRESULT hr = MFCreateSourceReaderFromURL (wstring.data (), NULL, &reader);

The return value from MFCreateSourceReaderFromURL is "hr = 0xc00d36c4 : The byte stream type of the given URL is unsupported".

Are there any information showing these format / codec differencies?

best regards

saoirse


Requirements for CLSID_CMpeg4sDecMFT

$
0
0

After getting the Microsoft H.264 MFT decoder to output data, I am trying to get the MPEG-4 decoder to do the same, but so far no luck.  I am using the same framework, initializing the CLSID_CMpeg4sDecMFT decoder, and passing bitstream data in. However, nothing comes back out, so clearly there is something wrong with the initialization or format of my input bitstream chunks. The documentation is sparse. Does anyone know any special requirements to the bitstream format being passed in? (for instance, the H.264 decoder required Annex B bitstreams, and not MPEG-4 style)..

I am setting the Subtype to MFVideoFormat_MP4V.

The documentation says: 

The MPEG4 Part 2 Video decoder accepts the following formats.

What does this mean? How should the IMFSamples be created to one of these formats?

IMFSinkWriter: Merit validation failed for MFT (Intel Quick Sync Video H.264 Encoder MFT)

$
0
0

Hi,

We are actually trying the run a simple Reader --> Writer (transcoder, VC1 -> H264) in Media Foundation. 
The source data (VC1) is captured with our own equipment, so no "premium protected content" or similar is used, the goal is to use Intel® Quick Sync Video H.264 Encoder MFT.

Looking in the MediaFoundation trace log we can see that a hardware MFT is enumerated and created BUT it fails.

CoCreateInstance @ Created {4BE8D3C0-0515-4A37-AD55-E4BAE19AF471} Intel® Quick Sync Video H.264 Encoder MFT (c:\Program Files\Intel\Media SDK\mfx_mft_h264ve_w7_32.dll)
MFGetMFTMerit @ Merit validation failed for MFT @06A42CA0 (hr=E_FAIL)

We provide a IDirect3DDeviceManager9 pointer MediaFoundation when creating our source reader + sink writer according to the documentation.
It's rather strange that MF wants to use a protected media path. Are we supposed to pass some encoder params to disable this type of behavior? Any ideas?

Standard monitor with DVI cable is used with Windows 7

best regards,

Carl



Video IMFSourceReaderCallback sets MF_SOURCE_READERF_STREAMTICK on first sample, never gets called again (Windows 10)

$
0
0

My application reads from a webcam using a source reader in async mode. I start video by calling ReadSample() from my source reader.

cameraReader->ReadSample(MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, NULL);

The IMFSourceReaderCallback callback processes the sample and calls ReadSample() to trigger the next one.

STDMETHODIMP OnReadSample(HRESULT hrStatus, DWORD dwStreamIndex, DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample *pSample)

{

if(FAILED(hrStatus)){

videoContinue = false; /* global */

}

{

/* ...check dwStreamFlags here... doesn't do anything for MF_SOURCE_READERF_STREAMTICK */

}

if(SUCCEEDED(hrStatus) && videoContinue && pSample){

/* Process frame */

}

if(SUCCEEDED(hrStatus) && videoContinue){

/* request next frame */

} else {

/* Clean up */

}

}

The only problem I have seen is on Windows 10. Video is started, and a sample is requested. On the first invocation of the callback, I see dwStreamFlags set to MF_SOURCE_READERF_STREAMTICK. pSample is NULL, so I don't do any processing.  At the end of the callback, another sample is requested since nothing has failed, but the callback is never invoked again. My program hangs waiting for the next sample.

How should I handle MF_SOURCE_READERF_STREAMTICK, if at all? And is there a reason why this flag would be set on the first sample of a stream?








Hardware accelerated SinkWriter Example

$
0
0

Does anyone know of, or have a sample of their own they will share, that illustrates how to use MF to create and use a SinkWriter to hardware encode NV12 input to h264?

I can get my code working in software, but I need to utilize hardware encoding resources to lower my cpu usage.  I am working on a Dell Latitude E6450 with both an AMD Radeon HD 8790M, and an Intel Core I7-4800MQ CPU. Both are supposed to support hardware encoding, and have MFTs. 

I would greatly appreciate tips/advice/code samples to correct this problem.

 mftrace shows the following:

For the amd gpu:

11308,EB0 15:22:09.34586 COle32ExportDetours::CoCreateInstance @ Failed to create {ADC9BC80-0F41-46C6-AB75-D693D793597D} AMD H.264 Hardware MFT Encoder (C:\Program Files\Common Files\ATI Technologies\Multimedia\AMDh264Enc32.dll) hr=0x80004005 E_FAIL
11308,EB0 15:22:09.34587 COle32ExportDetours::CoCreateInstance @ call to 'hr' failed (hr=0x80004005) at avcore\mf\samples\mf360\mftrace\mfdetours\otherdetours\ole32exportdetours.cpp:116
11308,EB0 15:22:09.34587 COle32ExportDetours::CoCreateInstance @ - exit (failed hr=0x80004005 E_FAIL)
11308,EB0 15:22:09.34587 CMFActivateDetours::ActivateObject @0078D1C0 call to 'FindDetouredVtbl( This->lpVtbl )->ActivateObject( This, riid, ppv )' failed (hr=0x80004005) at avcore\mf\samples\mf360\mftrace\mfdetours\interfacedetours\mfactivatedetours.cpp:440
11308,EB0 15:22:09.34587 CMFActivateDetours::ActivateObject @0078D1C0 - exit (failed hr=0x80004005 E_FAIL)

For the Intel cpu/gpu:

6332,CA4 15:24:04.14466 COle32ExportDetours::CoCreateInstance @ Created {4BE8D3C0-0515-4A37-AD55-E4BAE19AF471} Intel® Quick Sync Video H.264 Encoder MFT (C:\Program Files\Intel\Media SDK\mfx_mft_h264ve_w7_32.dll) @07AD3A00 - traced interfaces: IMFTransform @07AD3A00,
6332,CA4 15:24:04.14466 COle32ExportDetours::CoCreateInstance @ - exit
6332,CA4 15:24:04.14466 CMFActivateDetours::GetGUID @0040D218 - enter
6332,CA4 15:24:04.14466 CMFActivateDetours::GetGUID @0040D218 - exit
6332,CA4 15:24:04.14466 CMFActivateDetours::GetUnknown @0040D218 - enter
6332,CA4 15:24:04.14466 CMFActivateDetours::GetUnknown @0040D218 attribute not found guidKey = MFT_FIELDOFUSE_UNLOCK_Attribute
6332,CA4 15:24:04.14611 CMFActivateDetours::GetUnknown @0040D218 - exit (failed hr=0xC00D36E6 MF_E_ATTRIBUTENOTFOUND)
6332,CA4 15:24:04.14611 CMFActivateDetours::GetGUID @0040D218 - enter
6332,CA4 15:24:04.14611 CMFActivateDetours::GetGUID @0040D218 - exit
6332,CA4 15:24:04.14611 CMFActivateDetours::GetUnknown @0040D218 - enter
6332,CA4 15:24:04.14611 CMFActivateDetours::GetUnknown @0040D218 attribute not found guidKey = MFT_PREFERRED_ENCODER_PROFILE
6332,CA4 15:24:04.14611 CMFActivateDetours::GetUnknown @0040D218 - exit (failed hr=0xC00D36E6 MF_E_ATTRIBUTENOTFOUND)
6332,CA4 15:24:04.14611 CMFActivateDetours::GetUnknown @0040D218 - enter
6332,CA4 15:24:04.14612 CMFActivateDetours::GetUnknown @0040D218 attribute not found guidKey = MFT_PREFERRED_OUTPUTTYPE_Attribute
6332,CA4 15:24:04.14612 CMFActivateDetours::GetUnknown @0040D218 - exit (failed hr=0xC00D36E6 MF_E_ATTRIBUTENOTFOUND)
6332,CA4 15:24:04.14612 CMFActivateDetours::GetUINT32 @0040D218 - enter
6332,CA4 15:24:04.14612 CMFActivateDetours::GetUINT32 @0040D218 - exit
6332,CA4 15:24:04.14612 CMFPlatExportDetours::MFGetMFTMerit @ - enter
6332,CA4 15:24:04.19958 CMFPlatExportDetours::MFGetMFTMerit @ Merit validation failed for MFT @07AD3A00 (hr=80004005 E_FAIL)
6332,CA4 15:24:04.19958 CMFPlatExportDetours::MFGetMFTMerit @ - exit (failed hr=0x80004005 E_FAIL)
6332,CA4 15:24:04.20529 CMFActivateDetours::ActivateObject @0040D218 call to 'FindDetouredVtbl( This->lpVtbl )->ActivateObject( This, riid, ppv )' failed (hr=0x80004005) at avcore\mf\samples\mf360\mftrace\mfdetours\interfacedetours\mfactivatedetours.cpp:440
6332,CA4 15:24:04.20529 CMFActivateDetours::ActivateObject @0040D218 - exit (failed hr=0x80004005 E_FAIL)

difference between IMFSourceReader API and MFTEnumEx, finding decoders

$
0
0

Hi,

I have some troubles, finding decoders for "MFVideoFormat_DV25", "MFVideoFormat_DV50", "MFVideoFormat_DV100" as well as "MFVideoFormat_HEVC".

MFT_REGISTER_TYPE_INFO info = { MFMediaType_Video, subtype };
UINT32 unFlags = MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_HARDWARE | MFT_ENUM_FLAG_SORTANDFILTER;

hr = MFTEnumEx(MFT_CATEGORY_VIDEO_DECODER, unFlags,
        &info, // Input type
        NULL, // Output type
        &ppActivate, &count);

"count" is always 0 for these GUIDs.

If I use the IMFSourceReader API, I can read and decode such video files with the GUID "MFVideoFormat_DV25" and "MFVideoFormat_DV50". The GUID subtype (MF_MT_SUBTYPE) of the IMFSourceReader API is as expected  {30357664-0000-0010-8000-00AA00389B71} - MFVideoFormat_DV50 - for example.

I couldn`t test the two other formats, because I do not have such files (at the moment). Do you know the difference between the IMFSourceReader API and MFTEnumEx? Why is the IMFSourceReader API able to find a decoder and the MFTEnumEx failed?

best regards

saoirse



Media Foundation sample timestamps

$
0
0

Hi,

I having problems to find a way to query a sample's timestamps is UTC format.

Is there an attribute or a function that is equivalent to MFSampleExtension_DeviceTimestamp but that return a value in UTC instead of QPC?

Thanks,

Irena


Intel H264 Encoder MFT Problems on Win 7

$
0
0

Also posted in the Intel Media SDK Forum. 

I'm trying to get the Intel Quick-Sync H.264 Encoder MFT to work on Windows 7.   I've discovered that it doesn't work with a SinkWriter,  the error I eventually get while pushing samples to the SinkWriter is MF_E_UNEXPECTED( 0xC00D36BB ).  I'd like to get the encoder working in isolation first and then build up to hooking it to a MPEG4Sink to write to a file.

Anyway,  I'm running into a problem with the first step of creating a instance of the encoder and verifying that the eventing mechanism will work.   When I call GetEvent() I get a Access Violation with the following call stack :


mfplat.dll!MFTRACE_MEDIA_EVENT_IMPL( unsigned long, unsigned long, struct IMFMediaEvent * ) <---- Access Violation Here
mfplat.dll!CMFMediaEventGenerator::GetEvent( unsigned long, struct IMFMediaEvent * * )
mfplat.dll!CMFMediaEventQueue::GetEvent( unsigned long, struct IMFMediaEvent * * )
mfx_mft_h264ve_w7_32.dll!0f4fb3b0()
EventCrash.exe!main() Line 92

Here is a link to the code I used to get this far. 

I verified that this code works fine on Windows 10 with no crash.





Viewing all 1079 articles
Browse latest View live