Quantcast
Channel: Media Foundation Development for Windows Desktop forum
Viewing all 1079 articles
Browse latest View live

Failed to set VBR quality on WMV MFT encoder.

$
0
0

Hi, I'm using Windows Media Foundation WMV encoder on Win10 64bit. While it can be used to encode correctly, I failed to set VBR quality.

Below is the sample code

const PROPERTYKEY MFPKEY_VBRENABLED = { { 0xe48d9459, 0x6abe, 0x4eb5, { 0x92, 0x11, 0x60, 0x8, 0xc, 0x1a, 0xb9, 0x84 } }, 0x14 };     
const PROPERTYKEY MFPKEY_DESIRED_VBRQUALITY = { { 0x6dbdf03b, 0xb05c, 0x4a03, { 0x8e, 0xc1, 0xbb, 0xe6, 0x3d, 0xb1, 0x0c, 0xb4 } }, 0x00 + 25 };	  	  

CLSID* pCLSIDs = NULL;   // Pointer to an array of CLISDs.   UINT32 nCount = 0;     
MFT_REGISTER_TYPE_INFO encoderInfo;      encoderInfo.guidMajorType = MFMediaType_Video; 
encoderInfo.guidSubtype = MFVideoFormat_WMV3;     
HRESULT hr = fpMFTEnum(MFT_CATEGORY_VIDEO_ENCODER, 0, NULL, &encoderInfo, NULL, &pCLSIDs, &nCount);     
if (FAILED(hr) || (nCount == 0)) {}            ciEncoder.CreateObject(pCLSIDs[0], IID_IMFTransform);     
if (ciEncoder.IsInvalid()) {}          
LComInterface<IPropertyStore> ciPropertyStore; // WMV Encoder codec setting interface     
hr = ciEncoder->QueryInterface(IID_IPropertyStore, (void**)ciPropertyStore.GetAssignablePtrRef());    
if (SUCCEEDED(hr)) {         
    PROPVARIANT  propVal;         
    propVal.vt = VT_BOOL;         
    propVal.boolVal = VARIANT_TRUE;                  
    hr = ciPropertyStore->SetValue(MFPKEY_VBRENABLED, propVal);                 
    propVal.vt = VT_UI4;         
    propVal.ulVal = 90;         
    hr = ciPropertyStore->SetValue(MFPKEY_DESIRED_VBRQUALITY, propVal);         


While ciPropertyStore->SetValue(MFPKEY_VBRENABLED, propVal) returns S_OK, ciPropertyStore->SetValue(MFPKEY_DESIRED_VBRQUALITY, propVal) failed and hr = "The property ID does not match any property supported by the transform"

Thanks

== Update ==================================

if I run it on WMA encoder

 ciEncoder.CreateObject(CLSID_CWMAEncMediaObject1, IID_IMFTransform);

then there's no error. 




Major bug in latest Windows 10 media players for fragmented mp4 files

$
0
0

My Windows 10 machine updated itself and since the new update, it cannot play fragmented mp4 files properly. 

Here is an example file:

https://drive.google.com/file/d/0Bxyb9Iftjh4DUWdfV2JvMEs1OEU/view?usp=sharing

After you start the player, you need to seek it a few times for it to play properly. It was fine before the update.

This file plays fine in Windows 10. Please check this.

MPEG4 SinkWriter and MediaElement.SetSource (Metro) compatibility, and atom ordering

$
0
0

After much trial and error I've discovered that for Metro apps (though I assume it's the same for everything else) that the SetSource method of a MediaElement will not accept a stream of an MPEG4 file in which the MDAT atom comes before the MOOV atom, as is default for the MPEG4 Sink Writer (and thus Window 8's built-in transcoding features). (Note that it does accept the same file when it is referenced by a URI, annoyingly). MOOV before MDAT is required for streaming. I assumed the solution to my problem would be to use the MF_MPEG4SINK_MOOV_BEFORE_MDAT attribute when creating the sink, but I can't seem to get it to have an effect. Are there any pointers or samples that illustrate it? For the moment I've re-written the qt-quickstart code in ffmpeg to rearrange the atoms, but I'd prefer a cleaner solution.

I would post sample code, but it's largely the same as that in http://social.msdn.microsoft.com/Forums/en-US/winappswithnativecode/thread/49bffa74-4e84-4fd6-9d67-42e8385611b8 (but in c#). I'm setting the attribute as a UINT32-cast Boolean.True just before MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS is set, in that example.


Microsoft Video 1 AVI File Playback Issue when FPS > 1

$
0
0

Hi,

I tried to create a compressed AVI file using VFW in C++. However the compressed AVI file seems cannot be played correctly in Media Player. Media Player will close before the last frame duration ends.

For example, a AVI file is generated with 0.125 FPS, one frame should lasts for 8 seconds. If the file has 5 frames, the length should be 40 seconds. But the Media Player will stop at 00:31~00:32, just after the last frame is drawn. 

A uncompressed file with same fps could be played correctly, including the last frame. The Media Player will ends at 00:40 correctly.

I tried to convert the uncompressed file using ffmpeg with msvideo1 encoder, the output file still cannot be played correctly.

VLC cannot play this file correctly either.

Here's some code I used to prepare the stream:

    AVISTREAMINFO StreamInfo;
    ZeroMemory(&StreamInfo, sizeof(AVISTREAMINFO));

    FrameInfo.bmiHeader.biPlanes = 1;
    FrameInfo.bmiHeader.biWidth = 512;
    FrameInfo.bmiHeader.biHeight = 512;
    FrameInfo.bmiHeader.biCompression = BI_RGB;
    FrameInfo.bmiHeader.biBitCount = 24;
    FrameInfo.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
    FrameInfo.bmiHeader.biSizeImage = 512 * 512 * 3;

    auto HIC = ICOpen(mmioFOURCC('V', 'I', 'D', 'C'), mmioFOURCC('M', 'S', 'V', 'C'), ICMODE_QUERY);
    auto QueryResult = ICCompressQuery(HIC, &FrameInfo, nullptr);
    wcscpy_s(StreamInfo.szName, L"TEST Compress Video Stream");

    StreamInfo.fccType = streamtypeVIDEO;
    StreamInfo.fccHandler = mmioFOURCC('M', 'S', 'V', 'C');
    StreamInfo.dwScale = 100000;
    StreamInfo.dwRate = 12500;
    StreamInfo.dwQuality = 8000;
    StreamInfo.dwSuggestedBufferSize = FrameInfo.bmiHeader.biSizeImage;
    SetRect(&StreamInfo.rcFrame, 0, 0, FrameInfo.bmiHeader.biWidth, FrameInfo.bmiHeader.biHeight);

    auto CreateStreamResult = AVIFileCreateStream(AviFile, &Stream, &StreamInfo);

    AVICOMPRESSOPTIONS CompressOption;
    ZeroMemory(&CompressOption, sizeof(AVICOMPRESSOPTIONS));
    CompressOption.fccType = streamtypeVIDEO;
    CompressOption.fccHandler = StreamInfo.fccHandler;
    //CompressOption.dwKeyFrameEvery = 1;
    CompressOption.dwQuality = 8000;
    CompressOption.dwFlags = AVICOMPRESSF_VALID;

    auto MakeCompressStreamResult = AVIMakeCompressedStream(&CompressedStream, Stream, &CompressOption, nullptr);
    auto SetFormatResult = AVIStreamSetFormat(CompressedStream, 0, &FrameInfo.bmiHeader, FrameInfo.bmiHeader.biSize);


Does anyone know why the last frame cannot be played correctly?

Thanks for reading.

Best Regards,

Chen

Media Foundation: Cannot change a FPS on webcam

$
0
0

Hi All,

I try to replace codes with Directshow ("DS") on Media Foundation ("MF") in my app and met one problem - cannot set a needed fps using MF on a webcam. MF allowed me to set only 30 fps. If I try to set 25 fps, I always get the error 0xc00d5212 on SetCurrentMediaType(). In DS I could change that parameter.

My codes:

ASSERT(m_pReader); //IMFSourceReader *m_pReader;
IMFMediaType *pNativeType = NULL;
IMFMediaType *pType = NULL;
UINT32 w = 1280;
UINT32 h = 720;
UINT32 fps = 25; // or 30

DWORD dwStreamIndex = MF_SOURCE_READER_FIRST_VIDEO_STREAM;

// Find the native format of the stream.
HRESULT hr = m_pReader->GetNativeMediaType(dwStreamIndex, 0, &pNativeType);
if (FAILED(hr))
{
  //error
}

GUID majorType, subtype;

// Find the major type.
hr = pNativeType->GetGUID(MF_MT_MAJOR_TYPE, &majorType);
if (FAILED(hr))
{
  //error
}
// Define the output type.
hr = MFCreateMediaType(&pType);
if (FAILED(hr))
{
  //error
}
hr = pType->SetGUID(MF_MT_MAJOR_TYPE, majorType);
if (FAILED(hr))
{
  //error
}
// Select a subtype.
if (majorType == MFMediaType_Video)
{
    subtype= MFVideoFormat_RGB24;
}
else
{
  //error
}
hr = pType->SetGUID(MF_MT_SUBTYPE, subtype);
if (FAILED(hr))
{
  //error
}
hr = MFSetAttributeSize(pType, MF_MT_FRAME_SIZE, w, h);
if (FAILED(hr))
{
  //error
}
hr = MFSetAttributeSize(pType, MF_MT_FRAME_RATE, fps, 1);
if (FAILED(hr))
{
  //error
}
hr = m_pReader->SetCurrentMediaType(dwStreamIndex, NULL, pType);
if (FAILED(hr))
{// hr = 0xc00d5212
  //!!!!!error - if fps == 25
}
return hr;

I tried to research  my webcam (Logitech HD Webcam C310) by MF utility (http://alax.info/blog/1579)  and got the followings: my webcam can work only with 1 and 30 fps. But DS GraphEdit shown me another things - 5, 10, 15, 20, 25 and 30 fps. Why?

Thanks for any help.

Playing two video sources with aggregatesource and reader

$
0
0
I am attempting to take in two video sources and process them via a reader on sample read in order to place reduced images on the top and bottom half of the output stream.  The issue that I am running into is that the reader stops the callback after the first frame.  Is there a better way of doing this??

Martin Autry

Media Foundation's AAC format is not recognized by FFMPEG

$
0
0

 I've tried to generate movies by using MediaFoundationAPI and succeeded. This video file has two streams, a h264 stream and a AAC stream.

 But this generated movie file was not recognized by FFMPEG and Youtube. AAC stream recognized as 'none'.

 It seems MediaFoundation's AAC encoder and AAC profile are not standard and are not widely used way.

 Can I tune IMFMediaType of standard way?

Movie Maker audio and video

$
0
0
when i'm editing a video, i like to have sound from one video layered over another video.  On movie make I can't seem to do that.  if anyone knows how to split the audio and video of a file I would appreciate the help!

ks proxy plugin support for proprietory interfaces

$
0
0

Hello,

I have a video capture driver which exposes proprietary properties using automation tables. In addition to the driver, I have a usermode DLL which contains the COM interface declarations. All of this is based on the Max Paklin sample code: http://timr.probo.com/wd3/121503/KsProxyPlugin.htm

In a DirectShow application, I can call IUnknown::QueryInterface on my source filter's IBaseFilter, passing in the GUID of my interface, and receive a pointer to the interface. The code body lies in the DLL, which makes calls to the driver through the IKsPropertySet interface.

All this is well and good.

My question is, how can I obtain my interface from within a Media Foundation application?

I have a IMFMediaSource object for my source, but calling QueryInterface as above does not work (E_NOINTERFACE).

I obtained the IMFMediaSource using the msdn code https://msdn.microsoft.com/en-us/library/windows/desktop/dd940326(v=vs.85).aspx

Thank you in advance.

Media Foundation MPEG4FileSink creates file with wrong length and without 'Data rate' and 'Total bitrate'

$
0
0

Hello, i faced some troubles with capturing from camera with Media Foundation using topology. I have to capture data from camera, render it, encode it to H264 and save to file

It renders and saves encoded file but in playback there is no way to seek and details tab contains wrong video length and does not contain data rate and total bitrate.

My topology: source node (capture device) => tee node => H264 encoder => mpeg4filesink

                                                                                   => direct show renderer

i have some helper to set IMFMediaType attributes to output type:

 

HRESULT copyTypeParameters(IMFMediaType * in_media_type, IMFMediaType * out_mf_media_type) {
    UINT32 frameRate = 0;
    UINT32 frameRateDenominator;
    UINT32 aspectRatio = 0;
    UINT32 interlace = 0;
    UINT32 denominator = 0;
    UINT32 width, height, bitrate;
    HRESULT hr = S_OK;

    hr = CopyAttribute(in_media_type, out_mf_media_type, MF_MT_AVG_BITRATE);
    THROW_ON_FAIL(hr);
    hr = out_mf_media_type->SetUINT32(MF_NALU_LENGTH_SET, false);
    THROW_ON_FAIL(hr);
    hr = out_mf_media_type->GetUINT32(MF_MT_AVG_BITRATE, &bitrate);
    THROW_ON_FAIL(hr);
    hr = MFGetAttributeRatio(in_media_type, MF_MT_FRAME_SIZE, &width, &height);
    THROW_ON_FAIL(hr);
    hr = MFGetAttributeRatio(in_media_type, MF_MT_FRAME_RATE, &frameRate, &frameRateDenominator);
    THROW_ON_FAIL(hr);
    hr = MFGetAttributeRatio(in_media_type, MF_MT_PIXEL_ASPECT_RATIO, &aspectRatio, &denominator);
    THROW_ON_FAIL(hr);
    hr = MFSetAttributeRatio(out_mf_media_type, MF_MT_FRAME_SIZE, width, height);
    THROW_ON_FAIL(hr);
    hr = MFSetAttributeRatio(out_mf_media_type, MF_MT_FRAME_RATE, frameRate, frameRateDenominator);
    THROW_ON_FAIL(hr);
    hr = MFSetAttributeRatio(out_mf_media_type, MF_MT_PIXEL_ASPECT_RATIO, aspectRatio, denominator);
    THROW_ON_FAIL(hr);
    hr = CopyAttribute(in_media_type, out_mf_media_type, MF_MT_INTERLACE_MODE);
    THROW_ON_FAIL(hr);
    return hr;
}

I've tried to create custom IMFTransform and set sample time of first sample to 0 and shift each next timestamp to some offset equal first sample timestamp. It does not help:(

MFTIME currentSampleTime; 
hr = m_pSample->GetSampleTime(&currentSampleTime);
THROW_ON_FAIL(hr);
    
if (sampleCount == 0) {
     timeOffset = currentSampleTime;
}
MFTIME newValue = currentSampleTime - timeOffset;
hr = m_pSample->SetUINT32(MFSampleExtension_CleanPoint, true);
THROW_ON_FAIL(hr);
hr = m_pSample->SetSampleTime(newValue);
THROW_ON_FAIL(hr);

May be somebody faced same problem and found some solution?

system requirements for asynchronous MFTs

$
0
0

Hey,

could you tell me the hardware requirements for asynchronous MFTs? I have a Windows 8.1 system with an Nvidia Quadro video card, and there is no asynchronous MFT available for H.264.

MFT_REGISTER_TYPE_INFO info = { MFMediaType_Video, MFVideoFormat_H264};

UINT32 unFlags = MFT_ENUM_FLAG_ASYNCMFT | MFT_ENUM_FLAG_LOCALMFT |
        MFT_ENUM_FLAG_SORTANDFILTER;

hr = MFTEnumEx (MFT_CATEGORY_VIDEO_DECODER,
        unFlags,
        &info,      // Input type
        NULL,       // Output type
        &ppActivate,
        &count);

Are there any decoder from Microsoft supporting a asynchronous MFT (https://msdn.microsoft.com/en-us/library/windows/desktop/ff819077%28v=vs.85%29.aspx)

best regards

saoirse


Media Foundation and MP4 files with multiple avc1 entries in stsd.

$
0
0

Does media foundation and windows media player support mp4 files with multiple avc1 entries in the stsd box? I am doing a custom muxer, and this is one scenario where a hand written media foundation player and windows media player fails while all others like VLC, QuickTime and ffmpeg work fine.

I have an RTSP stream which disconnects time to time and each time the new stream has a different sps/pps (while the resolution stays the same). So I assumed I should be able to mux such a video stream in a single track using multiple avc1 entries, and grouping the video samples into chunks, where I switch chunk on each sps/pps change to use the next sample descriptor.

I hope that maybe someone from with knowledge of the internals of medai foundation can comment on this. If I try to open this file via IMFSourceResolver::CreateObjectFromURL I get the HRESULT 0xc00d36c4. If I mux the same video stream up to the point where the sps/pps changes it works fine, the bad behaviour seems to be introduced by the swtich.

Moreover, I tried to reverse engineer this scenario and write an MP4 file using media foundation - set the MF_MT_MPEG4_SAMPLE_DESCRIPTION to the stsd box with two avc1 entries and even write only one sample, but the output file is again not played by windows media player, but opens fine in VLC, ffmpeg and QuickTime. 

If it helps in any way I can provide the original video file and any other data.

Any help is appreciated:)

How to convert YUY2 to RGB24 by DSP ?

$
0
0

I stuck on initialization of IMFTransform interface.

	IMFTransform *lColorConvert = NULL;
	//if (needConvert == true)
	{
		UINT32 unFlag = MFT_ENUM_FLAG_SYNCMFT |
			MFT_ENUM_FLAG_LOCALMFT |
			MFT_ENUM_FLAG_SORTANDFILTER | MFT_ENUM_FLAG_HARDWARE;
		UINT32 count = 0;
		MFT_REGISTER_TYPE_INFO in = { 0 };
		in.guidMajorType = MFMediaType_Video;
		in.guidSubtype = MFVideoFormat_YUY2;
		MFT_REGISTER_TYPE_INFO out = { 0 };
		out.guidMajorType = MFMediaType_Video;
		out.guidSubtype = MFVideoFormat_RGB24;
		IMFActivate **ppMFTActive=NULL;
		hr = MFTEnumEx(MFT_CATEGORY_VIDEO_ENCODER/*MFT_CATEGORY_VIDEO_DECODER*/,unFlag, &in,/*&out*/NULL, &ppMFTActive, &count);
		ShowHRESULT(hr, L"MFTEnumEx");
		if (count < 1)
		{
			MessageBoxA(0, "Codecs not found", "", MB_OK);
		}
		else
		{
			IMFMediaType *inType=NULL;
			hr = MFCreateMediaType(&inType);
			ShowHRESULT(hr, L"MFCreateMediaType(&inType)");
			hr = inType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
			ShowHRESULT(hr, L"inType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video)");
			hr = inType->SetGUID(MF_MT_SUBTYPE, /*MediaGuids[mediaIndex]*/MFVideoFormat_YUY2);
			ShowHRESULT(hr, L"inType->SetGUID(MF_MT_SUBTYPE, MediaGuids[mediaIndex])");
			IMFMediaType *outType=NULL;
			hr = MFCreateMediaType(&outType);
			ShowHRESULT(hr, L"MFCreateMediaType(&outType)");
			hr = outType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
			ShowHRESULT(hr, L"outType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video)");
			hr = outType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB24);
			ShowHRESULT(hr, L"outType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB24)");

			hr = ppMFTActive[0]->ActivateObject(__uuidof(IMFTransform), (void **)&lColorConvert);
			ShowHRESULT(hr, L"ppMFTActive[0]->ActivateObject");
			hr = lColorConvert->SetOutputType(0, outType, 0); //MF_E_INVALIDMEDIATYPE
			ShowHRESULT(hr, L"lColorConvert->SetOutputType");
			hr = lColorConvert->SetInputType(0,inType, 0);//0xC00D6D60
			ShowHRESULT(hr, L"lColorConvert->SetInputType");
		}
	}

Maybe I on wrong way ? How to convert YUY2 to RGB24 by DSP ? Or it impossibly?

IMFCaptureEngine Preview

$
0
0

I am trying to utilize the capture engine and am running into a couple of issues.  First, the source has 1080p capability, but I cannot seem to set the preview for that resolution.  It always seems to display 640 x 480.  Secondly, I cannot seem to get audio to work in preview.  I successfully add the audio stream, but I get an error event saying the start preview is not relevant in the current state.  The documentation describes each interface, but not how to tie them together.  If anyone has gotten this to work, I would be interested in how.  Thanks

       // Configure the video format for the preview sink.
        hr = pSource->GetCurrentDeviceMediaType((DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_PREVIEW , &pMediaType);
        if (FAILED(hr))
        {
            goto done;
        }

		UINT width;
		UINT height;
		hr = MFGetAttributeSize(pMediaType, MF_MT_FRAME_SIZE, &width, &height);
        hr = CloneVideoMediaType(pMediaType, MFVideoFormat_RGB32, &pMediaType2);
        if (FAILED(hr))
        {
            goto done;
        }

        hr = pMediaType2->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE);
        if (FAILED(hr))
        {
            goto done;
        }

		hr = MFSetAttributeSize(pMediaType2, MF_MT_FRAME_SIZE, 1920, 1080);
		// Connect the video stream to the preview sink.
        DWORD dwSinkStreamIndex;
        hr = m_pPreview->AddStream((DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_PREVIEW,  pMediaType2, NULL, &dwSinkStreamIndex);
        if (FAILED(hr))
        {
            goto done;
        }

		hr = pSource->GetCurrentDeviceMediaType((DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_AUDIO, &pMediaType);
		hr = CloneAudioMediaType(pMediaType, MFAudioFormat_PCM, &pMediaType2);
		if (FAILED(hr))
		{
			goto done;
		}

		hr = pMediaType2->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE);
		if (FAILED(hr))
		{
			goto done;
		}

		hr = m_pPreview->AddStream((DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_AUDIO, pMediaType2, NULL, &dwSinkStreamIndex);
	}

    hr = m_pEngine->StartPreview();


Martin Autry

windows media player

$
0
0
says windows media player 9 says wrong internet address how do i get right 1 and change it many thanks

Grab a frame from WMV file using current best practices. (2017)

$
0
0

Does  example code exist that can extract a single frame from a wmv file using the latest best practices with managed c# code. Some of the samples and discussions I have found go back to 2005.


Paul

Windows Media Player SDK for 64-bit applications

$
0
0

I wish to include the Windows Media Player ActiveX control in a 64-bit WinForms application. When I add the control in the Forms designer, it automatically adds references to associated32-bit DLLs to my project. The compiler warns about unexpected runtime behavior when mixing 32-bit DLLs in a 64-bit app. I didn't even think it was possible to do so, but the control works. However, I'm experiencing odd behavior associated with playback of short files, and I'm concerned that it may have something to do with this mix. I searched my hard drive for versions of AxInterop.WMPLib and WMPLib, but found none other than those added to my project, so the originals must be in a compressed file somewhere.

Is is OK to use these 32-bit DLLs in a 64-bit application, or is there some way to get Visual Studio 2012 to add the 64-bit DLLs?

(Please redirect me if this isn't the correct forum.)

Multiple input MFT

$
0
0

Hi

I'm trying to create custom MFT with multiple inputs (audio or video) from multiple Media Sources (i.e. webcam capture and video file for example). It works fine with one input from either source, but if i try to switch to two inputs ProcessInput/ProcessOutput methods never called on my MFT. 

So my questions are:

  1. Do I need to implement custom Media Session to make this setup work?
  2. Do I need to implement custom Topology Loader to make this setup work?

IMFSourceReader::ReadSample can only read 8 samples per second, while the FPS of the mediatype is 30, frames lost?

$
0
0

I just follow the codes below to study the Media Foundation.

https://github.com/darkknightzh/Microsoft-Source-Reader

The codes  use IMFSourceReader::ReadSample in a loop  to read samples from USB webcam and counterthe actual samples read per second.

I add below codes in SetDeviceFormat

MFSetAttributeRatio(pType, MF_MT_FRAME_RATE, 30, 1); 

hr = pHandler->SetCurrentMediaType(pType);

but I can only get 8 samplers per second ,  

double usedtime = ((time_over.QuadPart - time_start.QuadPart) / dqFreq);
if (usedtime>1)
{
        printf(" cSamples = %d\n", cSamples);
	cSamples = 0;
	QueryPerformanceCounter(&time_start);
}

However,  the llSampleDuration from pSample->GetSampleDuration(&llSampleDuration) is33ms which is expected

.....

cSamples = 8
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 llSampleDuration = 33
 cSamples = 8

......

Does it mean some frames are lost ?  the 30 samples per second are expected, but there is only 8 samples actually.

If I changed the target FPS to 15.

MFSetAttributeRatio(pType, MF_MT_FRAME_RATE, 15, 1); 

the llSampleDuration  is 66ms which is expected.  While I can still only get8 samples per second.

cSamples = 8
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 llSampleDuration = 66
 cSamples = 8



Using Tee Nodes

$
0
0

I am trying to develop a audio/video recorder with preview.  Initially, I set up a topology using MFCreateTranscodeTopology and a profile.  Then I get the source node from the topology, disconnect the source output and insert a Tee Node.  I then connect the first Tee Node output to the original source downstream node and connect the other Tee node output to an EVR node.  All configures successfully, but when I try to run the session, I get a suitable decoder/encoder not found event error. If I disconnect the EVR, it runs fine encoding.  I think this is a media type issue, but how do you get these to match on the Tee Node outputs?

	hr = CreateSession();

	if (SUCCEEDED(hr))
	{
		hr = CreateMediaSource(&m_pSource);
	}

	hr = CreateTranscodeProfile(&pProfile);
	if (FAILED(hr))
	{
		goto done;
	}

	hr = MFCreateTranscodeTopology(m_pSource, L"c:\\programdata\\tekvox\\tekvideovault\\output1.mp4", pProfile, &pTopology);
	if (FAILED(hr))
	{
		goto done;
	}

	if (SUCCEEDED(hr))
	{
		hr = m_pSource->CreatePresentationDescriptor(&pPD);
	}

	IMFStreamDescriptor *pSD;
	BOOL fSelected;
	pPD->GetStreamDescriptorByIndex(0, &fSelected, &pSD);

	// Get source node and disconnect it
	IMFCollection *pCollection;
	hr = pTopology->GetSourceNodeCollection(&pCollection);
	IMFTopologyNode *pSourceNode;
	hr = pCollection->GetElement(0, (IUnknown **)&pSourceNode);
	IMFTopologyNode *pDownStreamNode;
	DWORD downStreamIndex;
	pSourceNode->GetOutput(0, &pDownStreamNode, &downStreamIndex);
	pSourceNode->DisconnectOutput(0);

	// Add Tee Node
	IMFTopologyNode *pTeeNode;
	hr = MFCreateTopologyNode(MF_TOPOLOGY_TEE_NODE, &pTeeNode);
	hr = pTopology->AddNode(pTeeNode);

	// Add Sink
	IMFActivate *pSinkActivate;
	hr = CreateMediaSinkActivate(pSD, hwnd, &pSinkActivate);
	IMFTopologyNode *pOutputNode;
	hr = AddOutputNode(pTopology, pSinkActivate, 0, &pOutputNode);

	// Connect source to Tee
	hr = pSourceNode->ConnectOutput(0, pTeeNode, 0);

	// Connect Tee to Encoder
	hr = pTeeNode->ConnectOutput(0, pDownStreamNode, 0);

	// Connect Tee to Renderer
	hr = pTeeNode->ConnectOutput(1, pOutputNode, 0);

	if (SUCCEEDED(hr))
	{
		hr = m_pSession->SetTopology(0, pTopology);
	}

	if (SUCCEEDED(hr))
	{
		PROPVARIANT varStart;
		PropVariantClear(&varStart);
		m_pSession->Start(&GUID_NULL, &varStart);
		m_state = OpenPending;
	}


Martin Autry


Viewing all 1079 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>