Quantcast
Channel: Media Foundation Development for Windows Desktop forum
Viewing all 1079 articles
Browse latest View live

Where to report an issue inside WMF

$
0
0

Hi,

There's an issue during stopping of playback of certain video file. IMFMediaSession::Shutdown() hangs.

Here is the callstack:

ntdll.dll!_RtlAcquireSRWLockExclusive@4()Unknown
mfcore.dll!CComRWLock::LockExclusive(void)Unknown
mfcore.dll!CMFMediaProcessor::Shutdown()Unknown
mfcore.dll!CMediaSession::InternalShutdown() Unknown
mfcore.dll!CMediaSession::Shutdown()Unknown
wmfengined.dll!MFPlayerSession::close() Line 134
wmfengined.dll!MFPlayerService::~MFPlayerService() Line 70
wmfengined.dll!MFPlayerService::`scalar deleting destructor'(unsigned int)wmfengined.dll!WMFServicePlugin::release(QMediaService * service) Line 93Qt5Multimediad.dll!QPluginServiceProvider::releaseService(QMediaService * service) Line 450Qt5Multimediad.dll!QMediaPlayer::~QMediaPlayer() Line 677...

Where can I report the issue with attached video file and sample to reproduce the hang?



mixerGetLineInfo

$
0
0

Hello!

Unable to retrieve data using mixerGetLineInfo under Windows 7 and using one of the pattern, please, see here.. 

It says

MIXER_OBJECTF_WAVEOUTThe hmxobj parameter is the identifier of a waveform-audio output device in the range of zero to one less than the number of devices returned by the waveOutGetNumDevs function.

MIXERLINE pmxl; pmxl.cbStruct = sizeof(pmxl); uint err; uint DeviceId = 0; // any available WaveIn or WaveOut device id err = mixerGetLineInfo(DeviceId, &pmxl, MIXER_OBJECTF_WAVEOUT); // or this way err = mixerGetLineInfo(DeviceId, &pmxl, MIXER_OBJECTF_WAVEIN);

My system does have several input mics and speakers, so I can see them in system devices and able to control them.
I need this to quick access to Volume of the speakers.

My Q is about that 2 patterns - is it a well-known issue or bug? The err value is always about 1024


"I = I + 1" .. Isn't it boolshit?


Memory leak IMFSourceReader->ReadSample method

$
0
0

Hello,

I experience a memory leak when using the IMFSourceReader->ReadSample method in sync mode, reading media samples from a file (not a device) after reading about  540.000 bytes the method returns a "Not enough storage is available to complete this operation." error (0x8007000E).

This issue can be reproduced with MF example AudioClip, or the article Tutorial Decoding Audio.

Note: If the IMFSourceReader->ReadSample is used in async mode to get camera samples all works fine.

Could someone explain if this is a bug in MfReadWrite.dll or something else?

The operating OS is Windows 10, 1903, November 2019 update.

Thank you in advance, Tony.


IMFSourceReader::ReadSample or MFCaptureToFile memory leak?

$
0
0

I'm concerned that I may have a memory leak in the MFCaptureToFile sample, specifically related to IMFSourceReader::ReadSample.  

  • I believe this because on the task manager "Processes" tab I see the Memory for my program growing indefinitely, even as far as 25GB, which brings my computer to its knees, presumably from all the paged memory thrashing.
  • Nevertheless, when my program exits, all this memory is eventually freed (even in the 25GB case, after a couple minutes), so the memory isn't permanently lost.

I started with the MFCaptureToFile sample, so I've gone back to this sample to see if it has the same memory leak.  I don't have a working capture device at the moment, so I am temporarily re-routing to a file input.  So I re-created the trivial changes to make this happen, and MFCaptureToFile so modified indeed has the same memory leak.

Is it my modifications somehow, or does MFCaptureToFile actually have a memory leak?

Below are my trivial modifications.  With these done fresh to the MFCaptureToFile project, I see the growing memory usage in task manager.  So the leak is there already. 

Any comments please?

Trivial mods:

1) I added CreateMediaSource() that I got from some other sample, and then called it from MFCaptureToFile's StartCapture() function for the case that pActivate was NULL, having no capture device.  Below is the excerpt from capture.cpp.  OF course, I also declare CreateMediaSource in capture.h.

//-------------------------------------------------------------------
// CreateMediaSource
//
// Create a media source from a URL.
//-------------------------------------------------------------------

HRESULT CCapture::CreateMediaSource(PCWSTR sURL, IMFMediaSource **ppSource)
{
    MF_OBJECT_TYPE ObjectType = MF_OBJECT_INVALID;

    IMFSourceResolver* pSourceResolver = NULL;
    IUnknown* pSource = NULL;

    // Create the source resolver.
    HRESULT hr = MFCreateSourceResolver(&pSourceResolver);
    if (FAILED(hr))
    {
        goto done;
    }

    // Use the source resolver to create the media source.

    // Note: For simplicity this sample uses the synchronous method to create 
    // the media source. However, creating a media source can take a noticeable
    // amount of time, especially for a network source. For a more responsive 
    // UI, use the asynchronous BeginCreateObjectFromURL method.

    hr = pSourceResolver->CreateObjectFromURL(
        sURL,                       // URL of the source.
        MF_RESOLUTION_MEDIASOURCE,  // Create a source object.
        NULL,                       // Optional property store.&ObjectType,        // Receives the created object type. &pSource            // Receives a pointer to the media source.
        );
    if (FAILED(hr))
    {
        goto done;
    }

    // Get the IMFMediaSource interface from the media source.
    hr = pSource->QueryInterface(IID_PPV_ARGS(ppSource));

done:
    SafeRelease(&pSourceResolver);
    SafeRelease(&pSource);
    return hr;
}

//-------------------------------------------------------------------
// StartCapture
//
// Start capturing.
//-------------------------------------------------------------------

HRESULT CCapture::StartCapture(
    IMFActivate *pActivate,
    const WCHAR *pwszFileName,
    const EncodingParameters& param
    )
{
    HRESULT hr = S_OK;

    IMFMediaSource *pSource = NULL;

    EnterCriticalSection(&m_critsec);

    // Create the media source for the device.
	if (pActivate) 
	{
		// If video capture device exists
		hr = pActivate->ActivateObject(
			__uuidof(IMFMediaSource), 
			(void**)&pSource
			);

		// Get the symbolic link. This is needed to handle device-
		// loss notifications. (See CheckDeviceLost.)

		if (SUCCEEDED(hr))
		{
			hr = pActivate->GetAllocatedString(
				MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK,&m_pwszSymbolicLink,
				NULL
				);
		}
	}
	else
	{
		// Otherwise HARD-CODE read from file
		//m_bFileSource = TRUE;
		//m_bEOF = FALSE;

		hr = CreateMediaSource(L"M:\\SPIIRCAM\\Doc\\Video\\Media Foundation SDK\\Play\\PLAYDATA\\MVI_2620.wmv", &pSource);

		// Since we're reading from a file and not a device, don't need to handle device-loss notifications
		//unnecessary//m_pwszSymbolicLink = NULL;
	}

    if (SUCCEEDED(hr))
    {
        hr = OpenMediaSource(pSource);
    }

    // Create the sink writer
    if (SUCCEEDED(hr))
    {
        hr = MFCreateSinkWriterFromURL(
            pwszFileName,
            NULL,
            NULL,
            &m_pWriter
            );
    }

    // Set up the encoding parameters.
    if (SUCCEEDED(hr))
    {
        hr = ConfigureCapture(param);
    }

    if (SUCCEEDED(hr))
    {
        m_bFirstSample = TRUE;
        m_llBaseTime = 0;

        // Request the first video frame.

        hr = m_pReader->ReadSample(
            (DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM,
            0,
            NULL,
            NULL,
            NULL,
            NULL
            );
    }

    SafeRelease(&pSource);
    LeaveCriticalSection(&m_critsec);
    return hr;
}

2) I also had to enable the "Start Capture" button in this case, with no capture devices.  Below is an excerpt from winmain.cpp, where my ONLY change was that one line where I put in "TRUE" to the EnableDialogControl() call.

//-----------------------------------------------------------------------------
// UpdateUI
//
// Updates the dialog UI for the current state.
//-----------------------------------------------------------------------------

void UpdateUI(HWND hDlg)
{
    BOOL bEnable = (g_devices.Count() > 0);     // Are there any capture devices?
    BOOL bCapturing = (g_pCapture != NULL);     // Is video capture in progress now?

    HWND hButton = GetDlgItem(hDlg, IDC_CAPTURE);

    if (bCapturing)
    {
        SetWindowText(hButton, L"Stop Capture");
    }
    else
    {
        SetWindowText(hButton, L"Start Capture");
    }

    EnableDialogControl(hDlg, IDC_CAPTURE, TRUE/*bCapturing || bEnable*/);

    EnableDialogControl(hDlg, IDC_DEVICE_LIST, !bCapturing && bEnable);

    // The following cannot be changed while capture is in progress,
    // but are OK to change when there are no capture devices.

    EnableDialogControl(hDlg, IDC_CAPTURE_MP4, !bCapturing);
    EnableDialogControl(hDlg, IDC_CAPTURE_WMV, !bCapturing);
    EnableDialogControl(hDlg, IDC_OUTPUT_FILE, !bCapturing);
}


Trouble setting the H.264 Encoder's Max Key Frame Spacing.

$
0
0

Hi

I am currently trying to change the max key frame interval of the media foundation H.264 encoder by setting the MF_MT_MAX_KEYFRAME_SPACING on the output media type of the encoder. But every video that I create seems to always default to a key frame interval of 2 seconds.

My current scenario entails setting up a SinkWriter with an uncompressed YUY2 input media type and a H.264 output media type.

The video source is 25fps so for a 1 second key frame interval I set the MF_MT_MAX_KEYFRAME_SPACING attribute on the H.264 output media type to 25 frames. But the H.264 encoder still outputs the key frames every 2 seconds(50 frames).

I also tried setting it to a higher 250 frames(10 second) interval and with the same result.

Am I missing some setting somewhere or is the max key frame interval not configurable on the media foundation H.264 encoder?

I have included two trace statements from my tests with the SinkWriter below:

1. The Media Foundation H.264 encoder being created.
7784,1CD4 12:44:33.39787 COle32ExportDetours::CoCreateInstance @ Created {6CA50344-051A-4DED-9779-A43305165E35}  (C:\Windows\SysWOW64\mfh264enc.dll) @00A5B51C - traced interfaces: IMFTransform @00A5B51C,

2. The output media format set on the encoder with the MF_MT_MAX_KEYFRAME_SPACING=25 attribute.
7784,1CD4 12:44:33.41075 CMFTransformDetours::SetOutputType @00A5B51C Succeeded MT: MF_MT_FRAME_SIZE=3092376453696 (720,576);MF_MT_AVG_BITRATE=3500000;MF_MT_MPEG_SEQUENCE_HEADER=00 00 00 01 67 42 c0 1e 95 b0 2d 04 9b 01 10 00 00 03 00 10 00 00 03 03 21 da 08 84 6e 00 00 00 01 68 ca 8f 20 ;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_MPEG2_PROFILE=66;MF_MT_MAX_KEYFRAME_SPACING=25;MF_MT_FRAME_RATE=107374182401 (25,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_INTERLACE_MODE=7;MF_MT_SUBTYPE=MEDIASUBTYPE_H264

Custom Video Processor for DX11VideoRenderer

$
0
0
What I'm Trying To Do

I am using a media sink based on the Media Foundation DX11VideoRenderer sample to play video. DX11VideoRenderer supports two video processing modes: one based on the Video Processor MFT (XVP) and one based on the Direct3D 11 ID3D11VideoProcessor. I have three issues:

1) As far as I know, the XVP path provides no way to disable video auto-processing, such as "contrast enhancement" and "skin tone mapping." Since my application is for color grading, it is essential that I turn these features off.
  
2) There is some diversity in the way different GPU vendors interpret color space conversions. For example, one vendor's BT709 YUV to RGB conversion deviates so far from the standard that I can't use it in the context of a color grading application.

3) I would like to support some color transforms and transfer functions that are not supported by Media Foundation, such as S-Log and Canon Log, LUTs, etc.

How I Think I Can Solve It
  
I believe I can implement the color transforms and custom transfer functions in a Direct3D 11 pixel shader and replace the DX11VideoRenderer video processor modes with one that applies the pixel shader to a rectangle drawn with Direct3D. This step could also simultaneously perform the NV12 to B8R8G8A8 conversion (or similar). I know the trick with mapping NV12 to a pair or R and RG shader resource views and have a working prototype.

The issue I ran into is the sample textures input to DX11VideoRenderer do not have the D3D11_BIND_SHADER_RESOURCE flag set and so cannot be used as pixel shader inputs. These are some ideas I have to address this issue:
  
1) Create a second texture for each sample texture input to DX11VideoRenderer with the D3D11_BIND_SHADER_RESOURCE flag set and use ID3D11DeviceContext::CopyResource to copy into it. This second copy could be expensive and seems unnecessary.
  
2) Perhaps there is a way for DX11VideoRenderer to hook into the texture creation and create textures with D3D11_BIND_SHADER_RESOURCE? I see that DX11VideoRenderer exposes a IMFVideoSampleAllocatorEx service that could allocate input textures but unfortunately Media Foundation does not seem to use this service.
  
3) There is a MF_SA_D3D11_BINDFLAGS attribute that can be applied to MFTs. If I could gain access to the IMFTransform for whichever node creates the textures fed to DX11VideoRenderer, I could hint that D3D11_BIND_SHADER_RESOURCE is needed and perhaps fall back on 1) if the hint is not taken and the flag is not set.
  
The solution does not need to be limited to changes to DX11VideoRenderer. I have access to the code that initializes Media Foundation, sets up the playback topology, etc.

DirectX & Media Foundation [UWP|WIN32] IMFMediaBuffer/IMF2DBuffer to ID3D11Texture2D

$
0
0

Hi.

  I Decide to integrate WebCam (IMFSourceReaderCallback) to my engine and I make some workaround with MFCaptureD3D from Windows Sample. But I get a little confuse about two thing.

1. In my engine, all was synchronize, with CRXMaterial::OnTick. By Example, if you take IMFMediaEngine and ask to read a video (mp4/avi). This one get Texture ID3D11Texture2D1 from ID3D11ShaderResourceView1 and send it back to IMFMediaEngine::TransferFrame and your video frame was copy inside your texture.

Now with IMFSourceReaderCallback you must draw the frame inside IMFSourceReaderCallback::OnReadSample. If i will respect OnTick philosophy I got a idea. Buffering IMFMediaBuffer inside IMFSourceReaderCallback::OnReadSample and take it back inside CRXMaterial::OnTick to transfer inside resource texture.

2. The problem come with example who use IDirect3DSurface9 and YUV color system. In sort, how transfer IMF2DBuffer to ID3D11Texture2D?

OnReadSample

//***************************************************************************
//* Class name    : CRXWebCam
//* Output        : HRESULT
//* Function name : OnReadSample
//* Description   : 
//* Input         : HRESULT hrStatus
//*                 DWORD dwStreamIndex
//*                 DWORD dwStreamFlags
//*                 LONGLONG llTimestamp
//*                 IMFSample* pSample
//***************************************************************************
HRESULT CRXWebCam::OnReadSample(HRESULT hrStatus, DWORD dwStreamIndex, DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample* pSample)
{
    HRESULT hr = S_OK;
    IMFMediaBuffer* pBuffer = NULL;

    if (FAILED(hrStatus))
        hr = hrStatus;

    if (SUCCEEDED(hr))
    {
        if (pSample)
        {
            hr = pSample->GetBufferByIndex(0, &pBuffer);

            //Buffering the frame
            m_pBuffer = pBuffer;
            m_pBuffer->AddRef();

            // Draw the frame.
        }
    }

    if (SUCCEEDED(hr))
        hr = m_pReader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, NULL);
    if (pBuffer) pBuffer->Release();
    pBuffer = NULL;
    return hr;
}

OnTick

//***************************************************************************
// Class name     : CRXMedia
// Function name  : Render
// Description    : Render Media Class
//***************************************************************************
void CRXMaterial::OnTick()
{
	/*IDXGIFactory4* spFactory;
	IDXGIAdapter1* spAdapter;
	IDXGIOutput* spOutput;

	ThrowIfFailed(CreateDXGIFactory2(0, __uuidof(IDXGIFactory4), (void**)&spFactory));
	ThrowIfFailed(spFactory->EnumAdapters1(0, &spAdapter));
	ThrowIfFailed(spAdapter->EnumOutputs(0, &spOutput));*/
	if (m_pMediaEngine)
	{
		if (/*SUCCEEDED(spOutput->WaitForVBlank()) && this && */!m_pMediaEngine->m_spMediaEngine->IsPaused())
		{
			ID3D11Texture2D1* Texture;
			m_pTextureView->GetResource((ID3D11Resource**)&Texture);
			DWORD SizeX;
			DWORD SizeY;
			m_pMediaEngine->m_spMediaEngine->GetNativeVideoSize(&SizeX, &SizeY);

			RECT Rect = { 0,0, (LONG)SizeX, (LONG)SizeY };
			MFVideoNormalizedRect NormRect = { 0.0f, 0.0f, 1.0f, 1.0f };
			MFARGB BackColor = { 0, 0, 0, 255 };

			m_pMediaEngine->TransferFrame(Texture, NormRect, Rect, BackColor);
			Texture->Release();
		}
	}

	if (m_pWebCamEngine)   //HERE
	{			
		if (m_pWebCamEngine->m_pBuffer)
		{
			ID3D11Texture2D1* Texture;
			m_pTextureView->GetResource((ID3D11Resource**)&Texture);

			IMF2DBuffer* m_p2DBuffer = NULL;
			//m_pWebCamEngine->m_pBuffer->QueryInterface(IID_PPV_ARGS(&m_p2DBuffer));
			m_pWebCamEngine->m_pBuffer->QueryInterface(IID_IMF2DBuffer, (void**)&m_p2DBuffer);

			BYTE* ppbScanLine0;
			LONG plStride;

			m_p2DBuffer->Lock2D(&ppbScanLine0, &plStride);

			//YUV to RGB???

			/*for (DWORD y = 0; y < m_Height; y++)
			{
				RGBQUAD* pDestPel = (RGBQUAD*)mapped.pData;
				WORD* pSrcPel = (WORD*)ppbScanLine0;

				for (DWORD x = 0; x < 640; x += 2)
				{
					// Byte order is U0 Y0 V0 Y1

					int y0 = (int)LOBYTE(pSrcPel[x]);
					int u0 = (int)HIBYTE(pSrcPel[x]);
					int y1 = (int)LOBYTE(pSrcPel[x + 1]);
					int v0 = (int)HIBYTE(pSrcPel[x + 1]);

					pDestPel[x] = ConvertYCrCbToRGB(y0, v0, u0);
					pDestPel[x + 1] = ConvertYCrCbToRGB(y1, v0, u0);
				}

				ppbScanLine0 += plStride;
				//mapped.pData += mapped.RowPitch;
			}*/

			//m_pDirect3D->GetD3DDeviceContext()->Unmap(Texture, NULL);
			m_p2DBuffer->Unlock2D();
			m_p2DBuffer->Release();

			Texture->Release();

			if (m_pWebCamEngine->m_pBuffer) m_pWebCamEngine->m_pBuffer->Release();
			m_pWebCamEngine->m_pBuffer = NULL;
		}
	}

	/*spFactory->Release();
	spAdapter->Release();
	spOutput->Release();*/
}


How to get the info from URI (ms-settings: xxxx)?

$
0
0

In Windows 10, you can see the Setting items when you open System Settings.

1. I want to get the friendly name from a specified URI.
 For example:
 Get "Battery Saver" from the URI "ms-settings:batterysaver-settings"
 Get "Diagnostics & feedback" from "ms-settings:privacy-feedback"
 Get "Sync your settings" from "ms-settings:sync"

 Please view the names and URIs from link below
 https://www.tenforums.com/tutorials/78214-settings-pages-list-uri-shortcuts-windows-10-a.html

2. How to find out which group the sub page belongs? for example
the "SettingsPageAccountsSync (ms-settings:sync)" belongs the "Account" group,
the "SettingsPageBackground(ms-settings:personalization-background)" belongs the "Personalization" group.

3. How to get the icon of page?

Are there the APIs?
 
Thank you.


CreateMediaSource example issue

$
0
0

Hi,

I've got a question about a common Microsoft sample about how to create a mediasource from an Uri.

// Create a media source from a URL. HRESULT CreateMediaSource(PCWSTR sURL, IMFMediaSource **ppSource) { MF_OBJECT_TYPE ObjectType = MF_OBJECT_INVALID; IMFSourceResolver* pSourceResolver = NULL; IUnknown* pSource = NULL; // Create the source resolver. HRESULT hr = MFCreateSourceResolver(&pSourceResolver); if (FAILED(hr)) { goto done; } // Use the source resolver to create the media source. // Note: For simplicity this sample uses the synchronous method to create // the media source. However, creating a media source can take a noticeable // amount of time, especially for a network source. For a more responsive // UI, use the asynchronous BeginCreateObjectFromURL method. hr = pSourceResolver->CreateObjectFromURL( sURL, // URL of the source. MF_RESOLUTION_MEDIASOURCE, // Create a source object. NULL, // Optional property store.&ObjectType, // Receives the created object type. &pSource // Receives a pointer to the media source. ); if (FAILED(hr)) { goto done; } // Get the IMFMediaSource interface from the media source. hr = pSource->QueryInterface(IID_PPV_ARGS(ppSource)); done: SafeRelease(&pSourceResolver); SafeRelease(&pSource); return hr; }



Now the issue is; When the given URL is a video file like .avi, mkv etc, this method works Ok. 
However the method fails when the URL is aan audio file like MP3, 
the return value of pSource->QueryInterface == 0x80004002 (No such interface supported.) 
And so, the method fails.. 
The strange thing is, that when copying &pSource to ppSource (pSource = (IMFMediaSource)ppSource) 
the MediaSession will play the audio file as expected. 
Is this a bug in IMFMediaSource->QueryInterface or something else?

Thank you, Tony.



E_UNEXPECTED in MESessionStarted event when topology has two branches

$
0
0

I am working on a session topology that records a video (with audio) while showing the video stream with an EVR. I am using SampleGrabber sinks for the video and audio streams so I can easily control when I start/end recording (using a sinkwriter) and extract sample data as I need to (for capturing stills and displaying the microphone's audio level). I am using a Tee node to split the video feed between the EVR and the video samplegrabber, and a copierMFT to deliver samples to the EVR.

Each branch of the topology work by themselves, but when I add both to the topology the media session fails to start, with a status="E_UNEXPECTED Catastrophic Error" set on the MESessionStarted event. That is the only error I find in the MFTrace logs. My camera never turns on and I don't receive any samples in my samplegrabbers. The topology seems to resolve correctly, so I'm not sure what the session didn't expect. The documentation for MediaSession::Start doesn't mention anything about this error. My current guess is that is has something to do with syncing the presentation clock, but setting the samplegrabber output nodes as rateless doesn't seem to help. After I receive the Topology Ready status event, checking the session's clock state returns MFCLOCK_STATE_INVALID.

Is there any special setup required to have a live camera source and microphone source in the same topology? Is there any way to get more info on the E_UNEXPECTED error?

Any help on getting to the bottom of this is appreciated. Here is a snippet of my mftrace log which shows the ready topology and the error:

12656,14E8 19:25:29.69496 CTopologyHelpers::Trace @02A799F8 >>>>>>>>>>>>> ready topology
12656,14E8 19:25:29.69499 CTopologyHelpers::TraceNode @ Node 0 @02A7A110 ID:317000000001, 0 inputs, 1 outputs, type 1, MF_TOPONODE_MARKIN_HERE=1;MF_TOPONODE_MARKOUT_HERE=1;MF_TOPONODE_MEDIASTART=0 (0,0);MF_TOPONODE_SOURCE=@02A95200;MF_TOPONODE_PRESENTATION_DESCRIPTOR=@02A63820;MF_TOPONODE_STREAM_DESCRIPTOR=@02A662C8;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85698
12656,14E8 19:25:29.69500 CMFTopologyNodeDetours::GetGUID @02A7A110 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69501 CTopologyHelpers::TraceObject @ Source @02A95200 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFMEDIASOURCE_CHARACTERISTICS=0x00000005
12656,14E8 19:25:29.69507 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7A188 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69509 CTopologyHelpers::TraceNode @ Node 1 @02A7A188 ID:317000000002, 1 inputs, 2 outputs, type 3, MF_TOPONODE_PRIMARYOUTPUT=0;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A7CE00
12656,14E8 19:25:29.69509 CMFTopologyNodeDetours::GetGUID @02A7A188 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69509 CTopologyHelpers::TraceObject @ Tee @00000000 {00000000-0000-0000-0000-000000000000} ((null)), (null)
12656,14E8 19:25:29.69514 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A110 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69518 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79D68 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69522 CTopologyHelpers::TraceStream @ Output stream 1, connected to node @02A7B1F0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69524 CTopologyHelpers::TraceNode @ Node 2 @02A79D68 ID:317000000005, 1 inputs, 1 outputs, type 2, {89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85908
12656,14E8 19:25:29.69524 CMFTopologyNodeDetours::GetGUID @02A79D68 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69526 CTopologyHelpers::TraceObject @ MFT @02A6DAE8 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFT_SUPPORT_DYNAMIC_FORMAT_CHANGE=1;{851745D5-C3D6-476D-9527-498EF2D10D18}=4
12656,14E8 19:25:29.69531 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A188 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69536 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79EA0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69538 CTopologyHelpers::TraceNode @ Node 3 @02A79DE0 ID:317000000003, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85AE0
12656,14E8 19:25:29.69539 CMFTopologyNodeDetours::GetGUID @02A79DE0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69539 CTopologyHelpers::TraceObject @ Sink @02A6FB28 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), (null)
12656,14E8 19:25:29.69544 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7B1F0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=2560;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_GEOMETRIC_APERTURE=00 00 00 00 00 00 00 00 80 02 00 00 e0 01 00 00 ;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=42949672960333333 (10000000,333333);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=1228800;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_RGB32
12656,14E8 19:25:29.69546 CTopologyHelpers::TraceNode @ Node 4 @02A7B1F0 ID:31700000000C, 1 inputs, 1 outputs, type 2, MF_TOPONODE_TRANSFORM_OBJECTID={98230571-0087-4204-B020-3282538E57D3};{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86040
12656,14E8 19:25:29.69546 CTopologyHelpers::TraceObject @ MFT @02A7A488 {98230571-0087-4204-B020-3282538E57D3} (C:\Windows\SYSTEM32\colorcnv.dll), <NULL>
12656,14E8 19:25:29.69551 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A188 stream 1, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69556 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79DE0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=2560;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_GEOMETRIC_APERTURE=00 00 00 00 00 00 00 00 80 02 00 00 e0 01 00 00 ;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=42949672960333333 (10000000,333333);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=1228800;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_RGB32
12656,14E8 19:25:29.69559 CTopologyHelpers::TraceNode @ Node 5 @02A79EA0 ID:317000000004, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_NOSHUTDOWN_ON_REMOVE=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86218;{B8AA3129-DFC9-423A-8ACD-1D82850A3D1F}=@02A8B9E0
12656,14E8 19:25:29.69560 CMFTopologyNodeDetours::GetGUID @02A79EA0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69560 CTopologyHelpers::TraceObject @ Sink @02A8E5D4 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MF.dll), (null)
12656,14E8 19:25:29.69565 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A79D68 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69568 CTopologyHelpers::TraceNode @ Node 6 @02A79F18 ID:317000000006, 0 inputs, 1 outputs, type 1, MF_TOPONODE_MARKIN_HERE=1;MF_TOPONODE_MARKOUT_HERE=1;MF_TOPONODE_MEDIASTART=0 (0,0);MF_TOPONODE_SOURCE=@02A65D40;MF_TOPONODE_PRESENTATION_DESCRIPTOR=@02A63CE8;MF_TOPONODE_STREAM_DESCRIPTOR=@02A63048;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86370
12656,14E8 19:25:29.69568 CMFTopologyNodeDetours::GetGUID @02A79F18 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69569 CTopologyHelpers::TraceObject @ Source @02A65D40 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFMEDIASOURCE_CHARACTERISTICS=0x00000005
12656,14E8 19:25:29.69571 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7BFA0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=384000;MF_MT_AUDIO_BLOCK_ALIGNMENT=8;MF_MT_AUDIO_NUM_CHANNELS=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=3;MF_MT_AUDIO_SAMPLES_PER_SECOND=48000;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=32;MF_MT_SUBTYPE=MFAudioFormat_Float
12656,14E8 19:25:29.69573 CTopologyHelpers::TraceNode @ Node 7 @02A7BDC0 ID:317000000007, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A864E0
12656,14E8 19:25:29.69573 CMFTopologyNodeDetours::GetGUID @02A7BDC0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69574 CTopologyHelpers::TraceObject @ Sink @02A70410 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), (null)
12656,14E8 19:25:29.69575 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7BFA0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=88200;MF_MT_AUDIO_BLOCK_ALIGNMENT=2;MF_MT_AUDIO_NUM_CHANNELS=1;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=4;MF_MT_AUDIO_SAMPLES_PER_SECOND=44100;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=16;MF_MT_SUBTYPE=MFAudioFormat_PCM
12656,14E8 19:25:29.69577 CTopologyHelpers::TraceNode @ Node 8 @02A7BFA0 ID:317000000010, 1 inputs, 1 outputs, type 2, MF_TOPONODE_TRANSFORM_OBJECTID={F447B69E-1884-4A7E-8055-346F74D6EDB3};{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02AAA068
12656,14E8 19:25:29.69578 CTopologyHelpers::TraceObject @ MFT @02A7C4AC {F447B69E-1884-4A7E-8055-346F74D6EDB3} (C:\Windows\SYSTEM32\resampledmo.dll), <NULL>
12656,14E8 19:25:29.69581 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A79F18 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=384000;MF_MT_AUDIO_BLOCK_ALIGNMENT=8;MF_MT_AUDIO_NUM_CHANNELS=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=3;MF_MT_AUDIO_SAMPLES_PER_SECOND=48000;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=32;MF_MT_SUBTYPE=MFAudioFormat_Float
12656,14E8 19:25:29.69583 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7BDC0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=88200;MF_MT_AUDIO_BLOCK_ALIGNMENT=2;MF_MT_AUDIO_NUM_CHANNELS=1;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_SAMPLES_PER_SECOND=44100;MF_MT_AUDIO_PREFER_WAVEFORMATEX=1;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=16;MF_MT_SUBTYPE=MFAudioFormat_PCM
12656,14E8 19:25:29.69584 CTopologyHelpers::Trace @02A799F8 MF_TOPOLOGY_RESOLUTION_STATUS = 0
12656,14E8 19:25:29.69584 CTopologyHelpers::Trace @02A799F8 <<<<<<<<<<<<< ready topology
12656,14E8 19:25:29.69590 CKernel32ExportDetours::OutputDebugStringA @ TOPOLOGY READY 
12656,14E8 19:25:29.69600 CMFTopologyDetours::GetUINT32 @02A799F8 attribute not found guidKey = {9C27891A-ED7A-40E1-88E8-B22727A024EE}
12656,14E8 19:25:29.69612 CMFMediaSessionDetours::EndGetEvent @02A6DDC0 Met=103 MESessionStarted, value (empty), failed HrStatus=8000FFFF E_UNEXPECTED, 


Win7 + H.264 Encoder + IMFSinkWriter Can't use Quality VBR encoding?

$
0
0

I'm trying to alter the encoder quality property eAVEncCommonRateControlMode_Quality via ICodecAPI.

However the setting is ignored as started in the documentation which says the property must be set before IMFTransform::SetOutputType is called.

Now here is the problem: the sink writer seems to call IMFTransform::SetOutputType when we call SetInputMediaType on the sink writer, however if we don't call SetInputMediaType we can't retrieve the ICodecAPI interface via sinkWriter.GetServiceForStream (throws exception) to change the quality setting..seems like a catch 22. I'm hoping it's me and not just a design flaw in the APIs.

Setting the quality property works on Win8, as Win8 does not ignore it when called after IMFTransform::SetOutputType is called.

Help!!

 


access denied when active hevc video decoder (HEVCVideoExtension)

$
0
0

hello there, I am in a trouble of activate hevc video decoder, when I activeobject after MFEnumEx an HEVC decoder, I got an "access denied" error, I have no idea why this happened, any help would be highly appreciated, code attach bellow

HRESULT FindVideoDecoder(
	const GUID& subtype,
	IMFTransform **ppDecoder
)
{
	HRESULT hr = S_OK;
	UINT32 count = 0;

	IMFActivate **ppActivate = NULL;

	MFT_REGISTER_TYPE_INFO info = { MFMediaType_Video, subtype };

	UINT32 unFlags = MFT_ENUM_FLAG_SYNCMFT | MFT_ENUM_FLAG_LOCALMFT |
		MFT_ENUM_FLAG_SORTANDFILTER;

	if (1)
	{
		unFlags |= MFT_ENUM_FLAG_ASYNCMFT;
	}
	if (1)
	{
		unFlags |= MFT_ENUM_FLAG_HARDWARE;
	}
	if (1)
	{
		unFlags |= MFT_ENUM_FLAG_TRANSCODE_ONLY;
	}

	hr = MFTEnumEx(MFT_CATEGORY_VIDEO_DECODER,
		unFlags,
		&info,      // Input type
		NULL,       // Output type&ppActivate,&count);

	if (SUCCEEDED(hr) && count == 0)
	{
		hr = E_FAIL;
	}

	std::cout << "found " << count << " hevc decoder" << endl;

	WCHAR MFDecName[1024];
	UINT32 nNameLength = 1024;
	ppActivate[0]->GetString(MFT_FRIENDLY_NAME_Attribute, MFDecName, 1024, &nNameLength);

	std::wcout << "found " << MFDecName << std::endl;

	// Create the first decoder in the list.
	if (SUCCEEDED(hr))
	{
		hr = ppActivate[0]->ActivateObject(IID_PPV_ARGS(ppDecoder));
		if (FAILED(hr))
			std::cout << "activate failed" << endl;
		else
			std::cout << "activate succeded" << endl;
	}

	for (UINT32 i = 0; i < count; i++)
	{
		ppActivate[i]->Release();
	}
	CoTaskMemFree(ppActivate);

	return hr;
}

int main()
{
	MFStartup(MF_VERSION, MFSTARTUP_FULL);
	IMFTransform *pDecoder = nullptr;
	FindVideoDecoder(MFVideoFormat_HEVC, &pDecoder);
	if (pDecoder)
	{
		pDecoder->Release();
	}
	MFShutdown();
	system("pause");
	return 0;
}

 

The problem when add tee node to the preview topology

$
0
0

Hi everyone:

               I use TopoEdit to create a topology to preview H264 video from camera capture.

               The camera uses bulk  transmission mode to transmit data.

               Create the topology as follows:

              1.  camera source  ->  decoder  ->  EVR                           :preview successfully

      but:  2. camera source  ->  Tee  -> Decoder  ->  EVR                : the first preview is failed with thesubsequence preview is success

             Who can explain the result to me? Is there something special of the Tee node?



TopoEdit - Tee nodes are broken?

$
0
0

I just installed the latest Windows 10 SDK so I could use TopoEdit to start some Media Foundation work. When I add a Tee node, it has only a input pin- zero outputs.  After connecting the Tee's input pin, there are still no output pins.  That's not expected, is it?

Thank you,

Josh

Low quality H.265 encoding

$
0
0
Related SO question: this.

The ICodecAPI properties of the HEVC encoder do not seem to work. I'm trying to encode video with MF H.265, and no matter what I try, the quality is always lower than the same-settings video procuded by non MF encoders, like what VideoPad uses (say, ffmpeg) at the same 4000 bitrate.

Videopad produces this video of a swimming boy. My app produces this video. The sky in my app is clearly worse at a 6K bitrate, where the VideoPad is at 1K.

https://docs.microsoft.com/en-us/windows/win32/medfound/h-265---hevc-video-encoder : does that link describe successfully the parameters of the H.265 encoder? Quality vs speed does not work. Quality when setting eAVEncCommonRateControlMode_Quality does not work. 


if (true)
    {
        VARIANT v = {};
        v.vt = VT_BOOL;
        v.boolVal  = VARIANT_FALSE;
        ca->SetValue(&CODECAPI_AVLowLatencyMode, &v);

    }
    if (true)
    {
        VARIANT v = {};
        v.vt = VT_UI4;
        v.ulVal = 100;
        hr = ca->SetValue(&CODECAPI_AVEncCommonQualityVsSpeed, &v);
    }

    if (true)
    {
        VARIANT v = {};
        v.vt = VT_UI4;
        v.ulVal = eAVEncCommonRateControlMode_Quality;
        ca->SetValue(&CODECAPI_AVEncCommonRateControlMode, &v);
        if (true)
        {
            VARIANT v = {};
            v.vt = VT_UI4;
            v.ulVal = 100;
            ca->SetValue(&CODECAPI_AVEncCommonQuality, &v);
        }
    }

I also added the MF_MT_MPEG2_LEVEL  and the MF_MT_VIDEO_PROFILE , no luck. Same quality and same encoding speed (very fast).

What am I missing?


Michael




Why is the playback speed of video encoded with IMFSinkWriter changing based on width?

$
0
0

I'm making a screen recorder (without audio) using the Sink Writer to encode a series of bitmaps into an MP4 file.

For some reason, the video playback speed increases (seemingly) proportionally with the video width.

From a post on Stack overflow (I can't post links because I can't log in :/), I've gathered that it's most likely because I'm calculating the buffer size incorrectly. The difference here is that their video playback issue was fixed once the calculation for the audio buffer size was correct, but since I don't encode any audio at all, I'm not sure what to take from it.

I've also tried to read about how the buffer works from the MSDN documentation, but I'm really at a loss as to exactly how the buffer size is causing different playback speeds.

Someone also pointed out that it may have something to do with the frame index/duration, so I'll also include that in the code, just in case.

i.e.: Depending on the width of the member variable `m_width`(measured in pixels), the playback speed changes. That is; the higher the width, the faster the video plays, and vice versa.

rtStart and rtDuration (frame index/duration) are defined as such, and are both private members of the MP4File class.

LONGLONG rtStart = 0;
UINT64   rtDuration;
MFFrameRateToAverageTimePerFrame(m_FPS, 1, &rtDuration);

This is where rtStart is updated, and the individual bits of the bitmap is passed to the frame writer.

void MP4File::AppendFrame(HBITMAP frame)
{
    HANDLE hHeap = HeapCreate(HEAP_NO_SERIALIZE, m_width * m_height * 4, 0);
    if (hHeap == NULL)
    {
        return;
    }

    LPVOID lpFrameBits = HeapAlloc(hHeap, HEAP_ZERO_MEMORY | HEAP_NO_SERIALIZE, m_width * m_height * 4);
    if (lpFrameBits == NULL)
    {
        return;
    }

    BITMAPINFO bmpInfo;
    bmpInfo.bmiHeader.biBitCount = 0;
    bmpInfo.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);

    GetDIBits(m_hDC, frame, 0, 0, NULL, &bmpInfo, DIB_RGB_COLORS);
    bmpInfo.bmiHeader.biCompression = BI_RGB;
    GetDIBits(m_hDC, frame, 0, bmpInfo.bmiHeader.biHeight, lpFrameBits, &bmpInfo, DIB_RGB_COLORS);

    WriteFrame(lpFrameBits);

    HeapFree(hHeap, HEAP_NO_SERIALIZE, lpFrameBits);
    lpFrameBits = NULL;

    HeapDestroy(hHeap);
    hHeap = NULL;

    if (SUCCEEDED(m_writeFrameResult)) // set by MP4File::WriteFrame() below
        rtStart += rtDuration;
}

And lastly, the frame writer which actually loads the bits into the buffer, and then writes to the Sink Writer.

void MP4File::WriteFrame(LPVOID lpBits)
{
    IMFSample *pSample = NULL;
    IMFMediaBuffer *pBuffer = NULL;

    const LONG cbWidth = 4 * m_width;
    const DWORD cbBufferSize = cbWidth * m_height;

    BYTE *pData = NULL;

    HRESULT hr = MFCreateMemoryBuffer(cbBufferSize, &pBuffer);

    if (SUCCEEDED(hr))
    {
        hr = pBuffer->Lock(&pData, NULL, NULL);
    }
    if (SUCCEEDED(hr))
    {
        hr = MFCopyImage(
            pData,                      // Destination buffer.
            cbWidth,                    // Destination stride.
            (BYTE*)lpBits,              // First row in source image.
            cbWidth,                    // Source stride.
            cbWidth,                    // Image width in bytes.
            m_height                    // Image height in pixels.
        );
    }
    if (pBuffer)
    {
        pBuffer->Unlock();
    }

    if (SUCCEEDED(hr))
    {
        hr = pBuffer->SetCurrentLength(cbBufferSize);
    }

    if (SUCCEEDED(hr))
    {
        hr = MFCreateSample(&pSample);
    }
    if (SUCCEEDED(hr))
    {
        hr = pSample->AddBuffer(pBuffer);
    }

    if (SUCCEEDED(hr))
    {
        hr = pSample->SetSampleTime(rtStart);
    }
    if (SUCCEEDED(hr))
    {
        hr = pSample->SetSampleDuration(rtDuration);
    }

    if (SUCCEEDED(hr))
    {
        hr = m_pSinkWriter->WriteSample(m_streamIndex, pSample);
    }

    SafeRelease(&pSample);
    SafeRelease(&pBuffer);
    m_writeFrameResult = hr;
}

A couple details about the encoding:

  • Framerate: 30FPS
  • Bitrate: 15 000 000
  • Output encoding format: H264 (MP4)

Sample Grabber Sink not setting media type GUID

$
0
0
I'm using the example code from the Sample Grabber Sink reference page except that I'm processing an mp4 file to get both audio and video samples (my sample code). To process the samples in the callback I need to know which ones are audio and which ones are video. The problem is the REFGUID guidMajorMediaType is always set as GUID_NULL.

Reading HEIF/HEIC files

$
0
0

Hi team,

I was wondering if I can use Media Foundation to read HEIC/HEIF files. I tried using IMFSourceReader but I get the error:

0xc00d36e5 The operation on the current offset is not permitted

when I attempt to create a source reader using MFCreateSourceReaderFromURL.

I have the HEIF and HEVC extensions installed on my system and I was able to open HEIC files using Windows photo viewer. I was able to read HEVC encoded MKV and MP4 files using IMFSourceReader on the same system.

Does it matter which Microsoft Account I used to install these extensions on my system?

Is this the write interface to use? Or do I have to use a third party library such as libheif or Nokia's HEIF Reader/Writer to parse the HEIF structure and use the IMFByteStream to supply the raw HEVC stream to create a source reader?

Any help would be appreciated.

Regards,

Dinesh

BasicPlayback Sample source code?

$
0
0

Hi all,

I'm looking for this source code after much reading about WMF but I can't find it. I'm getting started in the WMF. I'm not a software developer but I've made a few programs for myself with windows forms and WPF. I want to learn to playback video/audio files so I've been reading about WMF. I found the Microsoft WMF documentation and I this example seems pretty basic to start with the pipeline. I understand that most of the code is C++ which is fine (C# or C++ ok with me). A lot of the stuff in the documentation seems rather old (2007, 2015, etc). 

Is this sample outdated so I should try something else? if not where can I find the sample code?

BTW I have Visual studio community 2017 and 2019 installed in my (upgraded from Win7 to Win10) laptop.

Thanks

How to start using WMF C++, C#: WPF, WIN32, UWP?

$
0
0

Hi all,

I'm trying to learn WMF and I'm very confused. I'm not a software guru. I keep reading the WMF is mostly a mix of COM and C++. Just now I found some samples (https://github.com/Microsoft/Windows-universal-samples) that seem to be for UWP. I know nothing about UWP. I've seen the option in VS 2019 and I installed it. I can do (at my own level) C++ or C# and a bit of XML. I've never done COM. I have VS community 2017 and 2019 installed. Could someone advise me? I'm not looking to become a pro. I'm interested in playing video files in my projects (for Windows 10).

Thanks. 

Viewing all 1079 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>