Quantcast
Channel: Media Foundation Development for Windows Desktop forum
Viewing all 1079 articles
Browse latest View live

why MF_MT_FRAME_RATE_RANGE_MIN and MF_MT_FRAME_RATE_RANGE_MAX are equal?

$
0
0

I'm writing a simple program to pull every frame from a webcam, but I found these codes

MFGetAttributeRatio(pType, MF_MT_FRAME_RATE_RANGE_MIN, &w, &h); MFGetAttributeRatio(pType, MF_MT_FRAME_RATE_RANGE_MAX, &w, &h);
MFGetAttributeSize(pType, MF_MT_FRAME_SIZE, &w, &h);

give me exact same value on same IMFMediaType, so How can I know the frame rate range of a capture media type?


DLNA in Windows 7 64 bit - newly added files not available on devices.

$
0
0

dear Readers,

my apologies if this is the wrong forum.  i have a windows 64 bit machine.  using resources on the internet, i was able to configure the media files (music, photos, and videos) on different folders to be playable on my personal DLNA compatible new BD -570 SONY player and Samsung 40" LCD TV.

2 weeks ago, i created a folder under that shared subdirectory and added a few more mp3 files.  the folder and those media files are not visible downstairs on my TV or my bluray player.

is there an extra step i need to do before having the newly created subfolder available downstairs.

 

thanks for your help.

regards

Ravi.


Sr GIS App Developer Dallas Fort Worth area, TX

Need the Topoedit sourcecode for Windows 8 to understand DX11VideoRenderer

$
0
0

I'm attempting to understand how to use the DX11VideoRenderer.dll and need the windows 8 topoedit sourcecode to see how that .dll is used. The sourcecode for topoedit was provided in the windows 7 SDK and Windows Server 2008 SDK, but these versions don't use DX11VideoRenderer.dll.

Thanks!



David K McKinney


CColorConvertDMO

$
0
0

Compilation error in Media Foundation application:

The type or namespace name 'CColorConvertDMO' could not be found (are you missing a using directive or an assembly reference?)

at the following line of code:

hr = MFExtern.MFTRegisterLocalByCLSID(typeof(CColorConvertDMO).GUID, MFTransformCategory.MFT_CATEGORY_VIDEO_PROCESSOR, "", MFT_EnumFlag.SyncMFT, 0, null, 0, null);
Where is the 'CColorConvertDMO' CLSID located in C#?

Kinect Sensor Compatibility

$
0
0
Does the Kinect sensor work with the MFCaptureToFile and MFCaptureD3D samples?

Supported Video Capture Devices

$
0
0

The Microsoft docs about video capture in Media Foundation state that:

"In Windows 7, Microsoft Media Foundation now supports audio and video capture. Video capture devices are supported through the UVC class driver and must be compatible with UVC 1.1."

1) Are there any lists containing Media Foundation compatible video capture devices, which I can choose from?

2) Are digital camera's compatible or is it limited to webcams?

MFT H.264 encoder Windows 8 SetOutputType E_FAIL, GetLastError 127

$
0
0

I cannot get the MFT H.264 encoder to work on Windows 8 with the exact same setup parameters that work fine in Windows 7.

After setting the output type with all the required parameters (same as Win7), the encoder's SetOutputType returns E_FAIL, which is not documented for this function. GetLastError after this, returns 127 ("The specified procedure could not be found."). but I don't know if this is relevant for the E_FAIL.

Can anyone think what might be different and additional requirements for the MFT H.264 encoder on WIndows 8, relative to 7? The documentation says the Win 8 versions supports a lot more useful functionality, High Profile, B frames, CABAC to name a few, but these aren't required.


Color Converter DSP

$
0
0

I get the following error on compiling my application:

The type or namespace name 'CColorConvertDMO' could not be found (are you missing a using directive or an assembly reference?)

which relates to the following line of code:

typeof(CColorConvertDMO).GUID,

Where is the CColorConvertDMO CLSID located and how do you access it?

CColorConvert info: http://msdn.microsoft.com/en-gb/library/windows/desktop/ff819079%28v=vs.85%29.aspx


Using MS Media Foundation for Win8 Metro or WP8 to Decode Custom H.264 Video

$
0
0

Hi,

I'm novice for C++ and MS Media Foundation.
But I have task to play these following raw video sample in Windows 8 metro app and Windows Phone 8.

a) 360p H264 video:http://www.msh-tools.com/ardrone/dump81.h264
image: http://www.msh-tools.com/ardrone/81.png
b) 720p H264 video:http://www.msh-tools.com/ardrone/dump83.h264
image: http://www.msh-tools.com/ardrone/83.png

Those video can't be played in common video player, but need to use ffplay software. Because the header was made to be customized.

Clue that I have is need to write source filter which will take video frames, strip the header and send them downstream.

I've seen this link and the code sample, but still no idea how to start with
http://msdn.microsoft.com/en-us/library/windowsphone/develop/jj207074(v=vs.105).aspx

Any body can help me please?

Thanks

msmpeg2vdec.dll compatibility problem, older versions do not support I420 YUV

$
0
0

I have written a decoder using MS' MFT H.264 decoder contained within msmpeg2vdec.dll. On my home and dev systems, this is version 12.0.9200.16426 dated 1/3/2013. 

Then on a clean Windows 7 QA system the decoder does not work as documented. The version there is Version: 6.1.7140.0 Date: 07/13/2009.  I found the problem as well, the 4 year old decoder does not support I420 as the YUV output format, but NV12 works.

It seems virtual machines all have this problem, incl my Parallels Win7 on a Macbook, with all Windows Updates applied. So I am guessing the update msmpeg2vdec comes with DirectX updates with video cards, but who knows ... as there is no documentation for these revisions or how the decoder gets updated.



Rending bitmap over EVR9 video

$
0
0

Hi,

It is possible to render custom bitmaps (with alpha) over the video stream outputted by EVR9 without using IVMRMixerBitmap9::SetAlphaBitmap?

Need to support a requirement of capturing photo images using live video feed using web cam

$
0
0

Hi,

As we have a requirement to support capturing photo images using live video feed using web cam,we have tried using DirectShow and its found to be working fine.

But we came to know that DirectShow is superseded by Microsoft Media Foundation and we dont want to take risks on using older technology(DirectShow).

As we are trying with samples provided in SDK like MFCaptureToFile and MFCapture3D3 we couldnt be able to achieve our functionality.

Could you please answer our below queries.

1.Can we use DirectShow itself for Windows 7 and Windows 8?

2.Pseudo logic  steps to develop a similar web capture in Microsoft Media Foundation?

3.Hope its very round about for this requirement and not straight forward like DirectShow?


session set toloplogy error

$
0
0

Hi,

I am new to media foundation and currently building a live streaming application based on basic playback.

For the application i create a custom media source since the video frames i receive contain additional header around them and also i need to perform some handshaking with the camera after which i start receiving the video frames.

So using media foundation i managed to create my custom source, source node and ouput node both interconnected.

However i get session topology set error. 

Error says:->  (HRESULT = 0xc00d36e6 The requested attribute was not found.)

Also i am using MediaFoundationWrapper from Mediafoundation.Net

I have included my code blocks below

Player class

/****************************************************************************
While the underlying libraries are covered by LGPL, this sample is released
as public domain.  It is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
or FITNESS FOR A PARTICULAR PURPOSE.
*****************************************************************************/

using System;
using System.Threading;
using System.Diagnostics;
using System.Runtime.InteropServices;

using MediaFoundation;
using MediaFoundation.EVR;
using MediaFoundation.Misc;
using System.IO;
using LiveSource;

class CPlayer : COMBase, IMFAsyncCallback
{
    #region externs

    [DllImport("user32", CharSet = CharSet.Auto)]
    private extern static int PostMessage(
        IntPtr handle, int msg, IntPtr wParam, IntPtr lParam);

    #endregion

    #region Declarations

    const int WM_APP = 0x8000;
    const int WM_APP_ERROR = WM_APP + 2;
    const int WM_APP_NOTIFY = WM_APP + 1;
    const int WAIT_TIMEOUT = 258;

    const int MF_VERSION = 0x10070;

    public enum PlayerState
    {
        Ready = 0,
        OpenPending,
        Started,
        PausePending,
        Paused,
        StartPending,
    }

    #endregion

    public CPlayer(IntPtr hVideo, IntPtr hEvent)
    {
        TRACE(("CPlayer::CPlayer"));

        Debug.Assert(hVideo != IntPtr.Zero);
        Debug.Assert(hEvent != IntPtr.Zero);

        m_pSession = null;
        m_pSource = null;
        m_pVideoDisplay = null;
        m_hwndVideo = hVideo;
        m_hwndEvent = hEvent;
        m_state = PlayerState.Ready;

        m_hCloseEvent = new AutoResetEvent(false);

        int hr = MFExtern.MFStartup(0x10070, MFStartup.Full);
        MFError.ThrowExceptionForHR(hr);
    }

#if DEBUG
    // Destructor is private. Caller should call Release.
    ~CPlayer()
    {
        Debug.Assert(m_pSession == null);  // If FALSE, the app did not call Shutdown().
    }
#endif

    #region Public methods

    public int OpenURL(string sURL)
    {
        TRACE("CPlayer::OpenURL");
        TRACE("URL = " + sURL);

        // 1. Create a new media session.
        // 2. Create the media source.
        // 3. Create the topology.
        // 4. Queue the topology [asynchronous]
        // 5. Start playback [asynchronous - does not happen in this method.]

        int hr = S_Ok;
        try
        {
            IMFTopology pTopology = null;

            // Create the media session.
            CreateSession();

            // Create the media source.
            CreateMediaSource(sURL);

            // Create a partial topology.
            CreateTopologyFromSource(out pTopology);

            // Set the topology on the media session.
            hr = m_pSession.SetTopology(0, pTopology);
            MFError.ThrowExceptionForHR(hr);

            // Set our state to "open pending"
            m_state = PlayerState.OpenPending;
            NotifyState();

            SafeRelease(pTopology);

            // If SetTopology succeeded, the media session will queue an
            // MESessionTopologySet event.
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
            NotifyError(hr);
            m_state = PlayerState.Ready;
        }

        return hr;
    }

    public int PlayLiveStream()
    {
        TRACE("CPlayer::OpenURL");
        //TRACE("URL = " + sURL);

        // 1. Create a new media session.
        // 2. Create the media source.
        // 3. Create the topology.
        // 4. Queue the topology [asynchronous]
        // 5. Start playback [asynchronous - does not happen in this method.]

        int hr = S_Ok;
        try
        {
            IMFTopology pTopology = null;

            // Create the media session.
            CreateSession();

            // Create the media source.
            CreateLiveMediasource();

            // Create a partial topology.
            CreateTopologyFromSource(out pTopology);

            // Set the topology on the media session.
            hr = m_pSession.SetTopology(0, pTopology);
            MFError.ThrowExceptionForHR(hr);

           // ((LiveMediaSource)m_pSource).Start();
            // Set our state to "open pending"
            m_state = PlayerState.Ready;
            NotifyState();

            SafeRelease(pTopology);

            // If SetTopology succeeded, the media session will queue an
            // MESessionTopologySet event.
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
            NotifyError(hr);
            m_state = PlayerState.Ready;
        }

        return hr;
    }

    public int Play()
    {
        TRACE("CPlayer::Play");

        if (m_state != PlayerState.Paused)
        {
            return E_Fail;
        }
        if (m_pSession == null || m_pSource == null)
        {
            return E_Unexpected;
        }

        int hr = S_Ok;

        try
        {
            StartPlayback();

            m_state = PlayerState.StartPending;
            NotifyState();
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
            NotifyError(hr);
        }

        return hr;
    }

    public int Pause()
    {
        TRACE("CPlayer::Pause");

        if (m_state != PlayerState.Started)
        {
            return E_Fail;
        }
        if (m_pSession == null || m_pSource == null)
        {
            return E_Unexpected;
        }

        int hr = S_Ok;

        try
        {
            hr = m_pSession.Pause();
            MFError.ThrowExceptionForHR(hr);

            m_state = PlayerState.PausePending;
            NotifyState();
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
            NotifyError(hr);
        }

        return hr;
    }

    public int Shutdown()
    {
        TRACE("CPlayer::ShutDown");

        int hr = S_Ok;

        try
        {
            if (m_hCloseEvent != null)
            {
                // Close the session
                CloseSession();

                // Shutdown the Media Foundation platform
                hr = MFExtern.MFShutdown();
                MFError.ThrowExceptionForHR(hr);

                m_hCloseEvent.Close();
                m_hCloseEvent = null;
            }
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
        }

        return hr;
    }

    // Video functionality
    public int Repaint()
    {
        int hr = S_Ok;

        if (m_pVideoDisplay != null)
        {
            try
            {
                hr = m_pVideoDisplay.RepaintVideo();
                MFError.ThrowExceptionForHR(hr);
            }
            catch (Exception ce)
            {
                hr = Marshal.GetHRForException(ce);
            }
        }

        return hr;
    }

    public int ResizeVideo(short width, short height)
    {
        int hr = S_Ok;
        TRACE(string.Format("ResizeVideo: {0}x{1}", width, height));

        if (m_pVideoDisplay != null)
        {
            try
            {
                MFRect rcDest = new MFRect();
                MFVideoNormalizedRect nRect = new MFVideoNormalizedRect();

                nRect.left = 0;
                nRect.right = 1;
                nRect.top = 0;
                nRect.bottom = 1;
                rcDest.left = 0;
                rcDest.top = 0;
                rcDest.right = width;
                rcDest.bottom = height;

                hr = m_pVideoDisplay.SetVideoPosition(nRect, rcDest);
                MFError.ThrowExceptionForHR(hr);
            }
            catch (Exception ce)
            {
                hr = Marshal.GetHRForException(ce);
            }
        }

        return hr;
    }

    public PlayerState GetState()
    {
        return m_state;
    }

    public bool HasVideo()
    {
        return (m_pVideoDisplay != null);
    }

    #endregion

    #region IMFAsyncCallback Members

    int IMFAsyncCallback.GetParameters(out MFASync pdwFlags, out MFAsyncCallbackQueue pdwQueue)
    {
        pdwFlags = MFASync.FastIOProcessingCallback;
        pdwQueue = MFAsyncCallbackQueue.Standard;
        //throw new COMException("IMFAsyncCallback.GetParameters not implemented in Player", E_NotImplemented);

        return S_Ok;
    }

    int IMFAsyncCallback.Invoke(IMFAsyncResult pResult)
    {
        int hr;
        IMFMediaEvent pEvent = null;
        MediaEventType meType = MediaEventType.MEUnknown;  // Event type
        int hrStatus = 0;           // Event status
        MFTopoStatus TopoStatus = MFTopoStatus.Invalid; // Used with MESessionTopologyStatus event.

        try
        {
            // Get the event from the event queue.
            hr = m_pSession.EndGetEvent(pResult, out pEvent);
            MFError.ThrowExceptionForHR(hr);

            // Get the event type.
            hr = pEvent.GetType(out meType);
            MFError.ThrowExceptionForHR(hr);

            // Get the event status. If the operation that triggered the event did
            // not succeed, the status is a failure code.
            hr = pEvent.GetStatus(out hrStatus);
            MFError.ThrowExceptionForHR(hr);

            TRACE(string.Format("Media event: " + meType.ToString()));

            // Check if the async operation succeeded.
            if (Succeeded(hrStatus))
            {
                // Switch on the event type. Update the internal state of the CPlayer as needed.
                switch (meType)
                {
                    case MediaEventType.MESessionTopologyStatus:
                        // Get the status code.
                        int i;
                        hr = pEvent.GetUINT32(MFAttributesClsid.MF_EVENT_TOPOLOGY_STATUS, out i);
                        MFError.ThrowExceptionForHR(hr);
                        TopoStatus = (MFTopoStatus)i;
                        switch (TopoStatus)
                        {
                            case MFTopoStatus.Ready:
                                OnTopologyReady(pEvent);
                                break;
                            default:
                                // Nothing to do.
                                break;
                        }
                        break;

                    case MediaEventType.MESessionStarted:
                        OnSessionStarted(pEvent);
                        break;

                    case MediaEventType.MESessionPaused:
                        OnSessionPaused(pEvent);
                        break;

                    case MediaEventType.MESessionClosed:
                        OnSessionClosed(pEvent);
                        break;

                    case MediaEventType.MEEndOfPresentation:
                        OnPresentationEnded(pEvent);
                        break;
                }
            }
            else
            {
                // The async operation failed. Notify the application
                NotifyError(hrStatus);
            }
        }
        finally
        {
            // Request another event.
            if (meType != MediaEventType.MESessionClosed)
            {
                hr = m_pSession.BeginGetEvent(this, null);
                MFError.ThrowExceptionForHR(hr);
            }

            SafeRelease(pEvent);
        }

        return S_Ok;
    }

    #endregion

    #region Protected methods

    // NotifyState: Notifies the application when the state changes.
    protected void NotifyState()
    {
        PostMessage(m_hwndEvent, WM_APP_NOTIFY, new IntPtr((int)m_state), IntPtr.Zero);
    }

    // NotifyState: Notifies the application when an error occurs.
    protected void NotifyError(int hr)
    {
        TRACE("NotifyError: 0x" + hr.ToString("X"));
        m_state = PlayerState.Ready;
        PostMessage(m_hwndEvent, WM_APP_ERROR, new IntPtr(hr), IntPtr.Zero);
    }

    protected void CreateSession()
    {
        // Close the old session, if any.
        CloseSession();

        // Create the media session.
        int hr = MFExtern.MFCreateMediaSession(null, out m_pSession);
        MFError.ThrowExceptionForHR(hr);

        // Start pulling events from the media session
        hr = m_pSession.BeginGetEvent(this, null);
        MFError.ThrowExceptionForHR(hr);
    }

    protected void CloseSession()
    {
        int hr;

        if (m_pVideoDisplay != null)
        {
            Marshal.ReleaseComObject(m_pVideoDisplay);
            m_pVideoDisplay = null;
        }

        if (m_pSession != null)
        {
            hr = m_pSession.Close();
            MFError.ThrowExceptionForHR(hr);

            // Wait for the close operation to complete
            bool res = m_hCloseEvent.WaitOne(5000, true);
            if (!res)
            {
                TRACE(("WaitForSingleObject timed out!"));
            }
        }

        // Complete shutdown operations

        // 1. Shut down the media source
        if (m_pSource != null)
        {
            hr = m_pSource.Shutdown();
            MFError.ThrowExceptionForHR(hr);
            SafeRelease(m_pSource);
            m_pSource = null;
        }

        // 2. Shut down the media session. (Synchronous operation, no events.)
        if (m_pSession != null)
        {
            hr = m_pSession.Shutdown();
            MFError.ThrowExceptionForHR(hr);
            Marshal.ReleaseComObject(m_pSession);
            m_pSession = null;
        }
    }

    protected void StartPlayback()
    {
        TRACE("CPlayer::StartPlayback");

        Debug.Assert(m_pSession != null);

        int hr = m_pSession.Start(Guid.Empty, new PropVariant());
        MFError.ThrowExceptionForHR(hr);
    }

    protected void CreateMediaSource(string sURL)
    {
        TRACE("CPlayer::CreateMediaSource");

        IMFSourceResolver pSourceResolver;
        object pSource;

        // Create the source resolver.
        int hr = MFExtern.MFCreateSourceResolver(out pSourceResolver);
        MFError.ThrowExceptionForHR(hr);

        try
        {
            // Use the source resolver to create the media source.
            MFObjectType ObjectType = MFObjectType.Invalid;

            hr = pSourceResolver.CreateObjectFromURL(
                    sURL,                       // URL of the source.
                    MFResolution.MediaSource,   // Create a source object.
                    null,                       // Optional property store.
                    out ObjectType,             // Receives the created object type.
                    out pSource                 // Receives a pointer to the media source.
                );
            MFError.ThrowExceptionForHR(hr);

            // Get the IMFMediaSource interface from the media source.
            m_pSource = (IMFMediaSource)pSource;
        }
        finally
        {
            // Clean up
            Marshal.ReleaseComObject(pSourceResolver);
        }
    }

    protected void CreateLiveMediasource()
    {
        TRACE("CPlayer::CreateMediaSource");

        try
        {
            // Get the IMFMediaSource interface from the media source.
            m_pSource = new LiveMediaSource();
        }
        finally
        {

        }
    }

    protected void CreateTopologyFromSource(out IMFTopology ppTopology)
    {
        TRACE("CPlayer::CreateTopologyFromSource");

        Debug.Assert(m_pSession != null);
        Debug.Assert(m_pSource != null);

        IMFTopology pTopology = null;
        IMFPresentationDescriptor pSourcePD = null;
        int cSourceStreams = 0;

        int hr;

        try
        {
            // Create a new topology.
            hr = MFExtern.MFCreateTopology(out pTopology);
            MFError.ThrowExceptionForHR(hr);

            // Create the presentation descriptor for the media source.
            hr = m_pSource.CreatePresentationDescriptor(out pSourcePD);
            MFError.ThrowExceptionForHR(hr);

            // Get the number of streams in the media source.
            hr = pSourcePD.GetStreamDescriptorCount(out cSourceStreams);
            MFError.ThrowExceptionForHR(hr);

            TRACE(string.Format("Stream count: {0}", cSourceStreams));

            // For each stream, create the topology nodes and add them to the topology.
            for (int i = 0; i < cSourceStreams; i++)
            {
                AddBranchToPartialTopology(pTopology, pSourcePD, i);
            }

            // Return the IMFTopology pointer to the caller.
            ppTopology = pTopology;
        }
        catch
        {
            // If we failed, release the topology
            SafeRelease(pTopology);
            throw;
        }
        finally
        {
            SafeRelease(pSourcePD);
        }
    }


    protected void AddBranchToPartialTopology(IMFTopology pTopology, IMFPresentationDescriptor pSourcePD, int iStream)
    {
        int hr;

        TRACE("CPlayer::AddBranchToPartialTopology");

        Debug.Assert(pTopology != null);

        IMFStreamDescriptor pSourceSD = null;
        IMFTopologyNode pSourceNode = null;
        IMFTopologyNode pOutputNode = null;
        bool fSelected = false;

        try
        {
            // Get the stream descriptor for this stream.
            hr = pSourcePD.GetStreamDescriptorByIndex(iStream, out fSelected, out pSourceSD);
            MFError.ThrowExceptionForHR(hr);

            // Create the topology branch only if the stream is selected.
            // Otherwise, do nothing.
            if (fSelected)
            {
                // Create a source node for this stream.
                CreateSourceStreamNode(pSourcePD, pSourceSD, out pSourceNode);

                // Create the output node for the renderer.
                CreateOutputNode(pSourceSD, out pOutputNode);

                // Add both nodes to the topology.
                hr = pTopology.AddNode(pSourceNode);
                MFError.ThrowExceptionForHR(hr);
                hr = pTopology.AddNode(pOutputNode);
                MFError.ThrowExceptionForHR(hr);

                // Connect the source node to the output node.
                hr = pSourceNode.ConnectOutput(0, pOutputNode, 0);
                MFError.ThrowExceptionForHR(hr);
            }
        }
        finally
        {
            // Clean up.
            SafeRelease(pSourceSD);
            SafeRelease(pSourceNode);
            SafeRelease(pOutputNode);
        }
    }

    protected void CreateSourceStreamNode(IMFPresentationDescriptor pSourcePD, IMFStreamDescriptor pSourceSD, out IMFTopologyNode ppNode)
    {
        Debug.Assert(m_pSource != null);

        int hr;
        IMFTopologyNode pNode = null;

        try
        {
            // Create the source-stream node.
            hr = MFExtern.MFCreateTopologyNode(MFTopologyType.SourcestreamNode, out pNode);
            MFError.ThrowExceptionForHR(hr);

            // Set attribute: Pointer to the media source.
            hr = pNode.SetUnknown(MFAttributesClsid.MF_TOPONODE_SOURCE, m_pSource);
            MFError.ThrowExceptionForHR(hr);

            // Set attribute: Pointer to the presentation descriptor.
            hr = pNode.SetUnknown(MFAttributesClsid.MF_TOPONODE_PRESENTATION_DESCRIPTOR, pSourcePD);
            MFError.ThrowExceptionForHR(hr);

            // Set attribute: Pointer to the stream descriptor.
            hr = pNode.SetUnknown(MFAttributesClsid.MF_TOPONODE_STREAM_DESCRIPTOR, pSourceSD);
            MFError.ThrowExceptionForHR(hr);

            // Return the IMFTopologyNode pointer to the caller.
            ppNode = pNode;
        }
        catch
        {
            // If we failed, release the pnode
            SafeRelease(pNode);
            throw;
        }
    }

    protected void CreateOutputNode(IMFStreamDescriptor pSourceSD, out IMFTopologyNode ppNode)
    {
        IMFTopologyNode pNode = null;
        IMFMediaTypeHandler pHandler = null;
        IMFActivate pRendererActivate = null;

        Guid guidMajorType = Guid.Empty;
        int hr = S_Ok;

        // Get the stream ID.
        int streamID = 0;

        try
        {
            try
            {
                hr = pSourceSD.GetStreamIdentifier(out streamID); // Just for debugging, ignore any failures.
                MFError.ThrowExceptionForHR(hr);
            }
            catch
            {
                TRACE("IMFStreamDescriptor::GetStreamIdentifier" + hr.ToString());
            }

            // Get the media type handler for the stream.
            hr = pSourceSD.GetMediaTypeHandler(out pHandler);
            MFError.ThrowExceptionForHR(hr);
            
            // get hte major type for the media type using the handler
            hr = pHandler.GetMajorType(out guidMajorType);
            MFError.ThrowExceptionForHR(hr);

            // Create a downstream node.
            hr = MFExtern.MFCreateTopologyNode(MFTopologyType.OutputNode, out pNode);
            MFError.ThrowExceptionForHR(hr);

            // Create an IMFActivate object for the renderer, based on the media type.
            if (MFMediaType.Audio == guidMajorType)
            {
                // Create the audio renderer.
                TRACE(string.Format("Stream {0}: audio stream", streamID));
                hr = MFExtern.MFCreateAudioRendererActivate(out pRendererActivate);
                MFError.ThrowExceptionForHR(hr);
            }
            else if (MFMediaType.Video == guidMajorType)
            {
                // Create the video renderer.
                TRACE(string.Format("Stream {0}: video stream", streamID));
                hr = MFExtern.MFCreateVideoRendererActivate(m_hwndVideo, out pRendererActivate);
                MFError.ThrowExceptionForHR(hr);
            }
            else
            {
                TRACE(string.Format("Stream {0}: Unknown format", streamID));
                throw new COMException("Unknown format", E_Fail);
            }

            // Set the IActivate object on the output node.
            hr = pNode.SetObject(pRendererActivate);
            MFError.ThrowExceptionForHR(hr);

            // Return the IMFTopologyNode pointer to the caller.
            ppNode = pNode;
        }
        catch
        {
            // If we failed, release the pNode
            SafeRelease(pNode);
            throw;
        }
        finally
        {
            // Clean up.
            SafeRelease(pHandler);
            SafeRelease(pRendererActivate);
        }
    }

    // Media event handlers
    protected void OnTopologyReady(IMFMediaEvent pEvent)
    {
        int hr;
        object o;
        TRACE("CPlayer::OnTopologyReady");

        // Ask for the IMFVideoDisplayControl interface.
        // This interface is implemented by the EVR and is
        // exposed by the media session as a service.

        // Note: This call is expected to fail if the source
        // does not have video.

        try
        {
            hr = MFExtern.MFGetService(
                m_pSession,
                MFServices.MR_VIDEO_RENDER_SERVICE,
                typeof(IMFVideoDisplayControl).GUID,
                out o
                );
            MFError.ThrowExceptionForHR(hr);

            m_pVideoDisplay = o as IMFVideoDisplayControl;
        }
        catch (InvalidCastException)
        {
            m_pVideoDisplay = null;
        }

        try
        {
            StartPlayback();
        }
        catch (Exception ce)
        {
            hr = Marshal.GetHRForException(ce);
            NotifyError(hr);
        }

        // If we succeeded, the Start call is pending. Don't notify the app yet.
    }

    protected void OnSessionStarted(IMFMediaEvent pEvent)
    {
        TRACE("CPlayer::OnSessionStarted");

        m_state = PlayerState.Started;
        NotifyState();
    }

    protected void OnSessionPaused(IMFMediaEvent pEvent)
    {
        TRACE("CPlayer::OnSessionPaused");

        m_state = PlayerState.Paused;
        NotifyState();
    }

    protected void OnSessionClosed(IMFMediaEvent pEvent)
    {
        TRACE("CPlayer::OnSessionClosed");

        // The application thread is waiting on this event, inside the
        // CPlayer::CloseSession method.
        m_hCloseEvent.Set();
    }

    protected void OnPresentationEnded(IMFMediaEvent pEvent)
    {
        TRACE("CPlayer::OnPresentationEnded");

        // The session puts itself into the stopped state autmoatically.

        m_state = PlayerState.Ready;
        NotifyState();
    }

    #endregion

    #region Member Variables

    protected IMFMediaSession m_pSession;
    protected IMFMediaSource m_pSource;
    protected IMFVideoDisplayControl m_pVideoDisplay;
    protected IMFMediaType m_pMediaType;

    protected IntPtr m_hwndVideo;       // Video window.
    protected IntPtr m_hwndEvent;       // App window to receive events.
    protected PlayerState m_state;          // Current state of the media session.
    protected AutoResetEvent m_hCloseEvent;     // Event to wait on while closing

    #endregion
}


Help regarding OPM( Output Protection Manager ). Need some content regarding OPM and some samples or code related other than the example code.

$
0
0

I am new to visual c++ and does not find something useful related OPM(Output protection manager) to use it in my project in Visual C++.

Anyone there to help me in finding the samples or code regarding OPM. 

I am making a dll of it so please help me by giving your precious suggestions.

i have this code to start but i dont know where to implement it and with what project i have to use with. either directx or windows sdk.

-----------------------------------------------------------------------------------------

OPM_RANDOM_NUMBER random; // Random number from driver.

ZeroMemory(&random, sizeof(random)); BYTE *pbCertificate = NULL; // Pointer to a buffer to hold the certificate. ULONG cbCertificate = 0; // Size of the certificate in bytes. PUBLIC_KEY_VALUES *pKey = NULL; // The driver's public key.// Get the driver's certificate chain + random number hr = pVideoOutput->StartInitialization(&random, &pbCertificate, &cbCertificate );if (FAILED(hr)) {goto done; }// Validate the X.509 certificate. (Not shown.) hr = ValidateX509Certificate(pbCertificate, cbCertificate);if (FAILED(hr)) {goto done; }// Get the public key from the certificate. (Not shown.) hr = GetPublicKeyFromCertificate( pbCertificate, cbCertificate,&pKey );if (FAILED(hr)) {goto done; }// Load and initialize a CNG provider (Cryptography API: Next Generation) BCRYPT_ALG_HANDLE hAlg = 0; hr = BCryptOpenAlgorithmProvider( &hAlg, BCRYPT_RSA_ALGORITHM, MS_PRIMITIVE_PROVIDER, 0 );if (FAILED(hr)) {goto done; }// Import the public key into the CNG provider. BCRYPT_KEY_HANDLE hPublicKey = 0; // Import the RSA public key. hr = ImportRsaPublicKey(hAlg, pKey, &hPublicKey);if (FAILED(hr)) {goto done; }

--------------------------------------------------------- I dont know how to use it and where to use it. It says need driver for it to work.

H.264 decoder with MFT

$
0
0
I have a topology that renders an MP4 file. I create a source and sink node and an H.264 decoder is inserted correctly and everything works fine in my application and also in TopEdit (Render File...).  Once I try to insert a basic MFT into the topology I no longer get any video.  The same MFT works with WMV files.  Looking at the output from MFTrace (when trying the MP4) I can see that all the nodes are connected correctly.  The output subtype from the H.264 decoder is NV12, which works fine with the other file types, but the topology can't be resolved.  Do I need to force the prefered input type on the MFT or manually construct the topology? 

WriteSample takes long time to finish

$
0
0

I have a very interesting issue.

Basically Sink Writer WriteSample takes long time to finish. Sometimes 3-4 seconds.

Let me explain:

I am copying one MP4 file to another, like mfcopy. When I do a straight up copy it finishes up fast, just like expected.

But if I just read MF_SOURCE_READER_FIRST_VIDEO_STREAM from source reader and write that sample to the new mp4 files, the writes take very long time. It starts off fast, but as it reads along the file, the write takes very long. Each write takes 3-4 seconds.

Can anybody provide some clues as to why this might be happening?

Direct3D surface creation for a custom EVR presenter

$
0
0

My custom presenter is based on the SDK example and it uses the IDirect3DDevice9Ex::CreateAdditionalSwapChain and MFCreateVideoSampleFromSurface to get samples to send to the mixer ProcessOutput. Recenlty I've discovered the IDirectXVideoProcessorService::CreateSurface function.

What is the difference of using IDirectXVideoProcessorService::CreateSurface compared to IDirect3DDevice9Ex::CreateAdditionalSwapChain?

// Carl

MFTrace bug?

$
0
0

I am trying to use the MFTrace tool to get an idea of what my loaded topology looks like.  I intend to use a hardware H.264 encoder available on my machine. The application runs fine in the debugger and normally from Explorer, with the hardware encoder working its magic.  However, when I run my application under MFTrace, I run into an Access Violation exception.  It appears that this happens when I try to resolve a partial topology created by MFCreateTranscodeTopology with IMFTopoLoader::Load. The part of the trace that is logged before the exception shows a call to ActivateObject and subsequently MFGetMFTMerit.

I tried isolating the exception and found that I could cause it to happen when I simply enumerate the hardware encoder and try to activate it.  Same trace output.

It seems to me that there may be a bug in how MFTrace hooks itself into all the MF libraries.  Or could there be something wrong with the hardware encoder - I am using Intel's QuickSync H.264 encoder provided with Sandy Bridge chips (on a Core i7 3rd gen processor)?

Network sink stops receiving data

$
0
0

I have a working transcode topology as follows:

Camera --> MJPG Decoder --> Custom YUY2 MFT --> Intel Preprocessing MFT --> Intel H264 Encoder --> MP4 File Sink

This works great, giving me 30fps for a 720p video.  I am now trying to replace the file sink with a network streaming solution.  I have bolted on the streaming sample from Chapter 9 of Developing Microsoft Media Foundation Applications.  The topology resolves just fine, and bytes start spewing out of the custom byte stream object.  However, after about 48K of data, the writes to the custom IMFByteStream object stop happening.  I can't find any events or errors associated with this, it just stops.

I am unable to use MFTrace to debug this topology, see my other post for details:
http://social.msdn.microsoft.com/Forums/en-US/mediafoundationdevelopment/thread/d3edac2d-f1ae-456a-9c33-6494bbe0ebd5

Is something I am missing?  Could there be something I am not setting up properly given that I am using a live source instead of a file?

decoding and displaying mpeg4 part 2 frames

$
0
0

Hi, 

i have a created  a simple project that would show a live stream[ MPEG 4 stream] from an IP Camera using media foundation

I have created a custom media source and  media stream. everything is working correctly so far (i assume) in terms of starting the session and setting the topology as the sample are requested from the session. yet i do not get the decoded frames which are meant to be displayed. is there something i am missing or doing incorrectly .. Any suggestions or help would be helpful and appreciated.

Thanks

Viewing all 1079 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>