Quantcast
Channel: Media Foundation Development for Windows Desktop forum
Viewing all 1079 articles
Browse latest View live

javascript to capture user selection Are you sure you want to leave page

$
0
0

As per my requirement I need to write a code to capture the IE 9 on close. Am able to captue the close event and now the requirement is to identify the user selection on the alert window. i.e, how can we identify Stay on this page or Leave this page usin the Javascript.


Thanks


Obtaining display aspect ratio from AVI files

$
0
0

Hi, I am reading SD PAL sized avi files that are 4:3 and 16:9 display aspect ratio but am not able to read this from the file.

When I call MFGetAttributeRatio() with GUID MF_MT_PIXEL_ASPECT_RATIO this returns 1:1. The MFVideoFormat also contains 1:1 and the aspect ratio in VideoInfo2 is 5:4 for both 4:3 and 16:9. I understand that the aspect ratio in VideoInfo2 is the storage aspect ratio and that I need the pixel aspect ratio and by multiplying this with the storage aspect ratio this then gives the display aspect ratio but I have not been successful in reading this value. Also the interlaced field is always progressive which is incorrect, it should be interlaced.

Analysing the output from MFTrace when attached to windows media player on Windows 7 I can see that the MF_MT_PIXEL_ASPECT_RATIO value on calls to setInputType and setOutputType will change from 1,1 to 16,15 and then 64,45, the latter when multiplied with 5:4 gives 16:9.

I have started using media foundation and have read the article: http://msdn.microsoft.com/en-us/library/bb530115(v=VS.85).aspx

but this does not explain how I obtain the source display aspect ratio I am after.

Any suggestions and help much appreciated as I think I am not the only person who is struggling with this. To give a little more information I am using Media Foundation to import AVI files into our editor software and I can read all other video information such as size and frame rate but need to know the interlace mode and display aspect ratio before reading the samples into our application.

E_UNEXPECTED in MESessionStarted event when topology has two branches

$
0
0

I am working on a session topology that records a video (with audio) while showing the video stream with an EVR. I am using SampleGrabber sinks for the video and audio streams so I can easily control when I start/end recording (using a sinkwriter) and extract sample data as I need to (for capturing stills and displaying the microphone's audio level). I am using a Tee node to split the video feed between the EVR and the video samplegrabber, and a copierMFT to deliver samples to the EVR.

Each branch of the topology work by themselves, but when I add both to the topology the media session fails to start, with a status="E_UNEXPECTED Catastrophic Error" set on the MESessionStarted event. That is the only error I find in the MFTrace logs. My camera never turns on and I don't receive any samples in my samplegrabbers. The topology seems to resolve correctly, so I'm not sure what the session didn't expect. The documentation for MediaSession::Start doesn't mention anything about this error. My current guess is that is has something to do with syncing the presentation clock, but setting the samplegrabber output nodes as rateless doesn't seem to help. After I receive the Topology Ready status event, checking the session's clock state returns MFCLOCK_STATE_INVALID.

Is there any special setup required to have a live camera source and microphone source in the same topology? Is there any way to get more info on the E_UNEXPECTED error?

Any help on getting to the bottom of this is appreciated. Here is a snippet of my mftrace log which shows the ready topology and the error:

12656,14E8 19:25:29.69496 CTopologyHelpers::Trace @02A799F8 >>>>>>>>>>>>> ready topology
12656,14E8 19:25:29.69499 CTopologyHelpers::TraceNode @ Node 0 @02A7A110 ID:317000000001, 0 inputs, 1 outputs, type 1, MF_TOPONODE_MARKIN_HERE=1;MF_TOPONODE_MARKOUT_HERE=1;MF_TOPONODE_MEDIASTART=0 (0,0);MF_TOPONODE_SOURCE=@02A95200;MF_TOPONODE_PRESENTATION_DESCRIPTOR=@02A63820;MF_TOPONODE_STREAM_DESCRIPTOR=@02A662C8;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85698
12656,14E8 19:25:29.69500 CMFTopologyNodeDetours::GetGUID @02A7A110 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69501 CTopologyHelpers::TraceObject @ Source @02A95200 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFMEDIASOURCE_CHARACTERISTICS=0x00000005
12656,14E8 19:25:29.69507 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7A188 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69509 CTopologyHelpers::TraceNode @ Node 1 @02A7A188 ID:317000000002, 1 inputs, 2 outputs, type 3, MF_TOPONODE_PRIMARYOUTPUT=0;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A7CE00
12656,14E8 19:25:29.69509 CMFTopologyNodeDetours::GetGUID @02A7A188 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69509 CTopologyHelpers::TraceObject @ Tee @00000000 {00000000-0000-0000-0000-000000000000} ((null)), (null)
12656,14E8 19:25:29.69514 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A110 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69518 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79D68 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69522 CTopologyHelpers::TraceStream @ Output stream 1, connected to node @02A7B1F0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69524 CTopologyHelpers::TraceNode @ Node 2 @02A79D68 ID:317000000005, 1 inputs, 1 outputs, type 2, {89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85908
12656,14E8 19:25:29.69524 CMFTopologyNodeDetours::GetGUID @02A79D68 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69526 CTopologyHelpers::TraceObject @ MFT @02A6DAE8 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFT_SUPPORT_DYNAMIC_FORMAT_CHANGE=1;{851745D5-C3D6-476D-9527-498EF2D10D18}=4
12656,14E8 19:25:29.69531 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A188 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69536 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79EA0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69538 CTopologyHelpers::TraceNode @ Node 3 @02A79DE0 ID:317000000003, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A85AE0
12656,14E8 19:25:29.69539 CMFTopologyNodeDetours::GetGUID @02A79DE0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69539 CTopologyHelpers::TraceObject @ Sink @02A6FB28 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), (null)
12656,14E8 19:25:29.69544 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7B1F0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=2560;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_GEOMETRIC_APERTURE=00 00 00 00 00 00 00 00 80 02 00 00 e0 01 00 00 ;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=42949672960333333 (10000000,333333);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=1228800;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_RGB32
12656,14E8 19:25:29.69546 CTopologyHelpers::TraceNode @ Node 4 @02A7B1F0 ID:31700000000C, 1 inputs, 1 outputs, type 2, MF_TOPONODE_TRANSFORM_OBJECTID={98230571-0087-4204-B020-3282538E57D3};{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86040
12656,14E8 19:25:29.69546 CTopologyHelpers::TraceObject @ MFT @02A7A488 {98230571-0087-4204-B020-3282538E57D3} (C:\Windows\SYSTEM32\colorcnv.dll), <NULL>
12656,14E8 19:25:29.69551 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7A188 stream 1, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69556 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A79DE0 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=2560;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_GEOMETRIC_APERTURE=00 00 00 00 00 00 00 00 80 02 00 00 e0 01 00 00 ;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=42949672960333333 (10000000,333333);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_SAMPLE_SIZE=1228800;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_SUBTYPE=MFVideoFormat_RGB32
12656,14E8 19:25:29.69559 CTopologyHelpers::TraceNode @ Node 5 @02A79EA0 ID:317000000004, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_NOSHUTDOWN_ON_REMOVE=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86218;{B8AA3129-DFC9-423A-8ACD-1D82850A3D1F}=@02A8B9E0
12656,14E8 19:25:29.69560 CMFTopologyNodeDetours::GetGUID @02A79EA0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69560 CTopologyHelpers::TraceObject @ Sink @02A8E5D4 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MF.dll), (null)
12656,14E8 19:25:29.69565 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A79D68 stream 0, MT: MF_MT_FRAME_SIZE=2748779069920 (640,480);MF_MT_AVG_BITRATE=147456000;MF_MT_YUV_MATRIX=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Video;MF_MT_VIDEO_LIGHTING=3;MF_MT_DEFAULT_STRIDE=1280;MF_MT_VIDEO_CHROMA_SITING=6;MF_MT_AM_FORMAT_TYPE=FORMAT_VIDEOINFO2;MF_MT_FIXED_SIZE_SAMPLES=1;MF_MT_VIDEO_NOMINAL_RANGE=2;MF_MT_FRAME_RATE=128849018881 (30,1);MF_MT_PIXEL_ASPECT_RATIO=4294967297 (1,1);MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_FRAME_RATE_RANGE_MIN=128849018881 (30,1);MF_MT_SAMPLE_SIZE=614400;MF_MT_VIDEO_PRIMARIES=2;MF_MT_INTERLACE_MODE=2;MF_MT_FRAME_RATE_RANGE_MAX=128849018881 (30,1);MF_MT_SUBTYPE=MFVideoFormat_YUY2
12656,14E8 19:25:29.69568 CTopologyHelpers::TraceNode @ Node 6 @02A79F18 ID:317000000006, 0 inputs, 1 outputs, type 1, MF_TOPONODE_MARKIN_HERE=1;MF_TOPONODE_MARKOUT_HERE=1;MF_TOPONODE_MEDIASTART=0 (0,0);MF_TOPONODE_SOURCE=@02A65D40;MF_TOPONODE_PRESENTATION_DESCRIPTOR=@02A63CE8;MF_TOPONODE_STREAM_DESCRIPTOR=@02A63048;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A86370
12656,14E8 19:25:29.69568 CMFTopologyNodeDetours::GetGUID @02A79F18 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69569 CTopologyHelpers::TraceObject @ Source @02A65D40 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), MFMEDIASOURCE_CHARACTERISTICS=0x00000005
12656,14E8 19:25:29.69571 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7BFA0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=384000;MF_MT_AUDIO_BLOCK_ALIGNMENT=8;MF_MT_AUDIO_NUM_CHANNELS=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=3;MF_MT_AUDIO_SAMPLES_PER_SECOND=48000;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=32;MF_MT_SUBTYPE=MFAudioFormat_Float
12656,14E8 19:25:29.69573 CTopologyHelpers::TraceNode @ Node 7 @02A7BDC0 ID:317000000007, 1 inputs, 0 outputs, type 0, MF_TOPONODE_STREAMID=0;MF_TOPONODE_DISABLE_PREROLL=1;{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02A864E0
12656,14E8 19:25:29.69573 CMFTopologyNodeDetours::GetGUID @02A7BDC0 attribute not found guidKey = MF_TOPONODE_TRANSFORM_OBJECTID
12656,14E8 19:25:29.69574 CTopologyHelpers::TraceObject @ Sink @02A70410 {00000000-0000-0000-0000-000000000000} (C:\WINDOWS\SYSTEM32\MFCORE.DLL), (null)
12656,14E8 19:25:29.69575 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A7BFA0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=88200;MF_MT_AUDIO_BLOCK_ALIGNMENT=2;MF_MT_AUDIO_NUM_CHANNELS=1;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=4;MF_MT_AUDIO_SAMPLES_PER_SECOND=44100;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=16;MF_MT_SUBTYPE=MFAudioFormat_PCM
12656,14E8 19:25:29.69577 CTopologyHelpers::TraceNode @ Node 8 @02A7BFA0 ID:317000000010, 1 inputs, 1 outputs, type 2, MF_TOPONODE_TRANSFORM_OBJECTID={F447B69E-1884-4A7E-8055-346F74D6EDB3};{89485B85-2FFA-4547-B269-B82C79EE197C}=1;{9C86CC4E-68CE-4CFF-AA1E-9A5A40D5B4E0}=@02AAA068
12656,14E8 19:25:29.69578 CTopologyHelpers::TraceObject @ MFT @02A7C4AC {F447B69E-1884-4A7E-8055-346F74D6EDB3} (C:\Windows\SYSTEM32\resampledmo.dll), <NULL>
12656,14E8 19:25:29.69581 CTopologyHelpers::TraceStream @ Input stream 0, connected to node @02A79F18 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=384000;MF_MT_AUDIO_BLOCK_ALIGNMENT=8;MF_MT_AUDIO_NUM_CHANNELS=2;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_CHANNEL_MASK=3;MF_MT_AUDIO_SAMPLES_PER_SECOND=48000;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=32;MF_MT_SUBTYPE=MFAudioFormat_Float
12656,14E8 19:25:29.69583 CTopologyHelpers::TraceStream @ Output stream 0, connected to node @02A7BDC0 stream 0, MT: MF_MT_AUDIO_AVG_BYTES_PER_SECOND=88200;MF_MT_AUDIO_BLOCK_ALIGNMENT=2;MF_MT_AUDIO_NUM_CHANNELS=1;MF_MT_MAJOR_TYPE=MEDIATYPE_Audio;MF_MT_AUDIO_SAMPLES_PER_SECOND=44100;MF_MT_AUDIO_PREFER_WAVEFORMATEX=1;MF_MT_ALL_SAMPLES_INDEPENDENT=1;MF_MT_AUDIO_BITS_PER_SAMPLE=16;MF_MT_SUBTYPE=MFAudioFormat_PCM
12656,14E8 19:25:29.69584 CTopologyHelpers::Trace @02A799F8 MF_TOPOLOGY_RESOLUTION_STATUS = 0
12656,14E8 19:25:29.69584 CTopologyHelpers::Trace @02A799F8 <<<<<<<<<<<<< ready topology
12656,14E8 19:25:29.69590 CKernel32ExportDetours::OutputDebugStringA @ TOPOLOGY READY
12656,14E8 19:25:29.69600 CMFTopologyDetours::GetUINT32 @02A799F8 attribute not found guidKey = {9C27891A-ED7A-40E1-88E8-B22727A024EE}
12656,14E8 19:25:29.69612 CMFMediaSessionDetours::EndGetEvent @02A6DDC0 Met=103 MESessionStarted, value (empty), failed HrStatus=8000FFFF E_UNEXPECTED, 


Intel hareware H.264 encoder MFT failed

$
0
0

Hi,

I'm trying to use Intel hardware H.264 encoder on my machine which has an Intel Core i5-2400 and latest Intel driver installed. 

The code I tested is based on the example here https://msdn.microsoft.com/en-us/library/windows/desktop/ff819477%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396

The change I've made is:

1. changed the way to create the  SinkWriter

IMFAttributes *pMFAttributes = NULL;
hr = MFCreateAttributes(&pMFAttributes, 100);
hr = pMFAttributes->SetGUID(MF_TRANSCODE_CONTAINERTYPE, MFTranscodeContainerType_MPEG4); // format
hr = pMFAttributes->SetUINT32(MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS, true); // enable hardware Encoder
hr = MFCreateSinkWriterFromURL(L"output.mp4", NULL, pMFAttributes, &pSinkWriter);

2. changed the encoder type and input media format

const GUID   VIDEO_ENCODING_FORMAT = MFVideoFormat_H264;
const GUID   VIDEO_INPUT_FORMAT = MFVideoFormat_NV12;

3. manually create a NV12 buffer (Y=150, U=45, V=23)

// NV12 is 1.5 Byte/Pixel
const LONG cbWidth = 1.5 * VIDEO_WIDTH;
const DWORD cbBuffer = cbWidth * VIDEO_HEIGHT;
BYTE *pData = NULL;
// Create a new memory buffer.
HRESULT hr = MFCreateMemoryBuffer(cbBuffer, &pBuffer);
// Lock the buffer and copy the video frame to the buffer.
if (SUCCEEDED(hr)) {
   hr = pBuffer->Lock(&pData, NULL, NULL);
}
if (SUCCEEDED(hr)) {
   for (int i = 0; i < VIDEO_WIDTH * VIDEO_HEIGHT; i++) {
      *(pData++) = 150;
   }
   for (int i = 0; i < VIDEO_WIDTH * VIDEO_HEIGHT/2; i++) { 
      if (i % 2 == 0) *(pData++) = 45;
      else *(pData++) = 23;
   }
}

Result:

  • When I ran the program, pWriter->WriteSample failed after about 120 frames were successfully written (return value is S_OK). The resulting mp4 file was only 32 bytes. From the mftrace I saw
COle32ExportDetours::CoCreateInstance @ New MFT @01B509F0, MF_TRANSFORM_ASYNC=1;MFT_SUPPORT_DYNAMIC_FORMAT_CHANGE=1;MFT_ENUM_HARDWARE_URL_Attribute=AA243E5D-2F73-48c7-97F7-F6FA17651651
COle32ExportDetours::CoCreateInstance @ Created {4BE8D3C0-0515-4A37-AD55-E4BAE19AF471} Intel® Quick Sync Video H.264 Encoder MFT (C:\Program Files\Intel\Media SDK\mfx_mft_h264ve_w7_32.dll) @01B509F0 - traced interfaces: IMFTransform @01B509F0, 
  • Once I used MFCreateSinkWriterFromURL(L"output.mp4", NULL, NULL, &pSinkWriter) so that a SinkWriter with MS software encoder is created, a correct MP4 file could be generated.

API to get requested video frames

$
0
0

Hi,

could you tell me what I have to do to get any encoded video frame I want, by Media Foundation? At the moment I`m using the sourceReader API. It is simple but has its limitations.

For example the "ReadSample" method starts always with the previous syncframe. If I want a specific B-frame or P-frame, I have to read all frames from the previous syncframe to this specific frame - that takes a lot of time. It is too slow.

In my application, I ask the Media Foundation API for a specific encoded video frame, using the specific frame number. For example I want to get frame 345. In this situation the Media Foundation API should provide this frame to me - without the need to load all frames starting from the previous syncframe. Depending on the video source, there can be more than 500 frames between the previous syncframe and the requested frame.

Is there any chance to do that? Which APIs within Media Foundation do I have to use to get only the desired encoded video frame?

best regards

saoirse

Behavior of Media Foundation Media Session in Windwos Store App when ProcessInput returns MF_E_NOTACCEPTING.

$
0
0
Hello,

I am developing a multimedia application for Windows Store App. I use my MFTs for audio/video decoders.
I noticed that behavior of Windows Store App MF Media Session is different between Windows 10 and Windows 8.1.

On Windows 10, following steps occur when a MFT returns MF_E_NOTACCEPTING to a ProcessInput call.

1. The Media Session gives a IMFSample (A) to a MFT by ProcessInput.
2. The MFT returns MF_E_NOTACCEPTING for not accepting new sample.
3. The Media Session retrieves an output IMFSample by ProcessOutput.
4. The Media Session gives the same IMFSample as step 1 (A) to ProcessInput.

On Windows 8.1 or earlier, next IMFSample (B) is given at step 4.

I did not try whether this behavior occurs on Desktop applications or not because I do not have a MF application for Desktop.

My question is which behavior is correct in this situation. The MSDN document for IMFTransform::ProcessInput (https://msdn.microsoft.com/en-us/library/windows/desktop/ms703131%28v=vs.85%29.aspx) does not specify behavior after returning MF_E_NOTACCEPTING to a ProcessInput call.

kohei sakamoto

ProcessOutput: H.264 decoder problems

$
0
0

Hi,

I have some problems with the H.264 decoder. The behaviour of this decoder is quiet different to other Microsoft MFTT decoders like the DV decoder, which is already working.

First of all, it seems to me, that the case that H.264 can also have a GOP size of 1, is not recognized, because after calling ProcessOutput I always get the return value of "MF_E_TRANSFORM_NEED_MORE_INPUT".

After that I try to send a MFT_MESSAGE_COMMAND_DRAIN and call ProcessOutput again. Now I get the message MF_E_TRANSFORM_STREAM_CHANGE. Ah the moment I have no clue why this happens. Ok I try to set the output media type again. I check the decoder "GetOutputAvailableType" and set the desired output type. AFter that I call ProcessOutput again and I get a E_FAIL message. This behaviour is not related to the first video frame - it doesn`t mather at which frame I start. The first frame which comes to the decoder, has these problems. The second frame and so on can be decoded without these problems.

Moreover setting the output type again after getting the message MF_E_TRANSFORM_STREAM_CHANGE, it fails if I call MFSetAttributeSize with the correct width and height values. The return value of SetOutputType is "the data specified for the media type is invalid, inconsistent or not supported by this object. Are these attributes not allowed for the output?

I hope someone of you can shed some light on it.

best regards

determine GOP size of video

$
0
0

Hi,

I want to determine if a video has a GOP size bigger than 1. Which attribute can I use for this purpose? Using the sourceReader API, I get the IMFMediaType object for the video stream of my MPEG-4 or AVI video file. I have tested a H.264 video and a DV video.

Both IMFMediaType attributes "MF_MT_ALL_SAMPLES_INDEPENDENT" and "MF_MT_MAX_KEYFRAME_SPACING" are not available. Are these attributes only used for the encoder stuff? 

HRESULT hr = mediaType->GetUINT32 (MF_MT_ALL_SAMPLES_INDEPENDENT, &independent);

hr = mediaType->GetUINT32 (MF_MT_SAMPLE_SIZE, &sampleSize);

hr = mediaType->GetUINT32(MF_MT_MAX_KEYFRAME_SPACING, &spacing);

best regards

saoirse



PTS & DTS timestamp information using sourceReader

$
0
0

Hi,

using the sourceReader, I read out the encoded video samples. Additional to the IMFSample I get also the DTS timestamp information. In a second step I will send these samples to the decoder.

For example, my video player application wants to show frame 55 (in pts order of course) - in a longGOP video file. How and where can I get the information how many frames I have to load by the sourceReader to get frame 55 in PTS order????

Using SetCurrenPosition should be the key to get the previous sync frame. To be honest I do not know if SetCurrentPositon needs the DTS time information or the PTS time information.Is there anywhere in the www a hint?

After using SetCurrentPosition, I should have the sync frame - how many frames are needed to get the requested frame 55 (in pts order)? ReadSample returns no information about PTS - it only returns the DTS timestamp and the IMFSample, including the encoded video samples.....

Moreover both timestamps DTS and PTS can have different offset values - values of the first frame in DTS as well as PTS order. At the moment, the first call of ReadSample returns the dts offset, if you SetCurrentPosition to zero.

I hope someone of you could help me to get the required number of frames (in dts order) from the sourceReader to get the requested video frame by the decoder.

best regards

saoirse

FLAC metadata/tags being read incorrectly in Windows Media Player on Windows 10

$
0
0

I recently converted my lossless music to FLAC now that Windows 10 supports it natively. But when viewing these FLAC files in Windows Media Player, the year of the songs show up as "unknown year" despite having the year correctly labelled. I've noticed that compared to mp3 files, when you go to the details tab in the properties of a FLAC file, there's an extra tag option called "Date released" below the Year tag which when filled in allows the year to be displayed within WMP, although this is quite a tedious solution.

From what it looks like, WMP isn't able to read the Vorbis tag DATE. There's also a problem with the Artists name not showing up in the contributing artist section, probably for the same reason. I'm not sure how well WMP is able to read the vorbis tagging system but the ability to do so seems to be poorly implemented.

On a side note, there doesn't appear to be any standard windows media player FLAC icon for when WMP is the default media player, instead it uses the M4A file.




How to access custom method on my CustomEVRPresenter

$
0
0

Hi:

    I'm able to have media foundation use the CustomEVRPresenter by :

CComPtr<IMFMediaSink> spRendererSink;

hr = spRendererActivate->SetGUID(MF_ACTIVATE_CUSTOM_VIDEO_PRESENTER_CLSID,CLSID_My_CustomEVRPresenter);

CHECK_HR( hr = spRendererActivate->ActivateObject(IID_IMFMediaSink, (

void**) &spRendererSink) );

  

This creates an instance of   the CustomEVRPresenter but does not provide a pointer so I can reference a custom method I wrote for the CustomEVRPresenter.  Is there a way to do this??  If I create via CreateInstance another instance of the CustomEVRPresenter the of course I can access the method; but it is in addition to the one created  by the SetGUID( above and is not  the one media foundation is using to run the video.   Any ideas?

  Mags


Behavior of the Sink Writer and Microsoft's H 264 Encoder when it comes to DirectX Video Acceleration

$
0
0

Hi,

i would like to know how the Sink Writer and Microsoft's H 264 Encoder behave when passing DX surfaces which are in Default Pool ( Device Memory ). Normaly i am using Media Sessions, but for a small test i setup a very simple Sink Writer app and encoded a few frames.

I am passing my Device Manager pointer ( IMFDXGIDeviceManager ) to the sink writer using the MF_SINK_WRITER_D3D_MANAGER attribute. The Encoding works flawless, and seem to be very fast. What i am wondering about is if the Sink Writer sets the Device Manager directly on Microsoft's H 264 Encoder using the MFT_MESSAGE_SET_D3D_MANAGER message, or if it inserts a Processing MFT inbetween.

If Microsoft's H 264 Encoder accepts surfaces from Video Memory how does it process/encode them internally ? Does it encode them entirely on Hardware and then transfer it to System Memory and handle out the ProcessOutput result ? Or does it copy the bytes to System Memory in the first place and then encode them ?

I cant realy think of the later because that would hurt performance very much, it uses the Video Service of my Direct3D device i assume ?

Regards

co0Kie

how do i engage hdcp on itunes in windows 10

$
0
0
my iTunes movies will not play hd videos in windows 10 I get a message hd movies will not play

one second latency in MFT running in debug mode

$
0
0

Hey,

is there a known performance issue, using the Media Foundation in a debug build?


I setup a synchronous MFT and get the first two decoded frames. After that I do not send any message to the MFT or anyhting else. 

The next client call of ProcessOutput happens when the video player should start to play the video.

Running in debug mode, the video player has to skip a lot of video frames (approximately for one second) until the MFT decoder is fast enough. After that, the video player plays the video as expected. In the Release build, the MFT is fast enough, so that no video frame has to be skipped by the video player.  In my code, there are no debug output messages or code parts which are running only in debug mode. I also setup the attribute "CODECAPI_AVLowLatencyMode".


All Microsoft MFTs - especially DV and H.264 - have this behaviour, but slightly different. In my opinion the complete setup of the threads are already done because I have already decode two frames.

best regards

saoirse

system requirements for asynchronous MFTs

$
0
0

Hey,

could you tell me the hardware requirements for asynchronous MFTs? I have a Windows 8.1 system with an Nvidia Quadro video card, and there is no asynchronous MFT available for H.264.

MFT_REGISTER_TYPE_INFO info = { MFMediaType_Video, MFVideoFormat_H264};

UINT32 unFlags = MFT_ENUM_FLAG_ASYNCMFT | MFT_ENUM_FLAG_LOCALMFT |
        MFT_ENUM_FLAG_SORTANDFILTER;

hr = MFTEnumEx (MFT_CATEGORY_VIDEO_DECODER,
        unFlags,
        &info,      // Input type
        NULL,       // Output type
        &ppActivate,
        &count);

Are there any decoder from Microsoft supporting a asynchronous MFT (https://msdn.microsoft.com/en-us/library/windows/desktop/ff819077%28v=vs.85%29.aspx)

best regards

saoirse



MF_MPEG4SINK_MOOV_BEFORE_MDAT

$
0
0
Hi,

The documentation on the MF_MPEG4SINK_MOOV_BEFORE_MDAT attribute is a bit unclear.
To configure the MPEG-4 sink to store the moov atom before the mdat, is this the intended way:


CComPtr<IMFMediaSink> mediaSink;
::MFCreateMPEG4MediaSink(byteStream, settings.videoOutput, settings.audioOutput, &mediaSink);

...

CComPtr<IMFStreamSink> videoSink;
mediaSink->GetStreamSinkByIndex(videoSinkIndex, &videoSink)

CComPtr<IMFAttributes> attri;
videoSink->QueryInterface(IID_PPV_ARGS(&attri));
attri->SetUINT32(MF_MPEG4SINK_MOOV_BEFORE_MDAT, TRUE);

People seems to have some problem with this attribute: http://stackoverflow.com/questions/24085362/how-to-generate-moov-before-mdat-mp4-video-files-with-media-foundation

best regards,

Carl


closed captions for Iphone 5

$
0
0
I am deaf & need closed captions for my device & HP Computer Laptop-Windows 10

Video segment looping.

$
0
0

Hi,

What is the recommended way of doing video segment looping within a file?

Let's say that I have a one minute video, but I am for some reason only interested at the segment
15 seconds -> 25 seconds and I want that segment to be displayed over and over again, in a smooth loop.

Any recommendations?

// Carl

Display of several videos at the same time with media foundation

$
0
0

I am trying to display several videos at the same time in a media foundation application. As the video files are encrypted and encoded as H264 without any container, I have developed a custom source reading from a byte stream.

Now, I am trying to display several videos on the same screen, like in a classical surveillance application. It is not clear to me the best way to do that, nor the sequence of steps.

Can anyone tell me the best approach to solve this problem? Is there any sample to take a look and have an idea of the sequence of steps?

TrueColor technology with Media Foundation for Windows 7

$
0
0

Hello,

I am writing an application using WMF at Windows 7 to capture video with web camera Microsoft LifeCam Studio. The LifeCam camera is capable of capturing video using TrueColor technology that also works if I use OpenCV VideoCapture class.

Yet when I use WMF, the resulting video is not in TrueColor. Neither it is in TrueColor mode when I use MFCaptureD3D Sample provided with Windows SDK. However, when I run MFCaptureD3D Sample at Windows 8 machine, the TrueColor technology works.

It seems that TrueColor feature is not supported by WMF at Windows 7. 

Is there any way to capture video using TrueColor technology with MF at Windows 7? Or this feature is only supported starting with Windows 8? Is it also supported at Windows 10?

Thank you in advance,

Oksana

Viewing all 1079 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>