why so much time of delay(latency) in displaying videos captured from source reader?
How can I create an MP4 container from an h.264 byte stream (Annex B)?
Basically, I have a neat H.264 byte stream in the form of I and P samples. I can play these samples using MediaStreamSource and MediaElement and they play good. I also need to save them as an MP4 file so that the same can be played later using Media Element or VLC. This is how I am trying to do it, using Media Foundation;
I create an IMFMediaSink from MFCreateMPEG4MediaSink; this is my code:
IMFMediaType *pMediaType = NULL; IMFByteStream *pByteStream = NULL; HRESULT hr = S_OK; if (SUCCEEDED(hr)) { hr = MFCreateMediaType(&pMediaType); } pSeqHdr = reinterpret_cast<UINT8 *>(mSamplesQueue.SequenceHeader()); if (SUCCEEDED(hr)) { hr = pMediaType->SetBlob(MF_MT_MPEG_SEQUENCE_HEADER, pSeqHdr, 35); } UINT32 pcbBlobSize = {0}; hr = pMediaType->GetBlobSize(MF_MT_MPEG_SEQUENCE_HEADER, &pcbBlobSize); /*if (SUCCEEDED(hr)) { hr = pMediaType->SetUINT32(MF_MPEG4SINK_SPSPPS_PASSTHROUGH, TRUE); }*/ if (SUCCEEDED(hr)) { hr = pMediaType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video); } if (SUCCEEDED(hr)) { hr = pMediaType->SetGUID(MF_MT_SUBTYPE, VIDEO_INPUT_FORMAT); } if (SUCCEEDED(hr)) { hr = MFSetAttributeRatio(pMediaType, MF_MT_FRAME_RATE, VIDEO_FPS, 1); } if (SUCCEEDED(hr)) { hr = pMediaType->SetUINT32(MF_MT_AVG_BITRATE, VIDEO_BIT_RATE); } if (SUCCEEDED(hr)) { hr = pMediaType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); } if (SUCCEEDED(hr)) { hr = MFSetAttributeSize(pMediaType, MF_MT_FRAME_SIZE, VIDEO_WIDTH, VIDEO_HEIGHT); } if (SUCCEEDED(hr)) { // Pixel aspect ratio hr = MFSetAttributeRatio(pMediaType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1); } if (SUCCEEDED(hr)) { hr = MFCreateFile( MF_ACCESSMODE_READWRITE, MF_OPENMODE_DELETE_IF_EXIST, MF_FILEFLAGS_NONE, L"output1.mp4",&pByteStream); } if (SUCCEEDED(hr)) { hr = MFCreateMPEG4MediaSink( pByteStream, pMediaType, NULL,&pMediaSink); }
Then I create an IMFSinkWriter from this media sink using MFCreateSinkWriterFromMediaSink; this is my code:
if (SUCCEEDED(hr)) { hr = MFCreateSinkWriterFromMediaSink(pMediaSink, NULL, &pSinkWriter); } // Tell the sink writer to start accepting data. if (SUCCEEDED(hr)) { hr = pSinkWriter->BeginWriting(); } if (SUCCEEDED(hr)) { pSinkWriter->AddRef(); }
And then I write every sample to the sink writer with IMFSinkWriter::WriteSample(0, IMFSample); this is my code:
IMFSample *pSample = NULL; IMFMediaBuffer *pBuffer = NULL; const DWORD cbBuffer = mSamplesQueue.GetNextSampleSize(); UINT32 isIDR = mSamplesQueue.GetNextSampleIsIDR(); BYTE *pData = NULL; // Create a new memory buffer. HRESULT hr = MFCreateMemoryBuffer(cbBuffer, &pBuffer); // Lock the buffer and copy the video frame to the buffer.
DWORD pcbMaxLen = 0, pcbCurLen = 0; if (SUCCEEDED(hr)) { hr = pBuffer->Lock(&pData, &pcbMaxLen, &pcbCurLen); } if (SUCCEEDED(hr)) { hr = mSamplesQueue.Dequeu(&pData); } if (pBuffer) { pBuffer->Unlock(); } // Set the data length of the buffer. if (SUCCEEDED(hr)) { hr = pBuffer->SetCurrentLength(cbBuffer); } // Create a media sample and add the buffer to the sample. if (SUCCEEDED(hr)) { hr = MFCreateSample(&pSample); } if (SUCCEEDED(hr)) { hr = pSample->AddBuffer(pBuffer); } // Set the time stamp and the duration. if (SUCCEEDED(hr)) { hr = pSample->SetSampleTime(rtStart); } if (SUCCEEDED(hr)) { hr = pSample->SetSampleDuration(rtDuration); } if (SUCCEEDED(hr)) { hr = pSample->SetUINT32(MFSampleExtension_CleanPoint, isIDR); } //pSample-> // Send the sample to the Sink Writer. if (SUCCEEDED(hr)) { hr = pSinkWriter->WriteSample(0, pSample); } SafeRelease(&pSample); SafeRelease(&pBuffer);
The writing of samples is an iterative code that is called from every sample that I have (I am testing with 1k I and P samples). Now when I call the IMFSinkWriter::Finalize(), it tells me that "0xc00d4a45 : Sink could not create valid output file because required headers were not provided to the sink.". It does create an MP4 file with a very valid size (for my 1k samples, 4.6 MB). This is thelink to the trace from MFTrace.
If it is asking for MF_MT_MPEG_SEQUENCE_HEADER then I am setting them with IMFMediaType::SetBlob(MF_MT_MPEG_SEQUENCE_HEADER, BYTE[], UINT32)
I checked the file with Elecard Video Format Analyzer and the header seems incomplete.
Could I get some help finding out what I am missing or whether there is some better/other way of doing what I am trying to achieve?
Thanks!
Thanks, Manish
How about a "Folder and Leaf" concept for Windows Explorer?
I was thinking of a Folder system with Leaf or Leaves, a Folder with similar files or files with extensions is aLeaf, also a Leaf contains no more sub-folders, as in it can contain any files except folders. This will help user to navigate through folders easily, leaf will end user's search for specified folder while digging into endless folders within folders.
One should be able to convert any Leaf into a Folder, but no Folder with Subfolders can be converted into a Leaf.
Make leaves more customizable, apply color theme to it, change icons, change previews, etc.
Check if camera is front or back on windows
Is there a way using Native C++ to check if a camera on a tablet/laptop is front or rear ?
I need this to work in Native C++ not managed code. For example, I know one can retrieve this via DeviceInformation.EnclosureLocation. But is there another way ?
Hardware Clock for Media Foundation Source
I need to have the timestamps on IMFSample from a video capture source be assigned by the underlying hardware device. This is the default behavior for DirectShow, but I do not understand how to get the corresponding behavior for Media Foundation.
What I see as possible is to assign an IMFClock to a IMFMediaSession. It is also possible to create the corresponding IMFPresentationTimeSource as the system clock using MFCreateSystemTimeSource. However, unlike DirectShow, when the system clock is assigned as the IMFMediaSession clock, this is not presented to the underlying kernel device in the form of IKsReferenceClock acquired by KsPinGetReferenceClockInterface.
How can a Media Session use a clock that is applicable to both user and kernel modes?
Thanks,
Harry
Win7 + H.264 Encoder + IMFSinkWriter Can't use Quality VBR encoding?
I'm trying to alter the encoder quality property eAVEncCommonRateControlMode_Quality via ICodecAPI.
However the setting is ignored as started in the documentation which says the property must be set before IMFTransform::SetOutputType is called.
Now here is the problem: the sink writer seems to call IMFTransform::SetOutputType when we call SetInputMediaType on the sink writer, however if we don't call SetInputMediaType we can't retrieve the ICodecAPI interface via sinkWriter.GetServiceForStream (throws exception) to change the quality setting..seems like a catch 22. I'm hoping it's me and not just a design flaw in the APIs.
Setting the quality property works on Win8, as Win8 does not ignore if when called after IMFTransform::SetOutputType is called.
Help!!
MFCaptureToFile sample
While my
last thread never really came to a close, I figured it was time to start a new one.
This time, I've found some issues in the MFCaptureToFile sample. I'm basing my comments on the W7 SDK version, since the W8 (metro or full) doesn't seem to have any samples yet.
So far I've noticed 4 things:
1) Despite the fact that the docs for
IMFAttributes::GetAllocatedString say of pcchLength "This parameter must not be NULL", the sample does indeed pass it a NULL. Twice.
I will say that this seems an odd thing for the docs to say. If I had to guess, I'd wonder if the docs are auto-generated, and that this param (incorrectly) doesn't have an "optional" qualifier on it in the IDL. However, either the sample or
the docs (and maybe the IDL) should get changed.
2) CCapture::EndCaptureInternal is defined and implemented, but never called.
3) CCapture::OnReadSample calls CCapture::NotifyError (aka PostMessage) if an error occurs. However, the posted message (WM_APP_PREVIEW_ERROR) is not processed in DialogProc. This means the error is never actually handled.
4) If StartCapture encounters an error, g_pCapture does not get cleared. This leads to various errors including leaks. Perhaps WinMain's NotifyError should do cleanup?
Sometimes failing to get OnReadSample callback after successful call to ReadSample
I used MFTrace to capture the reader events in the good and bad cases, but they don't seem to indicate anything useful. Here are the relevant excerpts:
Good:
1372,B64 15:50:28.05871 Microsoft-Windows-MediaFoundation-MFReadWrite Start @SourceReader_ReadSample_Begin
1372,100 15:50:28.06997 Microsoft-Windows-MediaFoundation-Performance Stop @State Change to Stop Tag=CAPS Object=0x0000000020C95BC8
1372,D4C 15:50:28.07010 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaSourceEvent
1372,D4C 15:50:28.07012 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaSourceEvent
1372,EC4 15:50:28.07013 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
1372,EC4 15:50:28.07013 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
1372,EC4 15:50:28.07015 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
1372,EC4 15:50:28.07015 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_RequestSample
1372,EC4 15:50:28.07015 Microsoft-Windows-MediaFoundation-Performance Input @Buffer Input Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=9223372036854775807 Clock=0x0000000000000000 Sample=0x0000000000000000 Buffer Size=0 Sample Size=0 Sample Duration=0
1372,EC4 15:50:28.07016 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
1372,EC4 15:50:28.07016 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
1372,EC4 15:50:28.07016 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
1372,EC4 15:50:28.07016 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
1372,EC4 15:50:28.07017 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @SourceReader_ReadSample_End
1372,EC4 15:50:28.07028 Microsoft-Windows-MediaFoundation-MFReadWrite Start @SourceReader_ReadSample_Begin
1372,EC4 15:50:28.07036 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
1372,100 15:50:28.42629 Microsoft-Windows-MediaFoundation-Performance @Capture Source Start Tag=CAPS Object=0x0000000020C95950 Duration=<unknown TDH type 21>
1372,100 15:50:28.42630 Microsoft-Windows-MediaFoundation-Performance Queued @Buffer Queued Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=8420563654 Clock=0x0000000000000000 Sample=0x0000000020BB4D20 Buffer Size=1572864 Sample Size=22472 Sample Duration=333333
1372,100 15:50:28.42630 Microsoft-Windows-MediaFoundation-Performance Output @Buffer Output Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=8420563654 Clock=0x0000000000000000 Sample=0x0000000020BB4D20 Buffer Size=1572864 Sample Size=22472 Sample Duration=333333
1372,EC4 15:50:28.42632 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
1372,EC4 15:50:28.42633 Microsoft-Windows-MediaFoundation-MFReadWrite Start @Transform_ProcessInput
1372,EC4 15:50:28.43117 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @Transform_ProcessOutput
1372,EC4 15:50:28.43120 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @SourceReader_ReadSample_End
1372,EC4 15:50:28.43136 Microsoft-Windows-MediaFoundation-Performance Input @ProcessInput Tag=Resz Object=0x000000002A25F3B0 Object Category=8 Sample=0x0000000045075AD0 Buffer Size=1572864 Sample Time=0 Processing Time=4
1372,EC4 15:50:28.43230 Microsoft-Windows-MediaFoundation-Performance Output @ProcessOutput Tag=Resz Object=0x000000002A25F3B0 Object Category=8 Sample=0x0000000045C7B3E0 Buffer Size=1572864 Sample Time=0 Processing Time=933
1372,EC4 15:50:28.50238 Microsoft-Windows-MediaFoundation-MFReadWrite Start @SourceReader_ReadSample_Begin
1372,EC4 15:50:28.50242 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
1372,EC4 15:50:28.50243 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_RequestSample
1372,EC4 15:50:28.50243 Microsoft-Windows-MediaFoundation-Performance Input @Buffer Input Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=9223372036854775807 Clock=0x0000000000000000 Sample=0x0000000000000000 Buffer Size=0 Sample Size=0 Sample Duration=0
1372,100 15:50:28.63444 Microsoft-Windows-MediaFoundation-Performance @Capture Source Ready Queue Empty Tag=CAPS Object=0x000000002A250F90 Duration=<unknown TDH type 21>
1372,100 15:50:28.63444 Microsoft-Windows-MediaFoundation-Performance Queued @Buffer Queued Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=8422644559 Clock=0x0000000000000000 Sample=0x0000000020BB4D80 Buffer Size=1572864 Sample Size=22376 Sample Duration=333333
1372,100 15:50:28.63445 Microsoft-Windows-MediaFoundation-Performance Output @Buffer Output Tag=CAPS Object=0x000000002A250F90 Object Category=27 Stream=0x000000002A250F90 Timestamp=8422644559 Clock=0x0000000000000000 Sample=0x0000000020BB4D80 Buffer Size=1572864 Sample Size=22376 Sample Duration=333333
1372,EC4 15:50:28.63452 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
1372,EC4 15:50:28.63454 Microsoft-Windows-MediaFoundation-MFReadWrite Start @Transform_ProcessInput
1372,EC4 15:50:28.63845 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @Transform_ProcessOutput
1372,EC4 15:50:28.63848 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @SourceReader_ReadSample_End
(pattern repeats)
Bad:
3544,728 15:51:21.41710 Microsoft-Windows-MediaFoundation-MFReadWrite Start @SourceReader_ReadSample_Begin
3544,1114 15:51:21.43114 Microsoft-Windows-MediaFoundation-Performance Stop @State Change to Stop Tag=CAPS Object=0x000000002A10B358
3544,6E8 15:51:21.43127 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaSourceEvent
3544,6E8 15:51:21.43129 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaSourceEvent
3544,12DC 15:51:21.43131 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,12DC 15:51:21.43132 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,12DC 15:51:21.43134 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
3544,12DC 15:51:21.43134 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_RequestSample
3544,12DC 15:51:21.43135 Microsoft-Windows-MediaFoundation-Performance Input @Buffer Input Tag=CAPS Object=0x000000002A048660 Object Category=27 Stream=0x000000002A048660 Timestamp=9223372036854775807 Clock=0x0000000000000000 Sample=0x0000000000000000 Buffer Size=0 Sample Size=0 Sample Duration=0
3544,12DC 15:51:21.43135 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
3544,12DC 15:51:21.43136 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_MediaStreamEvent
3544,12DC 15:51:21.43136 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,12DC 15:51:21.43136 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
3544,12DC 15:51:21.43137 Microsoft-Windows-MediaFoundation-MFReadWrite Stop @SourceReader_ReadSample_End
3544,12DC 15:51:21.43172 Microsoft-Windows-MediaFoundation-MFReadWrite Start @SourceReader_ReadSample_Begin
3544,12DC 15:51:21.43176 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessOutputError
3544,728 15:51:21.90556 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,728 15:51:21.90556 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,728 15:51:21.90556 Microsoft-Windows-MediaFoundation-MFReadWrite @Transform_ProcessMessage
3544,12DC 15:51:21.94581 Microsoft-Windows-MediaFoundation-MFReadWrite @SourceReader_Error
Does MF use Free-Thread or Apartment-Thread COM?
Hi there,
Since I read the book "Developing MMF Applications", there's one paragrah that simply confused me:
"MF is a free-threaded system, ...Therefore, when calling CoInitializeEx(), you must initialize
COM with the apartment-threaded object concurrency by passing in the COINIT_APARTMENTTHREADED parameter..."
In addition, I've seen both COINIT_APARTMENTTHREADED and COINIT_MULTITHREADED appeared in those sample codes.
So which thread model should be used (or recommended)? Any difference on usage?
Any advices are appreciated.
Regards,
Joyah
IMFSourceReaderCallback
Trying to implement the following callback method in to C#:
http://msdn.microsoft.com/en-us/library/windows/desktop/gg583871%28v=vs.85%29.aspx
How do I implement the SourceReader pCallback pointer in C#? Should it be a pointer to the form.handle? See the following pseudocode.
I am able to successfully read the first sample but it's not automatically entering the IMFSourceReaderCallback.OnReadSample().
public partial class myForm : Form, IMFSourceReaderCallback { int IMFSourceReaderCallback.OnReadSample() { // Do something } int IMFSourceReaderCallback.OnEvent() { // Do something } int IMFSourceReaderCallback.OnFlush() { // Do something } int readMediaFile() { IMFSourceReader pReader = null; pCallback; // What should this be and where should it point to? hr = createSourceReaderAsync(pURL, pReader, pCallback); hr = pReader.readSample(MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, null, null, null, null); // Do something } int createSourceReaderAsync() { int hr = S_OK; IMFAttributes pAttributes = null; hr = MFCreateAttributes(pAttributes, 1); hr = pAttributes.SetUnknown(MF_SOURCE_READER_ASYNC_CALLBACK, pCallback); hr = MFCreateSourceReaderFromURL(pURL, pAttributes, ppReader); return hr; } }
Using Hardware accelerator with DXVA2.0
Hi
I'm trying to use H.264 hardware accelerator with VC++ 2005, DXVA2.0 and Geforce 8600 GTS on Vista.
After GetDecoderDeviceGuids() and GetDecoderRenderTargets(), it seems to succeed GetDecoderConfigurations().
But "guidConfigBitstreamEncryption", "guidConfigMBcontrolEncryption" and "guidConfigResidDiffEncryption" in all DXVA2_ConfigPictureDecode structures are alway "DXVA2_NoEncrypt".
And "ConfigBitstreamRaw" is 1 or 2. There are no explanation in the case of 2 on the document "DXVA2_ConfigPicture Structure".
I expect that at least one of "guidCondig*" is DXVA2_ModeH264_?.
Could anyone teach me what happen?
Or Do I misunderstand the way to use hardware accelerator?
How to clone an IMFSample? - WMFSinkWriter::WriteSample is returning MF_E_NO_SAMPLE_TIMESTAMP
I am trying to recreate an IMFSample from its constituent parts. however WMFSinkWriter::WriteSample is failing, even though I am setting timestamp via SetSampleTime.
What am I doing wrong ?
HRESULT CImageAcquisitionFileReader::OnReadSample( HRESULT hrStatus, DWORD dwStreamIndex, DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample* pSample ){ LOG_DEBUG("OnReadSample()"); HRESULT hr; m_frame++;// get Imageunsignedchar* image = NULL; IMFMediaBuffer *pBuffer = NULL; DWORD cbMaxLength = 0; DWORD cbCurrentLength = 0; LONGLONG sampleDuration = 0; hr = pSample->GetSampleDuration(&sampleDuration); hr = pSample->GetBufferByIndex(0, &pBuffer); if (SUCCEEDED(hr)) hr = pBuffer->GetCurrentLength(&cbCurrentLength); hr = pBuffer->GetMaxLength(&cbMaxLength); hr = pBuffer->Lock(&image, NULL, NULL);if (SUCCEEDED(hr)) { hr = pBuffer->Unlock(); } SafeRelease(&pBuffer);// recreate IMFSample from bytes IMFMediaBuffer *pMediaBuffer; IMFSample *pSampleCopy; BYTE *pbBuffer; hr = MFCreateMemoryBuffer(cbMaxLength, &pMediaBuffer); hr = pMediaBuffer->Lock(&pbBuffer,&cbMaxLength,&cbCurrentLength);//copy array to pbBuffer (or have the frames come in to this location in the first place) memcpy(pbBuffer, image, sizeof(image)); hr = pMediaBuffer->Unlock(); hr = pMediaBuffer->SetCurrentLength(cbCurrentLength); hr = MFCreateSample(&pSampleCopy); hr = pSampleCopy->AddBuffer(pMediaBuffer); hr = pSampleCopy->SetSampleTime(llTimestamp); hr = pSampleCopy->SetSampleDuration(sampleDuration); hr = pSampleCopy->SetSampleFlags(dwStreamFlags); ...
hr = m_spWriter->WriteSample(dwStreamIndex, pSampleCopy);
if (hr== MF_E_NO_SAMPLE_TIMESTAMP)
using the Media Type debugging code http://msdn.microsoft.com/en-us/library/windows/desktop/ee663602(v=vs.85).aspx,
I can see that pSampleCopy has no IMFAttributes set - could this be causing my problem ?
CComPtr<IMFAttributes> spSourceAttributes; ::MFCreateAttributes(&spSourceAttributes, 0); hr = pSample->CopyAllItems(spSourceAttributes); CComPtr<IMFAttributes> spCloneAttributes; ::MFCreateAttributes(&spCloneAttributes, 0); hr = pSampleCopy->CopyAllItems(spCloneAttributes); BOOL isMatch; MediaAttributesDumper* mediaAttributesDumper = new MediaAttributesDumper(); mediaAttributesDumper->LogAttributeValueByIndex(spSourceAttributes,0);delete (mediaAttributesDumper); hr = spSourceAttributes->Compare(spCloneAttributes,MF_ATTRIBUTES_MATCH_ALL_ITEMS, &isMatch);
Miracast Detection?
I'm looking for a way to detect Miracast support on the platform. I'm aware that there are two things to be concerned with:
- A compatible Wi-Fi direct device
- The presence of a Miracast-enabled graphics driver (exposed as a separate UMDF - specifically, a DLL)
I believe I've figured out how to detect whether the Microsoft Wi-Fi Direct Virtual Adapter is installed and present (although it seems a little convoluted - ideally, I'd query the network adapter for a dev property, but it appears that the virtual Wi-Fi Direct Device only appears after paired [is this similar to Bluetooth, where the adapter only appears after pairing?]). What I'm doing now is looking for PnP devices with {5d624f94-8850-40c3-a3fa-a4fd2080baf3}\vwifimp_wfd in its HardwareIds list. Is there a better way that ensures that the WFD device supports all the capabilities that are required for Miracast (from a transport perspective)?
The Miracast-enabled graphics driver is a little more challenging. Apparently, it looks like the way to go is to query MediaFoundation (I'm aware that the HCK looks at traces generated by the drivers, but I don't know how to do the same thing). Is there another way (perhaps enumerating the graphics driver properties via AQS with a custom property)?
At the end of the day, this requires a leap of faith that the combination of the two means that Miracast is enabled in the OS. Is this always the case (that if WFD works and the Graphics driver is installed, then the system Miracast-capable)?
Thanks in advance,
-Andre
Create desktop video using sink writer and Desktop Duplication API
IMFSourceReaderCallback thread executes after delete.
Hey Guys. I built a class which uses IMFSourceReader for getting images from webcam using Asynchronous method. The class destructor release IMFSourceReader follow by IMFMediaSource->Shutdown, then proceed by release.
I create an instance of the class ( A ) and later delete it without any error. However, when I create another instance of the class ( B ), the callback thread tells me that my IMFSourceReader has been released and could not get sample with a dereferenced pointer
0xDDDDDDDD. When I use step by step debug at the callback thread, I found that 2 callback thread is working at the same time with a IMFSourceReader being usable which I suspect from instance ( B ) and another IMFSourceReader which has been released from instance
( A ) . I used EnterCriticalSection and LeaveCriticalSection for my callback.
So, is there a way to actually stop the thread before releasing IMFSourceReader?
Thanks
[D3D] Why the texture blurred ? I need it keeps the same resolution of the source image
The source image is 1920x1200 px.
Here is my render funtion:
VOID Render() { // Clear the backbuffer and the zbuffer g_pd3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_XRGB( 0, 0, 255 ), 1.0f, 0 ); // Begin the scene if( SUCCEEDED( g_pd3dDevice->BeginScene() ) ) { g_pd3dDevice->SetSamplerState(0, D3DSAMP_ADDRESSU, D3DTADDRESS_BORDER); g_pd3dDevice->SetSamplerState(0, D3DSAMP_ADDRESSV, D3DTADDRESS_BORDER); g_pd3dDevice->SetSamplerState(0, D3DSAMP_BORDERCOLOR, 0x000000ff); g_pd3dDevice->SetTexture( 0, g_pTexture ); // Render the vertex buffer contents g_pd3dDevice->SetStreamSource( 0, g_pVB, 0, sizeof( CUSTOMVERTEX ) ); g_pd3dDevice->SetFVF( D3DFVF_CUSTOMVERTEX ); g_pd3dDevice->DrawPrimitive( D3DPT_TRIANGLELIST, 0, 2);//2 * 50 - 2 ); // End the scene g_pd3dDevice->EndScene(); } // Present the backbuffer contents to the display // show in two window g_pd3dDevice->Present( NULL, NULL, NULL, NULL ); HWND hWnd = WindowFromDC(g_hDc); RECT rc; rc.bottom = 600, rc.left = 100, rc.right = 600, rc.top = 100; //RECT rc; rc.bottom = in->height, rc.left = 0, rc.right = in->width, rc.top = 0; g_pd3dDevice->Present( NULL, &rc, hWnd, NULL ); }
What will be seen shows in here:
Why does the image the program displays look so blurred? How to set the texture to look better. The source image is 1920x1200 px !
ID3D11DeviceContext::CopyResource Method Fails
ReportGetSampleProgress() wp 8.10.14147 roach motel
ReportGetSampleProgress() will let you check in but never get back out.
Normally, it goes like this
MP_AV_CurrentStateChanged ENTER current state='Opening'
SeekAsync(0)
MP_AV_CurrentStateChanged ENTER current state='Playing'
but when ReportGetSampleProgress() doesn't return, the SeekAsync() callback does not fire; in the debugger/threads view, you can see it stuck in the native SeekAsync code. All that shows, in that case, is the 'Opening' line. It doesn't happen every time ReportGetSampleProgress(), but it happens. The app does not exit. You can kill it from the debugger, but oddly enough it doesn't die; threads (e.g., network) still run, and trace output still shows. Someone there might want to look into this.
IMFSourceResolver::CreateObjectFromURL sometimes hangs when creating mulitple media source
I want to capture multiple videos from multiple ip camera using rstp.
And i create one media source for each ip camera. Fist i try three camera. But sometimes, after one or two media source is created well, the next media source creation hangs in the execution of the Function IMFSourceResolver::CreateObjectFromURL.
Then how to solve this.
How to play multiple videos on one screen?
Hi everyone!
I'm implementing a video player, which plays n videos on one screen at the same time. If I play just one video, there is no problem. But if I play two or more they are flickering, slower, faster, ... terrible performance.
My implementation so far:
-vs2010 C#.net using MFNet (c# wrapper, http://mfnet.sourceforge.net/)
-one fullscreen windows form
-n windows user controls; each includes it's own player
-each player uses an own instance of MediaSession
-use SourceResolver and MediaSource for media
-media are in several formats that media foundation plays (different formats, resolution, bitrate, codecs)
Does anyone know how to boost the performance or if this implementation is the way to do it?
Tried to play 4 videos at the same time with TopoEdit (Windows SDK Tool) and have the same problem.
Tried to play 4 videos at the same time with GraphEdit (Windows SDK Tool) and the performance is much much better.
What are the differences between these programs? TopoEdit uses MediaFoundation, but GraphEdit?
Thanks in advance