public class AudioClient : ComObject
Enables a client to create and initialize an audio stream between an audio application and the audio engine (for a shared-mode stream) or the hardware buffer of an audio endpoint device (for an exclusive-mode stream). For more information, see .
IID_IAudioClient
IID of the IAudioClient-interface.
public readonly Guid IID_IAudioClient
AudioClient(IntPtr ptr)
Initializes a new instance of the AudioClient class.
public void AudioClient(IntPtr ptr)
Parameters
ptr
IntPtrNative pointer.
Remarks
Use the FromMMDevice method to create a new AudioClient instance.
FromMMDevice(MMDevice device)
Returns a new instance of the AudioClient class.
GetBufferSize()
Returns the size (maximum capacity) of the endpoint buffer.
GetBufferSizeNative(Int32& bufferFramesCount)
Retrieves the size (maximum capacity) of the endpoint buffer.
public Int32 GetBufferSizeNative(Int32& bufferFramesCount)
Parameters
bufferFramesCount
Int32&Retrieves the number of audio frames that the buffer can hold.
Returns
HRESULT
Remarks
The size of one frame = (number of bits per sample)/8 * (number of channels)
GetCurrentPadding()
Retrieves the number of frames of padding in the endpoint buffer.
GetCurrentPaddingNative(Int32& numPaddingFrames)
Retrieves the number of frames of padding in the endpoint buffer.
public Int32 GetCurrentPaddingNative(Int32& numPaddingFrames)
Parameters
numPaddingFrames
Int32&Retrieves the frame count (the number of audio frames of padding in the buffer).
Returns
HRESULT
Remarks
The size of one frame = (number of bits per sample)/8 * (number of channels)
GetDevicePeriodNative(Int64& hnsDefaultDevicePeriod, Int64& hnsMinimumDevicePeriod)
Retrieves the length of the periodic interval separating successive processing passes by the audio engine on the data in the endpoint buffer.
public Int32 GetDevicePeriodNative(Int64& hnsDefaultDevicePeriod, Int64& hnsMinimumDevicePeriod)
Parameters
hnsDefaultDevicePeriod
Int64&Retrieves a time value specifying the default interval between periodic processing passes by the audio engine. The time is expressed in 100-nanosecond units.
hnsMinimumDevicePeriod
Int64&Retrieves a time value specifying the minimum interval between periodic processing passes by the audio endpoint device. The time is expressed in 100-nanosecond units.
Returns
HRESULT
Remarks
Use the hnsDefaultDevicePeriod and the hnsMinimumDevicePeriod properties instead of the GetDevicePeriodNative method. For more information, see .
GetMixFormat()
Retrieves the stream format that the audio engine uses for its internal processing of shared-mode streams.
GetMixFormatNative(WaveFormat& deviceFormat)
Retrieves the stream format that the audio engine uses for its internal processing of shared-mode streams.
public Int32 GetMixFormatNative(WaveFormat& deviceFormat)
Parameters
deviceFormat
WaveFormat&Retrieves the mix format that the audio engine uses for its internal processing of shared-mode streams.
Returns
HRESULT
Remarks
For more information, see .
GetService(Guid riid)
Accesses additional services from the audio client object.
public IntPtr GetService(Guid riid)
Parameters
riid
GuidThe interface ID for the requested service. For a list of all available values, see .
Returns
A pointer into which the method writes the address of an instance of the requested interface. Through this method, the caller obtains a counted reference to the interface. The caller is responsible for releasing the interface, when it is no longer needed, by calling the interface's Release method.
Remarks
For more information, see .
GetServiceNative(Guid riid, IntPtr& ppv)
Accesses additional services from the audio client object.
public Int32 GetServiceNative(Guid riid, IntPtr& ppv)
Parameters
riid
GuidThe interface ID for the requested service. For a list of all available values, see .
ppv
IntPtr&A pointer variable into which the method writes the address of an instance of the requested interface. Through this method, the caller obtains a counted reference to the interface. The caller is responsible for releasing the interface, when it is no longer needed, by calling the interface's Release method. If the GetService call fails, *ppv is Zero.
Returns
HRESULT
Remarks
For more information, see .
GetStreamLatency()
Retrieves the maximum latency for the current stream and can be called any time after the stream has been initialized.
public Int64 GetStreamLatency()
Returns
A value representing the latency. The time is expressed in 100-nanosecond units.
Remarks
Rendering clients can use this latency value to compute the minimum amount of data that they can write during any single processing pass. To write less than this minimum is to risk introducing glitches into the audio stream. For more information, see .
GetStreamLatencyNative(Int64& hnsLatency)
Retrieves the maximum latency for the current stream and can be called any time after the stream has been initialized.
public Int32 GetStreamLatencyNative(Int64& hnsLatency)
Parameters
hnsLatency
Int64&Retrieves a value representing the latency. The time is expressed in 100-nanosecond units.
Returns
HRESULT
Remarks
Rendering clients can use this latency value to compute the minimum amount of data that they can write during any single processing pass. To write less than this minimum is to risk introducing glitches into the audio stream. For more information, see .
Initialize(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)
Initializes the audio stream.
public void Initialize(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.
streamFlags
AudioClientStreamFlagsFlags to control creation of the stream.
hnsBufferDuration
Int64The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.
hnsPeriodicity
Int64The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.
waveFormat
WaveFormatThe format descriptor. For more information, see .
audioSessionGuid
GuidA value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.
Remarks
For more information, see .
InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)
Initializes the audio stream.
public Int32 InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.
streamFlags
AudioClientStreamFlagsFlags to control creation of the stream.
hnsBufferDuration
Int64The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.
hnsPeriodicity
Int64The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.
waveFormat
WaveFormatThe format descriptor. For more information, see .
audioSessionGuid
GuidA value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.
Returns
HRESULT
Remarks
For more information, see .
InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, IntPtr waveFormat, Guid audioSessionGuid)
Initializes the audio stream.
public Int32 InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, IntPtr waveFormat, Guid audioSessionGuid)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.
streamFlags
AudioClientStreamFlagsFlags to control creation of the stream.
hnsBufferDuration
Int64The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.
hnsPeriodicity
Int64The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.
waveFormat
IntPtrPointer to the format descriptor. For more information, see .
audioSessionGuid
GuidA value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.
Returns
HRESULT
Remarks
For more information, see .
IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat)
Indicates whether the audio endpoint device supports a particular stream format.
public Boolean IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.
waveFormat
WaveFormatThe stream format to test whether it is supported by the AudioClient or not.
Returns
True if the waveFormat is supported. False if the waveFormat is not supported.
Remarks
For more information, see .
IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)
Indicates whether the audio endpoint device supports a particular stream format.
public Boolean IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.
waveFormat
WaveFormatThe stream format to test whether it is supported by the AudioClient or not.
closestMatch
WaveFormat&Retrieves the supported format that is closest to the format that the client specified through the waveFormat parameter. If shareMode is Shared, the closestMatch will be always null.
Returns
True if the waveFormat is supported. False if the waveFormat is not supported.
Remarks
For more information, see .
IsFormatSupportedNative(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)
Indicates whether the audio endpoint device supports a particular stream format.
public Int32 IsFormatSupportedNative(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)
Parameters
shareMode
AudioClientShareModeThe sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.
waveFormat
WaveFormatThe stream format to test whether it is supported by the AudioClient or not.
closestMatch
WaveFormat&Retrieves the supported format that is closest to the format that the client specified through the waveFormat parameter. If shareMode is Shared, the closestMatch will be always null.
Returns
HRESULT code. If the method returns 0 (= SOK), the endpoint device supports the specified waveFormat. If the method returns 1 (= SFALSE), the method succeeded with a closestMatch to the specified waveFormat. If the method returns 0x88890008 (= AUDCLNTEUNSUPPORTED_FORMAT), the method succeeded but the specified format is not supported in exclusive mode. If the method returns anything else, the method failed.
Remarks
For more information, see .
Reset()
Resets the audio stream.
ResetNative()
Resets the audio stream.
public Int32 ResetNative()
Returns
HRESULT
Remarks
For more information, see .
SetEventHandle(IntPtr handle)
Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.
SetEventHandle(WaitHandle waitHandle)
Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.
SetEventHandleNative(IntPtr handle)
Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.
public Int32 SetEventHandleNative(IntPtr handle)
Parameters
handle
IntPtrThe event handle.
Returns
HRESULT
Remarks
For more information, see .
Start()
Starts the audio stream.
StartNative()
Starts the audio stream.
public Int32 StartNative()
Returns
HRESULT
Remarks
For more information, see .
Stop()
Stops the audio stream.
StopNative()
Stops the audio stream.
public Int32 StopNative()
Returns
HRESULT
Remarks
For more information, see .
BufferSize
Gets the maximum capacity of the endpoint buffer.
public Int32 BufferSize { get; }
CurrentPadding
Gets the number of frames of padding in the endpoint buffer.
public Int32 CurrentPadding { get; }
DefaultDevicePeriod
Gets the default interval between periodic processing passes by the audio engine. The time is expressed in 100-nanosecond units.
public Int64 DefaultDevicePeriod { get; }
MinimumDevicePeriod
Gets the minimum interval between periodic processing passes by the audio endpoint device. The time is expressed in 100-nanosecond units.
public Int64 MinimumDevicePeriod { get; }
MixFormat
Gets the stream format that the audio engine uses for its internal processing of shared-mode streams.
public WaveFormat MixFormat { get; }
StreamLatency
Gets the maximum latency for the current stream and can be called any time after the stream has been initialized.
public Int64 StreamLatency { get; }