Fields (1) Methods (28) Properties (6)
Namespace: CSCore.CoreAudioAPI

Syntax

public class AudioClient : ComObject

Base Type

Summary

Enables a client to create and initialize an audio stream between an audio application and the audio engine (for a shared-mode stream) or the hardware buffer of an audio endpoint device (for an exclusive-mode stream). For more information, see .

Uses

Used by

Fields

IID_IAudioClient

IID of the IAudioClient-interface.

public readonly Guid IID_IAudioClient

Methods

AudioClient(IntPtr ptr)

Initializes a new instance of the AudioClient class.

public void AudioClient(IntPtr ptr)

Parameters

ptr

IntPtr

Native pointer.

Remarks

Use the FromMMDevice method to create a new AudioClient instance.

FromMMDevice(MMDevice device)

Returns a new instance of the AudioClient class.

public static AudioClient FromMMDevice(MMDevice device)

Parameters

device

MMDevice

Device which should be used to create the AudioClient instance.

Returns

AudioClient instance.

GetBufferSize()

Returns the size (maximum capacity) of the endpoint buffer.

public Int32 GetBufferSize()

Returns

HRESULT

Remarks

The size of one frame = (number of bits per sample)/8 * (number of channels)

GetBufferSizeNative(Int32& bufferFramesCount)

Retrieves the size (maximum capacity) of the endpoint buffer.

public Int32 GetBufferSizeNative(Int32& bufferFramesCount)

Parameters

bufferFramesCount

Int32&

Retrieves the number of audio frames that the buffer can hold.

Returns

HRESULT

Remarks

The size of one frame = (number of bits per sample)/8 * (number of channels)

GetCurrentPadding()

Retrieves the number of frames of padding in the endpoint buffer.

public Int32 GetCurrentPadding()

Returns

The frame count (the number of audio frames of padding in the buffer).

Remarks

The size of one frame = (number of bits per sample)/8 * (number of channels)

GetCurrentPaddingNative(Int32& numPaddingFrames)

Retrieves the number of frames of padding in the endpoint buffer.

public Int32 GetCurrentPaddingNative(Int32& numPaddingFrames)

Parameters

numPaddingFrames

Int32&

Retrieves the frame count (the number of audio frames of padding in the buffer).

Returns

HRESULT

Remarks

The size of one frame = (number of bits per sample)/8 * (number of channels)

GetDevicePeriodNative(Int64& hnsDefaultDevicePeriod, Int64& hnsMinimumDevicePeriod)

Retrieves the length of the periodic interval separating successive processing passes by the audio engine on the data in the endpoint buffer.

public Int32 GetDevicePeriodNative(Int64& hnsDefaultDevicePeriod, Int64& hnsMinimumDevicePeriod)

Parameters

hnsDefaultDevicePeriod

Int64&

Retrieves a time value specifying the default interval between periodic processing passes by the audio engine. The time is expressed in 100-nanosecond units.

hnsMinimumDevicePeriod

Int64&

Retrieves a time value specifying the minimum interval between periodic processing passes by the audio endpoint device. The time is expressed in 100-nanosecond units.

Returns

HRESULT

Remarks

Use the hnsDefaultDevicePeriod and the hnsMinimumDevicePeriod properties instead of the GetDevicePeriodNative method. For more information, see .

GetMixFormat()

Retrieves the stream format that the audio engine uses for its internal processing of shared-mode streams.

public WaveFormat GetMixFormat()

Returns

The mix format that the audio engine uses for its internal processing of shared-mode streams.

Remarks

For more information, see .

GetMixFormatNative(WaveFormat& deviceFormat)

Retrieves the stream format that the audio engine uses for its internal processing of shared-mode streams.

public Int32 GetMixFormatNative(WaveFormat& deviceFormat)

Parameters

deviceFormat

WaveFormat&

Retrieves the mix format that the audio engine uses for its internal processing of shared-mode streams.

Returns

HRESULT

Remarks

For more information, see .

GetService(Guid riid)

Accesses additional services from the audio client object.

public IntPtr GetService(Guid riid)

Parameters

riid

Guid

The interface ID for the requested service. For a list of all available values, see .

Returns

A pointer into which the method writes the address of an instance of the requested interface. Through this method, the caller obtains a counted reference to the interface. The caller is responsible for releasing the interface, when it is no longer needed, by calling the interface's Release method.

Remarks

For more information, see .

GetServiceNative(Guid riid, IntPtr& ppv)

Accesses additional services from the audio client object.

public Int32 GetServiceNative(Guid riid, IntPtr& ppv)

Parameters

riid

Guid

The interface ID for the requested service. For a list of all available values, see .

ppv

IntPtr&

A pointer variable into which the method writes the address of an instance of the requested interface. Through this method, the caller obtains a counted reference to the interface. The caller is responsible for releasing the interface, when it is no longer needed, by calling the interface's Release method. If the GetService call fails, *ppv is Zero.

Returns

HRESULT

Remarks

For more information, see .

GetStreamLatency()

Retrieves the maximum latency for the current stream and can be called any time after the stream has been initialized.

public Int64 GetStreamLatency()

Returns

A value representing the latency. The time is expressed in 100-nanosecond units.

Remarks

Rendering clients can use this latency value to compute the minimum amount of data that they can write during any single processing pass. To write less than this minimum is to risk introducing glitches into the audio stream. For more information, see .

GetStreamLatencyNative(Int64& hnsLatency)

Retrieves the maximum latency for the current stream and can be called any time after the stream has been initialized.

public Int32 GetStreamLatencyNative(Int64& hnsLatency)

Parameters

hnsLatency

Int64&

Retrieves a value representing the latency. The time is expressed in 100-nanosecond units.

Returns

HRESULT

Remarks

Rendering clients can use this latency value to compute the minimum amount of data that they can write during any single processing pass. To write less than this minimum is to risk introducing glitches into the audio stream. For more information, see .

public void Initialize(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.

streamFlags

AudioClientStreamFlags

Flags to control creation of the stream.

hnsBufferDuration

Int64

The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.

hnsPeriodicity

Int64

The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.

waveFormat

WaveFormat

The format descriptor. For more information, see .

audioSessionGuid

Guid

A value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.

Remarks

For more information, see .

public Int32 InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, WaveFormat waveFormat, Guid audioSessionGuid)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.

streamFlags

AudioClientStreamFlags

Flags to control creation of the stream.

hnsBufferDuration

Int64

The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.

hnsPeriodicity

Int64

The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.

waveFormat

WaveFormat

The format descriptor. For more information, see .

audioSessionGuid

Guid

A value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.

Returns

HRESULT

Remarks

For more information, see .

public Int32 InitializeNative(AudioClientShareMode shareMode, AudioClientStreamFlags streamFlags, Int64 hnsBufferDuration, Int64 hnsPeriodicity, IntPtr waveFormat, Guid audioSessionGuid)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the connection. Through this parameter, the client tells the audio engine whether it wants to share the audio endpoint device with other clients.

streamFlags

AudioClientStreamFlags

Flags to control creation of the stream.

hnsBufferDuration

Int64

The buffer capacity as a time value (expressed in 100-nanosecond units). This parameter contains the buffer size that the caller requests for the buffer that the audio application will share with the audio engine (in shared mode) or with the endpoint device (in exclusive mode). If the call succeeds, the method allocates a buffer that is a least this large.

hnsPeriodicity

Int64

The device period. This parameter can be nonzero only in exclusive mode. In shared mode, always set this parameter to 0. In exclusive mode, this parameter specifies the requested scheduling period for successive buffer accesses by the audio endpoint device. If the requested device period lies outside the range that is set by the device's minimum period and the system's maximum period, then the method clamps the period to that range. If this parameter is 0, the method sets the device period to its default value. To obtain the default device period, call the GetDevicePeriodNative method. If the StreamFlagsEventCallback stream flag is set and Exclusive is set as the shareMode, then hnsPeriodicity must be nonzero and equal to hnsBufferDuration.

waveFormat

IntPtr

Pointer to the format descriptor. For more information, see .

audioSessionGuid

Guid

A value that identifies the audio session that the stream belongs to. If the Guid identifies a session that has been previously opened, the method adds the stream to that session. If the GUID does not identify an existing session, the method opens a new session and adds the stream to that session. The stream remains a member of the same session for its lifetime. Use Empty to use the default session.

Returns

HRESULT

Remarks

For more information, see .

IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat)

Indicates whether the audio endpoint device supports a particular stream format.

public Boolean IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.

waveFormat

WaveFormat

The stream format to test whether it is supported by the AudioClient or not.

Returns

True if the waveFormat is supported. False if the waveFormat is not supported.

Remarks

For more information, see .

IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)

Indicates whether the audio endpoint device supports a particular stream format.

public Boolean IsFormatSupported(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.

waveFormat

WaveFormat

The stream format to test whether it is supported by the AudioClient or not.

closestMatch

WaveFormat&

Retrieves the supported format that is closest to the format that the client specified through the waveFormat parameter. If shareMode is Shared, the closestMatch will be always null.

Returns

True if the waveFormat is supported. False if the waveFormat is not supported.

Remarks

For more information, see .

save

reset

Drag to pan - Use Mousewheel + Ctrl to zoom

IsFormatSupportedNative(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)

Indicates whether the audio endpoint device supports a particular stream format.

public Int32 IsFormatSupportedNative(AudioClientShareMode shareMode, WaveFormat waveFormat, WaveFormat& closestMatch)

Parameters

shareMode

AudioClientShareMode

The sharing mode for the stream format. Through this parameter, the client indicates whether it wants to use the specified format in exclusive mode or shared mode.

waveFormat

WaveFormat

The stream format to test whether it is supported by the AudioClient or not.

closestMatch

WaveFormat&

Retrieves the supported format that is closest to the format that the client specified through the waveFormat parameter. If shareMode is Shared, the closestMatch will be always null.

Returns

HRESULT code. If the method returns 0 (= SOK), the endpoint device supports the specified waveFormat. If the method returns 1 (= SFALSE), the method succeeded with a closestMatch to the specified waveFormat. If the method returns 0x88890008 (= AUDCLNTEUNSUPPORTED_FORMAT), the method succeeded but the specified format is not supported in exclusive mode. If the method returns anything else, the method failed.

Remarks

For more information, see .

Reset()

Resets the audio stream.

public void Reset()

Remarks

For more information, see .

ResetNative()

Resets the audio stream.

public Int32 ResetNative()

Returns

HRESULT

Remarks

For more information, see .

SetEventHandle(IntPtr handle)

Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.

public void SetEventHandle(IntPtr handle)

Parameters

handle

IntPtr

The event handle.

Remarks

For more information, see .

SetEventHandle(WaitHandle waitHandle)

Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.

public void SetEventHandle(WaitHandle waitHandle)

Parameters

waitHandle

WaitHandle

The event handle.

Remarks

For more information, see .

save

reset

Drag to pan - Use Mousewheel + Ctrl to zoom

SetEventHandleNative(IntPtr handle)

Sets the event handle that the system signals when an audio buffer is ready to be processed by the client.

public Int32 SetEventHandleNative(IntPtr handle)

Parameters

handle

IntPtr

The event handle.

Returns

HRESULT

Remarks

For more information, see .

Start()

Starts the audio stream.

public void Start()

Remarks

For more information, see .

StartNative()

Starts the audio stream.

public Int32 StartNative()

Returns

HRESULT

Remarks

For more information, see .

Stop()

Stops the audio stream.

public void Stop()

Remarks

For more information, see .

StopNative()

Stops the audio stream.

public Int32 StopNative()

Returns

HRESULT

Remarks

For more information, see .

Properties

BufferSize

Gets the maximum capacity of the endpoint buffer.

public Int32 BufferSize { get; }

CurrentPadding

Gets the number of frames of padding in the endpoint buffer.

public Int32 CurrentPadding { get; }

DefaultDevicePeriod

Gets the default interval between periodic processing passes by the audio engine. The time is expressed in 100-nanosecond units.

public Int64 DefaultDevicePeriod { get; }

MinimumDevicePeriod

Gets the minimum interval between periodic processing passes by the audio endpoint device. The time is expressed in 100-nanosecond units.

public Int64 MinimumDevicePeriod { get; }

MixFormat

Gets the stream format that the audio engine uses for its internal processing of shared-mode streams.

public WaveFormat MixFormat { get; }

StreamLatency

Gets the maximum latency for the current stream and can be called any time after the stream has been initialized.

public Int64 StreamLatency { get; }

Class Diagram

public classAudioClientIID_IAudioClientFromMMDeviceGetBufferSizeGetBufferSizeNativeGetCurrentPaddingGetCurrentPaddingNativeGetDevicePeriodNativeGetMixFormatGetMixFormatNativeGetServiceGetServiceNativeGetStreamLatencyGetStreamLatencyNativeInitializeInitializeNativeInitializeNativeIsFormatSupportedIsFormatSupportedIsFormatSupportedNativeResetResetNativeSetEventHandleSetEventHandleSetEventHandleNativeStartStartNativeStopStopNativeAudioClientBufferSize { get; }CurrentPadding { get; }DefaultDevicePeriod { get; }MinimumDevicePeriod { get; }MixFormat { get; }StreamLatency { get; }public classComObjectUnsafeBasePtrAddRefDisposeDisposeFinalizeQueryInterfaceQueryInterfaceQueryInterfaceReleaseComObjectComObjectBasePtr { get; set; }IsDisposed { get; }public classAudioCaptureClientFromAudioClientGetBufferGetBufferGetBufferNativeGetNextPacketSizeGetNextPacketSizeNativeReleaseBufferReleaseBufferNativeAudioCaptureClientNextPacketSize { get; }public classAudioClockFromAudioClientGetCharacteristicsNativeGetFrequencyNativeGetPositionNativeAudioClockPu64Frequency { get; }Pu64Position { get; }public classAudioRenderClientFromAudioClientGetBufferGetBufferNativeReleaseBufferReleaseBufferNativeAudioRenderClientpublic classSimpleAudioVolumeFromAudioClientGetMasterVolumeNativeGetMuteInternalSetMasterVolumeNativeSetMuteNativeSimpleAudioVolumeIsMuted { get; set; }MasterVolume { get; set; }public classCoreAudioAPIExceptionTryCoreAudioAPIExceptionpublic classMMDeviceActivateActivateNativeDisposeGetIdNativeGetStateNativeOpenPropertyStoreOpenPropertyStoreNativeToStringMMDeviceDeviceID { get; }DeviceState { get; }FriendlyName { get; }PropertyStore { get; }

save

reset

Drag to pan - Use Mousewheel + Ctrl to zoom