core audioinst.cs.berkeley.edu › ~msdosx › fa11 › lectures › lecture20.pdfcore audio clock...

Post on 29-May-2020

15 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Core AudioMSDOSX : Lecture 20

Overview

•What is Core Audio?

•Core Audio Programming Interfaces

•Common Tasks With Core Audio

•Core Audio Frameworks

•What’s Been Shipping Since 10.4?

•Supported Audio and Data Formats

•Resources

What is Core Audio?

What is Core Audio?

Plug-in interfaces for audio synthesis and audio digital signal processing (DSP)

What is Core Audio?

Built in support for reading and writing a wide variety of audio file and data formats

What is Core Audio?

Plug-in interfaces for handling custom file and data formats

What is Core Audio?

A modular approach for constructing signal chains

What is Core Audio?

Scalable multichannel input and output

What is Core Audio?

Easy synchronization of audio and MIDI data during recording or

What is Core Audio?

A standardized interface to all built-in and external hardware devices, regardless of connection type (USB,

Firewire, PCI, and so on)

A standardized interface to all built-in and external hardware devices, regardless of

connection type (USB, Firewire, PCI, and so on)

A standardized interface to all built-in and external hardware devices, regardless of

connection type (USB, Firewire, PCI, and so on)

A standardized interface to all built-in and external hardware devices, regardless of

connection type (USB, Firewire, PCI, and so on)

InterfacesAudio Unit Services

• Audio Processing Graph API

• Audio File and Converter Services

• Hardware Abstraction Layer (HAL) Services

• Music Player API

• Core MIDI Services and MIDI Server Services

• Core Audio Clock API

• OpenAL (Open Audio Library)

• System Sound API

Audio Unit Services

Allows you to create and manipulate audio units.

Audio ProcessingGraph API

Audio ProcessingGraph API

Audio File andConverter Services

• Audio Converters and Codecs

• File Format Information

• Audio Metadata

• Core Audio File Format

Hardware Abstraction Layer (HAL) Services

“Core Audio uses a hardware abstraction layer (HAL) to provide a consistent and predictable interface for applications to deal with hardware.”

Hardware Abstraction Layer (HAL) Services

Music Player API

“The Music Player API allows you to arrange and play a collection of music tracks.”

MIDI

Core MIDI Services and MIDI Server Services

“Core MIDI Services defines an interface that applications and audio units can use to communicate with MIDI devices. It uses a number of abstractions that allow an application to interact with a MIDI network.”

Core MIDI Services and MIDI Server Services

Core MIDI Services and MIDI Server Services

Core Audio Clock APIThe Core Audio Clock API provides:

• A reference clock that you can use to synchronize applications or devices.

• A standalone timing source, or...

• A synchronized timer with an external trigger, such as a MIDI beat clock or MIDI time code.

• You can start and stop the clock yourself, or...

• You can set the clock to activate or deactivate in response to certain events.

Core Audio Clock API

OpenAL(Open Audio Library)

“OpenAL is a cross-platform API used to position and manipulate sounds in a simulated three-dimensional space.”

System Sound API

“The System Sound API provides a simple way to play standard system sounds in your application.”

Common Tasksin Core Audio

•Reading and Writing Audio Data

•Interfacing with Hardware Devices

•The AUHAL

•Using Aggregate Devices

•Creating Audio Units & Hosting Audio Units

•Handling MIDI Data

Reading and Writing Audio Data

Interfacing with Hardware Devices

Operations must go through the hardware abstraction layer (HAL).

The AUHAL

“If you need to connect to an input device, or a hardware device other than the default output device, you need to use the AUHAL.”

The AUHAL

“If you need to connect to an input device, or a hardware device other than the default output device, you need to use the AUHAL.”

Using Aggregate Devices

“When interfacing with hardware audio devices, Core Audio allows you to add an additional level of abstraction, creating aggregate devices which combine the inputs and outputs of multiple devices to appear as a single device.”

Creating Audio Units & Hosting Audio Units

...probably read the documentation. :-)

Handling MIDI Data

MIDI File Read

Handling MIDI Data

MIDI Sequence Playback

Handling MIDI Data

MIDI Device Play-through

Handling MIDI Data

MIDI File Recording

Handling MIDI Data

Mixing Audio Sources

Handling MIDI Data

“Big Picture”

Core Audio Frameworks

•AudioToolbox.framework

•AudioUnit.framework

•CoreAudioKit.framework

•CoreAudio.framework

•CoreMIDI.framework

•CoreMIDIServer.framework

•OpenAL.framework

System SuppliedAudio Units

System-supplied effect units (kAudioUnitType_Effect)

AUHiPass

kAudioUnitSubType_HighPassFilter

A high-pass filter with an adjustable resonance peak.

System SuppliedAudio Units

 System-supplied instrument unit

DLSMusicDevice

kAudioUnitSubType_DLSSynth

A virtual instrument unit that lets you play MIDI data using sound banks

System SuppliedAudio Units

 System-supplied mixer units (kAudioUnitType_Mixer)

AUMixer3D

kAudioUnitSubType_3DMixer

...can take several different signals and mix them so they appear to be positioned in a three-dimensional space.

System SuppliedAudio Units

System-supplied converter units (kAudioUnitType_FormatConverter)

AUTimePitch

kAudioUnitSubType_TimePitch

A unit that lets you change the speed of playback without altering the pitch, or vice versa.

System SuppliedAudio Units

 System-supplied output units (kAudioUnitType_Output)

AudioDeviceOutput

kAudioUnitSubType_HALOutput

A unit that interfaces with an audio device using the hardware abstraction layer. Also called the AUHAL.

System SuppliedAudio Units

System-supplied generator units (kAudioUnitType_Generator)

AUNetReceive

kAudioUnitSubType_NetReceive

A unit that receives streamed audio data from a network.

Supported Audio andData File Formats

AAC (.aac, .adts)

AC3 (.ac3)

AIFC (.aif, .aiff,.aifc)

AIFF (.aiff)

Apple Core Audio Format (.caf)

MPEG Layer 3 (.mp3)

MPEG 4 Audio (.mp4)

MPEG 4 Audio (.m4a)

NeXT/Sun Audio (.snd, .au)

Sound Designer II (.sd2)

WAVE (.wav)

Audio File Formats

Supported Audio andData File Formats

Data File Formats

MPEG Layer 3 ('.mp3') Apple DRM Audio Decoder ('drms')

MACE 3:1 ('MAC3') AC-3

MACE 6:1 ('MAC6') DVI 4:1 ('dvi ')

QDesign Music 2 ('QDM2') Apple IMA 4:1 ('ima4')

QDesign ('QDMC') LPC 23:1 ('lpc ')

Qualcomm PureVoice ('Qclp') Microsoft ADPCM

Qualcomm QCELP ('qclq') DVI ADPCM

AAC ('aac ') GSM610

Apple Lossless ('alac') AMR Narrowband ('samr')

Apple GSM 10:1 ('agsm') µLaw 2:1 ('ulaw')

ALaw 2:1 'alaw')

Resources•Overview• http://developer.apple.com/library/mac/#documentation/MusicAudio/

Conceptual/CoreAudioOverview/Introduction/Introduction.html

•Core Audio Frameworks• http://developer.apple.com/library/mac/#documentation/MusicAudio/

Conceptual/CoreAudioOverview/CoreAudioFrameworks/CoreAudioFrameworks.html

•System Supplied Audio Units• http://developer.apple.com/library/mac/#documentation/MusicAudio/

Conceptual/CoreAudioOverview/SystemAudioUnits/SystemAudioUnits.html

•"iPhone Core Audio Brain Dump"• http://www.subfurther.com/blog/2009/04/28/an-iphone-core-audio-brain-

dump/

top related