版本記錄
版本號 | 時間 |
---|---|
V1.0 | 2017.12.29 |
前言
Core Audio
使用專門的數(shù)據(jù)類型與音頻流、復(fù)雜的緩沖區(qū)和audiovisual
時間戳交互。接下來這幾篇就對該框架進(jìn)行詳細(xì)解析。感興趣的可以參考上面幾篇文章。
1. Core Audio框架詳細(xì)解析(一) —— 基本概要
Core Audio框架的地位
Core Audio
是iOS和MAC系統(tǒng)中的關(guān)于數(shù)字音頻處理的基礎(chǔ),它是應(yīng)用程序用來處理音頻的一組軟件框架,所有關(guān)于iOS音頻開發(fā)的接口都是由Core Audio來提供或者經(jīng)過它提供的接口來進(jìn)行封裝的。
其實(shí)一句話,它是任何iOS或者M(jìn)AC系統(tǒng)音頻處理框架的基礎(chǔ)。
具體可以用官方文檔的一張圖表示。
接下來我們就一起分析一下。
High-Level 高級服務(wù)
這里的高級別服務(wù),更加接近于頂層,基本上我們很多關(guān)于音頻開發(fā)的工作在這一層就可以完成。
1. Audio Queue Services
它位于框架AudioToolbox
中。
提供錄制、播放、暫停、循環(huán)、和同步音頻它自動采用必要的編解碼器處理壓縮的音頻格式。
要在iOS設(shè)備上播放和錄制音頻,蘋果推薦我們使用AVFoundation
框架中的AVAudioPlayer
和AVAudioRecorder
類。雖然用法比較簡單,但是不支持流式;這就意味著:在播放音頻前,必須等到整個音頻加載完成后,才能開始播放音頻;錄音時,也必須等到錄音結(jié)束后,才能獲取到錄音數(shù)據(jù)。這給應(yīng)用造成了很大的局限性。為了解決這個問題,我們就需要使用Audio Queue Services
來播放和錄制音頻。感興趣的可以看我前面寫的幾篇關(guān)于Audio Queue Services
的文章。這里只是簡單的給出錄音和播放的原理圖,具體原理和流程,看我前面寫的那幾篇,都有詳細(xì)的介紹。
2. AVAudioPlayer
它位于框架AVFoundation
中。
是專為IOS平臺提供的基于Objective-C接口的音頻播放類,可以支持iOS所支持的所有音頻的播放,它主要支持以下音頻格式。
AAC
AMR (Adaptive multi-Rate,一種語音格式)
ALAC (Apple lossless Audio Codec)
iLBC (internet Low Bitrate Codec,另一種語音格式)
IMA4 (IMA/ADPCM)
linearPCM (uncompressed)
u-law 和 a-law
MP3 (MPEG-Laudio Layer 3)
這個是純OC的實(shí)現(xiàn),特點(diǎn)就是調(diào)用簡單,下面簡單的看一下他的API。
#import <AVFoundation/AVBase.h>
#import <AVFoundation/AVAudioFormat.h>
#import <Foundation/Foundation.h>
#import <AVFAudio/AVAudioSettings.h>
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
#import <AVFAudio/AVAudioSession.h>
#endif // #if TARGET_OS_EMBEDDED
#import <Availability.h>
NS_ASSUME_NONNULL_BEGIN
@class NSData, NSURL, NSError;
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
@class AVAudioSessionChannelDescription;
#endif
@protocol AVAudioPlayerDelegate;
NS_CLASS_AVAILABLE(10_7, 2_2) __WATCHOS_AVAILABLE(3_0)
@interface AVAudioPlayer : NSObject {
@private
id _impl;
}
/* For all of these init calls, if a return value of nil is given you can check outError to see what the problem was.
If not nil, then the object is usable for playing
*/
/* all data must be in the form of an audio file understood by CoreAudio */
- (nullable instancetype)initWithContentsOfURL:(NSURL *)url error:(NSError **)outError;
- (nullable instancetype)initWithData:(NSData *)data error:(NSError **)outError;
/* The file type hint is a constant defined in AVMediaFormat.h whose value is a UTI for a file format. e.g. AVFileTypeAIFF. */
/* Sometimes the type of a file cannot be determined from the data, or it is actually corrupt. The file type hint tells the parser what kind of data to look for so that files which are not self identifying or possibly even corrupt can be successfully parsed. */
- (nullable instancetype)initWithContentsOfURL:(NSURL *)url fileTypeHint:(NSString * __nullable)utiString error:(NSError **)outError NS_AVAILABLE(10_9, 7_0);
- (nullable instancetype)initWithData:(NSData *)data fileTypeHint:(NSString * __nullable)utiString error:(NSError **)outError NS_AVAILABLE(10_9, 7_0);
/* transport control */
/* methods that return BOOL return YES on success and NO on failure. */
- (BOOL)prepareToPlay; /* get ready to play the sound. happens automatically on play. */
- (BOOL)play; /* sound is played asynchronously. */
- (BOOL)playAtTime:(NSTimeInterval)time NS_AVAILABLE(10_7, 4_0); /* play a sound some time in the future. time is an absolute time based on and greater than deviceCurrentTime. */
- (void)pause; /* pauses playback, but remains ready to play. */
- (void)stop; /* stops playback. no longer ready to play. */
/* properties */
@property(readonly, getter=isPlaying) BOOL playing; /* is it playing or not? */
@property(readonly) NSUInteger numberOfChannels;
@property(readonly) NSTimeInterval duration; /* the duration of the sound. */
#if !TARGET_OS_IPHONE
/* the UID of the current audio device (as a string) */
@property(copy, nullable) NSString *currentDevice API_AVAILABLE(macos(10.13));
#endif
/* the delegate will be sent messages from the AVAudioPlayerDelegate protocol */
@property(assign, nullable) id<AVAudioPlayerDelegate> delegate;
/* one of these properties will be non-nil based on the init... method used */
@property(readonly, nullable) NSURL *url; /* returns nil if object was not created with a URL */
@property(readonly, nullable) NSData *data; /* returns nil if object was not created with a data object */
@property float pan NS_AVAILABLE(10_7, 4_0); /* set panning. -1.0 is left, 0.0 is center, 1.0 is right. */
@property float volume; /* The volume for the sound. The nominal range is from 0.0 to 1.0. */
- (void)setVolume:(float)volume fadeDuration:(NSTimeInterval)duration API_AVAILABLE(macos(10.12), ios(10.0), watchos(3.0), tvos(10.0)); /* fade to a new volume over a duration */
@property BOOL enableRate NS_AVAILABLE(10_8, 5_0); /* You must set enableRate to YES for the rate property to take effect. You must set this before calling prepareToPlay. */
@property float rate NS_AVAILABLE(10_8, 5_0); /* See enableRate. The playback rate for the sound. 1.0 is normal, 0.5 is half speed, 2.0 is double speed. */
/* If the sound is playing, currentTime is the offset into the sound of the current playback position.
If the sound is not playing, currentTime is the offset into the sound where playing would start. */
@property NSTimeInterval currentTime;
/* returns the current time associated with the output device */
@property(readonly) NSTimeInterval deviceCurrentTime NS_AVAILABLE(10_7, 4_0);
/* "numberOfLoops" is the number of times that the sound will return to the beginning upon reaching the end.
A value of zero means to play the sound just once.
A value of one will result in playing the sound twice, and so on..
Any negative number will loop indefinitely until stopped.
*/
@property NSInteger numberOfLoops;
/* settings */
@property(readonly) NSDictionary<NSString *, id> *settings NS_AVAILABLE(10_7, 4_0); /* returns a settings dictionary with keys as described in AVAudioSettings.h */
/* returns the format of the audio data */
@property(readonly) AVAudioFormat *format API_AVAILABLE(macos(10.12), ios(10.0), watchos(3.0), tvos(10.0));
/* metering */
@property(getter=isMeteringEnabled) BOOL meteringEnabled; /* turns level metering on or off. default is off. */
- (void)updateMeters; /* call to refresh meter values */
- (float)peakPowerForChannel:(NSUInteger)channelNumber; /* returns peak power in decibels for a given channel */
- (float)averagePowerForChannel:(NSUInteger)channelNumber; /* returns average power in decibels for a given channel */
#if (TARGET_OS_IPHONE && __has_include(<AVFoundation/AVAudioSession.h>))
/* The channels property lets you assign the output to play to specific channels as described by AVAudioSession's channels property */
/* This property is nil valued until set. */
/* The array must have the same number of channels as returned by the numberOfChannels property. */
@property(nonatomic, copy, nullable) NSArray<AVAudioSessionChannelDescription *> *channelAssignments NS_AVAILABLE(10_9, 7_0); /* Array of AVAudioSessionChannelDescription objects */
#endif
@end
/* A protocol for delegates of AVAudioPlayer */
__WATCHOS_AVAILABLE(3_0)
@protocol AVAudioPlayerDelegate <NSObject>
@optional
/* audioPlayerDidFinishPlaying:successfully: is called when a sound has finished playing. This method is NOT called if the player is stopped due to an interruption. */
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag;
/* if an error occurs while decoding it will be reported to the delegate. */
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError * __nullable)error;
#if TARGET_OS_IPHONE
/* AVAudioPlayer INTERRUPTION NOTIFICATIONS ARE DEPRECATED - Use AVAudioSession instead. */
/* audioPlayerBeginInterruption: is called when the audio session has been interrupted while the player was playing. The player will have been paused. */
- (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player NS_DEPRECATED_IOS(2_2, 8_0);
/* audioPlayerEndInterruption:withOptions: is called when the audio session interruption has ended and this player had been interrupted while playing. */
/* Currently the only flag is AVAudioSessionInterruptionFlags_ShouldResume. */
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags NS_DEPRECATED_IOS(6_0, 8_0);
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withFlags:(NSUInteger)flags NS_DEPRECATED_IOS(4_0, 6_0);
/* audioPlayerEndInterruption: is called when the preferred method, audioPlayerEndInterruption:withFlags:, is not implemented. */
- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player NS_DEPRECATED_IOS(2_2, 6_0);
#endif // TARGET_OS_IPHONE
@end
NS_ASSUME_NONNULL_END
3. Extended Audio File Services
由Audio File
與Audio Converter
組合而成,提供壓縮及無壓縮音頻文件的讀寫能力。
它與Audio File Services
、Audio File Stream Services
和Audio Queue Services
等同時存在AudioToolbox
框架中。ExtendedAudioFile
相對Audio File Services
和 Audio Converter Services
,API調(diào)用非常簡單和明確,并且不需要去處理AudioStreamPacketDescription
,在實(shí)際開發(fā)中邏輯更為清晰。
4. OpenAL
它就是存在框架OpenAL
中。
是CoreAudio對OpenAL標(biāo)準(zhǔn)的實(shí)現(xiàn),可以播放3D混音效果。
OpenAL 主要的功能是在來源物體、音效緩沖和收聽者中編碼。來源物體包含一個指向緩沖區(qū)的指標(biāo)、聲音的速度、位置和方向,以及聲音強(qiáng)度。收聽者物體包含收聽者的速度、位置和方向,以及全部聲音的整體增益。緩沖里包含 8 或 16 位元、單聲道或立體聲 PCM 格式的音效資料,表現(xiàn)引擎進(jìn)行所有必要的計(jì)算,如距離衰減、多普勒效應(yīng)等。
不同于 OpenGL 規(guī)格,OpenAL 規(guī)格包含兩個API分支;以實(shí)際 OpenAL 函式組成的核心,和 ALC API
,ALC
用于管理表現(xiàn)內(nèi)容、資源使用情況,并將跨平臺風(fēng)格封在其中。還有“ALUT
”程式庫,提供高階“易用”的函式,其定位相當(dāng)于 OpenGL 的 GLUT
。
Mid-Level 中級服務(wù)
該層功能比較齊全,包括音頻數(shù)據(jù)格式轉(zhuǎn)換,音頻文件讀寫,音頻流解析,插件工作支持等。
1. Audio Convert Services
它位于框架AudioToolbox
中。
負(fù)責(zé)音頻數(shù)據(jù)格式的轉(zhuǎn)換
2. Audio File Services
它位于框架AudioToolbox
中。
負(fù)責(zé)音頻數(shù)據(jù)的讀寫。
3. Audio Unit Services 和 Audio Processing Graph Services
它位于框架AudioToolbox
中。
支持均衡器和混音器等數(shù)字信號處理的插件。
4. Audio File Scream Services
它位于框架AudioToolbox
中。
負(fù)責(zé)流解析。
5. Core Audio Clock Services
它位于框架Core Audio
中。
負(fù)責(zé)音頻音頻時鐘同步。
Low-Level 低級服務(wù)
該主要在MAC上的音頻APP實(shí)現(xiàn)中并且需要最大限度的實(shí)時性能的情況下使用,大部分音頻APP不需要使用該層的服務(wù)。而且,在iOS上也提供了具備較高實(shí)時性能的高層API達(dá)到你的需求。例如OpenAL
,在游戲中具備與I/O直接調(diào)用的實(shí)時音頻處理能力。
1. I/O Kit
它在IOKit
框架中,與硬件驅(qū)動交互。
獲得用戶空間訪問硬件設(shè)備和驅(qū)動程序。I / O Kit
框架通過設(shè)備接口機(jī)制實(shí)現(xiàn)對I / O Kit對象(驅(qū)動程序和結(jié)點(diǎn))的非內(nèi)核訪問。
2. Audio HAL
音頻硬件抽象層,使API調(diào)用與實(shí)際硬件相分離,保持獨(dú)立。
3. Core MIDI
它位于Core MIDI
框架中,與MIDI設(shè)備(如硬件鍵盤和合成器)進(jìn)行通信。
Core MIDI
框架提供了用于與MIDI(樂器數(shù)字接口)設(shè)備(包括硬件鍵盤和合成器)進(jìn)行通信的API。 使用基座連接器或網(wǎng)絡(luò)從iOS設(shè)備進(jìn)行連接。 有關(guān)使用基座連接器的更多信息,請參閱Apple的MFi program。
4. Host Time Services
訪問電腦硬件時鐘。
音頻框架不同場景使用分析
1. condition One
只實(shí)現(xiàn)音頻的播放,沒有其他需求,AVAudioPlayer
就可以滿足需求。它的接口使用簡單,不用關(guān)心其中的細(xì)節(jié),通常只提供給它一個播放源的URL地址,并且調(diào)用其play、pause、stop等方法進(jìn)行控制,observer其播放狀態(tài)更新UI即可。
2. condition Two
APP需要對音頻進(jìn)行流播放,就需要AudioFileStreamer
加Audio Queue
,將網(wǎng)絡(luò)或者本地的流讀取到內(nèi)存,提交給AudioFileStreamer
解析分離音頻幀,分離出來的音頻幀可以送給AudioQueue
進(jìn)行解碼和播放,可參考下面。
3. condition Three
APP需要需要對音頻施加音效(均衡器、混響器),就是除了數(shù)據(jù)的讀取和解析以外還需要用到AudioConverter或者Codec來把音頻數(shù)據(jù)轉(zhuǎn)換成PCM數(shù)據(jù),再由AudioUnit+AUGraph來進(jìn)行音效處理和播放,可參考下面。
參考文章
后記
未完,待續(xù)~~~