一、產生Waveform 的方法,可以分成以下四個步驟
acquisition(收集資料)
收集音頻資料,可以透過 ExtAudioFile 讀檔,或是直接透過 AudioUnit 取得麥克風輸入的聲音。storage(儲存資料)
音頻資料可先暫存至某個資料結構,當使用者針對此 Waveform 要進行 Zoom In/Out時,便可以使用原始資料重新繪圖。reduction(減少樣本)
減少用來畫圖的音頻資料量。聲音的取樣頻率可能是 44,100 samples/second,為了要快速的在畫面顯示,我們可以省略某些 samples,例如採用 200:1 的 samples:pixels 比例。這樣便可以只用220的 pixels 來顯示 44,1Khz 的資料。drawing(繪圖)
可以使用 Quartz 或是 OpenGLES 進行繪圖,Quartz比較簡單,OpenGLES繪圖速度較快。我個人偏好的做法是使用 AudioUnit 處理聲音,使用 OpenGLES 進行繪圖。
二、以 Syed Haris Ali 所撰寫的 EZAudioPassThroughExample 為例,分析實作方法
1. 建立 AudioUnit for microphone, 並且設定對應的 inputCallback()
static const AudioUnitScope kEZAudioMicrophoneInputBus = 1; static const UInt32 kEZAudioMicrophoneEnableFlag = 1; AudioUnit microphoneInput; TPCircularBuffer _circularBuffer; AudioUnitInitialize( microphoneInput ); // 預設 input scope 為 disabled, 因此需要開啟 kAudioUnitScope_Input AudioUnitSetProperty(microphoneInput, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, kEZAudioMicrophoneInputBus, &kEZAudioMicrophoneEnableFlag, sizeof(kEZAudioMicrophoneEnableFlag); // 設定處理麥克風輸入的 callback AURenderCallbackStruct microphoneCallbackStruct; microphoneCallbackStruct.inputProc = inputCallback; microphoneCallbackStruct.inputProcRefCon = (__bridge void *)self; AudioUnitSetProperty(microphoneInput, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, kEZAudioMicrophoneInputBus, µphoneCallbackStruct, sizeof(microphoneCallbackStruct)) // 設定用來儲存音頻資料的 circular buffer TPCircularBufferInit(circularBuffer,1024);
2. 當 Microphone 取得足夠的 samples, 便會呼叫 inputCallback(), 此函數會完成上述的 acquisition 動作,如下:
static OSStatus inputCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData ) {
MyMicrophone *microphone = (__bridge MyMicrophone*)inRefCon;
OSStatus result = noErr;
// Render audio into microphoneInputBuffer
result = AudioUnitRender(microphone->microphoneInput,
ioActionFlags,
inTimeStamp,
inBusNumber,
inNumberFrames,
microphone->microphoneInputBuffer);
// ----- Notify delegate (OF-style) -----
// Audio Received (float array)
if( microphone.microphoneDelegate ){
// THIS IS NOT OCCURING ON THE MAIN THREAD
if( [microphone.microphoneDelegate respondsToSelector:@selector(microphone:hasAudioReceived:withBufferSize:withNumberOfChannels:)] ){
// 此資料用來即時繪圖
AEFloatConverterToFloat(microphone->converter,
microphone->microphoneInputBuffer,
microphone->floatBuffers,
inNumberFrames);
[microphone.microphoneDelegate microphone:microphone
hasAudioReceived:microphone->floatBuffers
withBufferSize:inNumberFrames
withNumberOfChannels:microphone->streamFormat.mChannelsPerFrame];
}
}
// Audio Received (buffer list)
if( microphone.microphoneDelegate ){
// 此資料會先放至 circular buffer, 接著可以用來存檔或是透過喇叭播放
if( [microphone.microphoneDelegate respondsToSelector:@selector(microphone:hasBufferList:withBufferSize:withNumberOfChannels:)] ){
[microphone.microphoneDelegate microphone:microphone
hasBufferList:microphone->microphoneInputBuffer
withBufferSize:inNumberFrames
withNumberOfChannels:microphone->streamFormat.mChannelsPerFrame];
}
}
}
3. 音頻錄製,此函數會完成上述的 storage 動作(PassThroughViewController.m)
// Append the AudioBufferList from the microphone callback to a global circular buffer -(void)microphone:(EZMicrophone *)microphone hasBufferList:(AudioBufferList *)bufferList withBufferSize:(UInt32)bufferSize withNumberOfChannels:(UInt32)numberOfChannels { /** Append the audio data to a circular buffer */ TPCircularBufferProduceBytes(&circularBuffer, bufferList->mBuffers[0].mData, bufferList->mBuffers[0].mDataByteSize); }
4. 畫面更新,此函數會完成上述的 render 動作(PassThroughViewController.m)
-(void)microphone:(EZMicrophone *)microphone hasAudioReceived:(float **)buffer withBufferSize:(UInt32)bufferSize withNumberOfChannels:(UInt32)numberOfChannels { dispatch_async(dispatch_get_main_queue(), ^{ [self.audioPlot updateBuffer:buffer[0] withBufferSize:bufferSize]; }); }
註:此處繪圖(audioPlot) 原理為呼叫 OpenGLES API,glDrawArrays(drawingType, 0, PlotSize),可分別設定 drawingType 為 GL_LINE_STRIP 或 GL_TRIANGLE_STRIP 畫出不同的圖形。
三、網路上的程式範例
- https://github.com/syedhali/
EZAudio - https://developer.apple.com/
library/ios/samplecode/ aurioTouch/Introduction/Intro. html - http://stackoverflow.com/
questions/5032775/drawing- waveform-with-avassetreader - https://code.google.com/p/
core-plot/ - https://developer.apple.com/library/ios/samplecode/aurioTouch/Introduction/Intro.html
參考資料: