I'm recording audio using an Audio Unit, then writing that data into an NSOutputStream which is part of a bound pair that I'm using to POST that data over HTTP. My problem is the audio unit recording callback and the NSOutputStream hasSpaceAvailable callback are totally independent of one another, so I get buffer underruns very quickly. Is there any way to synchronize those two or map the audio recording callback directly to the NSOutputStream?
Synchronize AudioUnit callback with NSOutputStream
90 views Asked by Roshan Krishnan At
1
There are 1 answers
Related Questions in IOS
- URLSession requesting JSON array from server not working
- Incorrect display of LinearGradientBrush in IOS
- Module not found when building flutter app for IOS
- How to share metadata of an audio url file to a WhatsApp conversation with friends
- Occasional crash at NSURLSessionDataTask dataTaskWithRequest:completionHandler:
- Expo Deep linking on iOS is not working (because of Google sign-in?)
- On iOS, the keyboard does not offer a 6-character SMS code
- Hi, there is an error happened when I build my flutter app, after I'm installing firebase packages occurs that error
- The copy/paste functionalities don't work only on iOS in the Flutter app
- Hide LiveActivityIntent Button from Shortcuts App
- While Running Github Actions Pipeline: No Signing Certificate "iOS Development" found: No "iOS Development" signing certificate matching team ID
- Actionable notification api call not working in background
- Accessibility : Full keyboard access with scroll view in swiftui
- There is a problem with the request entity - You are not allowed to create 'iOS' profile with App ID 'XXXX'
- I am getting "binding has not yet been initialized" error when trying to connect firebase with flutter
Related Questions in AUDIOUNIT
- Cannot connect AVAudioUnitSampler to AVAudioEngine while the engine is running
- AudioUnitRender produces empty audio buffers
- I need EXS24 File format description for my utility
- Can't set parameters in apple provided Audio Units from my application
- Running an AUv3 extension in standalone mode does not initialize the AudioUnit class
- Where is the audio unit extension app template?
- Create AUv3 audio unit supporting multiple channels
- Record and add audio effects at the same time in iOS with audio unit
- How to merge two Audio Units into AudioBufferList for AURenderCallback
- In iOS, are we able to intercept/transform microphone audio before it is passed to other apps?
- How to check if any other App is using VoiceProcessingIO AU (macOS)
- AudioUnitRender error kAudioUnitErr_CannotDoInCurrentContext on iPhone 14 only
- file unit callback shows stereo channels in buffer, but the file loaded was 6 channel
- iOS 16 RemoteIO: Input data proc returned inconsistent 2 packets
- AUGraph Record and Play
Related Questions in NSRUNLOOP
- How to exit from `RunLoop`
- How to adapt RunLoop to swift concurrency(async/await)
- NSStream (BLE L2CAP) on background thread
- Swift RunLoop: get notified on currentMode change
- Something calls a method when the app is in the background
- Failed to block main thread with runloop on iOS15 with device iPhone12?
- iOS 15 Beta 5(19A5318f) runloop run crash
- In Apple's Foundation/Swift/Objective-C, how does runLoop.run block, but still allow DispatchWorkItems to process?
- Why run loop is needed when using DispatchQueue.main.async in mac command line tool in swift?
- iOS RunLoop and DispatchQueue.main.async
- In what cases can there be such a situation when unit tests will run earlier than View is fully loaded?
- Polling GCD main queue, to avoid deadlock
- Objective C++, how to use runloop in background thread?
- dispatch_after block is not running
- Correct usage of secondary NSThread with NSRunLoop
Related Questions in NSOUTPUTSTREAM
- Swift: Read/Write & connection problems with I/O streams and URLSessionStreamTask
- How to get image from NSOutputStream in Objective-C
- Memory issue while converting big video file path to NSData. How to use InputStream/FileHandle to fix this issue?
- CHCSVWriter memory usage for writing bigger CSV files
- Deadlock with NSOutputStream and URLSessionUploadTask (__psynch_mutexwait)
- ios - NSOutputStream doesn't write data
- (NS)StreamDelegate - no error when writing to closed
- InputStream never calls hasBytesAvailable
- Not able to create a second stream to the same peer in ios using MPC
- Send and receive key/value pair using gcdasyncsocket over wifi
- Synchronize AudioUnit callback with NSOutputStream
- iOS : NSInputStream / NSOutputStream - CFNetwork SSLHandshake failed (-9806)
- NSOutputStream to know when data has been read in the other side
- Writes to NSOutputStream after performing work on background thread don't work
- What does the hasSpaceAvailable property on NSOutputStream mean?
Related Questions in AUDIOSESSION
- Creating a system-wide Panning effect on android
- Flutter Duck music when playing sound in audio_session
- How to configure audio_session to duck others with Flutter
- iOS can't get audio focus when app in background
- Audio stop when phone sleep
- Swift and CarPlay audio streaming
- Flutter: How to configure audio_session to duck others
- Disable input/output AGC from RemoteIO and VPIO on iOS
- How to programmatically change the volume on call using the Sinch SDK?
- Playing audio while record will get a low volume
- Get Audio Session Id from Google Meet
- How to stop (or pause) the device's audio in Swift?
- How can I determine which Windows process is the "audio session" governing the current process's output?
- setPreferredInput WithBlueTooth not working
- BlueTooth audio capturing using airpod microphone
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Popular Tags
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
According to Apple DTS, you aren't supposed to do any networking or any other synchronization inside the real-time thread Audio Unit callback.
But you don't have to post data directly in your network stream's hasSpaceAvailable callback. You can post data after that callback function has exited, now that you know space is available, when the data becomes available. You can also buffer up a bit of extra audio data in a circular queue or fifo so that that some data is usually available to send to cover network rate variations and latency jitter.