Audio Sessions Management
Manage audio ducking and audio session sharing between the StreamLayer iOS SDK and your host app. Implement SLROverlayDelegate methods for smooth audio transitions during Watch Party, messaging, and media playback.
Audio Sessions Management
Handling Audio Ducking
For the best user experience, certain units within the StreamLayer Element may require a decrease in the main video's audio level. For instance, a video embedded in a social feed may play with sound — if the main audio level is not reduced accordingly, the two audio streams will overlap, resulting in a poor experience. When the host app provides the SDK methods to control the application's audio session, the SDK will smoothly decrease the sound when needed.
Two SLROverlayDelegate methods handle audio ducking:
Audio Ducking Example
SLRVideoPlayer: SLROverlayDelegate {
...
private var player: AVPlayer!
// tracks requests for ducking
private let volumeReduceRate: Float = 0.1
private var playerVolumeOriginal: [Float] = []
// tracks requests for audio sessions
private var kTotalSessions: Int { return kGenericSessions + kGenericSessions }
private var kGenericSessions: Int = 0
private var kVoiceSessions: Int = 0
...
public func requestAudioDucking() {
lock.withLockVoid {
if let player = player {
let isDuckingActive = playerVolumeOriginal.last.map({ Self.streamVolume == volumeReduceRate * $0 }) ?? false
playerVolumeOriginal.append(Self.streamVolume)
if !isDuckingActive {
Self.streamVolume = volumeReduceRate * Self.streamVolume
}
onPlayerVolumeChange?()
if player.timeControlStatus == .playing {
player.volume = Self.streamVolume
}
}
}
}
public func disableAudioDucking() {
lock.withLockVoid {
if playerVolumeOriginal.count > 0, let player = player {
Self.streamVolume = playerVolumeOriginal.popLast() ?? 1
onPlayerVolumeChange?()
if player.timeControlStatus == .playing {
player.volume = Self.streamVolume
}
}
}
}
...
}Audio Session Management
Audio notifications are used by the SDK for Watch Party, messaging, and other features. iOS allows only one audio session per application, so the host app and the SDK share the same audio session.
To minimize interference between the host app's and SDK's audio modes, implement the following delegate methods:
SLROverlayDelegate.prepareAudioSession(...)SLROverlayDelegate.disableAudioSession(...)
Audio Session Example
// tracks requests for audio sessions
fileprivate var kTotalSessions: Int { return kGenericSessions + kGenericSessions }
fileprivate var kGenericSessions: Int = 0
fileprivate var kVoiceSessions: Int = 0
...
public func disableAudioSession(for type: SLRAudioSessionType) {
switch type {
case .voice: kVoiceSessions -= 1
case .generic: kGenericSessions -= 1
}
// no sessions at all - disable
if kTotalSessions == 0 {
disableAudioSession()
return
}
if kVoiceSessions > 0 {
return
}
if type == .voice {
enableGenericAudioSession(reactivate: false)
}
}
public func prepareAudioSession(for type: SLRAudioSessionType) {
lock.lock()
defer {
print("[AudioSession] prepare kVoiceSessions: \(kVoiceSessions), kGenericSessions: \(kGenericSessions)")
lock.unlock()
}
switch type {
case .voice:
kVoiceSessions += 1
if kVoiceSessions == 1 {
// No specific logic here for now (WebRTC handles this)
}
case .generic:
kGenericSessions += 1
if kGenericSessions > 0, kVoiceSessions == 0 {
let reactivate = kGenericSessions == 1
do {
try StreamLayer.prepareSessionForGeneralAudio(reactivate: reactivate)
} catch let error {
print("[RPC] Error: \(error)")
return
}
return
}
}
}
...
}Related
- Watch Party — Overview of the Watch Party feature
- Integration Guide — Complete SDK setup instructions
Updated 15 days ago
