Use this SDK to add real-time video, audio and data features to your Swift app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
Since VideoView is a UI component, all operations (read/write properties etc) must be performed from the main thread.
Other core classes can be accessed from any thread.
Delegates will be called on the SDK’s internal thread.
Make sure any access to the UI is within the main thread, for example by using DispatchQueue.main.async.
Memory management
It is recommended to use weak var when storing references to objects created and managed by the SDK, such as Participant, TrackPublication etc. These objects are invalid when the Room disconnects, and will be released by the SDK. Holding strong reference to these objects will prevent releasing Room and other internal objects.
VideoView.track property does not hold strong reference, so it’s not required to set it to nil.
AudioSession management
LiveKit will automatically manage the underlying AVAudioSession while connected. The session will be set to playback category by default. When a local stream is published, it’ll be switched to
playAndRecord. In general, it’ll pick sane defaults and do the right thing.
However, if you’d like to customize this behavior, you would override AudioManager.customConfigureAudioSessionFunc to manage the underlying session on your own. See example here for the default behavior.
iOS Simulator limitations
Currently, VideoView will use OpenGL for iOS Simulator.
Publishing the camera track is not supported by iOS Simulator.
ScrollView performance
It is recommended to turn off rendering of VideoViews that scroll off the screen and isn’t visible by setting false to isEnabled property and true when it will re-appear to save CPU resources.
UICollectionViewDelegate‘s willDisplay / didEndDisplaying has been reported to be unreliable for this purpose. Specifically, in some iOS versions didEndDisplaying could get invoked even when the cell is visible.
The following is an alternative method to using willDisplay / didEndDisplaying :
// 1. define a weak-reference set for all cells
private var allCells = NSHashTable<ParticipantCell>.weakObjects()
// in UICollectionViewDataSource...
public func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: ParticipantCell.reuseIdentifier, for: indexPath)
if let cell = cell as? ParticipantCell {
// 2. keep weak reference to the cell
allCells.add(cell)
// configure cell etc...
}
return cell
}
// 3. define a func to re-compute and update isEnabled property for cells that visibility changed
func reComputeVideoViewEnabled() {
let visibleCells = collectionView.visibleCells.compactMap { $0 as? ParticipantCell }
let offScreenCells = allCells.allObjects.filter { !visibleCells.contains($0) }
for cell in visibleCells.filter({ !$0.videoView.isEnabled }) {
print("enabling cell#\(cell.hashValue)")
cell.videoView.isEnabled = true
}
for cell in offScreenCells.filter({ $0.videoView.isEnabled }) {
print("disabling cell#\(cell.hashValue)")
cell.videoView.isEnabled = false
}
}
// 4. set a timer to invoke the func
self.timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true, block: { [weak self] _ in
self?.reComputeVideoViewEnabled()
})
// alternatively, you can call `reComputeVideoViewEnabled` whenever cell visibility changes (such as scrollViewDidScroll(_:)),
// but this will be harder to track all cases such as cell reload etc.
iOS/macOS Swift SDK for LiveKit
Use this SDK to add real-time video, audio and data features to your Swift app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.Docs & Example app
Docs and guides are at https://docs.livekit.io.
There is full source code of a iOS/macOS Swift UI Example App.
For minimal examples view this repo 👉 Swift SDK Examples
Installation
LiveKit for Swift is available as a Swift Package.
Package.swift
Add the dependency and also to your target
XCode
Go to Project Settings -> Swift Packages.
Add a new package and enter:
https://github.com/livekit/client-sdk-swift
iOS Usage
LiveKit provides an UIKit based
VideoView
class that renders video tracks. Subscribed audio tracks are automatically played.Screen Sharing
See iOS Screen Sharing instructions.
Integration Notes
Thread safety
Since
VideoView
is a UI component, all operations (read/write properties etc) must be performed from themain
thread.Other core classes can be accessed from any thread.
Delegates will be called on the SDK’s internal thread. Make sure any access to the UI is within the main thread, for example by using
DispatchQueue.main.async
.Memory management
It is recommended to use weak var when storing references to objects created and managed by the SDK, such as
Participant
,TrackPublication
etc. These objects are invalid when theRoom
disconnects, and will be released by the SDK. Holding strong reference to these objects will prevent releasingRoom
and other internal objects.VideoView.track
property does not hold strong reference, so it’s not required to set it tonil
.AudioSession management
LiveKit will automatically manage the underlying
AVAudioSession
while connected. The session will be set toplayback
category by default. When a local stream is published, it’ll be switched toplayAndRecord
. In general, it’ll pick sane defaults and do the right thing.However, if you’d like to customize this behavior, you would override
AudioManager.customConfigureAudioSessionFunc
to manage the underlying session on your own. See example here for the default behavior.iOS Simulator limitations
VideoView
will use OpenGL for iOS Simulator.ScrollView performance
It is recommended to turn off rendering of
VideoView
s that scroll off the screen and isn’t visible by settingfalse
toisEnabled
property andtrue
when it will re-appear to save CPU resources.UICollectionViewDelegate
‘swillDisplay
/didEndDisplaying
has been reported to be unreliable for this purpose. Specifically, in some iOS versionsdidEndDisplaying
could get invoked even when the cell is visible.The following is an alternative method to using
willDisplay
/didEndDisplaying
:For the full example, see 👉 UIKit Minimal Example
Frequently asked questions
Mic privacy indicator (orange dot) remains on even after muting audio track
You will need to un-publish the LocalAudioTrack for the indicator to turn off. More discussion here https://github.com/livekit/client-sdk-swift/issues/140
How to publish camera in 60 FPS ?
LocalVideoTrack
by callingLocalVideoTrack.createCameraTrack(options: CameraCaptureOptions(fps: 60))
.LocalParticipant.publishVideoTrack(track: track, publishOptions: VideoPublishOptions(encoding: VideoEncoding(maxFps: 60)))
.Known issues
Avoid crashes on macOS Catalina
If your app is targeting macOS Catalina, make sure to do the following to avoid crash (ReplayKit not found):
Getting help / Contributing
Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.