
DSWaveformImage offers a native interfaces for drawing the envelope waveform of audio data
in iOS, iPadOS, macOS or via Catalyst. To do so, you can use
Additionally, you can get a waveform’s (normalized) [Float]
samples directly as well by
creating an instance of WaveformAnalyzer
.
Example UI (included in repository)
For a practical real-world example usage of a SwiftUI live audio recording waveform rendering, see RecordingIndicatorView.
You may also find the following iOS controls written in Swift interesting:
I’m doing all this for fun and joy and because I strongly believe in the power of open source. On the off-chance though, that using my library has brought joy to you and you just feel like saying “thank you”, I would smile like a 4-year old getting a huge ice cream cone, if you’d support my via one of the sponsoring buttons ☺️💕
If you’re feeling in the mood of sending someone else a lovely gesture of appreciation, maybe check out my iOS app 💌 SoundCard to send them a real postcard with a personal audio message.

Installation
- use SPM: add
https://github.com/dmrschmidt/DSWaveformImage
and set “Up to Next Major” with “13.0.0”
import DSWaveformImage // for core classes to generate `UIImage` / `NSImage` directly
import DSWaveformImageViews // if you want to use the native UIKit / SwiftUI views
Deprecated or discouraged but still possible alternative ways for older apps:
- since it has no other dependencies you may simply copy the
Sources
folder directly into your project
- use carthage:
github "dmrschmidt/DSWaveformImage" ~> 7.0
(last supported version is 10)
- or, sunset since 6.1.1:
use cocoapods: pod 'DSWaveformImage', '~> 6.1'
Usage
DSWaveformImage
provides 3 kinds of tools to use
The core renderes and processors as well as SwiftUI views natively support iOS & macOS, using UIImage
& NSImage
respectively.
SwiftUI
@State var audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
WaveformView(audioURL: audioURL)
@StateObject private var audioRecorder: AudioRecorder = AudioRecorder() // just an example
WaveformLiveCanvas(samples: audioRecorder.samples)
UIKit
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformImageView = WaveformImageView(frame: CGRect(x: 0, y: 0, width: 500, height: 300)
waveformImageView.waveformAudioURL = audioURL
Find a full example in the sample project’s RecordingViewController.
let waveformView = WaveformLiveView()
// configure and start AVAudioRecorder
let recorder = AVAudioRecorder()
recorder.isMeteringEnabled = true // required to get current power levels
// after all the other recording (omitted for focus) setup, periodically (every 20ms or so):
recorder.updateMeters() // gets the current value
let currentAmplitude = 1 - pow(10, recorder.averagePower(forChannel: 0) / 20)
waveformView.add(sample: currentAmplitude)
Raw API
Configuration
Note: Calculations are always performed and returned on a background thread, so make sure to return to the main thread before doing any UI work.
Check Waveform.Configuration
in WaveformImageTypes for various configuration options.
let waveformImageDrawer = WaveformImageDrawer()
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformImageDrawer.waveformImage(fromAudioAt: audioURL, with: .init(
size: topWaveformView.bounds.size,
style: .filled(UIColor.black)),
renderer: LinearWaveformRenderer()) { image in
// need to jump back to main queue
DispatchQueue.main.async {
self.topWaveformView.image = image
}
}
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformAnalyzer = WaveformAnalyzer(audioAssetURL: audioURL)
waveformAnalyzer.samples(count: 200) { samples in
print("so many samples: \(samples)")
}
async
/ await
Support
The public API has been updated in 9.1 to support async
/ await
. See the example app for an illustration.
public class WaveformAnalyzer {
func samples(count: Int, qos: DispatchQoS.QoSClass = .userInitiated) async throws -> [Float]
}
public class WaveformImageDrawer {
public func waveformImage(
fromAudioAt audioAssetURL: URL,
with configuration: Waveform.Configuration,
qos: DispatchQoS.QoSClass = .userInitiated
) async throws -> UIImage
}
Playback Progress Indication
If you’re playing back audio files and would like to indicate the playback progress to your users, you can find inspiration in the example app. UIKit and SwiftUI examples are provided.
Both approaches will result in something like the image below.
There is currently no plan to integrate this as a 1st class citizen to the library itself, as every app will have different design requirements, and WaveformImageDrawer
as well as WaveformAnalyzer
are as simple to use as the views themselves as you can see in the examples.
Loading remote audio files from URL
For one example way to display waveforms for audio files on remote URLs see https://github.com/dmrschmidt/DSWaveformImage/issues/22.
What it looks like
Waveforms can be rendered in 2 different ways and 5 different styles each.
By default LinearWaveformRenderer
is used, which draws a linear 2D amplitude envelope.
CircularWaveformRenderer
is available as an alternative, which can be passed in to the WaveformView
or WaveformLiveView
respectively. It draws a circular
2D amplitude envelope.
You can implement your own renderer by implementing WaveformRenderer
.
The following styles can be applied to either renderer:
- filled: Use solid color for the waveform.
- outlined: Draws the envelope as an outline with the provided thickness.
- gradient: Use gradient based on color for the waveform.
- gradientOutlined: Use gradient based on color for the waveform. Draws the envelope as an outline with the provided thickness.
- striped: Use striped filling based on color for the waveform.
https://user-images.githubusercontent.com/69365/127739821-061a4345-0adc-4cc1-bfd6-f7cfbe1268c9.mov
Migration
In 13.0.0
- Any mentions of
dampening
& similar were corrected to damping
etc in 11460b8b. Most notably in Waveform.Configuration
. See #64.
- styles
.outlined
& .gradientOutlined
were added to Waveform.Style
, see https://github.com/dmrschmidt/DSWaveformImage#what-it-looks-like
Waveform.Position
was removed. If you were using it to place the view somewhere, move this responsibility up to its parent for positioning, like with any other view as well.
In 12.0.0
- The rendering pipeline was split out from the analysis. You can now create your own renderes by subclassing
WaveformRenderer
.
- A new
CircularWaveformRenderer
has been added.
position
was removed from Waveform.Configuration
, see 0447737.
- new
Waveform.Style
option have been added and need to be accounted for in switch
statements etc.
In 11.0.0
the library was split into two: DSWaveformImage
and DSWaveformImageViews
. If you’ve used any of the native views bevore, just add the additional import DSWaveformImageViews
.
The SwiftUI views have changed from taking a Binding to the respective plain values instead.
In 9.0.0
a few public API’s have been slightly changed to be more concise. All types have also been grouped under the Waveform
enum-namespace. Meaning WaveformConfiguration
for instance has become Waveform.Configuration
and so on.
In 7.0.0
colors have moved into associated values on the respective style
enum.
Waveform
and the UIImage
category have been removed in 6.0.0 to simplify the API.
See Usage
for current usage.
See it live in action
SoundCard - postcards with sound lets you send real, physical postcards with audio messages. Right from your iOS device.
DSWaveformImage is used to draw the waveforms of the audio messages that get printed on the postcards sent by SoundCard - postcards with audio.
DSWaveformImage - iOS & macOS realtime audio waveform rendering
DSWaveformImage offers a native interfaces for drawing the envelope waveform of audio data in iOS, iPadOS, macOS or via Catalyst. To do so, you can use
WaveformImageView
(UIKit) /WaveformView
(SwiftUI) to render a static waveform from an audio file orWaveformLiveView
(UIKit) /WaveformLiveCanvas
(SwiftUI) to realtime render a waveform of live audio data (e.g. fromAVAudioRecorder
)WaveformImageDrawer
to generate a waveformUIImage
from an audio fileAdditionally, you can get a waveform’s (normalized)
[Float]
samples directly as well by creating an instance ofWaveformAnalyzer
.Example UI (included in repository)
For a practical real-world example usage of a SwiftUI live audio recording waveform rendering, see RecordingIndicatorView.
More related iOS Controls
You may also find the following iOS controls written in Swift interesting:
If you really like this library (aka Sponsoring)
I’m doing all this for fun and joy and because I strongly believe in the power of open source. On the off-chance though, that using my library has brought joy to you and you just feel like saying “thank you”, I would smile like a 4-year old getting a huge ice cream cone, if you’d support my via one of the sponsoring buttons ☺️💕
If you’re feeling in the mood of sending someone else a lovely gesture of appreciation, maybe check out my iOS app 💌 SoundCard to send them a real postcard with a personal audio message.
Installation
https://github.com/dmrschmidt/DSWaveformImage
and set “Up to Next Major” with “13.0.0”Deprecated or discouraged but still possible alternative ways for older apps:
Sources
folder directly into your projectgithub "dmrschmidt/DSWaveformImage" ~> 7.0
(last supported version is 10)use cocoapods:pod 'DSWaveformImage', '~> 6.1'
Usage
DSWaveformImage
provides 3 kinds of tools to useThe core renderes and processors as well as SwiftUI views natively support iOS & macOS, using
UIImage
&NSImage
respectively.SwiftUI
WaveformView
- renders a one-off waveform from an audio file:WaveformLiveCanvas
- renders a live waveform from(0...1)
normalized samples:UIKit
WaveformImageView
- renders a one-off waveform from an audio file:WaveformLiveView
- renders a live waveform from(0...1)
normalized samples:Find a full example in the sample project’s RecordingViewController.
Raw API
Configuration
Note: Calculations are always performed and returned on a background thread, so make sure to return to the main thread before doing any UI work.
Check
Waveform.Configuration
in WaveformImageTypes for various configuration options.WaveformImageDrawer
- creates aUIImage
waveform from an audio file:WaveformAnalyzer
- calculates an audio file’s waveform sample:async
/await
SupportThe public API has been updated in 9.1 to support
async
/await
. See the example app for an illustration.Playback Progress Indication
If you’re playing back audio files and would like to indicate the playback progress to your users, you can find inspiration in the example app. UIKit and SwiftUI examples are provided.
Both approaches will result in something like the image below.
There is currently no plan to integrate this as a 1st class citizen to the library itself, as every app will have different design requirements, and
WaveformImageDrawer
as well asWaveformAnalyzer
are as simple to use as the views themselves as you can see in the examples.Loading remote audio files from URL
For one example way to display waveforms for audio files on remote URLs see https://github.com/dmrschmidt/DSWaveformImage/issues/22.
What it looks like
Waveforms can be rendered in 2 different ways and 5 different styles each.
By default
LinearWaveformRenderer
is used, which draws a linear 2D amplitude envelope.CircularWaveformRenderer
is available as an alternative, which can be passed in to theWaveformView
orWaveformLiveView
respectively. It draws a circular 2D amplitude envelope.You can implement your own renderer by implementing
WaveformRenderer
.The following styles can be applied to either renderer:
Live waveform rendering
https://user-images.githubusercontent.com/69365/127739821-061a4345-0adc-4cc1-bfd6-f7cfbe1268c9.mov
Migration
In 13.0.0
dampening
& similar were corrected todamping
etc in 11460b8b. Most notably inWaveform.Configuration
. See #64..outlined
&.gradientOutlined
were added toWaveform.Style
, see https://github.com/dmrschmidt/DSWaveformImage#what-it-looks-likeWaveform.Position
was removed. If you were using it to place the view somewhere, move this responsibility up to its parent for positioning, like with any other view as well.In 12.0.0
WaveformRenderer
.CircularWaveformRenderer
has been added.position
was removed fromWaveform.Configuration
, see 0447737.Waveform.Style
option have been added and need to be accounted for inswitch
statements etc.In 11.0.0
the library was split into two:
DSWaveformImage
andDSWaveformImageViews
. If you’ve used any of the native views bevore, just add the additionalimport DSWaveformImageViews
. The SwiftUI views have changed from taking a Binding to the respective plain values instead.In 9.0.0
a few public API’s have been slightly changed to be more concise. All types have also been grouped under the
Waveform
enum-namespace. MeaningWaveformConfiguration
for instance has becomeWaveform.Configuration
and so on.In 7.0.0
colors have moved into associated values on the respective
style
enum.Waveform
and theUIImage
category have been removed in 6.0.0 to simplify the API. SeeUsage
for current usage.See it live in action
SoundCard - postcards with sound lets you send real, physical postcards with audio messages. Right from your iOS device.
DSWaveformImage is used to draw the waveforms of the audio messages that get printed on the postcards sent by SoundCard - postcards with audio.
Download SoundCard on the App Store.