A low-level haptic rendering and data processing library based on a universal
format. Designed as a core component for integration into game engines, audio
middleware, and other systems, Haptics SDK runs in production across millions of
devices daily.
Works with normalized haptic parameters such as amplitude, frequency, and
transients, rather than raw waveforms. This approach aligns with the broader
industry shift toward standardized, device-agnostic haptic data structures (not
only via the .haptic format in this repo but also Apple’s .ahap format). By
separating haptic design intent from device playback capabilities, you can
ensure consistent experiences across diverse hardware.
⚙️ Configuration-Based Render Output
Actuator Configuration Files (ACF) define how abstract haptic data maps to
specific hardware. This makes it easy to support multiple output platforms
without modifying your code, streamlining integration and maintenance.
🎛️ Dual Rendering Modes
Supports a wide range of platform API and device capabilites, including both
PCM-based systems and those based on simple amplitude curves.
🔧 Integration Friendly
No external dependencies - Standalone C/C++ source code
Minimal API surface - Simple, easy-to-understand integration points
Meta Haptics SDK is a haptic rendering library with a five-component
architecture that separates content from hardware.
Core Components
The SDK consists of five main components that work together:
.haptic Files (Specification: resources/haptic-file-spec.md): The
.haptic format is a JSON-based, device-agnostic representation of
vibrotactile effects. This repository open-sources the .haptic file
specification, but the underlying renderer is file format agnostic and
easily compatible with similar parametric formats like Apple’s AHAP. The
format is designed for interchange between design tools, game engines, and
runtime systems.
Parametric Haptic Data (core/haptic_data_parametric): A C++ library for
the file format used by Parametric Haptic Renderer (below). It takes .haptic
files as input and converts them to Parametric Haptic Data.
Actuator Configuration Files (ACF) (resources/acf): JSON5 configuration
files that define how normalized haptic data (0.0-1.0) within .haptic files
(and Parametric Haptic Data) maps to specific hardware characteristics. They
contain device-specific parameters like frequency ranges, gain curves, and
transient waveform shapes.
Parametric Renderer (core/renderer_parametric): A realtime rendering
C++ library. You initialize it with Parametric Haptic Data and ACF data and
then drive it from update loop or callback. In turn it has Haptic Renderer
(see below) render slices of parametric data to a the format required by the
platform haptic API.
Haptic Renderer (core/renderer_c): A lightweight C library that
processes parametric control points (amplitude and frequency ramps,
transients) and generates output samples according to the ACF at a target
sample rate. It operates in real-time, maintaining internal state for smooth
interpolation between control points.
Rendering Process
Meta Haptics SDK interpolates between hardware-agnostic parametric control
points, and based on the ACF, renders the optimal output for the platform API or
device hardware.
The renderer processes time-based control points (amplitude, frequency,
transients) and generates device-specific output samples.
This separation enables:
Targeting any configured hardware with the same content
Tuning haptic feel per-device via ACF configuration
Using the same content pipeline across all platforms
Design Philosophy
Meta Haptics SDK uses normalized, parametric data to distinctly separate
design-time concerns from runtime implementation. This approach provides
significant benefits for both content creators and system integrators.
A Low-Level Building Block for Haptic Systems
Meta Haptics SDK is a haptic rendering and data processing library designed
to be integrated into larger systems. It focuses on a single responsibility:
efficiently converting device-agnostic vibrotactile data into device and
platform-specific output.
What it provides:
Real-time rendering of parametric haptic data (amplitude/frequency envelopes,
transients)
Device-specific adaptation through Actuator Configuration Files (ACF)
Pure C/C++ implementation with zero external dependencies
File format agnostic renderer compatible with .haptic, AHAP, or similar
formats
What it doesn’t provide:
High-level playback APIs (play, stop, pause, seek) or playback location.
Haptic prioritization or mixing systems
Threading models or process communication
Asset management or file I/O
This design allows integration into systems that already have their own models
for playback control, threading, and resource management—such as game engines,
audio middleware, and platform runtimes.
For Haptic Designers
Hardware capabilities vary widely across platforms—from high-end console
controllers to simple gamepads. Requiring designers to create platform-specific
haptic assets is highly inefficient:
Haptics typically has the lowest priority and smallest budget in most
projects
Rework is cost-prohibitive - designers need to author once and deploy
everywhere
Design intent, not hardware specifics - normalized envelopes (0.0-1.0)
capture what designers want to express, not how specific motors should vibrate
By storing only design intent in normalized primitives, the system enables:
Graceful degradation on lower-capability hardware
High-quality output on advanced systems
Cross-platform compatibility - haptics work on new platforms without
rework
Forward and backward compatibility - content remains valid as systems
evolve
For System Integrators
Middleware and game engine creators need maintainable solutions across numerous
platforms. This SDK provides:
Configuration-based rendering - add new platforms with ACF files, not code
changes
No upstream content changes - existing .haptic files work on new
platforms automatically
Cost-effective maintenance - minimal engineering effort for platform
support
Why Not PCM?
Many systems use PCM (Pulse Code Modulation) as their haptic data format, but
PCM leaks hardware abstraction into the design:
PCM waveforms encode frequency information specific to target hardware
Designers must re-render PCM files for every platform with different motor
characteristics
Changes in hardware require re-authoring all content
Parametric data solves this by describing what to feel (amplitude,
frequency, transients), not how motors should move. The renderer handles
hardware translation at runtime.
A: The SDK is platform-agnostic C/C++ code. It runs on Windows, macOS,
Linux, and consoles. Integration examples focus on Meta Quest via OpenXR.
Q: Can I use this with my existing game engine?
A: Yes! The SDK integrates into Unity, Unreal, FMOD, and Wwise. You can also
integrate directly into custom engines—it’s just C/C++.
Q: Do I need Meta hardware to use this?
A: No. While the SDK powers Meta Quest haptics, it’s designed to work with
any vibrotactile actuator. Use the ACF system to tune for your target hardware.
Q: Is this only for VR?
A: No. The SDK works with any haptic-enabled device: VR controllers, PCVR,
game console controllers (PlayStation® DualSense™, Xbox), and more.
Technical Questions
Q: When should I use Synthesis vs Amplitude Curve mode?
A:
Synthesis Mode - When the platform supports PCM waveforms (Quest,
DualSense). Provides full frequency control.
Amplitude Curve Mode - When the platform only accepts amplitude values
(some Android APIs, simpler consoles).
Q: How do I tune an ACF for my hardware?
A:
Start with a template ACF (e.g., Generic-Console-PCM.acf)
Consult your actuator’s datasheet for frequency response
Set frequency_min/frequency_max to the usable range (e.g., 55-200 Hz)
Tune emphasis settings around the resonant frequency (f₀)
A: Yes. The .haptic specification defines
the data format. You can generate files in any language and validate against the
spec.
Q: What’s the difference between emphasis and transients?
A: They’re the same concept, different terminology. “Emphasis” is the term
used in the renderer API, “transient” is used in haptic design (short,
click-like events).
Compatibility Questions
Q: Can I use .haptic files created in Meta Haptics Studio?
A: Absolutely! That’s the primary design workflow. Studio exports .haptic
files that this SDK renders at runtime.
Contributing
We welcome contributions! Whether you’re fixing bugs, adding platform examples,
or improving documentation, your help is greatly appreciated. To set
expectations: the maintainers of this repository are committed to strong
stewardship of its technical direction and feature set. As a result, not every
suggestion or contribution may be accepted. However, please don’t let that
discourage you: we’re a friendly team and always happy to discuss your ideas for
improvements!
Thank you for your interest in making this project better.
A low-level haptic rendering and data processing library based on a universal format. Designed as a core component for integration into game engines, audio middleware, and other systems, Haptics SDK runs in production across millions of devices daily.
Table of Contents
Feature overview
🎨 Platform-Agnostic Haptic Input Data
Works with normalized haptic parameters such as amplitude, frequency, and transients, rather than raw waveforms. This approach aligns with the broader industry shift toward standardized, device-agnostic haptic data structures (not only via the
.hapticformat in this repo but also Apple’s.ahapformat). By separating haptic design intent from device playback capabilities, you can ensure consistent experiences across diverse hardware.⚙️ Configuration-Based Render Output
Actuator Configuration Files (ACF) define how abstract haptic data maps to specific hardware. This makes it easy to support multiple output platforms without modifying your code, streamlining integration and maintenance.
🎛️ Dual Rendering Modes
Supports a wide range of platform API and device capabilites, including both PCM-based systems and those based on simple amplitude curves.
🔧 Integration Friendly
Industry Adoption
Meta Haptics SDK serves as the core haptic rendering technology in major tools and platforms:
Audio Middleware
Natively integrated into professional audio middleware:
Game Engines
Design Tools
Getting Started
Prerequisites
Build Tools
Supported Platforms
Quick Start
Building
Using
just(Recommended)Using CMake
The justfile in the directory of each component contains the CMake commands.
Examples
1. Authoring a .haptic file
.haptic files contain the contents of a haptic effect.
Example .haptic File
Key Concepts
Design Tools: Create
.hapticfiles visually using Meta Haptics Studio or author them by hand..haptic Specification:
resources/haptic-file-spec.md2. Authoring an Actuator Configuration File (ACF)
ACF files define device-specific characteristics, translating normalized haptic data (0.0-1.0) into hardware-appropriate parameters.
Example ACF (Generic PCM Console)
Provided ACF Templates
Generic-Console-PCM.acf- Full waveform synthesis (PlayStation® DualSense, etc.)Generic-Console-SimpleHaptics.acf- Amplitude-only output (Xbox, etc.)Meta-Quest.acf- Meta Quest controller tuningTuning Your ACF: Start with a template and tune by feel. Adjust frequency ranges based on your actuator’s datasheet and perceptual testing.
ACF Specification:
resources/acf/README.md3. Parsing and Rendering Haptic Data in C++
To parse and render haptic data you need a
.hapticfile and an ACF (see examples above). Then do the following:.hapticto a ParametricHapticClip using the Parametric Haptic Data library.View Parametric Haptic Renderer Example
4. PCM Haptics for Unity and Unreal
Complete integration examples for Unity and Unreal demonstrating how to render haptics on devices that implement OpenXR PCM haptics (Meta Quest).
View Unity Example
View Unreal Example
How It Works
Meta Haptics SDK is a haptic rendering library with a five-component architecture that separates content from hardware.
Core Components
The SDK consists of five main components that work together:
.haptic Files (Specification:
resources/haptic-file-spec.md): The.hapticformat is a JSON-based, device-agnostic representation of vibrotactile effects. This repository open-sources the.hapticfile specification, but the underlying renderer is file format agnostic and easily compatible with similar parametric formats like Apple’s AHAP. The format is designed for interchange between design tools, game engines, and runtime systems.Parametric Haptic Data (
core/haptic_data_parametric): A C++ library for the file format used by Parametric Haptic Renderer (below). It takes .haptic files as input and converts them to Parametric Haptic Data.Actuator Configuration Files (ACF) (
resources/acf): JSON5 configuration files that define how normalized haptic data (0.0-1.0) within .haptic files (and Parametric Haptic Data) maps to specific hardware characteristics. They contain device-specific parameters like frequency ranges, gain curves, and transient waveform shapes.Parametric Renderer (
core/renderer_parametric): A realtime rendering C++ library. You initialize it with Parametric Haptic Data and ACF data and then drive it from update loop or callback. In turn it has Haptic Renderer (see below) render slices of parametric data to a the format required by the platform haptic API.Haptic Renderer (
core/renderer_c): A lightweight C library that processes parametric control points (amplitude and frequency ramps, transients) and generates output samples according to the ACF at a target sample rate. It operates in real-time, maintaining internal state for smooth interpolation between control points.Rendering Process
Meta Haptics SDK interpolates between hardware-agnostic parametric control points, and based on the ACF, renders the optimal output for the platform API or device hardware.
The renderer processes time-based control points (amplitude, frequency, transients) and generates device-specific output samples.
This separation enables:
Design Philosophy
Meta Haptics SDK uses normalized, parametric data to distinctly separate design-time concerns from runtime implementation. This approach provides significant benefits for both content creators and system integrators.
A Low-Level Building Block for Haptic Systems
Meta Haptics SDK is a haptic rendering and data processing library designed to be integrated into larger systems. It focuses on a single responsibility: efficiently converting device-agnostic vibrotactile data into device and platform-specific output.
What it provides:
.haptic, AHAP, or similar formatsWhat it doesn’t provide:
This design allows integration into systems that already have their own models for playback control, threading, and resource management—such as game engines, audio middleware, and platform runtimes.
For Haptic Designers
Hardware capabilities vary widely across platforms—from high-end console controllers to simple gamepads. Requiring designers to create platform-specific haptic assets is highly inefficient:
By storing only design intent in normalized primitives, the system enables:
For System Integrators
Middleware and game engine creators need maintainable solutions across numerous platforms. This SDK provides:
.hapticfiles work on new platforms automaticallyWhy Not PCM?
Many systems use PCM (Pulse Code Modulation) as their haptic data format, but PCM leaks hardware abstraction into the design:
Parametric data solves this by describing what to feel (amplitude, frequency, transients), not how motors should move. The renderer handles hardware translation at runtime.
Documentation
Complete Documentation Set
External Resources
FAQ
General Questions
Q: What platforms does Meta Haptics SDK support?
A: The SDK is platform-agnostic C/C++ code. It runs on Windows, macOS, Linux, and consoles. Integration examples focus on Meta Quest via OpenXR.
Q: Can I use this with my existing game engine?
A: Yes! The SDK integrates into Unity, Unreal, FMOD, and Wwise. You can also integrate directly into custom engines—it’s just C/C++.
Q: Do I need Meta hardware to use this?
A: No. While the SDK powers Meta Quest haptics, it’s designed to work with any vibrotactile actuator. Use the ACF system to tune for your target hardware.
Q: Is this only for VR?
A: No. The SDK works with any haptic-enabled device: VR controllers, PCVR, game console controllers (PlayStation® DualSense™, Xbox), and more.
Technical Questions
Q: When should I use Synthesis vs Amplitude Curve mode?
A:
Q: How do I tune an ACF for my hardware?
A:
Generic-Console-PCM.acf)frequency_min/frequency_maxto the usable range (e.g., 55-200 Hz)See the ACF specification for detailed guidance.
Q: Can I create .haptic files programmatically?
A: Yes. The .haptic specification defines the data format. You can generate files in any language and validate against the spec.
Q: What’s the difference between emphasis and transients?
A: They’re the same concept, different terminology. “Emphasis” is the term used in the renderer API, “transient” is used in haptic design (short, click-like events).
Compatibility Questions
Q: Can I use .haptic files created in Meta Haptics Studio?
A: Absolutely! That’s the primary design workflow. Studio exports .haptic files that this SDK renders at runtime.
Contributing
We welcome contributions! Whether you’re fixing bugs, adding platform examples, or improving documentation, your help is greatly appreciated. To set expectations: the maintainers of this repository are committed to strong stewardship of its technical direction and feature set. As a result, not every suggestion or contribution may be accepted. However, please don’t let that discourage you: we’re a friendly team and always happy to discuss your ideas for improvements!
Thank you for your interest in making this project better.
See our Contribution Guidelines for more information.
Code of Conduct
See our Code of Conduct
License
Meta Haptics SDK is released under the MIT License. See LICENSE for details.
Need help?
Open an issue or check our documentation.
Ready to design haptics?
Download Meta Haptics Studio.
Built with love by Meta’s IxI Haptics team (formerly Lofelt ❤️)