The NVIDIA® DriveWorks SDK is the foundation for all autonomous vehicle (AV) software development. It provides an extensive set of fundamental capabilities, including processing modules, tools and frameworks that are required for advanced AV development.

With the DriveWorks SDK, developers can begin innovating their own AV solution, instead of spending time developing basic low-level functionality. DriveWorks is modular, open, and readily customizable. Developers can use a single module within their own software stack to achieve a specific function, or use multiple modules to accomplish a higher-level objective.

DriveWorks is suited for the following:

  • Integrate automotive sensors within your software.
  • Accelerate image and Lidar data processing for AV algorithms.
  • Interfacing with a vehicle’s ECUs and receiving their state
  • Accelerate neural network inference for AV perception.
  • Capture and post-process data from multiple sensors.
  • Calibrate multiple sensors with precision.
  • Track and predict a vehicle’s pose.


Sensor Abstraction Layer

The Sensor Abstraction Layer (SAL) provides:
List of supported sensors

  • A unified interface to the vehicle’s sensors.
  • Synchronized timestamping of sensor data.
  • Ability to serialize the sensor data for recording.
  • Ability to replay recorded data.
  • Image signal processing acceleration via Xavier SoC ISP hardware engine.

All DriveWorks modules are compatible with the SAL, reducing the need for specialized sensor data processing. The SAL supports a diverse set of automotive sensors out of the box. Additional support can be added with the robust sensor plugin architecture.

These sensor plugins provide flexible options for interfacing new AV sensors with the SAL. Features including sensor lifecycle management, timestamp synchronization and sensor data replay can be realized with minimal development effort.

  • Custom Camera Feature: DriveWorks enables external developers to add DriveWorks support for any image sensor using the GMSL interface on DRIVE AGX Developer Kit.

  • Comprehensive Sensor Plugin Framework: This allows developers to bring new lidar, radar, IMU, GPS and CAN-based sensors into the DriveWorks SAL that are not natively supported by DriveWorks.
Replay Webinar

Image and Point Cloud Processing

DriveWorks provides a wide array of optimized low-level image and point cloud processing modules for incoming sensor data to use in higher level perception, mapping, and planning algorithms. Selected modules can be seamlessly run and accelerated on different DRIVE AGX hardware engines (such as PVA or GPU), giving the developer options and control over their application. Image processing capabilities include feature detection and tracking, structure from motion, and rectification. Point cloud processing capabilities include lidar packet accumulation, registration and planar segmentation (and more). New and improved modules are delivered with each release.

Replay Webinar    Read Blog


The VehicleIO module supports multiple production drive-by-wire backends to send commands to and receive status from the vehicle. In the event your drive-by-wire device is not supported out of the box, the VehicleIO Plugin framework enables easy integration with custom interfaces.

DNN Framework

The DriveWorks Deep Neural Network (DNN) Framework can be used for loading and inferring TensorRT models that have either been provided in DRIVE Software or have been independently trained. The DNN Plugins module enables DNN models that are composed of layers that are not supported by TensorRT to benefit from the efficiency of TensorRT. The DNN Framework enables inference acceleration using integrated GPU (in Xavier SoC), discrete GPU (in DRIVE AGX Pegasus) or integrated DLA (Deep Learning Accelerator in Xavier SoC).

Replay Webinar


In order to improve developer productivity, DriveWorks provides an extensive set of tools, reference applications, and documentation including:

  • Sensor Data Recording and Post-Recording Tools: A suite of tools to record, synchronize and playback the data captured from multiple sensors interfaced to the NVIDIA DRIVE™ AGX platform. Recorded data can be used as a high quality synchronized source for training and other development purposes.
Replay Webinar


If a sensor’s position or inherent properties deviate from nominal/assumed parameters, then any downstream processing may be faulty, whether it’s for self-driving or data collection. Calibration gives you a reliable foundation for building AV solutions with the assurance that they’ll achieve high fidelity, consistent, up-to-date sensor data. Calibration supports alignment of the vehicle’s camera, lidar, radar and Inertial Measurement Unit (IMU) sensors that are compatible with the DriveWorks Sensor Abstraction Layer.

  • Static Calibration Tools: Measures manufacturing variation for multiple AV sensors to a high degree of accuracy. Camera Calibration includes both extrinsic and intrinsic calibration, while the IMU Calibration Tool calibrates vehicle orientation with respect to the coordinate system.

  • Self-Calibration: Provides real-time compensation for environmental changes or mechanical stress on sensors caused by events such as changes in road gradient, tire pressure, vehicle passenger loading, and other minor changes. It corrects nominal calibration parameters (captured using the Static Calibration Tools) based on current sensor measurements in real-time, meaning that the algorithms are performant, safety-compliant, and optimized for the platform.

Replay Webinar


The DriveWorks Egomotion module uses a motion model to track and predict a vehicle’s pose. DriveWorks uses two types of motion models: an odometry-only model and, if an IMU is available, a model based on IMU and odometry.