onnxruntime

Форк
0

README.md

ONNX Runtime Node.js Binding

ONNX Runtime Node.js binding enables Node.js applications to run ONNX model inference.

Usage

Install the latest stable version:

npm install onnxruntime-node

Refer to ONNX Runtime JavaScript examples for samples and tutorials.

Requirements

ONNXRuntime works on Node.js v16.x+ (recommend v18.x+) or Electron v15.x+ (recommend v28.x+).

The following table lists the supported versions of ONNX Runtime Node.js binding provided with pre-built binaries.

EPs/PlatformsWindows x64Windows arm64Linux x64Linux arm64MacOS x64MacOS arm64
CPU✔️✔️✔️✔️✔️✔️
DirectML✔️✔️
CUDA✔️[1]
  • [1]: CUDA v11.8.

To use on platforms without pre-built binaries, you can build Node.js binding from source and consume it by npm install <onnxruntime_repo_root>/js/node/. See also instructions for building ONNX Runtime Node.js binding locally.

GPU Support

Right now, the Windows version supports only the DML provider. Linux x64 can use CUDA and TensorRT.

CUDA EP Installation

To use CUDA EP, you need to install the CUDA EP binaries. By default, the CUDA EP binaries are installed automatically when you install the package. If you want to skip the installation, you can pass the --onnxruntime-node-install-cuda=skip flag to the installation command.

npm install onnxruntime-node --onnxruntime-node-install-cuda=skip

You can also use this flag to specify the version of the CUDA: (v11 or v12)

npm install onnxruntime-node --onnxruntime-node-install-cuda=v12

License

License information can be found here.

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.