cONNXr

Pure C ONNX runtime for small embedded devices with zero dependencies.

EstablishedOpen SourceLow lock-in

Pricing

See website

Flat rate

Adoption

Stable

License

Open Source

Data freshness

Overview

What is cONNXr?

cONNXr is an ONNX runtime written in pure C99, designed to run inference on machine learning models across various frameworks. It's ideal for developers working with constrained hardware due to its minimal footprint and compatibility with older devices.

Key differentiator

cONNXr stands out as a pure C99 implementation with zero dependencies, making it uniquely suited for environments where resource constraints are critical.

Capability profile

Strength Radar

Zero dependenciesPure C99 impleme…Compatibility wi…ONNX model infer…Minimal footprint

Honest assessment

Strengths & Weaknesses

↑ Strengths

Zero dependencies

Pure C99 implementation

Compatibility with older devices

ONNX model inference support

Minimal footprint

Fit analysis

Who is it for?

✓ Best for

Developers needing to run machine learning inference on devices with very limited resources and no external dependencies

Projects that require a lightweight, standalone ONNX runtime for edge computing tasks

✕ Not a fit for

Applications requiring high-performance GPU acceleration

Environments where the use of external libraries or frameworks is not an issue

Cost structure

Pricing

Free Tier

None

Starts at

See website

Model

Flat rate

Enterprise

None

Performance benchmarks

How Fast Is It?

Next step

Get Started with cONNXr

Step-by-step setup guide with code examples and common gotchas.

View Setup Guide →