cONNXr
Pure C ONNX runtime for small embedded devices with zero dependencies.
Pricing
See website
Flat rate
Adoption
→StableLicense
Open Source
Data freshness
—Overview
What is cONNXr?
cONNXr is an ONNX runtime written in pure C99, designed to run inference on machine learning models across various frameworks. It's ideal for developers working with constrained hardware due to its minimal footprint and compatibility with older devices.
Key differentiator
“cONNXr stands out as a pure C99 implementation with zero dependencies, making it uniquely suited for environments where resource constraints are critical.”
Capability profile
Strength Radar
Honest assessment
Strengths & Weaknesses
↑ Strengths
Fit analysis
Who is it for?
✓ Best for
Developers needing to run machine learning inference on devices with very limited resources and no external dependencies
Projects that require a lightweight, standalone ONNX runtime for edge computing tasks
✕ Not a fit for
Applications requiring high-performance GPU acceleration
Environments where the use of external libraries or frameworks is not an issue
Cost structure
Pricing
Free Tier
None
Starts at
See website
Model
Flat rate
Enterprise
None
Performance benchmarks
How Fast Is It?
Next step
Get Started with cONNXr
Step-by-step setup guide with code examples and common gotchas.