OnnxStack.Core 0.39.0 (original) (raw)
OnnxStack.Core - Onnx Services for .NET Applications
OnnxStack.Core is a library that provides simplified wrappers for OnnxRuntime
Getting Started
OnnxStack.Core can be found via the nuget package manager, download and install it.
PM> Install-Package OnnxStack.Core
Dependencies
Video processing support requires FFMPEG and FFPROBE binaries, files must be present in your output folder or the destinations configured at runtime
https://ffbinaries.com/downloads
https://github.com/ffbinaries/ffbinaries-prebuilt/releases/download/v6.1/ffmpeg-6.1-win-64.zip
https://github.com/ffbinaries/ffbinaries-prebuilt/releases/download/v6.1/ffprobe-6.1-win-64.zip
OnnxModelSession Example
// CLIP Tokenizer Example
//----------------------//
// Model Configuration
var config = new OnnxModelConfig
{
DeviceId = 0,
InterOpNumThreads = 0,
IntraOpNumThreads = 0,
ExecutionMode = ExecutionMode.ORT_SEQUENTIAL,
ExecutionProvider = ExecutionProvider.DirectML,
OnnxModelPath = "cliptokenizer.onnx"
};
// Create Model Session
var modelSession = new OnnxModelSession(config);
// Get Metatdata
var modelMetadata = await modelSession.GetMetadataAsync();
// Create Input Tensor
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });
// Create Inference Parameters
using (var inferenceParameters = new OnnxInferenceParameters(modelMetadata))
{
// Set Inputs and Outputs
inferenceParameters.AddInputTensor(inputTensor);
inferenceParameters.AddOutputBuffer();
// Run Inference
using (var results = modelSession.RunInference(inferenceParameters))
{
// Extract Result Tokens
var resultData = results[0].ToArray<long>();
}
}
Product | Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
net8.0
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 8.0.1)
- Microsoft.Extensions.Logging.Abstractions (>= 8.0.1)
- Microsoft.ML.OnnxRuntime.Extensions (>= 0.10.0)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.18.0)
- SixLabors.ImageSharp (>= 3.1.4)
- System.Linq.Async (>= 6.0.1)
- System.Numerics.Tensors (>= 8.0.0)
NuGet packages (3)
Showing the top 3 NuGet packages that depend on OnnxStack.Core:
Package | Downloads |
---|---|
OnnxStack.StableDiffusion Stable Diffusion Library for .NET | 5.5K |
OnnxStack.ImageUpscaler OnnxRuntime Image Upscale Library for .NET | 2.5K |
OnnxStack.FeatureExtractor OnnxRuntime Image Feature Extractor Library for .NET | 1.2K |
GitHub repositories
This package is not used by any popular GitHub repositories.