apr-inference-server
Aprender inference server — GPU model serving with health checks
Files
- YAML recipe:
examples/deployment-stacks/recipes/apr-inference-server.yaml - Rust wrapper:
examples/deployment-stacks/apr_inference_server.rs
Run the wrapper
cargo run --example apr_inference_server
cargo test --example apr_inference_server
The wrapper loads the YAML, validates required fields (recipe.name, version, description, inputs), and exits without provisioning real infrastructure.
Real deployment via forjar
forjar apply examples/deployment-stacks/recipes/apr-inference-server.yaml \
--inputs <input_name>=<value>
See the YAML for the full input schema.
Contract
This recipe is graded against contracts/recipe-iiur-config-v1.yaml.
Provenance
Migrated from sovereign-ai-cookbook/recipes/apr-inference-server.yaml by PMAT-065 (centralize-cookbooks).