Skip to main content

List all inference runtimes

GET 

/v1/inference-runtimes

List all available inference runtimes supported to deploy models. Currently the supported inference runtimes are Aleph Alpha authorial Luminous and vLLM

Request

Responses

OK