Shared endpoints allow you to invoke ML models using our shared infrastructure. This is a cost-effective solution for users who do not require dedicated resources.


Using a shared endpoint

Deploying your model on a shared endpoint is straightforward. You can directly use our Playground or perform an API call.

Want a dedicated infra for better SLAs? Check out Private Endpoints.