Description
Pretty fast llm inference service provided by them so native support will be great option for users. Altho i do know we can still use it through Litellm ,still i know for sure that many dont recognize litellm and its true power so if it can be natively supported then it would be great thing for those who dont know :-)
Severity
Minor
Additional Information
No response