Skip to main content

Exception Mapping

LiteLLM maps exceptions across all providers to their OpenAI counterparts.

  • Rate Limit Errors
  • Invalid Request Errors
  • Authentication Errors
  • Timeout Errors
  • ServiceUnavailableError
  • APIError
  • APIConnectionError

Base case we return APIConnectionError

All our exceptions inherit from OpenAI's exception types, so any error-handling you have for that, should work out of the box with LiteLLM.

For all cases, the exception returned inherits from the original OpenAI Exception but contains 3 additional attributes:

  • status_code - the http status code of the exception
  • message - the error message
  • llm_provider - the provider raising the exception

usage​

from openai.error import OpenAIError
from litellm import completion

os.environ["ANTHROPIC_API_KEY"] = "bad-key"
try:
# some code
completion(model="claude-instant-1", messages=[{"role": "user", "content": "Hey, how's it going?"}])
except OpenAIError as e:
print(e)

details​

To see how it's implemented - check out the code

Create an issue or make a PR if you want to improve the exception mapping.

Note For OpenAI and Azure we return the original exception (since they're of the OpenAI Error type). But we add the 'llm_provider' attribute to them. See code

custom mapping list​

Base case - we return the original exception.

ContextWindowExceededErrorAuthenticationErrorInvalidRequestErrorRateLimitErrorServiceUnavailableError
Anthropic✅✅✅✅
OpenAI✅✅✅✅✅
Replicate✅✅✅✅✅
Cohere✅✅✅✅✅
Huggingface✅✅✅✅
Openrouter✅✅✅✅
AI21✅✅✅✅
VertexAI✅
Bedrock✅
Sagemaker✅
TogetherAI✅✅✅✅
AlephAlpha✅✅✅✅✅

For a deeper understanding of these exceptions, you can check out this implementation for additional insights.

The ContextWindowExceededError is a sub-class of InvalidRequestError. It was introduced to provide more granularity for exception-handling scenarios. Please refer to this issue to learn more.

Contributions to improve exception mapping are welcome