Commit Graph

2 Commits

Author SHA1 Message Date
Devin AI
0976c42c6b fix: track token usage in litellm non-streaming and async calls
This fixes GitHub issue #4170 where token usage metrics were not being
updated when using litellm with streaming responses and async calls.

Changes:
- Add token usage tracking to _handle_non_streaming_response
- Add token usage tracking to _ahandle_non_streaming_response
- Add token usage tracking to _ahandle_streaming_response
- Fix sync streaming to track usage in both code paths
- Convert usage objects to dicts before passing to _track_token_usage_internal
- Add comprehensive tests for token usage tracking in all scenarios

Co-Authored-By: João <joao@crewai.com>
2026-01-03 19:28:08 +00:00
Greyson LaLonde
20704742e2 feat: async llm support
Some checks failed
CodeQL Advanced / Analyze (actions) (push) Has been cancelled
CodeQL Advanced / Analyze (python) (push) Has been cancelled
Check Documentation Broken Links / Check broken links (push) Has been cancelled
Notify Downstream / notify-downstream (push) Has been cancelled
Build uv cache / build-cache (3.10) (push) Has been cancelled
Build uv cache / build-cache (3.11) (push) Has been cancelled
Build uv cache / build-cache (3.12) (push) Has been cancelled
Build uv cache / build-cache (3.13) (push) Has been cancelled
Mark stale issues and pull requests / stale (push) Has been cancelled
feat: introduce async contract to BaseLLM

feat: add async call support for:

Azure provider

Anthropic provider

OpenAI provider

Gemini provider

Bedrock provider

LiteLLM provider

chore: expand scrubbed header fields (conftest, anthropic, bedrock)

chore: update docs to cover async functionality

chore: update and harden tests to support acall; re-add uri for cassette compatibility

chore: generate missing cassette

fix: ensure acall is non-abstract and set supports_tools = true for supported Anthropic models

chore: improve Bedrock async docstring and general test robustness
2025-12-01 18:56:56 -05:00