产品快照

LiteLLM

Users are reporting multiple bugs affecting various LLM provider integrations (Bedrock, Vertex AI, Azure, Modal, Predibase, Together AI) including streaming inconsistencies, crashes, and incorrect parameter handling. Additionally, users want improved observability through built-in latency profiling and want to address security vulnerabilities in dependencies. These issues collectively affect the reliability and accuracy of the proxy when serving requests across different cloud providers and model types.

已分析 Issue22
纳入排序22
需求簇1
更新时间2026-04-06
头号需求

LLM Provider Integration Fixes and Performance Observability

5.9 得分

上升需求

LLM Provider Integration Fixes and Performance Observability

23.0x

主导分类

Integration

LLM Gateway

优先级地图

当前最重要需求

  1. 1

    LLM Provider Integration Fixes and Performance Observability

    Integration

    Users are reporting multiple bugs affecting various LLM provider integrations (Bedrock, Vertex AI, Azure, Modal, Predibase, Together AI) including streaming inconsistencies, crashes, and incorrect parameter handling. Additionally, users want improved observability through built-in latency profiling and want to address security vulnerabilities in dependencies. These issues collectively affect the reliability and accuracy of the proxy when serving requests across different cloud providers and model types.

    22 条 issue 5.9 得分