LiteLLM — 用户需求报告
周: 2026-W15 生成日期: 2026-04-06 分析 Issue 数: 22 (22 纳入分析) 需求簇: 1
Top 10 用户需求
| 排名 | 需求 | Issue 数 | 得分 | 分类 | 示例 |
|---|---|---|---|---|---|
| 1 | LLM Provider Integration Fixes and Performance Observability | 22 | 5.9 | Integration | #25191, #25157, #25116 |
上升最快的需求
| 需求 | 上升倍率 | 本周 | 分类 |
|---|---|---|---|
| LLM Provider Integration Fixes and Performance Observability | 23.0x | 22 | Integration |
分类分布
- Integration: 1 个簇
所有需求簇
1. LLM Provider Integration Fixes and Performance Observability
Users are reporting multiple bugs affecting various LLM provider integrations (Bedrock, Vertex AI, Azure, Modal, Predibase, Together AI) including streaming inconsistencies, crashes, and incorrect parameter handling. Additionally, users want improved observability through built-in latency profiling and want to address security vulnerabilities in dependencies. These issues collectively affect the reliability and accuracy of the proxy when serving requests across different cloud providers and model types.
- 数量: 22 条 issue (22 未关闭, 0 已关闭)
- 需求得分: 5.9
- 平均反应: 0.1 | 平均评论: 0.5
- 示例 Issue: #25191, #25157, #25116, #25214, #25172
本报告仅分析公开 GitHub Issues,代表的是公开讨论中的需求信号,并非全部用户的声音。
由 ReadYourUsers 生成