-
Notifications
You must be signed in to change notification settings - Fork 205
Issues: ModelTC/lightllm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
question about fp8 version of context_flashattention_nopad.py
bug
Something isn't working
#479
opened Jul 30, 2024 by
changyuanzhangchina
from lightllm_ppl_int8kv_flashdecoding_kernel import group8_int8kv_flashdecoding_stage1
bug
Something isn't working
#475
opened Jul 24, 2024 by
AlvL1225
[Feature]: Suport for InternVL-Chat-V1-5
bug
Something isn't working
#462
opened Jul 10, 2024 by
JingofXin
[BUG]Ask aboout Qwen models with weight quantization .
bug
Something isn't working
#408
opened May 15, 2024 by
Cesilina
1 task
[BUG] There already is a lightllm in pypi
bug
Something isn't working
#380
opened Mar 26, 2024 by
rlippmann
1 task
Qwen-14B-INT8 face the issue: 'QwenTransformerLayerWeight' object has no attribute 'q_weight_'
bug
Something isn't working
#333
opened Feb 20, 2024 by
wangr0031
Inconsistent Output between LightLLM and Transformers Inference Library
bug
Something isn't working
#309
opened Jan 19, 2024 by
Lvjinhong
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.