-
Notifications
You must be signed in to change notification settings - Fork 116
Issues: huggingface/lighteval
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[FT] Enable lazy model initialization
feature request
New feature/request
#496
opened Jan 11, 2025 by
JoelNiklaus
[BUG] Dataset 'lighteval/MATH-Hard' doesn't exist on the Hub
bug
Something isn't working
#494
opened Jan 10, 2025 by
ChangLiu-DrPatient
[FT] Custom model to TransformersModel
feature request
New feature/request
#489
opened Jan 7, 2025 by
Giuseppe5
[BUG] By default pip install lighteval is installing the cpu only torch version , its killing dependencies.
bug
Something isn't working
#487
opened Jan 6, 2025 by
kzos
[FT] Add and test multinode runs back
feature request
New feature/request
#482
opened Jan 2, 2025 by
clefourrier
[FT] Enhancing CorpusLevelTranslationMetric with Asian Language Support
feature request
New feature/request
#478
opened Dec 27, 2024 by
ryan-minato
[FT] JudgeLLM should support litellm backend
feature request
New feature/request
#474
opened Dec 22, 2024 by
JoelNiklaus
[FT] Rerun evaluations with new metrics based on completions saved in details file
feature request
New feature/request
#467
opened Dec 19, 2024 by
JoelNiklaus
[BUG] Issue with LightevalTaskConfig.stop_sequence Attribute When Unset
bug
Something isn't working
#462
opened Dec 19, 2024 by
ryan-minato
[BUG] Issue with CACHE_DIR Default Value in Accelerate Pipeline
bug
Something isn't working
#460
opened Dec 19, 2024 by
ryan-minato
[FT] remove openai endpoint and only use litellm
feature request
New feature/request
#458
opened Dec 18, 2024 by
NathanHB
[FT] Align parameter names in config files and config classes
feature request
New feature/request
#439
opened Dec 12, 2024 by
albertvillanova
[FT] Fail faster when passing unsupported metrics to InferenceEndpointModel
feature request
New feature/request
#436
opened Dec 11, 2024 by
albertvillanova
[FT] Enable the evaluation of any function
feature request
New feature/request
#430
opened Dec 10, 2024 by
JoelNiklaus
[FT] Adding caching for each dataset run
feature request
New feature/request
#417
opened Dec 2, 2024 by
JoelNiklaus
[FT] Add System Prompt field in LightevalTaskConfig that can be used by model clients
feature request
New feature/request
#410
opened Nov 28, 2024 by
JoelNiklaus
[FT] The word "pretrained" is required in model_args but not in model_config_path
feature request
New feature/request
#405
opened Nov 25, 2024 by
albertvillanova
[FT] Support llama.cpp inference
feature request
New feature/request
#402
opened Nov 22, 2024 by
JoelNiklaus
[FT] Is it possible to save the predictions to prevent rerunning expensive inference
feature request
New feature/request
#396
opened Nov 19, 2024 by
JoelNiklaus
[BUG] Can't use lighteval to evaluate the nanotron
bug
Something isn't working
#395
opened Nov 19, 2024 by
alexchen4ai
[FT] Evaluation using a multi-document RAG based on statistical tools and LLM as judge
feature request
New feature/request
#379
opened Oct 30, 2024 by
louisbrulenaudet
Previous Next
ProTip!
Updated in the last three days: updated:>2025-01-08.