GF
GPTQ-for-LLaMa
60.1
Low
High confidence
4 bits quantization of LLaMA using GPTQ
Pillar Breakdown
Adoption
35%
51.3
Maintenance
30%
62.9
Friction
20%
98.5
Ecosystem
15%
38.5
Momentum
0.06Falling
7d change -4.66
High confidenceIn Model Optimization
Ranked #13 of 16