You must log in or # to comment.
I got a straight answer out of QwQ-LCoT-7B-Instruct.Q4_K_M five out of five times for the Tiananmen Square question.
insane, absolutely insane
Why insane? For quality, speed, size? I find the coder 1.5b and 3b light and good
It matches R1 in the given benchmarks. R1 has 671B params (36 activated) while this only has 32
GGUF quants are already out: https://huggingface.co/bartowski/Qwen_QwQ-32B-GGUF