请问能否支持chatglm2-6b的4bit量化? #1381
Replies: 2 comments 2 replies
-
可以直接使用 我跑过 |
Beta Was this translation helpful? Give feedback.
1 reply
-
我就用的4bit,没任何特别的地方,就是把模型文件夹换成了int4的,一切正常 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
是否支持开启4bit量化,或者直接使用chatglm2-6b-int4?
Beta Was this translation helpful? Give feedback.
All reactions