Mx5 GGUF 7GB V1 API Documentation
by ModelsLabThis is a quantized version of my Flux model to run on lower-end graphics cards.
Thanks to @https://civitai.com/user/chrisgoringe243 for quantizing this, it is really good quality for such a small model.
There are larger sized GGUF versions available here: https://huggingface.co/ChrisGoringe/MixedQuantFlux/tree/main
for mid-range graphics cards.
mx5gguf7gbv1