YAML Metadata Warning: empty or missing yaml metadata in repo card

Check out the documentation for more information.

fokan/medgemma-4b-it-int8

INT8 dynamic quantized version of google/medgemma-4b-it

  • Quantization: Dynamic INT8 on Linear layers (PyTorch)
  • Ideal for CPU inference
  • 4ร— smaller than original model
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using fokan/medgemma-4b-it-int8 1