Improve model card: Add pipeline tag, library name, paper, code, and usage

#1
by nielsr HF Staff - opened

This PR significantly enhances the model card for mblm-chatbot-instruction-prompts-igtree by:

  • Adding the pipeline_tag: text-generation to correctly categorize the model for next-token prediction tasks.
  • Adding library_name: transformers. The TimblHuggingFaceModel architecture, as seen in config.json and the GitHub README's "Hugging Face style" usage, indicates compatibility with the Hugging Face transformers library (inheriting from transformers.PreTrainedModel), which enables an automated "How to use" widget.
  • Including a direct link to the research paper: Memory-based Language Models: An Efficient, Explainable, and Eco-friendly Approach to Large Language Modeling.
  • Adding a link to the official GitHub repository: https://github.com/antalvdb/olifant.
  • Providing a sample usage code snippet, adapted from the GitHub repository, demonstrating how to initialize and use the model for text generation with transformers.
  • Adding a comprehensive model description and the relevant BibTeX citation.

Please review and merge this PR to improve the discoverability and usability of this model on the Hub.

antalvdb changed pull request status to merged

Sign up or log in to comment