Yehor/kulyk-en-uk

#1183
by Yehor - opened

It would be nice to have the same model as https://huggingface.co/mradermacher/kulyk-uk-en-GGUF

We tried this model in the past and back then it failed using this error:

kulyk-en-uk     Traceback (most recent call last):
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 7641, in <module>
kulyk-en-uk         main()
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 7635, in main
kulyk-en-uk         model_instance.write()
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 411, in write
kulyk-en-uk         self.prepare_metadata(vocab_only=False)
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 499, in prepare_metadata
kulyk-en-uk         super().prepare_metadata(vocab_only=vocab_only)
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 401, in prepare_metadata
kulyk-en-uk         self.set_gguf_parameters()
kulyk-en-uk       File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 7337, in set_gguf_parameters
kulyk-en-uk         for layer_type in self.hparams["layer_types"]
kulyk-en-uk                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^
kulyk-en-uk     KeyError: 'layer_types'
kulyk-en-uk     job finished, status 1
kulyk-en-uk     job-done<0 kulyk-en-uk noquant 1>
kulyk-en-uk
kulyk-en-uk     NAME: kulyk-en-uk
kulyk-en-uk     TIME: Thu Jul 17 06:36:48 2025
kulyk-en-uk     WORKER: leia

This seems like a fixable error.I requeued booth of them so we get imatrix quants and if it again fails (which it almost certainly will) I will see if I can create a fixed version of it.

Strange why it worked to another model; they should be identical

Sign up or log in to comment