C-BERT MLM

Exploring Software Naturalness through Neural Language Models

Overview

This model is the unofficial HuggingFace version of "C-BERT" with just the masked language modeling head for pretraining. The weights come from "An Empirical Comparison of Pre-Trained Models of Source Code". Please cite the authors if you use this in an academic setting.

Downloads last month
11
Safetensors
Model size
45.1M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support