Good Idea
These precog models are very cool and work much better than trying to use a normal thinking/reasoning model for character interactions. Well done.
Agreed, after the completely broken Magidonia finetune, this one is a really good surprise.
It's coherent. The thinking part is short, and from the little I've tested, it actually helps to fight against Mistral's worst repetitive instincts. It's also the first CoT model I've encountered where the thinking block is actually not hindering the model when it comes to creative writing or RP tasks, while keeping the base functionalities working well enough. You actually found something worth digging further into.
Good job.
edit: oh shit it's the 123B model, not the 24B one, my bad. Still my comment still applies.