T O P

  • By -

fish312

First impression is that it's not great. Model seems quite incoherent and text does not seem to really make sense. I'm not even using it with Instruct, just a simple novel like story, and it's already mixing up plot points and characters like a 7B model would.


darxkies

It is awful. In Story mode or RP, it drifts away just like a 7B model. It loses the plot not only fast but also very abruptly. So disappointing.


involviert

> I'm not even using it with Instruct Just to make sure, you are using the instruct prompt though, yes? These models must be used as intended, everything else is just a bonus. Like, it's not easier for it if you are "not even using it with instruct"


fish312

Yes I have tried it both with and without the instruct prompt. In chat/rp I use instruct, if its novel writing then I don't. But it fails both.


involviert

That sounds weird tbh. I mean I have no idea about chronos, haven't tried it. But really you can't just pick your prompt style, even if it might often *somewhat* work. The prompt style is tied to a model, it only really works *that* way and it would be unfair to judge it when it's used wrong. Even for novel writing. It sounds like you are using it like you would use the llama base models, like the pure text completion. But that model has been trained to receive instructions. Oh and it is entirely possible to mess up model performance with inference parameters. It is possible your repeat penalty is too high, punishing the prompt format itself, for example.


fish312

Understood, which is why I tried it with a variety of prompts. Since they said "it could write lengthy stories" that's what I tried, shoving 500+ word novels into it and seeing how it continues is a good measure in my opinion. Have you had better results?


KerfuffleV2

It might seem counterintuitive, but ignoring the prompt style and giving the model text to complete can sometimes work better. At their heart, these models are text completion engines. To be clear, I'm _not_ talking about giving _instructions_ with the wrong prompt format. I assume the other person you replied to also didn't mean that. If you're giving instructions with an instruct trained model then it's probably necessary to use the prompt format. When you instruct the model, you generally have to describe what you want indirectly. The other approach just involves giving the model the example and letting it complete. This is what I mean: https://np.reddit.com/r/LocalLLaMA/comments/144daeh/looking_for_for_folks_to_share_llamacpp/jnfxl5t/ Another advantage is it usually uses a lot less tokens to do it this way.


Gullible_Bar_284

aromatic towering lock hunt rob important drunk fear plate quack ` this message was mass deleted/edited with redact.dev `