• Cyberflunk@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Hallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.

    Rest assured, this isn’t just training data.