• pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Yeah in fact you’re giving the llm additional data to train on what poisoned data looks like so it can avoid it better, as they can clear see the before vs after