' Distillation ' refers to the process of transferring knowledge from a larger model (teacher model) to a smaller model (student model), so that the distilled model can reduce computational costs ...
Alarming new research suggests that AI models can pick up “subliminal” patterns in training data generated by another AI that can make their behavior unimaginably more dangerous, The Verge reports.
Fine-tuned “student” models can pick up unwanted traits from base “teacher” models that could evade data filtering, generating a need for more rigorous safety evaluations. Researchers have discovered ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results