Vibe Coding Forem

Vibe YouTube
Vibe YouTube

Posted on

Krish Naik: What Is LLM Poisoning? Interesting Break Through

#ai

Did you hear? A joint team from Anthropic, the UK AI Security Institute and the Alan Turing Institute discovered that injecting as few as 250 poisoned documents can sneak a backdoor into any LLM—whether it’s a 600 M-parameter baby or a 13 B-parameter beast.

And if you’re in a festive mood, Krishnaik’s Diwali deal is live: 20% off all their live AI courses with code AI20. Hit up their enrollment links or ring +919111533440 / +91 84848 37781 to lock in your spot.

Watch on YouTube

Top comments (0)