Areel Khan 
experience / resume / blog / tags
2025
  • Jailbreaking LLMs: Competing Objectives and Mismatched Generalization
    Takeaways and thoughts after reading "Jailbroken: How Does LLM Safety Training Fail"
  • LLMs only predict the next word in a sentence
    A hand-wavy explanation for how LLMs are autoregressive
© 2025 • Areel Khan 👀
Built with Astro
Press Esc or click anywhere to close