Distillation Can Make AI Models Smaller and Cheaper 0 20.09.2025 14:00 WIRED A fundamental technique lets researchers use a big, expensive model to train another model for less.