AI & RoboticsNews

Nvidia launches fully open source transcription AI model Parakeet-TDT-0.6B-V2 on Hugging Face

Nvidia has become one of the most valuable companies in the world in recent years thanks to the stock market noticing how much demand there is for graphics processing units (GPUs), the powerful chips Nvidia makes that are used to render graphics in video games but also, increasingly, train AI large language and diffusion models. But Nvidia does far more than just make hardware, of course, and the…
Read more
AI & RoboticsNews

Not everything needs an LLM: A framework for evaluating when AI makes sense

Question: What product should use machine learning (ML)? Project manager answer: Yes. Jokes aside, the advent of generative AI has upended our understanding of what use cases lend themselves best to ML. Historically, we have always leveraged ML for repeatable, predictive patterns in customer experiences, but now, it’s possible to leverage a form of ML even without an entire training…
Read more
AI & RoboticsNews

The ‘era of experience’ will unleash self-learning AI agents across the web—here’s how to prepare

David Silver and Richard Sutton, two renowned AI scientists, argue in a new paper that artificial intelligence is about to enter a new phase, the “Era of Experience.” This is where AI systems rely increasingly less on human-provided data and improve themselves by gathering data from and interacting with the world. While the paper is conceptual and forward-looking, it has direct implications…
Read more
AI & RoboticsNews

Meta unleashes Llama API running 18x faster than OpenAI: Cerebras partnership delivers 2,600 tokens per second

Meta announced today a partnership with Cerebras Systems to power its new Llama API, offering developers access to inference speeds up to 18 times faster than traditional GPU-based solutions. The announcement, made at Meta’s inaugural LlamaCon developer conference in Menlo Park, positions the company to compete directly with OpenAI, Anthropic, and Google in the rapidly growing AI inference…
Read more