OpenAI has spent the past year systematically reducing its dependence on Nvidia. The company signed a massive multi-year deal with AMD in October 2025, struck a $38 billion cloud computing agreement ...
OpenAI's new Spark model codes 15x faster than GPT-5.3-Codex - but there's a catch ...
Artificial intelligence is in an arms race of scale with bigger models, more parameters and more compute driving competing announcements that seem to come out on a daily basis. AI foundation model ...
Alibaba Group (BABA) has taken the lead in open source AI models. Its Qwen model series is now the world’s most-downloaded and fine-tuned. This change marks a new direction in how companies and ...
What if the next breakthrough in AI wasn’t locked behind corporate paywalls but instead placed directly in the hands of developers, free to innovate and create? Enter Kimi K2 0905, a new open source ...
Big tech has spent the last few years creating ever-larger AI models, leveraging rack after rack of expensive GPUs to provide generative AI as a cloud service. But tiny AI matters, too. Google has ...
French AI startup Mistral has weathered a rocky period of public questioning over the last year to emerge, now here in December 2025, with new, crowd-pleasing models for enterprise and indie ...
SUNNYVALE, Calif. & SAN FRANCISCO--(BUSINESS WIRE)--Cerebras Systems today announced inference support for gpt-oss-120B, OpenAI’s first open-weight reasoning model, now running at record-breaking ...
China-based AI startup MiniMax has launched its open-source large language model, MiniMax M2, built specifically for agent workflows and coding tasks. The model is said to deliver twice the inference ...
DeepSeek, the Chinese artificial intelligence research company that has repeatedly challenged assumptions about AI development costs, has released a new model that fundamentally reimagines how large ...
If you’re a coder or someone who follows AI benchmarks for fun (hey, I won’t judge), this model will excite you tremendously. For everyone else, prepare to be underwhelmed—or rather, prepare to wait ...