Machine Learning

Auto Added by WPeMatico

Merriam-Webster’s word of the year delivers a dismissive verdict on junk AI content

Like most tools, generative AI models can be misused. And when the misuse gets bad enough that a major dictionary notices, you know it has become a cultural phenomenon. On Sunday, Merriam-Webster announced that “slop” is its 2025 Word of the Year, reflecting how the term has become shorthand for the flood of low-quality AI-generated […]

Merriam-Webster’s word of the year delivers a dismissive verdict on junk AI content Read More »

3 Questions: Using computation to study the world’s best single-celled chemists

Today, out of an estimated 1 trillion species on Earth, 99.999 percent are considered microbial — bacteria, archaea, viruses, and single-celled eukaryotes. For much of our planet’s history, microbes ruled the Earth, able to live and thrive in the most extreme of environments. Researchers have only just begun in the last few decades to contend

3 Questions: Using computation to study the world’s best single-celled chemists Read More »

Deep-learning model predicts how fruit flies form, cell by cell

During early development, tissues and organs begin to bloom through the shifting, splitting, and growing of many thousands of cells.A team of MIT engineers has now developed a way to predict, minute by minute, how individual cells will fold, divide, and rearrange during a fruit fly’s earliest stage of growth. The new method may one

Deep-learning model predicts how fruit flies form, cell by cell Read More »

OpenAI has Released the ‘circuit-sparsity’: A Set of Open Tools for Connecting Weight Sparse Models and Dense Baselines through Activation Bridges

OpenAI team has released their openai/circuit-sparsity model on Hugging Face and the openai/circuit_sparsity toolkit on GitHub. The release packages the models and circuits from the paper ‘Weight-sparse transformers have interpretable circuits‘. https://arxiv.org/pdf/2511.13653 What is a weight sparse transformer? The models are GPT-2 style decoder only transformers trained on Python code. Sparsity is not added after

OpenAI has Released the ‘circuit-sparsity’: A Set of Open Tools for Connecting Weight Sparse Models and Dense Baselines through Activation Bridges Read More »

OpenAI built an AI coding agent and uses it to improve the agent itself

With the popularity of AI coding tools rising among some software developers, their adoption has begun to touch every aspect of the process, including human developers using the tools to improve existing AI coding tools. We’re not talking about runaway self-improvement here; just people using tools to improve the tools themselves. In interviews with Ars

OpenAI built an AI coding agent and uses it to improve the agent itself Read More »

Enabling small language models to solve complex reasoning tasks

As language models (LMs) improve at tasks like image generation, trivia questions, and simple math, you might think that human-like reasoning is around the corner. In reality, they still trail us by a wide margin on complex tasks. Try playing Sudoku with one, for instance, where you fill in numbers one through nine in such

Enabling small language models to solve complex reasoning tasks Read More »

New method improves the reliability of statistical estimations

Let’s say an environmental scientist is studying whether exposure to air pollution is associated with lower birth weights in a particular county.They might train a machine-learning model to estimate the magnitude of this association, since machine-learning methods are especially good at learning complex relationships.Standard machine-learning methods excel at making predictions and sometimes provide uncertainties, like

New method improves the reliability of statistical estimations Read More »

OpenAI releases GPT-5.2 after “code red” Google threat alert

On Thursday, OpenAI released GPT-5.2, its newest family of AI models for ChatGPT, in three versions called Instant, Thinking, and Pro. The release follows CEO Sam Altman’s internal “code red” memo earlier this month, which directed company resources toward improving ChatGPT in response to competitive pressure from Google’s Gemini 3 AI model. “We designed 5.2

OpenAI releases GPT-5.2 after “code red” Google threat alert Read More »

What Is Sociophonetics and Why It Matters for AI

You’ve probably had this experience: a voice assistant understands your friend perfectly, but struggles with your accent, or with your parents’ way of speaking. Same language. Same request. Very different results. That gap is exactly where sociophonetics lives — and why it suddenly matters so much for AI. Sociophonetics looks at how social factors and

What Is Sociophonetics and Why It Matters for AI Read More »

Interview: From CUDA to Tile-Based Programming: NVIDIA’s Stephen Jones on Building the Future of AI

As AI models grow in complexity and hardware evolves to meet the demand, the software layer connecting the two must also adapt. We recently sat down with Stephen Jones, a Distinguished Engineer at NVIDIA and one of the original architects of CUDA. Jones, whose background spans from fluid mechanics to aerospace engineering, offered deep insights

Interview: From CUDA to Tile-Based Programming: NVIDIA’s Stephen Jones on Building the Future of AI Read More »