AI digest: multimodal tools and self-evolving models
MiniMax opens up its platform with CLI tools and self-evolving models, while Meta explores neural computers that merge computation and memory.
Three big releases this week show AI moving beyond chatbots into proper development tools and infrastructure.
MiniMax releases MMX-CLI for developers and AI agents
MiniMax dropped MMX-CLI, a Node.js command-line interface that gives both human developers and AI agents direct access to their full stack of generative capabilities. This includes image, video, speech, music, vision, and search all through one terminal interface. Smart move to make their platform accessible to coding agents like Cursor and Claude, not just web users.
MiniMax M2.7 goes open source with self-evolution
The same company open-sourced MiniMax M2.7 on Hugging Face, scoring 56.22% on SWE-Pro and 57% on Terminal Bench 2. The interesting bit is it’s designed to participate in its own development cycle, essentially self-evolving. This could be the start of models that actually improve themselves rather than waiting for the next training run.
Meta and KAUST propose neural computers
Researchers from Meta AI and KAUST are working on neural computers that fold computation, memory, and I/O into one learned model rather than running neural networks on top of traditional computers. Early days but this could reshape how we think about AI infrastructure if they can make it work at scale.
Anthropic courts Christian leaders on Claude’s morality
Anthropic invited Christian leaders to advise on Claude’s moral behaviour, including discussions about whether AI can be a “child of God”. Feels like the AI alignment conversation is expanding beyond technical safety into proper theological territory.