This week in AI updates: Local Azure DevOps MCP server, Veo 3.1 in Gemini API, and more (October 17, 2025)
Microsoft announces general availability of Azure DevOps local MCP Server The MCP server acts as an intermediary between AI assistants and the Azure DevOps organization. It can help inject context from Azure DevOps, like work items, pull requests, and test plans. By adding this context to prompts, the LLM can provide more relevant answers that are tailored to the users Azure DevOps project. According to Microsoft, it supports most of the main areas of Azure DevOps, including work items, wiki, repos, search, and test plans. Because it is local and not a remote MCP Server, it runs inside the network or local development environment, ensuring that private data doesn’t leave the system. Google introduces Coral NPU, Veo 3.1 in Gemini API, and interactive commands in Gemini CLI Google this week announced that the latest version of its image generation model Veo is now available in the Gemini API. Veo 3.1 can generate richer native audio, has an improved understanding of cinematic ...