This weekly digest covers key developments in artificial intelligence, including a new venture by the former GitHub CEO, warnings from a major chipmaker about AI data center expansion, and Google’s release of an MCP server for its developer documentation.
AI Coding Tools: Former GitHub CEO Focuses on Agent Code Management
Thomas Dohmke, previously the CEO of GitHub, introduced Entire on February 10. This new company secured $60 million in seed funding, valuing it at $300 million. Felicis led this funding round, noting it as the largest seed investment ever for a developer tools startup. Other notable investors included Y Combinator CEO Garry Tan, Datadog CEO Olivier Pomel, and Microsoft’s M12 fund.
Entire aims to address a growing challenge for developers: managing numerous AI coding agents that generate code at a pace too fast for human review. Its initial open-source offering, Checkpoints, records the prompts and context behind each AI-generated code modification. The tool supports Claude Code and Gemini CLI from its launch. Dohmke stated in his press announcement that existing software production systems were not designed for the AI era.
Apple also updated Xcode to version 26.3 during this week, incorporating support for Anthropic’s Claude Agent and OpenAI’s Codex directly within the IDE. This allows developers to utilize advanced reasoning models within Apple’s development environment, enabling code generation from natural language and real-time error detection.
These introductions highlight a broader trend where AI coding tools are evolving beyond simple autocomplete to comprehensive agent workflows. The focus has shifted from whether AI can write code to how the vast amount of AI-generated code is managed. Entire, Snowflake, and Apple each presented different approaches to this question this week.
AI Processing: SMIC Raises Concerns About Data Center Expansion
On February 11, China’s leading chipmaker, SMIC, issued a warning. Co-CEO Zhao Haijun informed analysts that companies are rapidly constructing data center capacity equivalent to 10 years’ worth in just two years. Zhao expressed concern that the exact purpose of these data centers has not been fully considered. He also cautioned that the memory chip market is in “crisis mode”, with companies overbooking orders amid a global supply shortage.
These concerns are supported by projections from Moody’s Ratings, which anticipates AI infrastructure spending to exceed $3 trillion over the next five years. In 2026 alone, major tech companies like Alphabet, Amazon, Meta, and Microsoft collectively plan to invest approximately $650 billion in capital expenditures.
SMIC reported revenues of $9.3 billion for 2025, marking a 16.2% increase year-over-year. However, wafer shipments from smartphone and consumer electronics manufacturers are being constrained by the surge in AI chip orders. Net profit increased by 39% to $685.1 million, yet both full-year and Q4 profits fell short of analyst expectations. SMIC’s Hong Kong shares declined by 2.2% following this announcement.
The scarcity of HBM (high-bandwidth memory) further exacerbates the situation. Zhao indicated that tight HBM supply will persist for several years due to the time required to build and qualify new capacity. Data centers are estimated to consume 70% of all memory chips produced in 2026, leading to shortages in other sectors.
A Deloitte outlook published this week projected global semiconductor sales to reach $975 billion in 2026. AI chips currently account for roughly half of this total revenue but represent less than 0.2% of all chip units sold. The report highlights a significant risk: the industry’s growth is heavily reliant on AI, while non-AI markets such as automotive and consumer electronics remain stagnant.
Standards and Protocols: Google Introduces Developer Knowledge MCP Server
Google launched the Developer Knowledge API and MCP server in public preview on February 4. This tool provides AI coding assistants with direct access to Google’s official developer documentation for services like Firebase, Android, and Google Cloud. Google ensures that all documentation is re-indexed within 24 hours of any service update.
The MCP server is compatible with popular assistants and IDEs. Developers can connect it using a simple configuration file and an API key from Google Cloud. The API is powered by two main functions: SearchDocumentChunks, which locates relevant documentation snippets, and BatchGetDocuments, which retrieves full page content. Google plans to integrate structured content, such as code samples, before the general availability release.
This introduction follows a period of significant activity within the MCP ecosystem. Google also proposed adding gRPC transport to MCP, a move beneficial for enterprises already utilizing gRPC across their services. This proposal addresses the challenge of requiring teams to rewrite services or implement translation proxies.
Regarding security, the Coalition for Secure AI (CoSAI) released its MCP Security white paper in late January. This paper identifies nearly 40 threats across 12 categories, treating MCP servers as critical infrastructure rather than just another API. It advocates for zero-trust principles, sandboxing, and end-to-end traceability for every agent request.
Cisco contributed Project CodeGuard to CoSAI on February 9. This open-source framework embeds security rules directly into AI coding workflows, guiding agents to produce more secure code from the outset. Meta also joined CoSAI as a Premier Sponsor on February 3, adding its support to the coalition’s growing list of members, which includes Google, IBM, Microsoft, and NVIDIA.


