Supabase, Databricks, Neon, Graphite, Code Rabbit, Hammerspace, Statsig, Eppo, Datadog, Fastino, WisdomAI
It’s been a breakout quarter across the infrastructure and developer tooling landscape, with funding momentum continuing at the intersection of AI, data platforms, and enterprise software. This issue covers a range of announcements—from Supabase’s $200 million raise to Graphite’s push to rewire code review with AI. The headlines reflect a broader trend: developer-facing companies are no longer just productivity enhancers—they’re becoming foundational layers for AI-native workflows.
Two threads run through this month’s updates. First, infrastructure built around Postgres is having a moment: Supabase and Neon are both capitalizing on Postgres’ open-source maturity, but taking divergent approaches – Supabase as an integrated platform for full-stack developers, and Neon as a serverless core built for AI agents. Second, the rise of “agentic” architectures—tools that support AI agents acting autonomously—is prompting renewed investment in data movement (Hammerspace), experimentation (Statsig, Eppo), and domain-aware assistants (WisdomAI, Graphite, CodeRabbit). Whether building, reviewing, experimenting, or scaling, the emerging theme is clear: AI-ready systems need new foundations, and investors are moving fast to back them.
Let’s dive in.
Supabase Secures $200 Million Series D as it Seeks to Solidify Position as Open Source Firebase Alternative
Supabase has raised $200 million in Series D funding, bringing its post-money valuation to $2 billion. The round was led by Accel and included participation from Coatue, Y Combinator, Craft Ventures, and Felicis. Notable angels such as OpenAI Chief Product Officer Kevin Weil, Vercel CEO Guillermo Rauch, and Laravel founder Taylor Otwell also participated.
Founded in 2020 by Paul Copplestone and Ant Wilson, Supabase offers an open-source backend-as-a-service platform built on PostgreSQL. The platform packages essential backend services—including authentication, file storage, real-time subscriptions, edge functions, and instant APIs—making it a popular choice for full-stack developers. The startup originally gained traction by positioning itself as an open-source alternative to Google’s Firebase, focusing on transparency, extensibility, and self-hosting options.
The latest round comes just seven months after Supabase’s $80 million Series B, which was led by Craft Ventures and Peak XV Partners The rapid follow-on raise reflects both strong investor appetite and growing user adoption. Supabase reports over 2 million developers on its platform managing more than 3.5 million databases. The company says its sign-up rate has doubled in the past three months, driven by what CEO Paul Copplestone refers to as “vibe coding”—the fast-growing segment of developers building with new AI-native frontend frameworks such as Bolt, Lovable, and Cursor.
Accel‘s investment came after an unusually personal courtship: Partner Gonzalo Mocorrea traveled to Wānaka, New Zealand—Copplestone’s hometown—unannounced. After several informal conversations, Accel Partner Arun Mathew flew in as well to finalize a term sheet in person. “We know what greatness looks like,” said Mathew in an interview with Fortune. “The database layer is where value gets created in every major platform shift.”
Supabase’s commitment to Postgres, rather than building a proprietary engine, has been a deliberate part of its strategy. Postgres remains one of the most trusted and flexible open-source databases in the industry. The company has also invested in AI-compatible extensions such as pgvector for embedding storage and retrieval—key infrastructure for building modern AI apps.
The funding will support continued product development, growth of its global team, and expansion of platform capabilities to support AI-native workflows. Supabase’s vision is to become the backend of choice for developers building across a range of platforms—from early-stage prototypes to scaled AI products.
Despite its growth, Supabase faces increasing competition. The backend-as-a-service space has seen a wave of entrants focused on developer-first tools, edge deployment, and AI-native database architectures. Platforms like Firebase, Appwrite, Nhost, and Convex each offer variations on the backend stack with different philosophies around vendor lock-in, performance, and developer experience.
Supabase, however, is betting on open source, Postgres, and a community-centric approach to win long-term. “We’re default alive,” said Copplestone in a recent Reddit AMA. “The point of taking VC money is to support the growth. The economics are well-established – we get developers using the free tier, some convert to paid, and the cycle funds itself.”
Databricks Acquires Neon in $1B Deal to Power the Backend of AI Agents
On a related note, Databricks has announced the acquisition of Neon, a developer-first, serverless Postgres company, in a transaction valued at approximately $1 billion. The deal marks Databricks’ third billion-dollar acquisition in recent years and signals the company’s intent to integrate operational (OLTP) databases into its broader Data Intelligence Platform for AI and analytics.
Founded in 2020 by Nikita Shamgunov (co-founder of SingleStore), Heikki Linnakangas (a long-time Postgres committer), and Stas Kelvich (a systems engineer turned database hacker), Neon set out to re-architect Postgres for the cloud era. Its architecture separates storage from compute and introduces branching and forking as native database operations—capabilities that resonate deeply with both developers and emerging AI agent use cases.
Neon has raised a total of $131 million with its latest funding bringing in $26 million, led by M12, Microsoft’s Venture Fund in August 2024. Other backers include Abstract Ventures, General Catalyst, Menlo Ventures, and Notable Capital.
Databricks CEO Ali Ghodsi framed the acquisition as a foundational move to support the growing class of AI agents—autonomous software bots that generate, execute, and iterate code. “What Spark did for big data, we’re now doing for the AI stack,” said Ghodsi. “Four out of every five databases on Neon are spun up by code, not humans. The era of AI-native, agent-driven applications is here, and we need a database platform that operates at machine speed, not human speed.”
Neon’s developer-focused offering includes sub-second instance provisioning, usage-based pricing, and seamless branching for sandboxing, testing, and rollback. These features are already proving critical for AI agents, which demand rapid database creation, isolation, and experimentation at scale. Here’s a crazy stat – over 80% of databases created on Neon are now provisioned by AI agents, up from 30% just a year ago.
The acquisition comes at a time when developer infrastructure is seeing a wave of consolidation and category definition. Supabase, a fellow open-source Postgres-based backend platform, recently raised a $200 million Series D round at a $2 billion valuation as noted above. Supabase and Neon have both built substantial momentum on the back of Postgres’ enduring popularity and the open-source ecosystem that surrounds it. While Supabase targets full-stack developers with an integrated backend-as-a-service offering, Neon has focused squarely on modernizing the Postgres engine itself, making it more elastic, programmatic, and cloud-native.
Neon will continue to operate as a standalone platform post-acquisition, with its team of ~140 employees joining Databricks. Databricks plans to integrate Neon more deeply into its broader stack, allowing customers to use the same underlying data infrastructure for analytics, machine learning, and now transactional workloads in AI agent systems. The acquisition also positions Databricks to compete more directly with hyperscalers and infrastructure providers offering AI development platforms.
From a technology perspective, the acquisition is a bet on the evolving nature of workloads: in AI-native applications, agents write code, spin up services, and generate state—often requiring isolated, ephemeral, and rapidly provisioned databases. Traditional database products—monolithic, slow to provision, and built for static environments—struggle in this new paradigm. Neon’s branching and time-travel features mirror how developers already work with code, enabling faster iteration, safer experimentation, and lower overhead.
The move reflects a broader shift in the $100B+ database market. Legacy systems like Oracle, SQL Server, and even early cloud-native options like Aurora are increasingly mismatched with the needs of AI developers and agent frameworks. Neon, like Supabase, is part of a new wave of developer infrastructure companies building for the “text-to-software” generation—where AI composes, deploys, and evaluates code autonomously.
According to Databricks, more details on the integration roadmap will be shared at the upcoming Data + AI Summit in San Francisco this June. For now, existing Neon customers can expect continued investment in the platform’s developer experience and expanded reach through Databricks’ enterprise channels.
In joining Databricks, Neon gains the resources, distribution, and operational scale to accelerate its mission—while Databricks secures a critical building block for the emerging AI-native application stack.
Graphite Raises $52M Series B to Bring AI-Powered Code Reviews to the Developer Mainstream
Graphite , a startup building AI-augmented tooling for code review, has raised $52 million in Series B funding to expand its product and engineering teams and accelerate adoption of its developer workflow platform. The round was led by Accel and included participation from existing backers Andreessen Horowitz, The General Partnership, and XYZ Venture Capital. Shopify and Figma also joined as strategic investors, alongside Menlo Ventures’ Anthology Fund, affiliated with AI research lab Anthropic.
Founded in 2020 by former Facebook and Airbnb engineers, Graphite’s mission is to modernize the core workflows around pull requests, code review, and collaboration—especially in engineering teams scaling rapidly. Its flagship feature is an AI agent called Diamond, which reviews pull requests, highlights regressions, and proposes fixes. Unlike many code review assistants that rely solely on LLMs or rule-based linting, Diamond has full context of the repo and can be configured with organization-specific policies.
The company reports adoption by teams at Vercel, Replit, and Ramp, and claims Diamond reviews “hundreds of pull requests a day” in production environments. The platform integrates directly with GitHub and supports both individual contributors and entire teams, offering automation around pull request generation, reviews, and merges. While the AI handles routine suggestions, Graphite emphasizes human-in-the-loop design and maintains a strong commitment to developer trust and control.
In a blog post announcing the funding, Graphite CEO Merrill Lutsky framed the company’s direction as a productivity multiplier—not a replacement—for engineering teams. The company also launched a free plan for developers handling fewer than 100 pull requests per month, signaling a broader push for self-serve growth.
Graphite is not alone in targeting AI-driven improvements to the code review process. CodeRabbit, which raised a $16 million Series A in 2024 led by CRV, offers a competing platform focused on understanding developer intent and providing in-line suggestions directly within GitHub workflows. Others, such as Sweep and Snyk developed DeepCode AI, are experimenting with lightweight assistants and copilots for PR generation and triage. While approaches vary, the emergence of these tools reflects a growing consensus: AI-enhanced code review is becoming a foundational layer in modern software development.
With its new funding, Graphite plans to continue refining Diamond, invest in its core developer experience platform, and expand support for monorepos and custom pipelines—key requirements for AI-native and enterprise-scale teams.
Hammerspace Raises $100M to Power the Data Layer of AI Infrastructure
Hammerspace, a San Mateo-based data orchestration company, has secured $100 million in strategic growth capital to expand its position as a core enabler of AI workloads. The round was led by Altimeter Capital and ARK Investment Management LLC, with participation from a select group of strategic backers described as “highly participatory”—including investors who were early to Meta, Palantir, SpaceX, Tesla, and NVIDIA. The round values Hammerspace at over $500 million.
Founded by David Flynn, a technologist known for his early work in flash storage and Linux-based systems, Hammerspace has developed a Linux-native, high-performance platform that virtualizes, manages, and orchestrates unstructured data across hybrid, multi-cloud, and edge environments. The company’s architecture eliminates traditional bottlenecks by separating data access from data location—enabling near-instant access to data no matter where it’s stored.
The platform is built on the ubiquitous Linux kernel NFS client, originally developed and still maintained by Hammerspace CTO Trond Myklebust. At the core is a standards-based, parallel file and object system designed to ingest, move, and deliver data with minimal latency. Customers such as Meta, the U.S. Department of Defense, the National Science Foundation, and major life sciences organizations rely on Hammerspace to meet the high-throughput demands of training and inference in large-scale AI environments.
“AI has been the perfect storm for needing what I have built,” said Flynn. “Time-to-value is the critical metric now, and every delay is wasted potential. We built Hammerspace to eliminate that friction.”
Hammerspace’s value proposition resonates in an increasingly crowded infrastructure landscape. With the proliferation of siloed, fragmented, and unstructured data across organizations, traditional methods of data movement—copying, staging, syncing—have become too slow and operationally burdensome. By enabling data to be accessed in place, and moving only when absolutely necessary, Hammerspace offers a new standard for “just-in-time” data availability.
The company positions itself not as a storage vendor but as a “Tier 0” data platform for AI. It accelerates throughput between storage systems and GPUs, optimizes NVMe performance, and enables data to be rapidly ingested by models with minimal configuration or deployment lag. Flynn refers to this as “orchestration for outcomes”—abstracting the complexity of data movement in order to maximize infrastructure utilization and model responsiveness.
Altimeter Capital Partner Jamin Ball emphasized the strategic role of data performance in AI architecture: “You don’t have an AI strategy without a data strategy. Hammerspace understands that AI is only as powerful as the data it can reach.”
Hammerspace has raised a total of $156 million to date. Prior to this round, it raised $56 million from Prosperity7 Ventures (the VC arm of Saudi Aramco), ARK Investment Management LLC , Pier 88 Investment Partners, and other investors. The company was originally self-funded by Flynn.
The new capital will be used to expand go-to-market initiatives, scale deployments with existing enterprise and government customers, and build out partnerships with leading system integrators and cloud providers. Flynn has previously hinted at IPO ambitions, though recent statements suggest a two-year timeline based on market conditions.
Hammerspace competes in a broader category that includes players like VAST Data, WEKA, Dell Technologies and Pure Storage. While many of these firms focus on high-performance storage, Hammerspace differentiates itself by providing a software-only solution that virtualizes data across all storage types and environments. This orchestration-first approach gives it flexibility, speed, and cost efficiency that many infrastructure buyers now prioritize.
With the acceleration of AI development and growing pressure on infrastructure teams to eliminate idle time, Hammerspace is emerging as one of the few companies purpose-built to handle the data performance demands of large-scale model training, inference, and real-time AI systems.
Statsig Lands $100M Series C at $1.1B Valuation as Experimentation Stack Consolidates
Statsig , a Bellevue-based product experimentation and analytics startup, has raised $100 million in a Series C funding round led by ICONIQ Growth with participation from Sequoia Capital and Madrona Venture Group. The round brings Statsig’s valuation to $1.1 billion and marks a significant milestone for the company. Statsig was founded in 2021 by Vijaye Raji, a former engineering leader at Facebook.
Statsig provides a platform that helps product and engineering teams run controlled experiments, manage feature rollouts, and analyze user behavior. Its tools are widely used by companies looking to accelerate product development cycles, especially as AI-driven features introduce more unpredictability into user-facing applications. The company claims a broad range of end customers including Notion, Figma, Whatnot, and Rec Room.
This raise comes amid rising demand for tools that can bridge data, experimentation, and real-time analytics—especially in an era where AI products need to be constantly tested and validated post-deployment. Statsig positions itself as a modern alternative to homegrown experimentation stacks and legacy tools by unifying feature flags, metrics, and experimentation into a single, developer-friendly platform.
The timing of Statsig’s raise is notable given the recent acquisition of its closest rival, Eppo by Datadog. Eppo, another fast-growing experimentation platform founded in 2021, had positioned itself as a champion of the “modern data stack” and had integrated tightly with tools like dbt Labs and Snowflake. It raised $28 million in Series B financing last year from Menlo Ventures and Amplify Partners.
Datadog’s acquisition of Eppo signals a broader consolidation in the product analytics and experimentation space. With the lines between observability, product analytics, and experimentation increasingly blurring—particularly in AI-powered applications—platform providers are racing to offer end-to-end solutions. Datadog, already a leader in infrastructure monitoring and APM, is incorporating Eppo’s capabilities into its core platform, enabling customers to run product experiments directly within their existing observability workflows.
In contrast, Statsig’s strategy is to remain an independent and deeply focused provider of experimentation infrastructure. The company emphasizes technical sophistication and developer ergonomics, with a focus on automating the statistical complexity behind experimentation. Its recent release of Pulse, a real-time product analytics module, shows its intent to go beyond A/B testing and become a broader system of record for product decisions.
Other players in this space include Optimizely (now part of Episerver), Split.io, and LaunchDarkly. Statsig‘s growth, combined with the Eppo acquisition, suggests that experimentation is becoming a core part of modern product infrastructure rather than a niche analytics function.
With fresh capital, Statsig plans to expand its engineering team, enhance its integrations across AI and data platforms, and continue investing in statistical tooling to support the next generation of AI-native products. In an increasingly consolidated market, the company is betting on deep product focus and technical differentiation to remain a standalone leader.
Task-Specific Language Model Pioneer Fastino Raises $17.5 mm Seed
Fastino, a Palo Alto-based AI startup, has raised $17.5 million in seed funding led by Khosla Ventures, bringing its total funding to nearly $25 million. The company specializes in Task-Specific Language Models (TLMs), which are lightweight AI models optimized for specific enterprise tasks. Founded by Ash Lewis and George Hurn-Maloney, Fastino’s team includes researchers from Google DeepMind, Stanford University, Carnegie Mellon University and Apple Intelligence.
Fastino’s TLMs are designed to perform tasks such as summarization, function calling, text-to-JSON conversion, PII redaction, text classification, profanity censoring, and information extraction. These models are trained on low-end gaming GPUs costing less than $100,000, achieving 99 times faster inference than traditional large language models. The company’s approach allows for deployment on CPUs and low-end GPUs, making AI more accessible and cost-effective for enterprises.
Fastino offers a flat monthly subscription model, eliminating per-token fees and providing predictable costs for developers. The TLM API includes a free tier with up to 10,000 requests per month, and enterprise customers can deploy models within their own infrastructure, including Virtual Private Clouds and on-premise data centers. @
Additional investors in the seed round include Insight Partners , Valor Equity Partners, Dropbox Ventures, and notable angel investors such as Scott Johnston, former CEO of Docker, Inc, and Lukas Biewald, CEO of Weights & Biases.
Agentic Data Insights Platform WisdomAI Emerges from Stealth with $23 million in Seed Funding
WisdomAI, a San Mateo-based startup, has emerged from stealth with a $23 million seed funding round led by Coatue, with participation from Madrona, GTM Capital, The Anthology Fund, U First Capital, Latitude Capital, and over 30 angel investors.
Founded in 2023 by Soham Mazumdar, co-founder of Rubrik and former engineer at Google and Facebook, along with former Rubrik colleagues Kapil Chhabra, Sharvanath Pathak, and Guilherme Menezes, WisdomAI aims to redefine business intelligence through its Agentic Data Insights Platform.
At the core of WisdomAI’s platform is the “Knowledge Fabric,” a dynamic, interconnected map of an organization’s data ecosystem enriched with business context. This fabric is continuously refined by domain experts, ensuring that AI agents operate with a deep understanding of the company’s unique data landscape.
The company claims a key differentiator I is its approach to mitigating AI hallucinations. Instead of using generative AI to produce answers, the platform employs it to generate precise queries across data systems, ensuring that responses are grounded in actual company data.
Early adopters of WisdomAI include ConocoPhillips, Cisco, and Rubrik, utilizing the platform to gain actionable insights across various business functions.
With this funding, WisdomAI plans to accelerate product development, expand its engineering teams, and scale its enterprise customer base, aiming to transform how organizations derive intelligence from their data.
To continue receiving updates, please click here to subscribe to InfraRead.
My day job is advising growing companies on corporate strategy, finance and M&A. I recently advised Agnostiq on its acquisition by DataRobot.
If you are looking for an experienced partner with a track record of maximizing outcomes, let’s find a time to talk.
This article is cross-posted here.