Search

99 results

Clear filters
  • SEPT. 5, 2025 / Mobile

    Google AI Edge Gallery: Now with audio and on Google Play

    Google AI Edge has expanded the Gemma 3n preview to include audio support. Users can play with it on their own mobile phone using the Google AI Edge Gallery, which is now available in Open Beta on Play Store.

    GoogleAIEdge_Metadatal_RD2-V01
  • SEPT. 4, 2025 / AI

    From Fine-Tuning to Production: A Scalable Embedding Pipeline with Dataflow

    Learn how to use Google's EmbeddingGemma, an efficient open model, with Google Cloud's Dataflow and vector databases like AlloyDB to build scalable, real-time knowledge ingestion pipelines.

    EG+Dataflow_Metadatal
  • SEPT. 4, 2025 / Gemma

    Introducing EmbeddingGemma: The Best-in-Class Open Model for On-Device Embeddings

    Introducing EmbeddingGemma: a new embedding model designed for efficient on-device AI applications from Google. This open model is the highest-ranking text-only multilingual embedding model under 500M parameters on the MTEB benchmark, enabling powerful features like RAG and semantic search directly on mobile devices without an internet connection.

    EmbeddingGemma_Metadata
  • AUG. 27, 2025 / Gemini

    Beyond the terminal: Gemini CLI comes to Zed

    Google and Zed have partnered to integrate Gemini CLI directly into the Zed code editor, bringing AI capabilities directly into the editor for developers and allowing for faster and more focused coding, enabling tasks like in-place code generation, instant answers, and natural chat within the terminal with a seamless review workflow for AI-generated changes.

    Gemini CLI is now integrated into Zed, bringing AI directly to your code editor
  • AUG. 27, 2025 / Google Labs

    Stop “vibe testing” your LLMs. It's time for real evals.

    Stax, an experimental developer tool, addresses the insufficient nature of "vibe testing" LLMs by streamlining the LLM evaluation lifecycle, allowing users to rigorously test their AI stack and make data-driven decisions through human labeling and scalable LLM-as-a-judge auto-raters.

    Stax
  • AUG. 18, 2025 / Gemini

    URL context tool for Gemini API now generally available

    The Gemini API's URL Context tool is now generally available, allowing developers to ground prompts using web content instead of manual uploads. This release expands support to PDFs and images.

    URL context tool for Gemini API now generally available
  • JULY 24, 2025 / Google Labs

    Introducing Opal: describe, create, and share your AI mini-apps

    Opal is a new experimental tool from Google Labs that helps you compose prompts into dynamic, multi-step mini-apps using natural language, removing the need for code, allowing users to build and deploy shareable AI apps with powerful features and seamless integration with existing Google tools.

    Opal Metadata card
  • JULY 24, 2025 / AI

    The agentic experience: Is MCP the right tool for your AI future?

    Apigee helps enterprises integrate large language models (LLMs) into existing API ecosystems securely and scalably, addressing challenges like authentication and authorization not fully covered by the evolving Model Context Protocol (MCP), and offering an open-source MCP server example that demonstrates how to implement enterprise-ready API security for AI agents.

    The Agentic experience: Is MCP the right tool for your AI future?
  • JULY 23, 2025 / Firebase

    Unleashing new AI capabilities for popular frameworks in Firebase Studio

    New AI capabilities for popular frameworks in Firebase Studio include AI-optimized templates, streamlined integration with Firebase backend services, and the ability to fork workspaces for experimentation and collaboration, making AI-assisted app development more intuitive and faster for developers worldwide.

    Unleashing new AI capabilities for popular frameworks in Firebase Studio
  • JULY 22, 2025 / Gemini

    Gemini 2.5 Flash-Lite is now stable and generally available

    Gemini 2.5 Flash-Lite, previously in preview, is now stable and generally available. This cost-efficient model is ~1.5x faster than 2.0 Flash-Lite and 2.0 Flash, offers high quality, and includes 2.5 family features like a 1 million-token context window and multimodality.

    Gemini 2.5 Flash is making it easy to build with the Gemini API in Google AI Studio