Posts by Lucia Loher

2 results

Clear filters
  • SEPT. 10, 2025 / AI

    Gemini Batch API now supports Embeddings and OpenAI Compatibility

    Gemini Batch API now supports Embeddings and OpenAI compatibility, allowing asynchronous processing at 50% lower rates. The new Gemini Embedding Model can be leveraged with the Batch API for cost-sensitive use cases. OpenAI SDK compatibility simplifies switching to Gemini Batch API.

    GeminiBatchAPI_16x9_RD2-V01
  • JULY 7, 2025 / Gemini

    Batch Mode in the Gemini API: Process more for less

    The new batch mode in the Gemini API is designed for high-throughput, non-latency-critical AI workloads, simplifying large jobs by handling scheduling and processing, and making tasks like data analysis, bulk content creation, and model evaluation more cost-effective and scalable, so developers can process large volumes of data efficiently.

    Scale your AI workloads with batch mode in the Gemini API