2 results
SEPT. 10, 2025 / AI
Gemini Batch API now supports Embeddings and OpenAI compatibility, allowing asynchronous processing at 50% lower rates. The new Gemini Embedding Model can be leveraged with the Batch API for cost-sensitive use cases. OpenAI SDK compatibility simplifies switching to Gemini Batch API.
JULY 7, 2025 / Gemini
The new batch mode in the Gemini API is designed for high-throughput, non-latency-critical AI workloads, simplifying large jobs by handling scheduling and processing, and making tasks like data analysis, bulk content creation, and model evaluation more cost-effective and scalable, so developers can process large volumes of data efficiently.