Search

30 results

Clear filters
  • JUNE 26, 2025 / AI

    Unlock deeper insights with the new Python client library for Data Commons

    Google has released a new Python client library for Data Commons – an open-source knowledge graph that unifies public statistical data, and enhances how data developers can leverage Data Commons by offering improved features, support for custom instances, and easier access to a vast array of statistical variables – developed with contributions from The ONE Campaign.

    data-commons-python-library-meta
  • JUNE 24, 2025 / Gemini

    Supercharge your notebooks: The new AI-first Google Colab is now available to everyone

    The new AI-first Google Colab enhances productivity with improvements powered by features like iterative querying for conversational coding, a next-generation Data Science Agent for autonomous workflows, and effortless code transformation. Early adopters report a dramatic productivity boost, accelerating ML projects, debugging code faster, and effortlessly creating high-quality visualizations.

    Supercharge your notebooks: The new AI-first Google Colab is now available to everyone
  • JUNE 24, 2025 / Kaggle

    Using KerasHub for easy end-to-end machine learning workflows with Hugging Face

    KerasHub enables users to mix and match model architectures and weights across different machine learning frameworks, allowing checkpoints from sources like Hugging Face Hub (including those created with PyTorch) to be loaded into Keras models for use with JAX, PyTorch, or TensorFlow. This flexibility means you can leverage a vast array of community fine-tuned models while maintaining full control over your chosen backend framework.

    How to load model weights from SafeTensors into KerasHub for multi-framework machine learning
  • JUNE 23, 2025 / Kaggle

    Multilingual innovation in LLMs: How open models help unlock global communication

    Developers adapt LLMs like Gemma for diverse languages and cultural contexts, demonstrating AI's potential to bridge global communication gaps by addressing challenges like translating ancient texts, localizing mathematical understanding, and enhancing cultural sensitivity in lyric translation.

    Multilingual innovation in LLMs: How open models help unlock global communication
  • MAY 20, 2025 / Gemini

    Fully Reimagined: AI-First Google Colab

    Google Colab is launching a reimagined AI-first version at Google I/O, featuring an agentic collaborator powered by Gemini 2.5 Flash with iterative querying capabilities, an upgraded Data Science Agent, effortless code transformation, and flexible interaction methods, aiming to significantly improve coding workflows.

    Google Colab's reimagined Al-first experience
  • APRIL 15, 2025 / Cloud

    Apigee announces general availability of APIM Extension Processor

    The Apigee Extension Processor enhances Apigee's capabilities by enabling it to manage gRPC streaming and integrate with Google Cloud services through Cloud Load Balancing, allowing for applying Apigee policies to a wider range of backend services, and improving secure access to Google Cloud infrastructure.

    Apigee-API-Hub-Feature
  • APRIL 8, 2025 / Cloud

    Simplified Dataflow Connectors with Managed I/O

    Google Cloud Dataflow's Managed I/O simplifies using Apache Beam I/O connectors by automatically updating connectors to the latest versions and providing a standardized API, optimizing connectors specifically for Dataflow, ensuring efficient performance and reducing the need for manual configuration, freeing users to focus on pipeline logic.

    Cloud-2-meta
  • MARCH 25, 2025 / Gemini

    Introducing TxGemma: Open models to improve therapeutics development

    Google DeepMind releases TxGemma, built on Gemma, which predicts therapeutic properties, and Agentic-Tx, powered by Gemini 2.0 Pro, which tackles complex research problem-solving with advanced tools.

    TxGemma
  • MARCH 13, 2025 / Gemini

    Unlocking bonus worlds with Gemini for the Google I/O puzzle

    The Google I/O 2025 puzzle used the Gemini API to generate dynamic riddles for bonus worlds, enhancing player engagement and scalability. Here's what our developers learned on using the Gemini API effectively, including creativity, design, and implementation strategies.

    IO-2025-puzzle-how-its-made
  • JAN. 15, 2025 / AI

    Vertex AI RAG Engine: A developers tool

    Vertex AI RAG Engine, a managed orchestration service, streamlines the process of retrieving and feeding relevant information to Large Language Models. This enables developers to build robust, grounded generative AI apps that ensure responses are factually grounded.

    Cloud-Vertex-AI-Sequence-Light