Search

3 results

Clear filters
  • AUG. 12, 2025 / Kaggle

    Train a GPT2 model with JAX on TPU for free

    Build and train a GPT2 model from scratch using JAX on Google TPUs, with a complete Python notebook for free-tier Colab or Kaggle. Learn how to define a hardware mesh, partition model parameters and input data for data parallelism, and optimize the model training process.

    Train a GPT2 model with JAX on TPU for free
  • JUNE 24, 2025 / Kaggle

    Using KerasHub for easy end-to-end machine learning workflows with Hugging Face

    KerasHub enables users to mix and match model architectures and weights across different machine learning frameworks, allowing checkpoints from sources like Hugging Face Hub (including those created with PyTorch) to be loaded into Keras models for use with JAX, PyTorch, or TensorFlow. This flexibility means you can leverage a vast array of community fine-tuned models while maintaining full control over your chosen backend framework.

    How to load model weights from SafeTensors into KerasHub for multi-framework machine learning
  • MAY 13, 2025 / TensorFlow

    Build and train a recommender system in 10 minutes using Keras and JAX

    Keras Recommenders (KerasRS) is a new library announced to help developers build recommendation systems using APIs with building blocks for ranking and retrieval, and it can be installed via pip with support for JAX, TensorFlow, or PyTorch backends.

    Build and train a Recommender System in 10 minutes using Keras and JAX