Search

2 results

Clear filters
  • SEP 04, 2024 / Google AI Edge

    TensorFlow Lite is now LiteRT

    TensorFlow Lite, now named LiteRT, is still the same high-performance runtime for on-device AI, but with an expanded vision to support models authored in PyTorch, JAX, and Keras.

    LiteRT_BlogGraphics_1600x873px_1
  • AUG 13, 2024 / Mobile

    Streamlining LLM Inference at the Edge with TFLite

    XNNPack, the default TensorFlow Lite CPU inference engine, has been updated to improve performance and memory management, allow cross-process collaboration, and simplify the user-facing API.

    TF-Wagtail-Feature