Latent AI Solutions for the Public Sector

  • Latent AI Efficient Inference Platform (LEIP)

    LEIP expedites edge AI development with an all-in-one platform that empowers developers to build secure models ultra-fast with seamless field or lab updates. Our platform allows developers to jump-start design with benchmarked configurations, rapidly retrain within a trusted pipeline, and deploy secure models that can be monitored in real-time.  

  • LEIP Design

    LEIP Design is a set of tools for designing edge AI models. It helps you choose and fine-tune the best model-hardware combination to meet your data, performance, and device requirements.

    • Bring your data to a streamlined ML design process that matches your requirements to benchmarked, ready-to-execute recipes.
    • Choose from using over 1,000 benchmarked recipes benchmarked against on-device performance, inference speed, and memory footprint.
    • Rapidly prototype analyzing size, accuracy, and power trade-offs interactively and meet your exact criteria.
    • Speed update and retraining with new data using a trusted pipeline.
    • Reuse a modular pipeline to apply different models and/or hardware to the same dataset.
  • LEIP Optimize

    LEIP Optimize is a set of tools that automate rapid hardware and software optimization of ML models without requiring hardware expertise.

    • Bring your own model to a framework capable of ingesting models from various sources.
    • Supports computer vision models and most models with dynamic tensors, such as transformers. Including models from popular frameworks (Tensorflow, PyTorch, Onnx, etc.)
    • Easily integrates into your current ML environment to maintain developer familiarity and tooling flexibility.
    • Allows users to optimize and encrypt multiple models for one or more hardware targets, enhancing workflow without requiring hardware expertise.
    • Facilitates watermarking with ease. 
  • LEIP Deploy

    LEIP Deploy is a set of tools that generates and maintains a secure, standardized runtime engine for Edge AI that is frictionless to deploy and update.

    • Standardized runtime portable to multiple hardware platforms
    • Lightweight engine for all LEIP services and inference
    • Compatibility with C++ and Python
    • APIs for easy third-party and application integration
    • Deter unauthorized use or distribution of models with version tracking and digital watermarks
    • Seamlessly measure performance during deployment with real-time diagnostic metrics
    • Update or replace model with no changes to your application
    • Version tracking with UUID
  • Latent Ruggedized Toolkit

    The Latent Ruggedized Toolkit (RTK) bridges military-grade hardware with LEIP software, allowing users to modify, retrain, and deploy Edge AI directly in the field.

    • Supports offline operational needs
      • Easy to transmit models that are compressed by models by 10x in memory size,
      • Transmit using low bandwidth tactical radios         
    • Extends/sustains operations
    • Faster retraining (minutes, not hours)
    • Usable by non-machine learning experts
    • Avoid model drift with trusted pipeline for rapid development to improve fast AI fielding cycles