Laku is a command line tool to manage your AI datasets and models
Seamlessly publish, manage and use datasets, models and other artifacts either from CI/CD pipelines, servers, HPC training infrastructures or your local computer. Works with S3 protocol compliant servers either self-hosted or in the cloud.
Learn more about Laku CliInstall laku cli by executing the command below in your Terminal.
Copy and paste it in your terminal prompt.
Laku cli is compatible with MacOS, Linux, OpenBSD and Windows on both intel and arm (Apple Silicon, AWS Graviton and Raspberry Pi and similar).
For further information on Laku Cli install browse to Compatibility
and Install
or to the Getting Started
guide.
What’s Included in Laku cli
Laku cli is a Dataset and Model lifecycle management command line interface tool that makes performing Artificial Inteligence dataset and model operations easy on any environment.
- Compatible with most S3 protocol servers (AWS S3, Azure Simple Storage Service, GCP Cloud Storage, Minio,…) with no changes needed in storage infrastructure.
- Laku Cli is compatible with multiple Operating Systems and Architectures. Browse to Compatibility for more information.
- Laku cli Docker image available with the latest cli version
- Manage multiple dataset and model storage server locations via Server Remotes
- Customizable .lakuignore file (fully compatible with standard .gitignore file specification)
- Laku admin cli for performing all remote storage server administration tasks.
- Free to use Laku cli for performing all storage remote server administration tasks.
Dataset management
Laku cli is . Some of the key features are:
- Datasets can be published on a complete file and/or folder granularity basis
- Full dataset or specific Dataset files can be consumed in AI pipelines
- Consume Datasets in HPC or Cloud compute resources anywhere
- Each Dataset and Model can be fetched and sent to and from different remote sources
- Dataset and Model readme-md files saved in remote location with all the related data
- Datasets and Model files can be consumed in AI pipelines
- Customizable Key and Secret based authentication for each Remote location
Model management
With Laku, you can manage your Artificial Inteligence Datasets and Models easily.
- Versioning of Models with Tags
- Models can be published on a complete file and/or folder granularity basis
- Consume Model tags in CI/CD processes in Gitlab Runners, Github Actions, Jenkins,…
- Perform Model and Dataset management actions for backup or archiving purposes
Remote datalake management and administration
With Laku, you can manage your Artificial Inteligence Datasets and Models easily.
- Manage multiple Remote DATASET and MODEL locations
- Create and manage storage buckets
- Manage user permissions and create key/token based credentials for different usage roles (RBAC)