AIStore hosts a variety of sample transformers in the form of Docker images to be used with ETL workflows on AIStore via the init spec
functionality.
Transformer | Language | Communication Mechanisms | Description |
---|---|---|---|
echo |
python:3.11 |
hpull , hpush , hrev |
Returns the original data, with an MD5 sum in the response headers. |
go_echo |
golang:1.21 |
hpull , hpush , hrev |
Returns the original data, with an MD5 sum in the response headers. |
hello_world |
python:3.11 |
hpull , hpush , hrev |
Returns Hello World! string on any request. |
md5 |
python:3.11 |
hpull , hpush , hrev |
Returns the MD5 sum of the original data as the response. |
tar2tf |
golang:1.21 |
hrev |
Returns the transformed TensorFlow compatible data for the input TAR files. |
compress |
python:3.11 |
hpull , hpush , hrev |
Returns the compressed or decompressed data using gzip or bz2 . |
NeMo/FFmpeg |
python:3.11 |
hpull , hpush , hrev |
Returns audio files in WAV format with control over Audio Channels (AC ) and Audio Rate (AR ). |
keras |
python:slim |
hpull , hpush , hrev |
Returns the transformed images using Keras pre-processing. |
torchvision |
python:slim |
hpull , hpush , hrev |
Returns the transformed images using Torchvision pre-processing. |
The following sections demonstrate initializing ETLs on AIStore using the provided sample transformers.
For detailed usage information regarding the
Tar2TF
,Compress
,NeMo/FFmpeg
,Keras
,Torchvision
transformers and their optional parameters, please refer to theREADME
documents located in their respective sub-directories.
ETLs on AIStore requires the installation and use of Kubernetes.
For more information on AIStore Kubernetes deployment options, refer here.
The basic procedure is as follows:
- Change directory into sub-directory of desired sample transformer.
- Export communication mechanism (and optional arguments if any) as environment variables.
- Substitute environment variables into the provided
YAML
specification file (pod.yaml
). - Initialize ETL w/ AIStore CLI providing path to the generated
YAML
specification file.
The following demonstrates basic usage:
# Change Directory (to Desired Sample Transformer)
cd ais-etl/transformers/md5
# Export Environment Variables for Communication Mechanism (& Any Additional Arguments)
export COMMUNICATION_TYPE = "hpull://"
# Subsitute Environment Variables in YAML Specificiation
eval "echo \"$(cat pod.yaml)\"" > md5_pod_config.yaml
# Initialize ETL on AIStore via CLI
ais etl init spec --name md5-etl --from-file "./md5_pod_config.yaml"
The YAML
specification files for the sample transformers are provided as templates in the Python SDK.
The basic procedure is as follows:
- Import desired sample transformer's YAML specification template.
- Format communication mechanism (and optional arguments if any) into template.
- Initialize ETL w/ AIStore SDK providing the formatted template.
The following demonstrates basic usage:
from aistore.sdk.client import Client
from aistore.sdk.etl_templates import ECHO
from aistore.sdk.etl_const import ETL_COMM_HPULL
AIS_ENDPOINT = os.environ.get("AIS_ENDPOINT")
client = Client(AIS_ENDPOINT)
echo_etl_template = ECHO.format(communication_type=ETL_COMM_HPULL)
client.etl("echo-etl").init_spec(template=echo_etl_template, communication_type=ETL_COMM_HPULL)
The maintenance of the sample transformers on DockerHub is managed by the ais-etl
GitHub repository.
To contribute, push any changes to sample transformers to the GitHub repository. The existing GitHub workflows will build and push the updated sample transformers to the DockerHub repostiory.
For more information, refer to the GitHub workflow files here.