@@ -10,6 +10,41 @@ Expect changes in the API and internals.
10
10
## Cloning
11
11
If you want to run examples, make sure you have [ git-lfs] ( https://git-lfs.github.com ) installed when you clone.
12
12
13
+ ## Quickstart
14
+
15
+ 1 . [ Docker] ( #docker )
16
+ 2 . [ Build] ( #building )
17
+
18
+ ## Docker
19
+
20
+ To quickly tryout RedisAI, launch an instance using docker:
21
+
22
+ ``` sh
23
+ docker run -p 6379:6379 -it --rm redisai/redisai
24
+ ```
25
+
26
+ ### Give it a try
27
+
28
+ On the client, load the model
29
+ ``` sh
30
+ redis-cli -x AI.MODELSET foo TF CPU INPUTS a b OUTPUTS c < examples/models/graph.pb
31
+ ```
32
+
33
+ Then create the input tensors, run the computation graph and get the output tensor (see ` load_model.sh ` ). Note the signatures:
34
+ * ` AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN] `
35
+ * ` AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ... `
36
+ ``` sh
37
+ redis-cli
38
+ > AI.TENSORSET bar FLOAT 2 VALUES 2 3
39
+ > AI.TENSORSET baz FLOAT 2 VALUES 2 3
40
+ > AI.MODELRUN foo INPUTS bar baz OUTPUTS jez
41
+ > AI.TENSORGET jez VALUES
42
+ 1) FLOAT
43
+ 2) 1) (integer) 2
44
+ 3) 1) " 4"
45
+ 2) " 9"
46
+ ```
47
+
13
48
## Building
14
49
This will checkout and build and download the libraries for the backends
15
50
(TensorFlow and PyTorch) for your platform.
30
65
cd ..
31
66
```
32
67
33
- ## Docker
34
-
35
- To quickly tryout RedisAI, launch an instance using docker:
36
-
37
- ``` sh
38
- docker run -p 6379:6379 -it --rm redisai/redisai
39
- ```
40
-
41
- ## Running the server
68
+ ### Running the server
42
69
43
70
You will need a redis-server version 4.0.9 or greater. This should be
44
71
available in most recent distributions:
@@ -54,25 +81,15 @@ To start redis with the RedisAI module loaded:
54
81
redis-server --loadmodule build/redisai.so
55
82
```
56
83
57
- On the client, load the model
58
- ```
59
- redis-cli -x AI.MODELSET foo TF CPU INPUTS a b OUTPUTS c < graph.pb
60
- ```
84
+ ## Client libraries
85
+
86
+ Some languages have client libraries that provide support for RedisAI's commands:
87
+
88
+ | Project | Language | License | Author | URL |
89
+ | ------- | -------- | ------- | ------ | --- |
90
+ | JRedisAI | Java | BSD-3 | [ RedisLabs] ( https://redislabs.com/ ) | [ Github] ( https://github.com/RedisAI/JRedisAI ) |
91
+ | redisai-py | Python | BSD-2 | [ RedisLabs] ( https://redislabs.com/ ) | [ Github] ( https://github.com/RedisAI/redisai-py ) |
61
92
62
- Then create the input tensors, run the computation graph and get the output tensor (see ` load_model.sh ` ). Note the signatures:
63
- * ` AI.TENSORSET tensor_key data_type dim1..dimN [BLOB data | VALUES val1..valN] `
64
- * ` AI.MODELRUN graph_key INPUTS input_key1 ... OUTPUTS output_key1 ... `
65
- ```
66
- redis-cli
67
- > AI.TENSORSET bar FLOAT 2 VALUES 2 3
68
- > AI.TENSORSET baz FLOAT 2 VALUES 2 3
69
- > AI.MODELRUN foo INPUTS bar baz OUTPUTS jez
70
- > AI.TENSORGET jez VALUES
71
- 1) FLOAT
72
- 2) 1) (integer) 2
73
- 3) 1) "4"
74
- 2) "9"
75
- ```
76
93
77
94
## Backend Dependancy
78
95
0 commit comments