Skip to content

Commit 94fcfdd

Browse files
authored
[BitSail-269][Doc] Adjust the directory structure of documents (#276)
Adjust the directory structure of connector documents
1 parent f609ccf commit 94fcfdd

File tree

85 files changed

+507
-596
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

85 files changed

+507
-596
lines changed

website/en/documents/components/conversion/introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Parent document: [bitsail-components](../README.md)
66

77
## Content
88

9-
When ***BitSail*** transmits data to a specified data source, it needs to convert the intermediate format (`bitsail rows`) used in the transmission process into a data type acceptable to the data source.
9+
When **BitSail** transmits data to a specified data source, it needs to convert the intermediate format (`bitsail rows`) used in the transmission process into a data type acceptable to the data source.
1010
This module provides convenient tools for converting.
1111

1212
- In this context, `bitsail rows` means `com.bytedance.bitsail.common.column.Column` data wrapped by `org.apache.flink.types.Row`

website/en/documents/components/format/introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Parent document: [bitsail-components](../README.md)
66

77
## Content
88

9-
When ***BitSail*** uses flink as the engine, it uses `flink rows` as intermediate format.
9+
When **BitSail** uses flink as the engine, it uses `flink rows` as intermediate format.
1010
So developers need to convert data from data source into `flink rows`.
1111
This module offers convenient methods to convert some kinds of data into `flink rows`.
1212
The specific supported formats are as follows:

website/en/documents/connectors/README.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,15 +13,17 @@ dir:
1313
- [Druid connector](druid/druid.md)
1414
- [Elasticsearch connector](elasticsearch/elasticsearch.md)
1515
- [FTP/SFTP connector](ftp/ftp.md)
16+
- [FTP/SFTP-v1 connector](ftp/v1/ftp-v1.md)
1617
- [Hadoop connector](hadoop/hadoop.md)
1718
- [HBase connector](hbase/hbase.md)
1819
- [Hive connector](hive/hive.md)
1920
- [Hudi connector](hudi/hudi.md)
20-
- [Jdbc connector](Jdbc/jdbc.md)
21+
- [JDBC connector](jdbc/jdbc.md)
2122
- [Kafka connector](kafka/kafka.md)
2223
- [Kudu connector](kudu/kudu.md)
2324
- [LarkSheet connector](larksheet/larksheet.md)
2425
- [MongoDB connector](mongodb/mongodb.md)
25-
- [Redis connector](redis/redis-v1.md)
26+
- [Redis connector](redis/redis.md)
27+
- [Redis-v1 connector](redis/v1/redis-v1.md)
2628
- [RocketMQ connector](rocketmq/rocketmq.md)
27-
- [StreamingFile connector (Hdfs streaming connector)](StreamingFile/StreamingFile.md)
29+
- [StreamingFile(HDFS streaming) connector](streamingfile/streamingfile.md)

website/en/documents/connectors/clickhouse/clickhouse-example.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# ClickHouse connector example
22

3-
Parent document: [clickhouse-connector](./clickhouse.md)
3+
Parent document: [ClickHouse connector](./clickhouse.md)
44

55
## ClickHouse configuration
66

@@ -11,7 +11,7 @@ Account information:
1111
- Username: default
1212
- Password: 1234567
1313

14-
Library tables to write to:
14+
Target database and table:
1515
- Database name: default
1616
- Table name: test_ch_table
1717

website/en/documents/connectors/clickhouse/clickhouse.md

Lines changed: 3 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# ClickHouse connector
22

3-
Parent document: [connectors](../README.md)
3+
Parent document: [Connectors](../README.md)
44

55
**BitSail** ClickHouse connector can be used to read data in ClickHouse, mainly supports the following functions:
66

77
- Support batch reading of ClickHouse tables
8-
- JDBC Driver version used: 0.3.2-patch11
8+
- JDBC Driver version: 0.3.2-patch11
99

1010
## Maven dependency
1111

@@ -72,14 +72,6 @@ Read connector parameters are configured in `job.reader`, please pay attention t
7272

7373
#### Optional parameters
7474

75-
| user_name | No | | Username to access ClickHouse services |
76-
| password | No | | The password of the above user |
77-
| split_field | No | | Batch query fields, only support Int8 - Int64 and UInt8 - UInt32 integer types |
78-
| split_config | No | | The configuration for batch query according to `split_field` field, including initial value, maximum value and query times, <p/> For example: `{"lower_bound": 0, "upper_bound": 10000, "split_num": 3}` |
79-
| sql_filter | No | | The filter condition of the query, such as `( id % 2 == 0 )`, will be spliced into the WHERE clause of the query SQL |
80-
| reader_parallelism_num | No | | Concurrency for reading ClickHouse tables |
81-
82-
8375
| Parameter name | Required | Optional value | Description |
8476
|:-------------------|:---------|:---------------|:---------------------------------------------------|
8577
| user_name | no | | Username to access ClickHouse services |
@@ -91,4 +83,4 @@ Read connector parameters are configured in `job.reader`, please pay attention t
9183

9284
## Related documents
9385

94-
Configuration example: [clickhouse-connector-example](./clickhouse-example.md)
86+
Configuration example: [ClickHouse connector example](./clickhouse-example.md)

website/en/documents/connectors/doris/doris-example.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
1-
# Doris connector examples
2-
3-
Parent documents: [doris-connector](./doris.md)
1+
# Doris connector example
42

3+
Parent documents: [Doris connector](./doris.md)
54

65
## Doris cluster info
76

website/en/documents/connectors/doris/doris.md

Lines changed: 16 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
11
# Doris connector
22

3-
Parent document: [connectors](../README.md)
3+
Parent document: [Connectors](../README.md)
44

5-
***BitSail*** Doris connector supports writing doris. The main function points are as follows:
5+
**BitSail** Doris connector supports writing doris. The main function points are as follows:
66

77
- Use StreamLoad to write doris.
88
- Support firstly create and then write partition
99

10-
1110
## Maven dependency
1211

1312
```xml
@@ -48,7 +47,6 @@ It supports common data type in doris:
4847

4948
### Parameters
5049

51-
5250
The following mentioned parameters should be added to `job.writer` block when using, for example:
5351

5452
```json
@@ -65,22 +63,22 @@ The following mentioned parameters should be added to `job.writer` block when us
6563

6664
#### Necessary parameters
6765

68-
| Param name | Required | Optional value | Description |
69-
|:-----------------------------|:---------|:---------------|:---------------------------------------------------------------------------------------------------------------|
70-
| class | yes | | Doris writer class name, `com.bytedance.bitsail.connector.doris.sink.DorisSink` |
71-
| fe_hosts | yes | | Doris FE address, multi addresses separated by comma |
72-
| mysql_hosts | yes | | Doris jdbc query address , multi addresses separated by comma |
73-
| user| yes | | Doris account user |
74-
| password| yes | | Doris account password, can be empty |
75-
| db_name| yes | | database to write |
76-
| table_name| yes | | table to write |
77-
| partitions | Yes if target table has partition | | target partition to write |
66+
| Param name | Required | Optional value | Description |
67+
|:--------------------|:---------|:---------------|:-------------------------------------------------------------------|
68+
| class | yes | | Doris writer class name, `com.bytedance.bitsail.connector.doris.sink.DorisSink` |
69+
| fe_hosts | yes | | Doris FE address, multi addresses separated by comma |
70+
| mysql_hosts | yes | | Doris jdbc query address , multi addresses separated by comma |
71+
| user | yes | | Doris account user |
72+
| password | yes | | Doris account password, can be empty |
73+
| db_name | yes | | database to write |
74+
| table_name | yes | | table to write |
75+
| partitions | Yes if target table has partition | | target partition to write |
7876
| table_has_partition | Yes if target table does not have partition | | True if target table does not have partition |
79-
| table_model | yes | UNIQUE | Table model of target table. Currently only support unique table. |
77+
| table_model | yes | UNIQUE | Table model of target table. Currently only support unique table. |
8078

8179
<!--AGGREGATE<br/>DUPLICATE-->
8280

83-
Notice, `partitions` has following requirements:
81+
Notice, `partitions` has the following requirements:
8482
1. You can determine multi partitions
8583
2. Each partition should contain:
8684
1. `name`: name of the partition
@@ -116,9 +114,6 @@ partitions example:
116114
}
117115
```
118116

119-
120-
121-
122117
#### Optional parameters
123118

124119
| Param name | Required | Optional value | Description |
@@ -134,8 +129,6 @@ partitions example:
134129
| csv_field_delimiter | no | | field delimiter used in csv, default "," |
135130
| csv_line_delimiter | no | | line delimiter used in csv, default "\n" |
136131

132+
## Related documents
137133

138-
## Related document
139-
140-
141-
Configuration examples: [doris-connector-example](./doris-example.md)
134+
Configuration examples: [Doris connector example](./doris-example.md)

website/en/documents/connectors/druid/druid-example.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# Druid connector configuration examples
1+
# Druid connector example
22

3-
Parent document: [druid-connector](./druid.md)
3+
Parent document: [Druid connector](./druid.md)
44

55
## Druid writer example
66

website/en/documents/connectors/druid/druid.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
# Druid connector
22

3-
Parent document: [connectors](../README.md)
3+
Parent document: [Connectors](../README.md)
44

5-
***BitSail*** Druid connector supports writing druid data-sources.
5+
**BitSail** Druid connector supports writing druid data-sources.
66

77
## Maven dependency
88

@@ -62,6 +62,6 @@ The following mentioned parameters should be added to `job.writer` block when us
6262

6363
-----
6464

65-
## Related document
65+
## Related documents
6666

67-
Configuration example: [druid-connector-example](./druid-example.md)
67+
Configuration example: [Druid connector example](./druid-example.md)

website/en/documents/connectors/elasticsearch/elasticsearch-example.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# Elasticsearch connector examples
1+
# Elasticsearch connector example
22

3-
Parent document: [elasticsearch-connector](./elasticsearch.md)
3+
Parent document: [Elasticsearch connector](./elasticsearch.md)
44

55

66
The following configuration shows how to organize parameter configuration to write the specified Elasticsearch index.

0 commit comments

Comments
 (0)