Skip to content

Commit 548459a

Browse files
authored
Feat add filtering flinksql (#1619)
* Initial commit for FlinkSQL filtering; first four sections complete * Completed validation section * Completed the create sql resources section * Completed tutorial * Update makefile for expected results file * Add new line end of expected log * Fixes for running integration test * update test for missing sql file * review comments - fix typos
1 parent 72d71ec commit 548459a

File tree

64 files changed

+1158
-1
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

64 files changed

+1158
-1
lines changed

.semaphore/semaphore.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -496,3 +496,7 @@ blocks:
496496
- name: Flink SQL test for splitting
497497
commands:
498498
- make -C _includes/tutorials/splitting/flinksql/code tutorial
499+
- name: Flink SQL test for filtering
500+
commands:
501+
- make -C _includes/tutorials/filtering/flinksql/code tutorial
502+
Lines changed: 155 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,155 @@
1+
dev:
2+
steps:
3+
- title: Prerequisites
4+
content:
5+
- action: skip
6+
render:
7+
file: shared/markup/dev/docker-prerequisite.adoc
8+
9+
- title: Initialize the project
10+
content:
11+
- action: execute
12+
file: tutorial-steps/dev/init.sh
13+
render:
14+
file: tutorials/filtering/flinksql/markup/dev/init.adoc
15+
16+
- title: Get Confluent Platform
17+
content:
18+
- action: make_file
19+
file: docker-compose.yml
20+
render:
21+
file: tutorials/filtering/flinksql/markup/dev/make-docker-compose.adoc
22+
23+
- action: execute_async
24+
file: tutorial-steps/dev/docker-compose-up.sh
25+
render:
26+
file: tutorials/filtering/flinksql/markup/dev/start-compose.adoc
27+
28+
- action: execute
29+
file: tutorial-steps/dev/wait-for-containers.sh
30+
render:
31+
skip: true
32+
33+
- title: Write the program interactively using the CLI
34+
content:
35+
- action: docker_flinksql_cli_session
36+
container: flink-sql-client
37+
docker_bootup_file: tutorial-steps/dev/start-cli.sh
38+
column_width: 20
39+
render:
40+
file: tutorials/filtering/flinksql/markup/dev/start-cli.adoc
41+
stdin:
42+
- file: tutorial-steps/dev/create-all-publications.sql
43+
render:
44+
file: tutorials/filtering/flinksql/markup/dev/create-all-publications.adoc
45+
46+
- file: tutorial-steps/dev/populate-publication-events.sql
47+
render:
48+
file: tutorials/filtering/flinksql/markup/dev/populate-publication-events.adoc
49+
50+
- file: tutorial-steps/dev/transient-query.sql
51+
render:
52+
file: tutorials/filtering/flinksql/markup/dev/transient-query.adoc
53+
54+
- file: tutorial-steps/dev/create-publications-by-author.sql
55+
render:
56+
file: tutorials/filtering/flinksql/markup/dev/create-publications-by-author.adoc
57+
58+
- file: tutorial-steps/dev/populate-publications-by-author.sql
59+
render:
60+
file: tutorials/filtering/flinksql/markup/dev/populate-publications-by-author.adoc
61+
62+
63+
stdout:
64+
directory: tutorial-steps/dev/outputs
65+
66+
- title: Validate output
67+
content:
68+
- action: execute
69+
file: tutorial-steps/dev/validate-publications-by-author.sh
70+
stdout: tutorial-steps/dev/outputs/validate-publications-by-author.log
71+
render:
72+
file: tutorials/filtering/flinksql/markup/dev/validate-publications-by-author.adoc
73+
74+
test:
75+
steps:
76+
- title: Decide what testing tools to use
77+
content:
78+
- action: skip
79+
render:
80+
file: tutorials/filtering/flinksql/markup/test/test-architecture.adoc
81+
82+
- title: Create the test skeleton
83+
content:
84+
- action: execute
85+
file: tutorial-steps/test/make-test-dirs.sh
86+
render:
87+
file: tutorials/filtering/flinksql/markup/test/make-test-dirs.adoc
88+
89+
- action: make_file
90+
file: build.gradle
91+
render:
92+
file: tutorials/filtering/flinksql/markup/test/make-build-gradle.adoc
93+
94+
- action: execute
95+
file: tutorial-steps/test/gradle-wrapper.sh
96+
render:
97+
file: tutorials/filtering/flinksql/markup/test/make-gradle-wrapper.adoc
98+
99+
- title: Create SQL resources
100+
content:
101+
- action: make_file
102+
file: src/test/resources/create-all-publications.sql.template
103+
render:
104+
file: tutorials/filtering/flinksql/markup/test/create-all-publications.sql.template.adoc
105+
106+
- action: make_file
107+
file: src/test/resources/populate-publication-events.sql
108+
render:
109+
file: tutorials/filtering/flinksql/markup/test/create-resource-populate-publication-events.sql.adoc
110+
- action: make_file
111+
file: src/test/resources/create-publications-by-author.sql.template
112+
render:
113+
file: tutorials/filtering/flinksql/markup/test/create-resource-create-publications-by-author.sql.template.adoc
114+
115+
- action: make_file
116+
file: src/test/resources/populate-publications-by-author.sql
117+
render:
118+
file: tutorials/filtering/flinksql/markup/test/create-resource-populate-publications-by-author.sql.adoc
119+
120+
- action: make_file
121+
file: src/test/resources/query-publications-by-author.sql
122+
render:
123+
file: tutorials/filtering/flinksql/markup/test/create-resource-query-publications-by-author.sql.adoc
124+
125+
- action: make_file
126+
file: src/test/resources/expected-publications-by-author.txt
127+
render:
128+
file: tutorials/filtering/flinksql/markup/test/create-resource-expected-publications-by-author.txt.adoc
129+
130+
- title: Write a test
131+
content:
132+
- action: make_file
133+
file: src/test/java/io/confluent/developer/AbstractFlinkKafkaTest.java
134+
render:
135+
file: tutorials/filtering/flinksql/markup/test/make-test-base.adoc
136+
137+
- action: make_file
138+
file: src/test/java/io/confluent/developer/FlinkSqlFilteringTest.java
139+
render:
140+
file: tutorials/filtering/flinksql/markup/test/make-test.adoc
141+
142+
- title: Invoke the test
143+
content:
144+
- action: execute
145+
file: tutorial-steps/test/invoke-test.sh
146+
render:
147+
file: tutorials/filtering/flinksql/markup/test/invoke-test.adoc
148+
149+
ccloud:
150+
steps:
151+
- title: Run your app to Confluent Cloud
152+
content:
153+
- action: skip
154+
render:
155+
file: shared/markup/ccloud/try-ccloud.adoc

_data/tutorials.yaml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,7 @@ filtering:
2828
kstreams: enabled
2929
kafka: enabled
3030
confluent: enabled
31+
flinksql: enabled
3132
splitting:
3233
title: How to split a stream of events into substreams
3334
meta-description: split a stream of events into substreams
@@ -621,7 +622,8 @@ kafka-producer-application-callback:
621622
the Callback interface
622623
canonical: confluent
623624
slug: /kafka-producer-callback-application
624-
question: How can you use callbacks with a KafkaProducer to handle responses from the broker?
625+
question: How can you use callbacks with a KafkaProducer to handle responses from
626+
the broker?
625627
introduction: You have an application using an Apache Kafka producer, and you want
626628
an automatic way of handling responses after producing records. In this tutorial,
627629
you'll learn how to use the Callback interface to automatically handle responses
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
tutorial-steps/dev/outputs/
2+
3+
# Ignore Gradle project-specific cache directory
4+
.gradle
5+
6+
# Ignore Gradle build output directory
7+
build
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
STEPS_DIR := tutorial-steps
2+
DEV_OUTPUTS_DIR := $(STEPS_DIR)/dev/outputs
3+
TEMP_DIR := $(shell mktemp -d)
4+
SEQUENCE := "dev, test, ccloud"
5+
6+
tutorial:
7+
rm -r $(DEV_OUTPUTS_DIR) || true
8+
mkdir $(DEV_OUTPUTS_DIR)
9+
harness-runner ../../../../../_data/harnesses/filtering/flinksql.yml $(TEMP_DIR) $(SEQUENCE)
10+
diff --strip-trailing-cr $(STEPS_DIR)/dev/expected-books-by-author.log $(DEV_OUTPUTS_DIR)/validate-publications-by-author.log
11+
reset
Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
buildscript {
2+
repositories {
3+
mavenCentral()
4+
}
5+
}
6+
7+
plugins {
8+
id "java"
9+
id "idea"
10+
}
11+
12+
sourceCompatibility = JavaVersion.VERSION_11
13+
targetCompatibility = JavaVersion.VERSION_11
14+
version = "0.0.1"
15+
16+
repositories {
17+
mavenCentral()
18+
}
19+
20+
dependencies {
21+
testImplementation "com.google.guava:guava:31.1-jre"
22+
testImplementation "junit:junit:4.13.2"
23+
testImplementation 'org.testcontainers:testcontainers:1.17.6'
24+
testImplementation 'org.testcontainers:kafka:1.17.6'
25+
testImplementation "org.apache.flink:flink-sql-connector-kafka:1.17.1"
26+
testImplementation "org.apache.flink:flink-sql-avro-confluent-registry:1.17.1"
27+
testImplementation "org.apache.flink:flink-test-utils:1.17.1"
28+
testImplementation "org.apache.flink:flink-test-utils-junit:1.17.1"
29+
testImplementation 'org.apache.flink:flink-json:1.17.1'
30+
testImplementation "org.apache.flink:flink-table-api-java-bridge:1.17.0"
31+
testImplementation "org.apache.flink:flink-table-planner_2.12:1.17.1"
32+
testImplementation "org.apache.flink:flink-table-planner_2.12:1.17.1:tests"
33+
testImplementation "org.apache.flink:flink-statebackend-rocksdb:1.17.1"
34+
}
Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
version: '2'
2+
services:
3+
broker:
4+
image: confluentinc/cp-kafka:7.4.1
5+
hostname: broker
6+
container_name: broker
7+
ports:
8+
- 29092:29092
9+
environment:
10+
KAFKA_BROKER_ID: 1
11+
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT,CONTROLLER:PLAINTEXT
12+
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:9092,PLAINTEXT_HOST://localhost:29092
13+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
14+
KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
15+
KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
16+
KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
17+
KAFKA_PROCESS_ROLES: broker,controller
18+
KAFKA_NODE_ID: 1
19+
KAFKA_CONTROLLER_QUORUM_VOTERS: 1@broker:29093
20+
KAFKA_LISTENERS: PLAINTEXT://broker:9092,CONTROLLER://broker:29093,PLAINTEXT_HOST://0.0.0.0:29092
21+
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
22+
KAFKA_CONTROLLER_LISTENER_NAMES: CONTROLLER
23+
KAFKA_LOG_DIRS: /tmp/kraft-combined-logs
24+
CLUSTER_ID: MkU3OEVBNTcwNTJENDM2Qk
25+
flink-sql-client:
26+
image: cnfldemos/flink-sql-client-kafka:1.16.0-scala_2.12-java11
27+
hostname: flink-sql-client
28+
container_name: flink-sql-client
29+
depends_on:
30+
- flink-jobmanager
31+
environment:
32+
FLINK_JOBMANAGER_HOST: flink-jobmanager
33+
volumes:
34+
- ./settings/:/settings
35+
flink-jobmanager:
36+
image: cnfldemos/flink-kafka:1.16.0-scala_2.12-java11
37+
hostname: flink-jobmanager
38+
container_name: flink-jobmanager
39+
ports:
40+
- 9081:9081
41+
command: jobmanager
42+
environment:
43+
- |
44+
FLINK_PROPERTIES=
45+
jobmanager.rpc.address: flink-jobmanager
46+
rest.bind-port: 9081
47+
flink-taskmanager:
48+
image: cnfldemos/flink-kafka:1.16.0-scala_2.12-java11
49+
hostname: flink-taskmanager
50+
container_name: flink-taskmanager
51+
depends_on:
52+
- flink-jobmanager
53+
command: taskmanager
54+
scale: 1
55+
environment:
56+
- |
57+
FLINK_PROPERTIES=
58+
jobmanager.rpc.address: flink-jobmanager
59+
taskmanager.numberOfTaskSlots: 10
Binary file not shown.
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
distributionBase=GRADLE_USER_HOME
2+
distributionPath=wrapper/dists
3+
distributionUrl=https\://services.gradle.org/distributions/gradle-7.5.1-bin.zip
4+
zipStoreBase=GRADLE_USER_HOME
5+
zipStorePath=wrapper/dists

0 commit comments

Comments
 (0)