18
18
-->
19
19
20
20
# Apache Airflow Python Client
21
+
21
22
# Overview
22
23
23
24
To facilitate management, Apache Airflow supports a range of REST API endpoints across its
@@ -26,6 +27,7 @@ This section provides an overview of the API design, methods, and supported use
26
27
27
28
Most of the endpoints accept ` JSON ` as input and return ` JSON ` responses.
28
29
This means that you must usually add the following headers to your request:
30
+
29
31
```
30
32
Content-type: application/json
31
33
Accept: application/json
@@ -41,7 +43,7 @@ Resource names are used as part of endpoint URLs, as well as in API parameters a
41
43
42
44
## CRUD Operations
43
45
44
- The platform supports ** C ** reate , ** R ** ead , ** U ** pdate , and ** D ** elete operations on most resources.
46
+ The platform supports ** Create ** , ** Read ** , ** Update ** , and ** Delete ** operations on most resources.
45
47
You can review the standards for these operations and their standard parameters below.
46
48
47
49
Some endpoints have special behavior as exceptions.
@@ -66,6 +68,7 @@ The response usually returns a `200 OK` response code upon success, with an obje
66
68
of resources' metadata in the response body.
67
69
68
70
When reading resources, some common query parameters are usually available. e.g.:
71
+
69
72
```
70
73
v1/connections?limit=25&offset=25
71
74
```
@@ -84,7 +87,7 @@ resource in the response body.
84
87
85
88
### Delete
86
89
87
- Deleting a resource requires the resource ` id ` and is typically executed via an HTTP ` DELETE ` request.
90
+ Deleting a resource requires the resource ` id ` and is typically executing via an HTTP ` DELETE ` request.
88
91
The response usually returns a ` 204 No Content ` response code upon success.
89
92
90
93
## Conventions
@@ -93,16 +96,15 @@ The response usually returns a `204 No Content` response code upon success.
93
96
- Names are consistent between URL parameter name and field name.
94
97
95
98
- Field names are in snake_case.
99
+
96
100
``` json
97
101
{
98
- \"description\": \"string\",
99
102
\"name\": \"string\",
103
+ \"slots\": 0,
100
104
\"occupied_slots\": 0,
101
- \"open_slots \": 0
105
+ \"used_slots \": 0,
102
106
\"queued_slots\": 0,
103
- \"running_slots\": 0,
104
- \"scheduled_slots\": 0,
105
- \"slots\": 0,
107
+ \"open_slots\": 0
106
108
}
107
109
```
108
110
@@ -115,10 +117,13 @@ The update request ignores any fields that aren't specified in the field mask, l
115
117
their current values.
116
118
117
119
Example:
118
- ```
119
- resource = request.get('/resource/my-id').json()
120
- resource['my_field'] = 'new-value'
121
- request.patch('/resource/my-id?update_mask=my_field', data=json.dumps(resource))
120
+
121
+ ``` python
122
+ import requests
123
+
124
+ resource = requests.get(" /resource/my-id" ).json()
125
+ resource[" my_field" ] = " new-value"
126
+ requests.patch(" /resource/my-id?update_mask=my_field" , data = json.dumps(resource))
122
127
```
123
128
124
129
## Versioning and Endpoint Lifecycle
@@ -136,6 +141,7 @@ the Apache Airflow API.
136
141
Note that you will need to pass credentials data.
137
142
138
143
For e.g., here is how to pause a DAG with [ curl] ( https://curl.haxx.se/ ) , when basic authorization is used:
144
+
139
145
``` bash
140
146
curl -X PATCH ' https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \\
141
147
-H ' Content-Type: application/json' \\
@@ -148,8 +154,9 @@ curl -X PATCH 'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \
148
154
Using a graphical tool such as [ Postman] ( https://www.postman.com/ ) or [ Insomnia] ( https://insomnia.rest/ ) ,
149
155
it is possible to import the API specifications directly:
150
156
151
- 1 . Download the API specification by clicking the ** Download** button at the top of this document
157
+ 1 . Download the API specification by clicking the ** Download** button at top of this document.
152
158
2 . Import the JSON specification in the graphical tool of your choice.
159
+
153
160
- In * Postman* , you can click the ** import** button at the top
154
161
- With * Insomnia* , you can just drag-and-drop the file on the UI
155
162
@@ -172,10 +179,12 @@ and it is even possible to add your own method.
172
179
173
180
If you want to check which auth backend is currently set, you can use
174
181
` airflow config get-value api auth_backends ` command as in the example below.
182
+
175
183
``` bash
176
184
$ airflow config get-value api auth_backends
177
185
airflow.api.auth.backend.basic_auth
178
186
```
187
+
179
188
The default is to deny all requests.
180
189
181
190
For details on configuring the authentication, see
@@ -229,43 +238,40 @@ resource, e.g. the resource it tries to create already exists.
229
238
This means that the server encountered an unexpected condition that prevented it from
230
239
fulfilling the request.
231
240
232
-
233
241
This Python package is automatically generated by the [ OpenAPI Generator] ( https://openapi-generator.tech ) project:
234
242
235
- - API version: 2.8 .0
236
- - Package version: 2.8 .0
243
+ - API version: 2.9 .0
244
+ - Package version: 2.9 .0
237
245
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
246
+
238
247
For more information, please visit [ https://airflow.apache.org ] ( https://airflow.apache.org )
239
248
240
249
## Requirements.
241
250
242
- Python >=3.6
251
+ Python >=3.8
243
252
244
253
## Installation & Usage
245
- ### pip install
246
254
247
- If the python package is hosted on a repository, you can install directly using:
255
+ ### pip install
248
256
249
- ``` sh
250
- pip install git+https://github.com/apache/airflow-client-python.git
251
- ```
252
- (you may need to run ` pip ` with root permission: ` sudo pip install git+https://github.com/apache/airflow-client-python.git ` )
257
+ You can install the client using standard Python installation tools. It is hosted
258
+ in PyPI with ` apache-airflow-client ` package id so the easiest way to get the latest
259
+ version is to run:
253
260
254
- Then import the package:
255
- ``` python
256
- import airflow_client.client
261
+ ``` bash
262
+ pip install apache-airflow-client
257
263
```
258
264
259
- ### Setuptools
260
-
261
- Install via [ Setuptools] ( http://pypi.python.org/pypi/setuptools ) .
265
+ If the python package is hosted on a repository, you can install directly using:
262
266
263
- ``` sh
264
- python setup.py install --user
267
+ ``` bash
268
+ pip install git+https://github.com/apache/airflow-client-python.git
265
269
```
266
- (or ` sudo python setup.py install ` to install the package for all users)
270
+
271
+ ### Import check
267
272
268
273
Then import the package:
274
+
269
275
``` python
270
276
import airflow_client.client
271
277
```
@@ -275,40 +281,34 @@ import airflow_client.client
275
281
Please follow the [ installation procedure] ( #installation--usage ) and then run the following:
276
282
277
283
``` python
278
-
279
284
import time
280
285
import airflow_client.client
281
286
from pprint import pprint
282
287
from airflow_client.client.api import config_api
283
288
from airflow_client.client.model.config import Config
284
289
from airflow_client.client.model.error import Error
290
+
285
291
# Defining the host is optional and defaults to /api/v1
286
292
# See configuration.py for a list of all supported configuration parameters.
287
- configuration = client.Configuration(
288
- host = " /api/v1"
289
- )
293
+ configuration = client.Configuration(host = " /api/v1" )
290
294
291
295
# The client must configure the authentication and authorization parameters
292
296
# in accordance with the API server security policy.
293
297
# Examples for each auth method are provided below, use the example that
294
298
# satisfies your auth use case.
295
299
296
300
# Configure HTTP basic authorization: Basic
297
- configuration = client.Configuration(
298
- username = ' YOUR_USERNAME' ,
299
- password = ' YOUR_PASSWORD'
300
- )
301
+ configuration = client.Configuration(username = " YOUR_USERNAME" , password = " YOUR_PASSWORD" )
301
302
302
303
303
304
# Enter a context with an instance of the API client
304
305
with client.ApiClient(configuration) as api_client:
305
306
# Create an instance of the API class
306
307
api_instance = config_api.ConfigApi(api_client)
307
- section = " section_example" # str | If given, only return config of this section. (optional)
308
308
309
309
try :
310
310
# Get current configuration
311
- api_response = api_instance.get_config(section = section )
311
+ api_response = api_instance.get_config()
312
312
pprint(api_response)
313
313
except client.ApiException as e:
314
314
print (" Exception when calling ConfigApi->get_config: %s \n " % e)
@@ -321,7 +321,6 @@ All URIs are relative to */api/v1*
321
321
Class | Method | HTTP request | Description
322
322
------------ | ------------- | ------------- | -------------
323
323
* ConfigApi* | [ ** get_config** ] ( docs/ConfigApi.md#get_config ) | ** GET** /config | Get current configuration
324
- * ConfigApi* | [ ** get_value** ] ( docs/ConfigApi.md#get_value ) | ** GET** /config/section/{section}/option/{option} | Get a option from configuration
325
324
* ConnectionApi* | [ ** delete_connection** ] ( docs/ConnectionApi.md#delete_connection ) | ** DELETE** /connections/{connection_id} | Delete a connection
326
325
* ConnectionApi* | [ ** get_connection** ] ( docs/ConnectionApi.md#get_connection ) | ** GET** /connections/{connection_id} | Get a connection
327
326
* ConnectionApi* | [ ** get_connections** ] ( docs/ConnectionApi.md#get_connections ) | ** GET** /connections | List connections
@@ -345,7 +344,7 @@ Class | Method | HTTP request | Description
345
344
* DAGRunApi* | [ ** get_dag_runs** ] ( docs/DAGRunApi.md#get_dag_runs ) | ** GET** /dags/{dag_id}/dagRuns | List DAG runs
346
345
* DAGRunApi* | [ ** get_dag_runs_batch** ] ( docs/DAGRunApi.md#get_dag_runs_batch ) | ** POST** /dags/~ /dagRuns/list | List DAG runs (batch)
347
346
* DAGRunApi* | [ ** get_upstream_dataset_events** ] ( docs/DAGRunApi.md#get_upstream_dataset_events ) | ** GET** /dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents | Get dataset events for a DAG run
348
- * DAGRunApi* | [ ** post_dag_run** ] ( docs/DAGRunApi.md#post_dag_run ) | ** POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run.
347
+ * DAGRunApi* | [ ** post_dag_run** ] ( docs/DAGRunApi.md#post_dag_run ) | ** POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run
349
348
* DAGRunApi* | [ ** set_dag_run_note** ] ( docs/DAGRunApi.md#set_dag_run_note ) | ** PATCH** /dags/{dag_id}/dagRuns/{dag_run_id}/setNote | Update the DagRun note.
350
349
* DAGRunApi* | [ ** update_dag_run_state** ] ( docs/DAGRunApi.md#update_dag_run_state ) | ** PATCH** /dags/{dag_id}/dagRuns/{dag_run_id} | Modify a DAG run
351
350
* DagWarningApi* | [ ** get_dag_warnings** ] ( docs/DagWarningApi.md#get_dag_warnings ) | ** GET** /dagWarnings | List dag warnings
@@ -427,7 +426,6 @@ Class | Method | HTTP request | Description
427
426
- [ DAGRun] ( docs/DAGRun.md )
428
427
- [ DAGRunCollection] ( docs/DAGRunCollection.md )
429
428
- [ DAGRunCollectionAllOf] ( docs/DAGRunCollectionAllOf.md )
430
- - [ DagProcessorStatus] ( docs/DagProcessorStatus.md )
431
429
- [ DagScheduleDatasetReference] ( docs/DagScheduleDatasetReference.md )
432
430
- [ DagState] ( docs/DagState.md )
433
431
- [ DagWarning] ( docs/DagWarning.md )
@@ -488,11 +486,9 @@ Class | Method | HTTP request | Description
488
486
- [ TimeDelta] ( docs/TimeDelta.md )
489
487
- [ Trigger] ( docs/Trigger.md )
490
488
- [ TriggerRule] ( docs/TriggerRule.md )
491
- - [ TriggererStatus] ( docs/TriggererStatus.md )
492
489
- [ UpdateDagRunState] ( docs/UpdateDagRunState.md )
493
490
- [ UpdateTaskInstance] ( docs/UpdateTaskInstance.md )
494
491
- [ UpdateTaskInstancesState] ( docs/UpdateTaskInstancesState.md )
495
- - [ UpdateTaskState] ( docs/UpdateTaskState.md )
496
492
- [ User] ( docs/User.md )
497
493
- [ UserAllOf] ( docs/UserAllOf.md )
498
494
- [ UserCollection] ( docs/UserCollection.md )
@@ -512,40 +508,104 @@ Class | Method | HTTP request | Description
512
508
- [ XComCollectionAllOf] ( docs/XComCollectionAllOf.md )
513
509
- [ XComCollectionItem] ( docs/XComCollectionItem.md )
514
510
515
-
516
511
## Documentation For Authorization
517
512
513
+ By default the generated client supports the three authentication schemes:
514
+
515
+ * Basic
516
+ * GoogleOpenID
517
+ * Kerberos
518
518
519
- ## Basic
519
+ However, you can generate client and documentation with your own schemes by adding your own schemes in
520
+ the security section of the OpenAPI specification. You can do it with Breeze CLI by adding the
521
+ `` --security-schemes `` option to the `` breeze release-management prepare-python-client `` command.
520
522
521
- - ** Type ** : HTTP basic authentication
523
+ ## Basic "smoke" tests
522
524
525
+ You can run basic smoke tests to check if the client is working properly - we have a simple test script
526
+ that uses the API to run the tests. To do that, you need to:
523
527
524
- ## Kerberos
528
+ * install the ` apache-airflow-client ` package as described above
529
+ * install `` rich `` Python package
530
+ * download the [ test_python_client.py] ( test_python_client.py ) file
531
+ * make sure you have test airflow installation running. Do not experiment with your production deployment
532
+ * configure your airflow webserver to enable basic authentication
533
+ In the ` [api] ` section of your ` airflow.cfg ` set:
525
534
535
+ ``` ini
536
+ [api]
537
+ auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
538
+ ```
526
539
540
+ You can also set it by env variable:
541
+ ` export AIRFLOW__API__AUTH_BACKENDS=airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth `
527
542
528
- ## Author
543
+ * configure your airflow webserver to load example dags
544
+ In the ` [core] ` section of your ` airflow.cfg ` set:
529
545
530
-
546
+ ``` ini
547
+ [core]
548
+ load_examples = True
549
+ ```
550
+
551
+ You can also set it by env variable: ` export AIRFLOW__CORE__LOAD_EXAMPLES=True `
552
+
553
+ * optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run with
554
+ the default setting, but if you want to see the configuration, you need to expose it.
555
+ In the ` [webserver] ` section of your ` airflow.cfg ` set:
556
+
557
+ ``` ini
558
+ [webserver]
559
+ expose_config = True
560
+ ```
561
+
562
+ You can also set it by env variable: ` export AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True `
563
+
564
+ * Configure your host/ip/user/password in the ` test_python_client.py ` file
565
+
566
+ ``` python
567
+ import airflow_client
568
+
569
+ # Configure HTTP basic authorization: Basic
570
+ configuration = airflow_client.client.Configuration(
571
+ host = " http://localhost:8080/api/v1" , username = " admin" , password = " admin"
572
+ )
573
+ ```
574
+
575
+ * Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsing
576
+ loops (you can pass --num-runs parameter to it or keep it running in the background). The script relies
577
+ on example DAGs being serialized to the DB and this only
578
+ happens when scheduler runs with `` core/load_examples `` set to True.
579
+
580
+ * Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enough
581
+ time to initialize.
582
+
583
+ Run ` python test_python_client.py ` and you should see colored output showing attempts to connect and status.
531
584
532
585
533
586
## Notes for Large OpenAPI documents
587
+
534
588
If the OpenAPI document is large, imports in client.apis and client.models may fail with a
535
589
RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:
536
590
537
591
Solution 1:
538
592
Use specific imports for apis and models like:
593
+
539
594
- ` from airflow_client.client.api.default_api import DefaultApi `
540
595
- ` from airflow_client.client.model.pet import Pet `
541
596
542
597
Solution 2:
543
598
Before importing the package, adjust the maximum recursion limit as shown below:
544
- ```
599
+
600
+ ``` python
545
601
import sys
602
+
546
603
sys.setrecursionlimit(1500 )
547
604
import airflow_client.client
548
605
from airflow_client.client.apis import *
549
606
from airflow_client.client.models import *
550
607
```
551
608
609
+ ## Authors
610
+
611
+
0 commit comments