Skip to content

Commit

Permalink
Update README for 2.9.0 (#112)
Browse files Browse the repository at this point in the history
These were missed during the initial release
  • Loading branch information
jedcunningham authored Jun 18, 2024
1 parent 564f532 commit a006189
Show file tree
Hide file tree
Showing 2 changed files with 116 additions and 55 deletions.
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -240,9 +240,10 @@ fulfilling the request.

This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:

- API version: 2.8.0
- Package version: 2.8.0
- API version: 2.9.0
- Package version: 2.9.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen

For more information, please visit [https://airflow.apache.org](https://airflow.apache.org)

## Requirements.
Expand Down
166 changes: 113 additions & 53 deletions airflow_client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
-->

# Apache Airflow Python Client

# Overview

To facilitate management, Apache Airflow supports a range of REST API endpoints across its
Expand All @@ -26,6 +27,7 @@ This section provides an overview of the API design, methods, and supported use

Most of the endpoints accept `JSON` as input and return `JSON` responses.
This means that you must usually add the following headers to your request:

```
Content-type: application/json
Accept: application/json
Expand All @@ -41,7 +43,7 @@ Resource names are used as part of endpoint URLs, as well as in API parameters a

## CRUD Operations

The platform supports **C**reate, **R**ead, **U**pdate, and **D**elete operations on most resources.
The platform supports **Create**, **Read**, **Update**, and **Delete** operations on most resources.
You can review the standards for these operations and their standard parameters below.

Some endpoints have special behavior as exceptions.
Expand All @@ -66,6 +68,7 @@ The response usually returns a `200 OK` response code upon success, with an obje
of resources' metadata in the response body.

When reading resources, some common query parameters are usually available. e.g.:

```
v1/connections?limit=25&offset=25
```
Expand All @@ -84,7 +87,7 @@ resource in the response body.

### Delete

Deleting a resource requires the resource `id` and is typically executed via an HTTP `DELETE` request.
Deleting a resource requires the resource `id` and is typically executing via an HTTP `DELETE` request.
The response usually returns a `204 No Content` response code upon success.

## Conventions
Expand All @@ -93,16 +96,15 @@ The response usually returns a `204 No Content` response code upon success.
- Names are consistent between URL parameter name and field name.

- Field names are in snake_case.

```json
{
\"description\": \"string\",
\"name\": \"string\",
\"slots\": 0,
\"occupied_slots\": 0,
\"open_slots\": 0
\"used_slots\": 0,
\"queued_slots\": 0,
\"running_slots\": 0,
\"scheduled_slots\": 0,
\"slots\": 0,
\"open_slots\": 0
}
```

Expand All @@ -115,10 +117,13 @@ The update request ignores any fields that aren't specified in the field mask, l
their current values.

Example:
```
resource = request.get('/resource/my-id').json()
resource['my_field'] = 'new-value'
request.patch('/resource/my-id?update_mask=my_field', data=json.dumps(resource))

```python
import requests

resource = requests.get("/resource/my-id").json()
resource["my_field"] = "new-value"
requests.patch("/resource/my-id?update_mask=my_field", data=json.dumps(resource))
```

## Versioning and Endpoint Lifecycle
Expand All @@ -136,6 +141,7 @@ the Apache Airflow API.
Note that you will need to pass credentials data.

For e.g., here is how to pause a DAG with [curl](https://curl.haxx.se/), when basic authorization is used:

```bash
curl -X PATCH 'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \\
-H 'Content-Type: application/json' \\
Expand All @@ -148,8 +154,9 @@ curl -X PATCH 'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused' \
Using a graphical tool such as [Postman](https://www.postman.com/) or [Insomnia](https://insomnia.rest/),
it is possible to import the API specifications directly:

1. Download the API specification by clicking the **Download** button at the top of this document
1. Download the API specification by clicking the **Download** button at top of this document.
2. Import the JSON specification in the graphical tool of your choice.

- In *Postman*, you can click the **import** button at the top
- With *Insomnia*, you can just drag-and-drop the file on the UI

Expand All @@ -172,10 +179,12 @@ and it is even possible to add your own method.

If you want to check which auth backend is currently set, you can use
`airflow config get-value api auth_backends` command as in the example below.

```bash
$ airflow config get-value api auth_backends
airflow.api.auth.backend.basic_auth
```

The default is to deny all requests.

For details on configuring the authentication, see
Expand Down Expand Up @@ -229,43 +238,40 @@ resource, e.g. the resource it tries to create already exists.
This means that the server encountered an unexpected condition that prevented it from
fulfilling the request.


This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:

- API version: 2.8.0
- Package version: 2.8.0
- API version: 2.9.0
- Package version: 2.9.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen

For more information, please visit [https://airflow.apache.org](https://airflow.apache.org)

## Requirements.

Python >=3.6
Python >=3.8

## Installation & Usage
### pip install

If the python package is hosted on a repository, you can install directly using:
### pip install

```sh
pip install git+https://github.com/apache/airflow-client-python.git
```
(you may need to run `pip` with root permission: `sudo pip install git+https://github.com/apache/airflow-client-python.git`)
You can install the client using standard Python installation tools. It is hosted
in PyPI with `apache-airflow-client` package id so the easiest way to get the latest
version is to run:

Then import the package:
```python
import airflow_client.client
```bash
pip install apache-airflow-client
```

### Setuptools

Install via [Setuptools](http://pypi.python.org/pypi/setuptools).
If the python package is hosted on a repository, you can install directly using:

```sh
python setup.py install --user
```bash
pip install git+https://github.com/apache/airflow-client-python.git
```
(or `sudo python setup.py install` to install the package for all users)

### Import check

Then import the package:

```python
import airflow_client.client
```
Expand All @@ -275,40 +281,34 @@ import airflow_client.client
Please follow the [installation procedure](#installation--usage) and then run the following:

```python

import time
import airflow_client.client
from pprint import pprint
from airflow_client.client.api import config_api
from airflow_client.client.model.config import Config
from airflow_client.client.model.error import Error

# Defining the host is optional and defaults to /api/v1
# See configuration.py for a list of all supported configuration parameters.
configuration = client.Configuration(
host = "/api/v1"
)
configuration = client.Configuration(host="/api/v1")

# The client must configure the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.

# Configure HTTP basic authorization: Basic
configuration = client.Configuration(
username = 'YOUR_USERNAME',
password = 'YOUR_PASSWORD'
)
configuration = client.Configuration(username="YOUR_USERNAME", password="YOUR_PASSWORD")


# Enter a context with an instance of the API client
with client.ApiClient(configuration) as api_client:
# Create an instance of the API class
api_instance = config_api.ConfigApi(api_client)
section = "section_example" # str | If given, only return config of this section. (optional)

try:
# Get current configuration
api_response = api_instance.get_config(section=section)
api_response = api_instance.get_config()
pprint(api_response)
except client.ApiException as e:
print("Exception when calling ConfigApi->get_config: %s\n" % e)
Expand All @@ -321,7 +321,6 @@ All URIs are relative to */api/v1*
Class | Method | HTTP request | Description
------------ | ------------- | ------------- | -------------
*ConfigApi* | [**get_config**](docs/ConfigApi.md#get_config) | **GET** /config | Get current configuration
*ConfigApi* | [**get_value**](docs/ConfigApi.md#get_value) | **GET** /config/section/{section}/option/{option} | Get a option from configuration
*ConnectionApi* | [**delete_connection**](docs/ConnectionApi.md#delete_connection) | **DELETE** /connections/{connection_id} | Delete a connection
*ConnectionApi* | [**get_connection**](docs/ConnectionApi.md#get_connection) | **GET** /connections/{connection_id} | Get a connection
*ConnectionApi* | [**get_connections**](docs/ConnectionApi.md#get_connections) | **GET** /connections | List connections
Expand All @@ -345,7 +344,7 @@ Class | Method | HTTP request | Description
*DAGRunApi* | [**get_dag_runs**](docs/DAGRunApi.md#get_dag_runs) | **GET** /dags/{dag_id}/dagRuns | List DAG runs
*DAGRunApi* | [**get_dag_runs_batch**](docs/DAGRunApi.md#get_dag_runs_batch) | **POST** /dags/~/dagRuns/list | List DAG runs (batch)
*DAGRunApi* | [**get_upstream_dataset_events**](docs/DAGRunApi.md#get_upstream_dataset_events) | **GET** /dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents | Get dataset events for a DAG run
*DAGRunApi* | [**post_dag_run**](docs/DAGRunApi.md#post_dag_run) | **POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run.
*DAGRunApi* | [**post_dag_run**](docs/DAGRunApi.md#post_dag_run) | **POST** /dags/{dag_id}/dagRuns | Trigger a new DAG run
*DAGRunApi* | [**set_dag_run_note**](docs/DAGRunApi.md#set_dag_run_note) | **PATCH** /dags/{dag_id}/dagRuns/{dag_run_id}/setNote | Update the DagRun note.
*DAGRunApi* | [**update_dag_run_state**](docs/DAGRunApi.md#update_dag_run_state) | **PATCH** /dags/{dag_id}/dagRuns/{dag_run_id} | Modify a DAG run
*DagWarningApi* | [**get_dag_warnings**](docs/DagWarningApi.md#get_dag_warnings) | **GET** /dagWarnings | List dag warnings
Expand Down Expand Up @@ -427,7 +426,6 @@ Class | Method | HTTP request | Description
- [DAGRun](docs/DAGRun.md)
- [DAGRunCollection](docs/DAGRunCollection.md)
- [DAGRunCollectionAllOf](docs/DAGRunCollectionAllOf.md)
- [DagProcessorStatus](docs/DagProcessorStatus.md)
- [DagScheduleDatasetReference](docs/DagScheduleDatasetReference.md)
- [DagState](docs/DagState.md)
- [DagWarning](docs/DagWarning.md)
Expand Down Expand Up @@ -488,11 +486,9 @@ Class | Method | HTTP request | Description
- [TimeDelta](docs/TimeDelta.md)
- [Trigger](docs/Trigger.md)
- [TriggerRule](docs/TriggerRule.md)
- [TriggererStatus](docs/TriggererStatus.md)
- [UpdateDagRunState](docs/UpdateDagRunState.md)
- [UpdateTaskInstance](docs/UpdateTaskInstance.md)
- [UpdateTaskInstancesState](docs/UpdateTaskInstancesState.md)
- [UpdateTaskState](docs/UpdateTaskState.md)
- [User](docs/User.md)
- [UserAllOf](docs/UserAllOf.md)
- [UserCollection](docs/UserCollection.md)
Expand All @@ -512,40 +508,104 @@ Class | Method | HTTP request | Description
- [XComCollectionAllOf](docs/XComCollectionAllOf.md)
- [XComCollectionItem](docs/XComCollectionItem.md)


## Documentation For Authorization

By default the generated client supports the three authentication schemes:

* Basic
* GoogleOpenID
* Kerberos

## Basic
However, you can generate client and documentation with your own schemes by adding your own schemes in
the security section of the OpenAPI specification. You can do it with Breeze CLI by adding the
``--security-schemes`` option to the ``breeze release-management prepare-python-client`` command.

- **Type**: HTTP basic authentication
## Basic "smoke" tests

You can run basic smoke tests to check if the client is working properly - we have a simple test script
that uses the API to run the tests. To do that, you need to:

## Kerberos
* install the `apache-airflow-client` package as described above
* install ``rich`` Python package
* download the [test_python_client.py](test_python_client.py) file
* make sure you have test airflow installation running. Do not experiment with your production deployment
* configure your airflow webserver to enable basic authentication
In the `[api]` section of your `airflow.cfg` set:

```ini
[api]
auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
```

You can also set it by env variable:
`export AIRFLOW__API__AUTH_BACKENDS=airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth`

## Author
* configure your airflow webserver to load example dags
In the `[core]` section of your `airflow.cfg` set:

dev@airflow.apache.org
```ini
[core]
load_examples = True
```

You can also set it by env variable: `export AIRFLOW__CORE__LOAD_EXAMPLES=True`

* optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run with
the default setting, but if you want to see the configuration, you need to expose it.
In the `[webserver]` section of your `airflow.cfg` set:

```ini
[webserver]
expose_config = True
```

You can also set it by env variable: `export AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True`

* Configure your host/ip/user/password in the `test_python_client.py` file

```python
import airflow_client

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
host="http://localhost:8080/api/v1", username="admin", password="admin"
)
```

* Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsing
loops (you can pass --num-runs parameter to it or keep it running in the background). The script relies
on example DAGs being serialized to the DB and this only
happens when scheduler runs with ``core/load_examples`` set to True.

* Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enough
time to initialize.

Run `python test_python_client.py` and you should see colored output showing attempts to connect and status.


## Notes for Large OpenAPI documents

If the OpenAPI document is large, imports in client.apis and client.models may fail with a
RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:

Solution 1:
Use specific imports for apis and models like:

- `from airflow_client.client.api.default_api import DefaultApi`
- `from airflow_client.client.model.pet import Pet`

Solution 2:
Before importing the package, adjust the maximum recursion limit as shown below:
```

```python
import sys

sys.setrecursionlimit(1500)
import airflow_client.client
from airflow_client.client.apis import *
from airflow_client.client.models import *
```

## Authors

dev@airflow.apache.org

0 comments on commit a006189

Please sign in to comment.