diff --git a/docs/docs/en/api/faststream/index.md b/docs/docs/en/api/faststream/index.md index ac2ff9a271..fa4b8658b7 100644 --- a/docs/docs/en/api/faststream/index.md +++ b/docs/docs/en/api/faststream/index.md @@ -4,4 +4,4 @@ Here's the reference or code API, the classes, functions, parameters, attributes all the FastAPI parts you can use in your applications. If you want to **learn FastStream** you are much better off reading the -[FastStream Tutorial](../../../getting-started/index.md). +[FastStream Tutorial](../../../getting-started/index.md). \ No newline at end of file diff --git a/docs/docs/en/faststream.md b/docs/docs/en/faststream.md index a77549451e..8460a2a505 100644 --- a/docs/docs/en/faststream.md +++ b/docs/docs/en/faststream.md @@ -1,7 +1,6 @@ --- hide: - navigation - - footer search: exclude: true --- @@ -98,7 +97,7 @@ That's **FastStream** in a nutshell—easy, efficient, and powerful. Whether you **FastStream** works on **Linux**, **macOS**, **Windows** and most **Unix**-style operating systems. You can install it with `pip` as usual: -{!> includes/index/1.md !} +{! includes/index/1.md !} !!! tip "" By default **FastStream** uses **PydanticV2** written in **Rust**, but you can downgrade it manually, if your platform has no **Rust** support - **FastStream** will work correctly with **PydanticV1** as well. @@ -107,8 +106,8 @@ You can install it with `pip` as usual: ## Writing app code -**FastStream** brokers provide convenient function decorators `#!python @broker.subscriber` -and `#!python @broker.publisher` to allow you to delegate the actual process of: +**FastStream** brokers provide convenient function decorators `#!python @broker.subscriber(...)` +and `#!python @broker.publisher(...)` to allow you to delegate the actual process of: - consuming and producing data to Event queues, and @@ -121,12 +120,12 @@ JSON-encoded data into Python objects, making it easy to work with structured da Here is an example Python app using **FastStream** that consumes data from an incoming data stream and outputs the data to another one: -{!> includes/index/2.md !} +{! includes/index/2.md !} Also, **Pydantic**’s [`BaseModel`](https://docs.pydantic.dev/usage/models/){.external-link target="_blank"} class allows you to define messages using a declarative syntax, making it easy to specify the fields and types of your messages. -{!> includes/index/3.md !} +{! includes/index/3.md !} --- @@ -138,7 +137,7 @@ The Tester will redirect your `subscriber` and `publisher` decorated functions t Using pytest, the test for our service would look like this: -{!> includes/index/4.md !} +{! includes/index/4.md !} ## Running the application @@ -146,13 +145,13 @@ The application can be started using the built-in **FastStream** CLI command. To run the service, use the **FastStream CLI** command and pass the module (in this case, the file where the app implementation is located) and the app symbol to the command. -``` shell +```shell faststream run basic:app ``` After running the command, you should see the following output: -``` {.shell .no-copy} +```{.shell .no-copy} INFO - FastStream app starting... INFO - input_data | - `HandleMsg` waiting for messages INFO - FastStream app started successfully! To exit press CTRL+C @@ -161,13 +160,13 @@ INFO - FastStream app started successfully! To exit press CTRL+C Also, **FastStream** provides you with a great hot reload feature to improve your Development Experience -``` shell +```shell faststream run basic:app --reload ``` And multiprocessing horizontal scaling feature as well: -``` shell +```shell faststream run basic:app --workers 3 ``` @@ -189,8 +188,8 @@ The availability of such documentation significantly simplifies the integration **FastStream** (thanks to [**FastDepends**](https://lancetnik.github.io/FastDepends/){.external-link target="_blank"}) has a dependency management system similar to `pytest fixtures` and `FastAPI Depends` at the same time. Function arguments declare which dependencies you want are needed, and a special decorator delivers them from the global Context object. -```python linenums="1" hl_lines="9-10" -{!> docs_src/index/dependencies.py [ln:1,6-14] !} +```python linenums="1" hl_lines="8-9" +{! docs_src/index/dependencies.py [ln:1,5-14] !} ``` --- @@ -228,16 +227,16 @@ As evident, **FastStream** is an incredibly user-friendly framework. However, we Save application description inside `description.txt`: ``` -{!> docs_src/index/app_description.txt !} +{! docs_src/index/app_description.txt !} ``` and run the following command to create a new **FastStream** project: -``` shell +```shell faststream_gen -i description.txt ``` -``` {.shell .no-copy} +```{.shell .no-copy} ✨ Generating a new FastStream application! ✔ Application description validated. ✔ FastStream app skeleton code generated. akes around 15 to 45 seconds)... diff --git a/docs/docs/en/getting-started/asyncapi/custom.md b/docs/docs/en/getting-started/asyncapi/custom.md index c9160e20d2..fd95424995 100644 --- a/docs/docs/en/getting-started/asyncapi/custom.md +++ b/docs/docs/en/getting-started/asyncapi/custom.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Customizing AsyncAPI Documentation for FastStream In this guide, we will explore how to customize **AsyncAPI** documentation for your **FastStream** application. Whether you want to add custom app info, broker information, handlers, or fine-tune payload details, we'll walk you through each step. @@ -9,7 +19,7 @@ Before we dive into customization, ensure you have a basic **FastStream** applic Copy the following code in your basic.py file: ```python linenums="1" -{!> docs_src/getting_started/asyncapi/asyncapi_customization/basic.py !} +{! docs_src/getting_started/asyncapi/asyncapi_customization/basic.py !} ``` Now, when you run @@ -31,8 +41,8 @@ Let's start by customizing the app information that appears in your **AsyncAPI** Copy the following code in your basic.py file, we have highligted the additional info passed to **FastStream** app: -```python linenums="1" hl_lines="6-15" - {!> docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py !} +```python linenums="1" hl_lines="6-16" +{! docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py !} ``` Now, when you run @@ -46,7 +56,7 @@ you should see the following in your general app documentation: Now, your documentation reflects your application's identity and purpose. !!! note - The ```description``` field in the above example supports ```Markdown``` text. + The `description` field in the above example supports `Markdown` text. ## Setup Custom Broker Information @@ -61,7 +71,7 @@ The next step is to customize broker information. This helps users understand th Copy the following code in your basic.py file, we have highligted the additional info passed to the **FastStream** app broker: ```python linenums="1" hl_lines="5-9" - {!> docs_src/getting_started/asyncapi/asyncapi_customization/custom_broker.py !} +{! docs_src/getting_started/asyncapi/asyncapi_customization/custom_broker.py !} ``` Now, when you run @@ -89,12 +99,12 @@ Customizing handler information helps users comprehend the purpose and behavior Copy the following code in your basic.py file, we have highligted the additional info passed to the **FastStream** app handlers: ```python linenums="1" hl_lines="17-25 27-31" - {!> docs_src/getting_started/asyncapi/asyncapi_customization/custom_handler.py !} +{! docs_src/getting_started/asyncapi/asyncapi_customization/custom_handler.py !} ``` Now, when you run ```shell -{!> docs_src/getting_started/asyncapi/serve.py [ln:17] !} +{! docs_src/getting_started/asyncapi/serve.py [ln:17] !} ``` you should see the descriptions in your handlers: @@ -115,7 +125,7 @@ To describe your message payload effectively, you can use Pydantic models. Here' Copy the following code in your basic.py file, we have highligted the creation of payload info and you can see it being passed to the return type and the `msg` argument type in the `on_input_data` function: ```python linenums="1" hl_lines="7-10 19" - {!> docs_src/getting_started/asyncapi/asyncapi_customization/payload_info.py !} +{! docs_src/getting_started/asyncapi/asyncapi_customization/payload_info.py !} ``` Now, when you run diff --git a/docs/docs/en/getting-started/asyncapi/export.md b/docs/docs/en/getting-started/asyncapi/export.md index 91ab0506dd..f2d23cf9ad 100644 --- a/docs/docs/en/getting-started/asyncapi/export.md +++ b/docs/docs/en/getting-started/asyncapi/export.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # How to Generate and Serve AsyncAPI Documentation In this guide, let's explore how to generate and serve [**AsyncAPI**](https://www.asyncapi.com/){.external-link target="_blank"} documentation for our FastStream application. @@ -8,29 +18,29 @@ Here's an example Python application using **FastStream** that consumes data fro topic, increments the value, and outputs the data to another topic. Save it in a file called `basic.py`. -``` python -{!> docs_src/kafka/basic/basic.py!} +```python title="basic.py" +{! docs_src/kafka/basic/basic.py!} ``` ## Generating the AsyncAPI Specification Now that we have a FastStream application, we can proceed with generating the AsyncAPI specification using a CLI command. -``` shell -{!> docs_src/getting_started/asyncapi/serve.py[ln:9]!} +```shell +{! docs_src/getting_started/asyncapi/serve.py [ln:9] !} ``` The above command will generate the AsyncAPI specification and save it in a file called `asyncapi.json`. If you prefer `yaml` instead of `json`, please run the following command to generate `asyncapi.yaml`. -``` shell -{!> docs_src/getting_started/asyncapi/serve.py[ln:13]!} +```shell +{! docs_src/getting_started/asyncapi/serve.py [ln:13] !} ``` !!! tip To generate the documentation in yaml format, please install the necessary dependency to work with **YAML** file format at first. - ``` shell + ```shell pip install PyYAML ``` diff --git a/docs/docs/en/getting-started/asyncapi/hosting.md b/docs/docs/en/getting-started/asyncapi/hosting.md index 2e53685b3a..ffb4b95082 100644 --- a/docs/docs/en/getting-started/asyncapi/hosting.md +++ b/docs/docs/en/getting-started/asyncapi/hosting.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Serving the AsyncAPI Documentation FastStream provides a command to serve the AsyncAPI documentation. @@ -5,25 +15,25 @@ FastStream provides a command to serve the AsyncAPI documentation. !!! note This feature requires an Internet connection to obtain the **AsyncAPI HTML** via **CDN**. -``` shell -{!> docs_src/getting_started/asyncapi/serve.py[ln:17]!} +```shell +{! docs_src/getting_started/asyncapi/serve.py [ln:17] !} ``` In the above command, we are providing the path in the format of `python_module:FastStream`. Alternatively, you can also specify `asyncapi.json` or `asyncapi.yaml` to serve the AsyncAPI documentation. === "JSON" - ``` shell - {!> docs_src/getting_started/asyncapi/serve.py[ln:21]!} + ```shell + {!> docs_src/getting_started/asyncapi/serve.py [ln:21] !} ``` === "YAML" - ``` shell - {!> docs_src/getting_started/asyncapi/serve.py[ln:25]!} + ```shell + {!> docs_src/getting_started/asyncapi/serve.py [ln:25] !} ``` After running the command, it should serve the AsyncAPI documentation on port **8000** and display the following logs in the terminal. -``` {.shell .no-copy} +```{.shell .no-copy} INFO: Started server process [2364992] INFO: Waiting for application startup. INFO: Application startup complete. diff --git a/docs/docs/en/getting-started/cli/index.md b/docs/docs/en/getting-started/cli/index.md index 0a8a820218..6d9cdce471 100644 --- a/docs/docs/en/getting-started/cli/index.md +++ b/docs/docs/en/getting-started/cli/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # CLI **FastStream** has its own built-in **CLI** tool for your maximum comfort as a developer. diff --git a/docs/docs/en/getting-started/config/index.md b/docs/docs/en/getting-started/config/index.md index 55d295856c..5b7830e3c9 100644 --- a/docs/docs/en/getting-started/config/index.md +++ b/docs/docs/en/getting-started/config/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Settings and Environment Variables In many cases, your application may require external settings or configurations, such as a broker connection or database credentials. @@ -47,7 +57,7 @@ It will also convert and validate the data, so when you use that `settings` obje Now you can use the new `settings` object in your application: ```python linenums='1' hl_lines="3 9 14" title="serve.py" -{!> docs_src/getting_started/config/usage.py !} +{! docs_src/getting_started/config/usage.py !} ``` ### Running the Application @@ -91,7 +101,7 @@ QUEUE="test-queue" Then update your `config.py` as follows: ```python linenums='1' hl_lines="1 11" -{!> docs_src/getting_started/config/settings_env.py !} +{! docs_src/getting_started/config/settings_env.py !} ``` This way, you can specify different `.env` files directly from your terminal, which can be extremely helpful for various testing and production scenarios. diff --git a/docs/docs/en/getting-started/context/custom.md b/docs/docs/en/getting-started/context/custom.md index aac3ddacdd..c6b6d2a788 100644 --- a/docs/docs/en/getting-started/context/custom.md +++ b/docs/docs/en/getting-started/context/custom.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Context Fields Declaration You can also store your own objects in the `Context`. @@ -6,11 +16,11 @@ You can also store your own objects in the `Context`. To declare an application-level context field, you need to call the `context.set_global` method with with a key to indicate where the object will be placed in the context. -{!> includes/getting_started/context/custom_global.md !} +{! includes/getting_started/context/custom_global.md !} Afterward, you can access your `secret` field in the usual way: -{!> includes/getting_started/context/custom_global_2.md !} +{! includes/getting_started/context/custom_global_2.md !} In this case, the field becomes a global context field: it does not depend on the current message handler (unlike `message`) @@ -24,8 +34,8 @@ context.reset_global("my_key") To set a local context (available only within the message processing scope), use the context manager `scope` -{!> includes/getting_started/context/custom_local.md !} +{! includes/getting_started/context/custom_local.md !} You can also set the context by yourself, and it will remain within the current call stack until you clear it. -{!> includes/getting_started/context/manual_local.md !} +{! includes/getting_started/context/manual_local.md !} diff --git a/docs/docs/en/getting-started/context/existed.md b/docs/docs/en/getting-started/context/existed.md index a64c462a65..5ba9ef818b 100644 --- a/docs/docs/en/getting-started/context/existed.md +++ b/docs/docs/en/getting-started/context/existed.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Existing Fields **Context** already contains some global objects that you can always access: @@ -13,7 +23,7 @@ At the same time, thanks to `contextlib.ContextVar`, **message** is local for yo By default, the context searches for an object based on the argument name. -{!> includes/getting_started/context/access.md !} +{! includes/getting_started/context/access.md !} ## Annotated Aliases @@ -25,4 +35,4 @@ Also, **FastStream** has already created `Annotated` aliases to provide you with from faststream import Logger, ContextRepo ``` -{!> includes/getting_started/context/existed_annotations.md !} +{! includes/getting_started/context/existed_annotations.md !} diff --git a/docs/docs/en/getting-started/context/extra.md b/docs/docs/en/getting-started/context/extra.md index d2f455cc87..703906d269 100644 --- a/docs/docs/en/getting-started/context/extra.md +++ b/docs/docs/en/getting-started/context/extra.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Context Extra Options Additionally, `Context` provides you with some extra capabilities for working with containing objects. @@ -8,14 +18,14 @@ For instance, if you attempt to access a field that doesn't exist in the global However, you can set default values if needed. -{!> includes/getting_started/context/default.md !} +{! includes/getting_started/context/default.md !} ## Cast Context Types By default, context fields are **NOT CAST** to the type specified in their annotation. -{!> includes/getting_started/context/not_cast.md !} +{! includes/getting_started/context/not_cast.md !} If you require this functionality, you can enable the appropriate flag. -{!> includes/getting_started/context/cast.md !} +{! includes/getting_started/context/cast.md !} diff --git a/docs/docs/en/getting-started/context/fields.md b/docs/docs/en/getting-started/context/fields.md index ef548fc131..638bbd6f25 100644 --- a/docs/docs/en/getting-started/context/fields.md +++ b/docs/docs/en/getting-started/context/fields.md @@ -1,4 +1,11 @@ --- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 comment_1: This way you can get access to context object by its name comment_2: This way you can get access to context object specific field comment_3: Or even to a dict key diff --git a/docs/docs/en/getting-started/context/index.md b/docs/docs/en/getting-started/context/index.md index cd8cf6c824..e8ca99446b 100644 --- a/docs/docs/en/getting-started/context/index.md +++ b/docs/docs/en/getting-started/context/index.md @@ -1,14 +1,24 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Application Context **FastStreams** has its own Dependency Injection container - **Context**, used to store application runtime objects and variables. With this container, you can access both application scope and message processing scope objects. This functionality is similar to [`Depends`](../dependencies/index.md){.internal-link} usage. -{!> includes/getting_started/context/base.md !} +{! includes/getting_started/context/base.md !} But, with the [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated){.external-docs target="_blank"} Python feature usage, it is much closer to `#!python @pytest.fixture`. -{!> includes/getting_started/context/annotated.md !} +{! includes/getting_started/context/annotated.md !} ## Usages @@ -25,8 +35,8 @@ By default, the context is available in the same place as `Depends`: To use context in other functions, use the `#!python @apply_types` decorator. In this case, the context of the called function will correspond to the context of the event handler from which it was called. -```python linenums="1" hl_lines="6 9-10" -{!> docs_src/getting_started/context/nested.py [ln:1,9-16] !} +```python linenums="1" hl_lines="5 7-8" +{! docs_src/getting_started/context/nested.py [ln:1-2,9-12,14-16] !} ``` -In the example above, we did not pass the `logger` function at calling it; it was placed outside of context. +In the example above, we did not pass the `logger` function at calling it; it was placed from context. diff --git a/docs/docs/en/getting-started/contributing/.meta.yml b/docs/docs/en/getting-started/contributing/.meta.yml new file mode 100644 index 0000000000..d545d98e3a --- /dev/null +++ b/docs/docs/en/getting-started/contributing/.meta.yml @@ -0,0 +1,7 @@ +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 3 diff --git a/docs/docs/en/getting-started/contributing/CONTRIBUTING.md b/docs/docs/en/getting-started/contributing/CONTRIBUTING.md index e0bd0c830b..9818aea06c 100644 --- a/docs/docs/en/getting-started/contributing/CONTRIBUTING.md +++ b/docs/docs/en/getting-started/contributing/CONTRIBUTING.md @@ -100,7 +100,7 @@ pytest -m 'not rabbit and not kafka and not nats and not redis' To run tests based on RabbitMQ, Kafka, or other dependencies, the following dependencies are needed to be started as docker containers: ```yaml -{!> includes/docker-compose.yaml !} +{! includes/docker-compose.yaml !} ``` You can start the dependencies easily using provided script by running: diff --git a/docs/docs/en/getting-started/dependencies/index.md b/docs/docs/en/getting-started/dependencies/index.md index 75e55c4196..8d88ab81d2 100644 --- a/docs/docs/en/getting-started/dependencies/index.md +++ b/docs/docs/en/getting-started/dependencies/index.md @@ -1,4 +1,6 @@ --- +search: + boost: 10 nested: A nested dependency is called here --- @@ -119,7 +121,7 @@ from faststream import Depends, apply_types def simple_dependency(a: int, b: int = 3) -> str: return a + b # 'return' is cast to `str` for the first time -@inject +@apply_types def method(a: int, d: int = Depends(simple_dependency)): # 'd' is cast to `int` for the second time return a + d diff --git a/docs/docs/en/getting-started/index.md b/docs/docs/en/getting-started/index.md index 126eca5f23..ec740eb1a8 100644 --- a/docs/docs/en/getting-started/index.md +++ b/docs/docs/en/getting-started/index.md @@ -1,6 +1,13 @@ --- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 hide: - - toc + - toc run_docker: To start a new project, we need a test broker container --- @@ -25,7 +32,7 @@ faststream run serve:app After running the command, you should see the following output: -``` {.shell .no-copy} +```{.shell .no-copy} INFO - FastStream app starting... INFO - test | - `BaseHandler` waiting for messages INFO - FastStream app started successfully! To exit, press CTRL+C diff --git a/docs/docs/en/getting-started/integrations/django/index.md b/docs/docs/en/getting-started/integrations/django/index.md index 04fdde094b..8f24234a27 100644 --- a/docs/docs/en/getting-started/integrations/django/index.md +++ b/docs/docs/en/getting-started/integrations/django/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Using FastStream with Django [**Django**](https://www.djangoproject.com/){.external-link target="_blank"} is a high-level Python web framework that encourages rapid development and clean, pragmatic design. Built by experienced developers, it takes care of much of the hassle of web development, so you can focus on writing your app without needing to reinvent the wheel. It’s free and open source. diff --git a/docs/docs/en/getting-started/integrations/fastapi/index.md b/docs/docs/en/getting-started/integrations/fastapi/index.md index e36a6435bb..5318714c42 100644 --- a/docs/docs/en/getting-started/integrations/fastapi/index.md +++ b/docs/docs/en/getting-started/integrations/fastapi/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # **FastAPI** Plugin ## Handling messages @@ -17,7 +27,7 @@ Just import a **StreamRouter** you need and declare the message handler in the s When processing a message from a broker, the entire message body is placed simultaneously in both the `body` and `path` request parameters. You can access them in any way convenient for you. The message header is placed in `headers`. Also, this router can be fully used as an `HttpRouter` (of which it is the inheritor). So, you can -use it to declare any `get`, `post`, `put` and other HTTP methods. For example, this is done at **line 23**. +use it to declare any `get`, `post`, `put` and other HTTP methods. For example, this is done at [**line 23**](#__codelineno-0-23). !!! warning If your **ASGI** server does not support installing **state** inside **lifespan**, you can disable this behavior as follows: @@ -74,13 +84,13 @@ To test your **FastAPI StreamRouter**, you can still use it with the *TestClient ## Miltiple Routers -Using **FastStream** as a **FastAPI** plugin you are still able to separate messages processing logic between different routers (like with a regular HTTPRouter). But it can be confusing - how you should include multiple routers, if we have to setup `router.lifespan_context` as a **FastAPI** object lifespan. +Using **FastStream** as a **FastAPI** plugin you are still able to separate messages processing logic between different routers (like with a regular `HTTPRouter`). But it can be confusing - how you should include multiple routers, if we have to setup `router.lifespan_context` as a **FastAPI** object lifespan. You can make it in a two ways, depends on you reminds. ### Routers nesting -If you want to use the **SAME CONNECTION** for all of you routers you should nested them each over and finally use only the core router to include in into **FastAPI** object. +If you want to use the **SAME CONNECTION** for all of you routers you should nest them each other and finally use only the core router to include it into **FastAPI** object. {! includes/getting_started/integrations/fastapi/multiple.md !} @@ -88,7 +98,7 @@ This way the core router collects all nested routers publishers and subscribers ### Custom lifespan -Overwise, if you want to has multiple connections to various broker instances, you should start routers independently in your custom lifespan +Overwise, if you want to has multiple connections to different broker instances, you should start routers independently in your custom lifespan {! includes/getting_started/integrations/fastapi/multiple_lifespan.md !} diff --git a/docs/docs/en/getting-started/integrations/frameworks/index.md b/docs/docs/en/getting-started/integrations/frameworks/index.md index 6f06f2106e..fcb09ce7f2 100644 --- a/docs/docs/en/getting-started/integrations/frameworks/index.md +++ b/docs/docs/en/getting-started/integrations/frameworks/index.md @@ -1,4 +1,11 @@ --- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 # template variables fastapi_plugin: If you want to use **FastStream** in conjunction with **FastAPI**, perhaps you should use a special [plugin](../fastapi/index.md){.internal-link} no_hook: However, even if such a hook is not provided, you can do it yourself. diff --git a/docs/docs/en/getting-started/lifespan/context.md b/docs/docs/en/getting-started/lifespan/context.md index 9f54d97ebd..3dcee88b9f 100644 --- a/docs/docs/en/getting-started/lifespan/context.md +++ b/docs/docs/en/getting-started/lifespan/context.md @@ -1,10 +1,20 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Lifespan Context Manager Also, you can define *startup* and *shutdown* logic using the `lifespan` parameter of the **FastSTream** app, and a "context manager" (I'll show you what that is in a second). Let's start with an example from [hooks page](./hooks.md#another-example){.internal-link} and refactor it using "context manager". -We create an async function `lifespan()` with `yield` like this: +We create an async function `lifespan()` with `#!python yield` like this: {! includes/getting_started/lifespan/ml_context.md !} diff --git a/docs/docs/en/getting-started/lifespan/hooks.md b/docs/docs/en/getting-started/lifespan/hooks.md index 994371a1f0..f6cfa3ee4d 100644 --- a/docs/docs/en/getting-started/lifespan/hooks.md +++ b/docs/docs/en/getting-started/lifespan/hooks.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Lifespan Hooks ## Usage example @@ -86,7 +96,7 @@ Also, we don't want the model to finish its work incorrectly when the applicatio If you want to declare multiple lifecycle hooks, they will be used in the order they are registered: ```python linenums="1" hl_lines="6 11" -{!> docs_src/getting_started/lifespan/multiple.py !} +{! docs_src/getting_started/lifespan/multiple.py !} ``` ## Some more details diff --git a/docs/docs/en/getting-started/lifespan/index.md b/docs/docs/en/getting-started/lifespan/index.md index f74e924f87..8bb5c9cda1 100644 --- a/docs/docs/en/getting-started/lifespan/index.md +++ b/docs/docs/en/getting-started/lifespan/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Lifespan Events Sometimes you need to define the logic that should be executed before launching the application. diff --git a/docs/docs/en/getting-started/lifespan/test.md b/docs/docs/en/getting-started/lifespan/test.md index 19c89d3bc3..339bcdcc18 100644 --- a/docs/docs/en/getting-started/lifespan/test.md +++ b/docs/docs/en/getting-started/lifespan/test.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Events Testing In the most cases you are testing your subsriber/publisher functions, but sometimes you need to trigger some lifespan hooks in your tests too. @@ -10,9 +20,9 @@ For this reason, **FastStream** has a special **TestApp** patcher working as a r If you want to use In-Memory patched broker in your tests, it's advisable to patch the broker first (before applying the application patch). -Also, **TestApp** and **TestBroker** are calling `#!python broker.start()` both. According to the original logic, broker should be started in the `FastStream` application, but **TestBroker** applied first breaks this behavior. This reason **TestApp** prevents **TestBroker** `#!python broker.start()` call if it placed incide **TestBroker** context. +Also, **TestApp** and **TestBroker** are calling `broker.start()` both. According to the original logic, broker should be started in the `FastStream` application, but **TestBroker** applied first breaks this behavior. This reason **TestApp** prevents **TestBroker** `broker.start()` call if it placed incide **TestBroker** context. -This behavior is ruled by `connect_only` **TestBroker** argument. By default it has `#!python None` value, but **TestApp** can set it to `True/False` by inner logic. To prevent this "magic", just setup `connect_only` argument manually. +This behavior is ruled by `connect_only` **TestBroker** argument. By default it has `None` value, but **TestApp** can set it to `True/False` by inner logic. To prevent this "magic", just setup `connect_only` argument manually. !!! warning With `#!python connect_only=False`, all `FastStream` hooks will be called after **broker was started**, what can breaks some `@app.on_startup` logic. diff --git a/docs/docs/en/getting-started/logging.md b/docs/docs/en/getting-started/logging.md index 46746d6fcb..c5eff16b96 100644 --- a/docs/docs/en/getting-started/logging.md +++ b/docs/docs/en/getting-started/logging.md @@ -1,6 +1,16 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Application and Access Logging -**FastStream** uses two previously configured loggers: +**FastStream** uses two already configured loggers: * `faststream` - used by `FastStream` app * `faststream.access` - used by the broker @@ -41,7 +51,7 @@ If you want to completely disable the default logging of `FastStream`, you can s from faststream import FastStream from faststream.rabbit import RabbitBroker -broker = RabbitBroker(logger=None) # Disables broker logs +broker = RabbitBroker(logger=None) # Disables broker logs app = FastStream(broker, logger=None) # Disables application logs ``` diff --git a/docs/docs/en/getting-started/middlewares/index.md b/docs/docs/en/getting-started/middlewares/index.md index 45d4037218..61f3ab1296 100644 --- a/docs/docs/en/getting-started/middlewares/index.md +++ b/docs/docs/en/getting-started/middlewares/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Middlewares **Middlewares** are a powerful mechanism that allows you to add additional logic to any stage of the message processing pipeline. @@ -17,7 +27,7 @@ Unfortunately, this powerful feature has a somewhat complex signature too. Using middlewares, you can wrap the entire message processing pipeline. In this case, you need to specify `on_receive` and `after_processed` methods: -``` python +```python from faststream import BaseMiddleware class MyMiddleware(BaseMiddleware): @@ -46,7 +56,7 @@ Also, using middlewares, you are able to wrap consumer function calls directly. In this case, you need to specify `on_receive` and `after_processed` methods: -``` python +```python from typing import Optional from faststream import BaseMiddleware: @@ -70,7 +80,7 @@ Finally, using middlewares, you are able to patch outgoing messages too. For exa In this, case you need to specify `on_publish` and `after_publish` methods: -``` python +```python from typing import Optional from faststream import BaseMiddleware: diff --git a/docs/docs/en/getting-started/publishing/broker.md b/docs/docs/en/getting-started/publishing/broker.md index 1fbe5df555..9d4d01f68c 100644 --- a/docs/docs/en/getting-started/publishing/broker.md +++ b/docs/docs/en/getting-started/publishing/broker.md @@ -1,7 +1,17 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Broker Publishing The easiest way to publish a message is to use a Broker, which allows you to use it as a publisher client in any applications. In the **FastStream** project, this call is not represented in the **AsyncAPI** scheme. You can use it to send rarely-publishing messages, such as startup or shutdown events. -{!> includes/getting_started/publishing/broker/1.md !} +{! includes/getting_started/publishing/broker/1.md !} diff --git a/docs/docs/en/getting-started/publishing/decorator.md b/docs/docs/en/getting-started/publishing/decorator.md index b4a1beac6c..28f837d73b 100644 --- a/docs/docs/en/getting-started/publishing/decorator.md +++ b/docs/docs/en/getting-started/publishing/decorator.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publisher Decorator The second easiest way to publish messages is by using the Publisher Decorator. This method has an [**AsyncAPI**](../asyncapi/custom.md){.internal-link} representation and is suitable for quickly creating applications. However, it doesn't provide all testing features. @@ -7,13 +17,13 @@ It creates a structured DataPipeline unit with an input and output. The order of !!! note It uses the handler function's return type annotation to cast the function's return value before sending, so be accurate with it. -{!> includes/getting_started/publishing/decorator/1.md !} +{! includes/getting_started/publishing/decorator/1.md !} ## Message Broadcasting The decorator can be used multiple times with one function to broadcast the function's return: -```python +```python hl_lines="2-3" @broker.subscriber("in") @broker.publisher("first-out") @broker.publisher("second-out") diff --git a/docs/docs/en/getting-started/publishing/direct.md b/docs/docs/en/getting-started/publishing/direct.md index d082332b04..c824f64b8a 100644 --- a/docs/docs/en/getting-started/publishing/direct.md +++ b/docs/docs/en/getting-started/publishing/direct.md @@ -1,14 +1,24 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publisher Direct Usage The Publisher Object provides a full-featured way to publish messages. It has an [**AsyncAPI**](../asyncapi/custom.md){.internal-link} representation and includes [testability](./test.md){.internal-link} features. This method creates a reusable Publisher object that can be used directly to publish a message: -{!> includes/getting_started/publishing/direct/1.md !} +{! includes/getting_started/publishing/direct/1.md !} It is something in the middle between [broker publish](./broker.md){.internal-link} and [object decorator](./object.md){.internal-link}. It has an **AsyncAPI** representation and *testability* features (like the **object decorator**), but allows you to send different messages to different outputs (like the **broker publish**). -```python +```python hl_lines="3-4" @broker.subscriber("in") async def handle(msg) -> str: await publisher1.publish("Response-1") diff --git a/docs/docs/en/getting-started/publishing/index.md b/docs/docs/en/getting-started/publishing/index.md index e2d5e75da8..a9237d5348 100644 --- a/docs/docs/en/getting-started/publishing/index.md +++ b/docs/docs/en/getting-started/publishing/index.md @@ -1,10 +1,20 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing Basics **FastStream** is broker-agnostic and easy to use, even as a client in non-**FastStream** applications. It offers several use cases for publishing messages: -* Using `#!python broker.publish(...)` +* Using `#!python broker.publish(...)` method * Using the `#!python @broker.publisher(...)` decorator * Using a publisher object decorator * Using a publisher object directly @@ -35,4 +45,4 @@ You just need to `#!python connect` your broker, and you are ready to send a mes To publish a message, simply set up the message content and a routing key: -{!> includes/getting_started/publishing/index.md !} +{! includes/getting_started/publishing/index.md !} diff --git a/docs/docs/en/getting-started/publishing/object.md b/docs/docs/en/getting-started/publishing/object.md index fe141a881e..263dc48586 100644 --- a/docs/docs/en/getting-started/publishing/object.md +++ b/docs/docs/en/getting-started/publishing/object.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publisher Object The Publisher Object provides a full-featured way to publish messages. It has an [**AsyncAPI**](../asyncapi/custom.md){.internal-link} representation and includes [testability](./test.md){.internal-link} features. This method creates a reusable Publisher object. @@ -7,13 +17,13 @@ Additionally, this object can be used as a decorator. The order of Subscriber an !!! note It uses the handler function's return type annotation to cast the function's return value before sending, so be accurate with it. -{!> includes/getting_started/publishing/object/1.md !} +{! includes/getting_started/publishing/object/1.md !} ## Message Broadcasting The decorator can be used multiple times with one function to broadcast the function's return: -```python +```python hl_lines="1-2" @publisher1 @publisher2 @broker.subscriber("in") diff --git a/docs/docs/en/getting-started/publishing/test.md b/docs/docs/en/getting-started/publishing/test.md index afcf61bf88..174b225922 100644 --- a/docs/docs/en/getting-started/publishing/test.md +++ b/docs/docs/en/getting-started/publishing/test.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publisher Testing If you are working with a Publisher object (either as a decorator or directly), you have several testing features available: @@ -20,7 +30,7 @@ Let's take a look at a simple application example with a publisher as a decorato To test it, you just need to patch your broker with a special *TestBroker*. -{!> includes/getting_started/publishing/testing/3.md !} +{! includes/getting_started/publishing/testing/3.md !} By default, it patches you broker to run **In-Memory**, so you can use it without any external broker. It should be extremely usefull in your CI or local development environment. diff --git a/docs/docs/en/getting-started/routers/index.md b/docs/docs/en/getting-started/routers/index.md index bf01346181..8539af2e46 100644 --- a/docs/docs/en/getting-started/routers/index.md +++ b/docs/docs/en/getting-started/routers/index.md @@ -1,4 +1,11 @@ --- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 # template variables note_decor: Now you can use the created router to register handlers and publishers as if it were a regular broker note_include: Then you can simply include all the handlers declared using the router in your broker @@ -26,13 +33,13 @@ First, you need to import the *Broker Router* from the same module from where yo {{ includes }} !!! tip - Also, when creating a *Broker Router*, you can specify [middleware](../middlewares/index.md), [dependencies](../dependencies/global.md), [parser](../serialization/parser.md) and [decoder](../serialization/decoder.md) to apply them to all subscribers declared via this router. + Also, when creating a *Broker Router*, you can specify [middleware](../middlewares/index.md), [dependencies](../dependencies/index.md#top-level-dependencies), [parser](../serialization/parser.md) and [decoder](../serialization/decoder.md) to apply them to all subscribers declared via this router. ## Delay Handler Registration If you want to separate your application's core logic from **FastStream**'s routing logic, you can write some core functions and use them as *Broker Router* `handlers` later: -{!> includes/getting_started/routers/2.md !} +{! includes/getting_started/routers/2.md !} !!! warning Be careful, this way you won't be able to test your handlers with a [`mock`](../subscription/test.md) object. diff --git a/docs/docs/en/getting-started/serialization/decoder.md b/docs/docs/en/getting-started/serialization/decoder.md index 73afc98a9e..c0e5610efa 100644 --- a/docs/docs/en/getting-started/serialization/decoder.md +++ b/docs/docs/en/getting-started/serialization/decoder.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Custom Decoder At this stage, the body of a **StreamMessage** is transformed into the format that it will take when it enters your handler function. This stage is the one you will need to redefine more often. @@ -6,11 +16,11 @@ At this stage, the body of a **StreamMessage** is transformed into the format th The original decoder function has a relatively simple signature (this is a simplified version): -{!> includes/getting_started/serialization/decoder/1.md !} +{! includes/getting_started/serialization/decoder/1.md !} Alternatively, you can reuse the original decoder function with the following signature: -{!> includes/getting_started/serialization/decoder/2.md !} +{! includes/getting_started/serialization/decoder/2.md !} !!! note The original decoder is always an asynchronous function, so your custom decoder should also be asynchronous. @@ -19,4 +29,4 @@ Afterward, you can set this custom decoder at the broker or subscriber level. ## Example -You can find examples of *Protobuf* and *Msgpack* serialization in the [next article](./examples.md){.internal-link}. +You can find examples of *Protobuf*, *Msgpack* and *Avro* serialization in the [next article](./examples.md){.internal-link}. diff --git a/docs/docs/en/getting-started/serialization/examples.md b/docs/docs/en/getting-started/serialization/examples.md index a66a87f9db..5a19ab10a1 100644 --- a/docs/docs/en/getting-started/serialization/examples.md +++ b/docs/docs/en/getting-started/serialization/examples.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Serialization examples ## Protobuf @@ -32,14 +42,14 @@ python -m grpc_tools.protoc --python_out=. --pyi_out=. -I . message.proto This generates two files: `message_pb2.py` and `message_pb2.pyi`. We can use the generated class to serialize our messages: -``` python linenums="1" hl_lines="1 10-13 16 23" -{!> docs_src/getting_started/serialization/protobuf.py !} +```python linenums="1" hl_lines="1 10-13 16 23" +{! docs_src/getting_started/serialization/protobuf.py !} ``` Note that we used the `NoCast` annotation to exclude the message from the `pydantic` representation of our handler. -``` python -{!> docs_src/getting_started/serialization/protobuf.py [ln:17] !} +```python +{! docs_src/getting_started/serialization/protobuf.py [ln:17] !} ``` ## Msgpack @@ -54,8 +64,8 @@ pip install msgpack Since there is no need for a schema, you can easily write a *Msgpack* decoder: -``` python linenums="1" hl_lines="1 10-11 14 21" -{!> docs_src/getting_started/serialization/msgpack_ex.py !} +```python linenums="1" hl_lines="1 10-11 14 21" +{! docs_src/getting_started/serialization/msgpack_ex.py !} ``` Using *Msgpack* is much simpler than using *Protobuf* schemas. Therefore, if you don't have strict message size limitations, you can use *Msgpack* serialization in most cases. @@ -75,26 +85,26 @@ pip install fastavro Next, let's define the schema for our message. You can either define it in the Python file itself as: -``` python -{!> docs_src/getting_started/serialization/avro.py [ln:10-19] !} +```python +{! docs_src/getting_started/serialization/avro.py [ln:10-19] !} ``` Or you can load the schema from an avsc file as: -``` python -{!> docs_src/getting_started/serialization/avro.py [ln:21] !} +```python +{! docs_src/getting_started/serialization/avro.py [ln:21] !} ``` The contents of the `person.avsc` file are: -``` avro -{!> docs_src/getting_started/serialization/person.avsc !} +```json title="person.avsc" +{! docs_src/getting_started/serialization/person.avsc !} ``` Finally, let's use Avro's `schemaless_reader` and `schemaless_writer` to decode and encode messages in the `FastStream` app. -``` python linenums="1" hl_lines="1 3 15-18 21 30-32" -{!> docs_src/getting_started/serialization/avro.py [ln:1-9,23-45] !} +```python linenums="1" hl_lines="1 3 15-18 21 30-32" +{! docs_src/getting_started/serialization/avro.py [ln:1-10,22-45] !} ``` ## Tips diff --git a/docs/docs/en/getting-started/serialization/index.md b/docs/docs/en/getting-started/serialization/index.md index d2ddd23102..81d55d5771 100644 --- a/docs/docs/en/getting-started/serialization/index.md +++ b/docs/docs/en/getting-started/serialization/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Custom Serialization By default, **FastStream** uses the *JSON* format to send and receive messages. However, if you need to handle messages in other formats or with additional serialization steps, such as *gzip*, *lz4*, *Avro*, *Protobuf* or *Msgpack*, you can easily modify the serialization logic. diff --git a/docs/docs/en/getting-started/serialization/parser.md b/docs/docs/en/getting-started/serialization/parser.md index 857ddf3176..34ebf73897 100644 --- a/docs/docs/en/getting-started/serialization/parser.md +++ b/docs/docs/en/getting-started/serialization/parser.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Custom Parser At this stage, **FastStream** serializes an incoming message from the broker's framework into a general format called **StreamMessage**. During this stage, the message body remains in the form of raw bytes. @@ -10,11 +20,11 @@ For example, you can specify your own header with the `message_id` semantic. Thi To create a custom message parser, you should write a regular Python function (synchronous or asynchronous) with the following signature: -{!> includes/getting_started/serialization/parser/1.md !} +{! includes/getting_started/serialization/parser/1.md !} Alternatively, you can reuse the original parser function with the following signature: -{!> includes/getting_started/serialization/parser/2.md !} +{! includes/getting_started/serialization/parser/2.md !} The argument naming doesn't matter; the parser will always be placed as the second argument. @@ -27,4 +37,4 @@ Afterward, you can set this custom parser at the broker or subscriber level. As an example, let's redefine `message_id` to a custom header: -{!> includes/getting_started/serialization/parser/3.md !} +{! includes/getting_started/serialization/parser/3.md !} diff --git a/docs/docs/en/getting-started/subscription/annotation.md b/docs/docs/en/getting-started/subscription/annotation.md index f40b1aa047..e7ca77912a 100644 --- a/docs/docs/en/getting-started/subscription/annotation.md +++ b/docs/docs/en/getting-started/subscription/annotation.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Annotation Serialization ## Basic usage @@ -6,7 +16,7 @@ As you already know, **FastStream** serializes your incoming message body accord So, there are some valid usecases: -{!> includes/getting_started/subscription/annotation/1.md !} +{! includes/getting_started/subscription/annotation/1.md !} As with other Python primitive types as well (`#!python float`, `#!python bool`, `#!python datetime`, etc) @@ -21,10 +31,10 @@ But how can we serialize more complex message, like `#!json { "name": "John", "u For sure, we can serialize it as a simple `#!python dict` -{!> includes/getting_started/subscription/annotation/2.md !} +{! includes/getting_started/subscription/annotation/2.md !} But it doesn't looks like a correct message validation, does it? For this reason, **FastStream** supports per-argument message serialization: you can declare multiple arguments with various types and your message will unpack to them: -{!> includes/getting_started/subscription/annotation/3.md !} +{! includes/getting_started/subscription/annotation/3.md !} diff --git a/docs/docs/en/getting-started/subscription/filtering.md b/docs/docs/en/getting-started/subscription/filtering.md index c12bb46fca..715eac66d4 100644 --- a/docs/docs/en/getting-started/subscription/filtering.md +++ b/docs/docs/en/getting-started/subscription/filtering.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Application-level Filtering **FastStream** also allows you to specify the message processing way using message headers, body type or something else. The `filter` feature enables you to consume various messages with different schemas within a single event stream. @@ -7,15 +17,15 @@ As an example, let's create a subscriber for both `JSON` and non-`JSON` messages: -{!> includes/getting_started/subscription/filtering/1.md !} +{! includes/getting_started/subscription/filtering/1.md !} !!! note A subscriber without a filter is a default subscriber. It consumes messages that have not been consumed yet. For now, the following message will be delivered to the `handle` function -{!> includes/getting_started/subscription/filtering/2.md !} +{! includes/getting_started/subscription/filtering/2.md !} And this one will be delivered to the `default_handler` -{!> includes/getting_started/subscription/filtering/3.md !} +{! includes/getting_started/subscription/filtering/3.md !} diff --git a/docs/docs/en/getting-started/subscription/index.md b/docs/docs/en/getting-started/subscription/index.md index 767abfc9a4..5ca412a1af 100644 --- a/docs/docs/en/getting-started/subscription/index.md +++ b/docs/docs/en/getting-started/subscription/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Subscription Basics **FastStream** provides a Message Broker agnostic way to subscribe to event streams. @@ -5,7 +15,7 @@ You need not even know about topics/queues/subjects or any broker inner objects you use. The basic syntax is the same for all brokers: -{!> includes/getting_started/subscription/index/1.md !} +{! includes/getting_started/subscription/index/1.md !} !!! tip If you want to use Message Broker specific features, please visit the corresponding broker documentation section. @@ -13,13 +23,13 @@ The basic syntax is the same for all brokers: Also, synchronous functions are supported as well: -{!> includes/getting_started/subscription/index/sync.md !} +{! includes/getting_started/subscription/index/sync.md !} ## Message Body Serialization Generally, **FastStream** uses your function type annotation to serialize incoming message body with [**Pydantic**](https://docs.pydantic.dev){.external-link target="_blank"}. This is similar to how [**FastAPI**](https://fastapi.tiangolo.com){.external-link target="_blank"} works (if you are familiar with it). -{!> includes/getting_started/subscription/index/2.md !} +{! includes/getting_started/subscription/index/2.md !} You can also access some extra features through the function arguments, such as [Depends](../dependencies/index.md){.internal-link} and [Context](../context/existed.md){.internal-link} if required. @@ -27,10 +37,10 @@ However, you can easily disable Pydantic validation by creating a broker with th This way **FastStream** still consumes `#!python json.loads` result, but without pydantic validation and casting. -{!> includes/getting_started/subscription/index/3.md !} +{! includes/getting_started/subscription/index/3.md !} ## Multiple Subscriptions You can also subscribe to multiple event streams at the same time with one function. Just wrap it with multiple `#!python @broker.subscriber(...)` decorators (they have no effect on each other). -{!> includes/getting_started/subscription/index/4.md !} +{! includes/getting_started/subscription/index/4.md !} diff --git a/docs/docs/en/getting-started/subscription/pydantic.md b/docs/docs/en/getting-started/subscription/pydantic.md index 117e006cee..aef5dd679d 100644 --- a/docs/docs/en/getting-started/subscription/pydantic.md +++ b/docs/docs/en/getting-started/subscription/pydantic.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Pydantic Serialization ## pydantic.Field @@ -8,10 +18,18 @@ You can access this information with extra details using `pydantic.Field` (such Just use `pydantic.Field` as a function default argument: -{!> includes/getting_started/subscription/pydantic/1.md !} +{! includes/getting_started/subscription/pydantic/1.md !} + + +!!! tip + Also you can use `typing.Annotated` (python 3.9+) or `typing_extensions.Annotated` to declare your handler fields + + ```python + {!> docs_src/getting_started/subscription/kafka/pydantic_annotated_fields.py [ln:14.5,15.5,16.5,17.5,18.5,19.5,20.5,21.5] !} + ``` ## pydantic.BaseModel To make your message schema reusable between different subscribers and publishers, you can decalre it as a `pydantic.BaseModel` and use it as a single message annotation: -{!> includes/getting_started/subscription/pydantic/2.md !} +{! includes/getting_started/subscription/pydantic/2.md !} diff --git a/docs/docs/en/getting-started/subscription/test.md b/docs/docs/en/getting-started/subscription/test.md index 459ffa675c..497f307c9a 100644 --- a/docs/docs/en/getting-started/subscription/test.md +++ b/docs/docs/en/getting-started/subscription/test.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Subscriber Testing Testability is a crucial part of any application, and **FastStream** provides you with the tools to test your code easily. @@ -6,7 +16,7 @@ Testability is a crucial part of any application, and **FastStream** provides yo Let's take a look at the original application to test -{!> includes/getting_started/subscription/testing/1.md !} +{! includes/getting_started/subscription/testing/1.md !} It consumes **JSON** messages like `#!json { "name": "username", "user_id": 1 }` @@ -28,19 +38,19 @@ For this reason, **FastStream** has a special `TestClient` to make your broker w Just use it like a regular async context manager - all published messages will be routed in-memory (without any external dependencies) and consumed by the correct handler. -{!> includes/getting_started/subscription/testing/2.md !} +{! includes/getting_started/subscription/testing/2.md !} ### Catching Exceptions This way you can catch any exceptions that occur inside your handler: -{!> includes/getting_started/subscription/testing/3.md !} +{! includes/getting_started/subscription/testing/3.md !} ### Validates Input Also, your handler has a mock object to validate your input or call counts. -{!> includes/getting_started/subscription/testing/4.md !} +{! includes/getting_started/subscription/testing/4.md !} !!! note The Handler mock has a not-serialized **JSON** message body. This way you can validate the incoming message view, not python arguments. @@ -49,13 +59,13 @@ Also, your handler has a mock object to validate your input or call counts. You should be careful with this feature: all mock objects will be cleared when the context manager exits. -{!> includes/getting_started/subscription/testing/5.md !} +{! includes/getting_started/subscription/testing/5.md !} ## Real Broker Testing -If you want to test your application in a real environment, you shouldn't have to rewrite all you tests: just pass `with_real` optional parameter to your `TestClient` context manager. This way, `TestClient` supports all the testing features but uses an unpatched broker to send and consume messages. +If you want to test your application in a real environment, you shouldn't have to rewrite all your tests: just pass `with_real` optional parameter to your `TestClient` context manager. This way, `TestClient` supports all the testing features but uses an unpatched broker to send and consume messages. -{!> includes/getting_started/subscription/testing/real.md !} +{! includes/getting_started/subscription/testing/real.md !} !!! tip When you're using a patched broker to test your consumers, the publish method is called synchronously with a consumer one, so you need not wait until your message is consumed. But in the real broker's case, it doesn't. @@ -68,7 +78,7 @@ If you want to test your application in a real environment, you shouldn't have t It can be very helpful to set the `with_real` flag using an environment variable. This way, you will be able to choose the testing mode right from the command line: ```bash -WITH_REAL=True/False pytest tests/ +WITH_REAL=True/False pytest ... ``` -To learn more about managing your application configiruation visit [this](../config/index.md){.internal-link} page. +To learn more about managing your application configiruation visit [this page](../config/index.md){.internal-link}. diff --git a/docs/docs/en/kafka/Publisher/batch_publisher.md b/docs/docs/en/kafka/Publisher/batch_publisher.md index 3bb19d52b7..9469ec4e90 100644 --- a/docs/docs/en/kafka/Publisher/batch_publisher.md +++ b/docs/docs/en/kafka/Publisher/batch_publisher.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing in Batches ## General Overview @@ -15,7 +25,7 @@ Let's delve into a detailed example illustrating how to produce messages in batc First, let's take a look at the whole app creation and then dive deep into the steps for producing in batches. Here is the application code: ```python linenums="1" -{!> docs_src/kafka/publish_batch/app.py!} +{! docs_src/kafka/publish_batch/app.py!} ``` Below, we have highlighted key lines of code that demonstrate the steps involved in creating and using a batch publisher: @@ -23,7 +33,7 @@ Below, we have highlighted key lines of code that demonstrate the steps involved Step 1: Creation of the Publisher ```python linenums="1" -{!> docs_src/kafka/publish_batch/app.py [ln:19] !} +{! docs_src/kafka/publish_batch/app.py [ln:19] !} ``` Step 2: Publishing an Actual Batch of Messages @@ -31,13 +41,13 @@ Step 2: Publishing an Actual Batch of Messages You can publish a batch by directly calling the publisher with a batch of messages you want to publish, as shown here: ```python linenums="1" -{!> docs_src/kafka/publish_batch/app.py [ln:32-34] !} +{! docs_src/kafka/publish_batch/app.py [ln:32.5,33.5,34.5] !} ``` Or you can decorate your processing function and return a batch of messages, as shown here: ```python linenums="1" -{!> docs_src/kafka/publish_batch/app.py [ln:22-26] !} +{! docs_src/kafka/publish_batch/app.py [ln:22-26] !} ``` The application in the example imelements both of these ways, so feel free to use whichever option fits your needs better. diff --git a/docs/docs/en/kafka/Publisher/index.md b/docs/docs/en/kafka/Publisher/index.md index ae97cd74f5..3bd1f276ac 100644 --- a/docs/docs/en/kafka/Publisher/index.md +++ b/docs/docs/en/kafka/Publisher/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing The **FastStream** KafkaBroker supports all regular [publishing use cases](../../getting-started/publishing/index.md){.internal-link}, and you can use them without any changes. @@ -18,10 +28,10 @@ You can specify the topic to send by its name. {!> docs_src/kafka/raw_publish/example.py [ln:8] !} ``` -1. Publish a message using the `publish` method +2. Publish a message using the `publish` method ```python linenums="1" - {!> docs_src/kafka/raw_publish/example.py [ln:26-32] !} + {!> docs_src/kafka/raw_publish/example.py [ln:26.9,27.7,28.9,29.9,30.9,31.9,32.9] !} ``` This is the most basic way of using the KafkaBroker to publish a message. @@ -36,16 +46,16 @@ The simplest way to use a KafkaBroker for publishing has a significant limitatio {!> docs_src/kafka/publisher_object/example.py [ln:8] !} ``` -1. Create a publisher instance +2. Create a publisher instance ```python linenums="1" {!> docs_src/kafka/publisher_object/example.py [ln:17] !} ``` -1. Publish a message using the `publish` method of the prepared publisher +3. Publish a message using the `publish` method of the prepared publisher ```python linenums="1" - {!> docs_src/kafka/publisher_object/example.py [ln:26-31] !} + {!> docs_src/kafka/publisher_object/example.py [ln:26.9,27.7,28.9,29.9,30.9,31.9] !} ``` Now, when you wrap your broker into a FastStream object, the publisher will be exported to the AsyncAPI documentation. @@ -61,7 +71,7 @@ This method relies on the return type annotation of the handler function to prop Let's start by examining the entire application that utilizes the Publisher Decorator and then proceed to walk through it step by step. ```python linenums="1" -{!> docs_src/kafka/publish_example/app.py [ln:1-26] !} +{! docs_src/kafka/publish_example/app.py [ln:1-26] !} ``` 1. **Initialize the KafkaBroker instance:** Start by initializing a KafkaBroker instance with the necessary configuration, including Kafka broker address. @@ -70,19 +80,19 @@ Let's start by examining the entire application that utilizes the Publisher Deco {!> docs_src/kafka/publish_example/app.py [ln:13] !} ``` -1. **Prepare your publisher object to use later as a decorator:** +2. **Prepare your publisher object to use later as a decorator:** ```python linenums="1" {!> docs_src/kafka/publish_example/app.py [ln:17] !} ``` -1. **Create your processing logic:** Write a function that will consume the incoming messages in the defined format and produce a response to the defined topic +3. **Create your processing logic:** Write a function that will consume the incoming messages in the defined format and produce a response to the defined topic ```python linenums="1" {!> docs_src/kafka/publish_example/app.py [ln:22-23] !} ``` -1. **Decorate your processing function:** To connect your processing function to the desired Kafka topics you need to decorate it with `#!python @broker.subscriber` and `#!python @broker.publisher` decorators. Now, after you start your application, your processing function will be called whenever a new message in the subscribed topic is available and produce the function return value to the topic defined in the publisher decorator. +4. **Decorate your processing function:** To connect your processing function to the desired Kafka topics you need to decorate it with `#!python @broker.subscriber(...)` and `#!python @broker.publisher(...)` decorators. Now, after you start your application, your processing function will be called whenever a new message in the subscribed topic is available and produce the function return value to the topic defined in the publisher decorator. ```python linenums="1" {!> docs_src/kafka/publish_example/app.py [ln:20-23] !} diff --git a/docs/docs/en/kafka/Publisher/using_a_key.md b/docs/docs/en/kafka/Publisher/using_a_key.md index f848bdbd9b..a8a3f0182d 100644 --- a/docs/docs/en/kafka/Publisher/using_a_key.md +++ b/docs/docs/en/kafka/Publisher/using_a_key.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Using a Partition Key Partition keys are a crucial concept in Apache **Kafka**, enabling you to determine the appropriate partition for a message. This ensures that related messages are kept together in the same partition, which can be invaluable for maintaining order or grouping related messages for efficient processing. Additionally, **Kafka** utilizes partitioning to distribute load across multiple brokers and scale horizontally, while replicating data across brokers provides fault tolerance. @@ -13,7 +23,7 @@ To publish a message to a **Kafka** topic using a partition key, follow these st In your FastStream application, define the publisher using the `#!python @KafkaBroker.publisher(...)` decorator. This decorator allows you to configure various aspects of message publishing, including the partition key. ```python linenums="1" -{!> docs_src/kafka/publish_with_partition_key/app.py [ln:17] !} +{! docs_src/kafka/publish_with_partition_key/app.py [ln:17] !} ``` ### Step 2: Pass the Key @@ -21,7 +31,7 @@ In your FastStream application, define the publisher using the `#!python @KafkaB When you're ready to publish a message with a specific key, simply include the `key` parameter in the `publish` function call. This key parameter is used to determine the appropriate partition for the message. ```python linenums="1" -{!> docs_src/kafka/publish_with_partition_key/app.py [ln:25] !} +{! docs_src/kafka/publish_with_partition_key/app.py [ln:25.5] !} ``` ## Example Application @@ -29,7 +39,7 @@ When you're ready to publish a message with a specific key, simply include the ` Let's examine a complete application example that consumes messages from the `#!python "input_data"` topic and publishes them with a specified key to the `#!python "output_data"` topic. This example will illustrate how to incorporate partition keys into your **Kafka**-based applications: ```python linenums="1" -{!> docs_src/kafka/publish_with_partition_key/app.py [ln:1-25] !} +{! docs_src/kafka/publish_with_partition_key/app.py [ln:1-25] !} ``` As you can see, the primary difference from standard publishing is the inclusion of the `key` parameter in the `publish` call. This key parameter is essential for controlling how **Kafka** partitions and processes your messages. diff --git a/docs/docs/en/kafka/Subscriber/batch_subscriber.md b/docs/docs/en/kafka/Subscriber/batch_subscriber.md index 32fb960fc9..ded564ef60 100644 --- a/docs/docs/en/kafka/Subscriber/batch_subscriber.md +++ b/docs/docs/en/kafka/Subscriber/batch_subscriber.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Batch Subscriber If you want to consume data in batches, the `#!python @broker.subscriber(...)` decorator makes it possible. By defining your consumed `msg` object as a list of messages and setting the `batch` parameter to `True`, the subscriber will call your consuming function with a batch of messages consumed from a single partition. Let's walk through how to achieve this. @@ -11,7 +21,7 @@ To consume messages in batches, follow these steps: In your FastStream application, define the subscriber using the `#!python @broker.subscriber(...)` decorator. Ensure that you configure the `msg` object as a list and set the `batch` parameter to `True`. This configuration tells the subscriber to handle message consumption in batches. ```python linenums="1" -{!> docs_src/kafka/batch_consuming_pydantic/app.py [ln:20] !} +{! docs_src/kafka/batch_consuming_pydantic/app.py [ln:20] !} ``` ### Step 2: Implement Your Consuming Function @@ -19,7 +29,7 @@ In your FastStream application, define the subscriber using the `#!python @broke Create a consuming function that accepts the list of messages. The `#!python @broker.subscriber(...)` decorator will take care of collecting and grouping messages into batches based on the partition. ```python linenums="1" -{!> docs_src/kafka/batch_consuming_pydantic/app.py [ln:20-22] !} +{! docs_src/kafka/batch_consuming_pydantic/app.py [ln:20-22] !} ``` ## Example of Consuming in Batches @@ -27,7 +37,7 @@ Create a consuming function that accepts the list of messages. The `#!python @br Let's illustrate how to consume messages in batches from the `#!python "test_batch"` topic with a practical example: ```python linenums="1" -{!> docs_src/kafka/batch_consuming_pydantic/app.py!} +{! docs_src/kafka/batch_consuming_pydantic/app.py!} ``` In this example, the subscriber is configured to process messages in batches, and the consuming function is designed to handle these batches efficiently. diff --git a/docs/docs/en/kafka/Subscriber/index.md b/docs/docs/en/kafka/Subscriber/index.md index b704945f6f..f3b9df13dc 100644 --- a/docs/docs/en/kafka/Subscriber/index.md +++ b/docs/docs/en/kafka/Subscriber/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Basic Subscriber To start consuming from a **Kafka** topic, simply decorate your consuming function with a `#!python @broker.subscriber(...)` decorator, passing a string as a topic key. @@ -7,7 +17,7 @@ In the folowing example, we will create a simple FastStream app that will consum The full app code looks like this: ```python linenums="1" -{!> docs_src/kafka/consumes_basics/app.py!} +{! docs_src/kafka/consumes_basics/app.py!} ``` ## Import FastStream and KafkaBroker @@ -15,7 +25,7 @@ The full app code looks like this: To use the `#!python @broker.subscriber(...)` decorator, first, we need to import the base FastStream app KafkaBroker to create our broker. ```python linenums="1" -{!> docs_src/kafka/consumes_basics/app.py [ln:3-4] !} +{! docs_src/kafka/consumes_basics/app.py [ln:3-4] !} ``` ## Define the HelloWorld Message Structure @@ -23,7 +33,7 @@ To use the `#!python @broker.subscriber(...)` decorator, first, we need to impor Next, you need to define the structure of the messages you want to consume from the topic using Pydantic. For the guide, we’ll stick to something basic, but you are free to define any complex message structure you wish in your project. ```python linenums="1" -{!> docs_src/kafka/consumes_basics/app.py [ln:7-12] !} +{! docs_src/kafka/consumes_basics/app.py [ln:7-12] !} ``` ## Create a KafkaBroker @@ -31,7 +41,7 @@ Next, you need to define the structure of the messages you want to consume from Next, we will create a `KafkaBroker` object and wrap it into the `FastStream` object so that we can start our app using CLI later. ```python linenums="1" -{!> docs_src/kafka/consumes_basics/app.py [ln:15-16] !} +{! docs_src/kafka/consumes_basics/app.py [ln:15-16] !} ``` ## Create a Function that will Consume Messages from a Kafka hello-world Topic @@ -39,7 +49,7 @@ Next, we will create a `KafkaBroker` object and wrap it into the `FastStream` ob Let’s create a consumer function that will consume `HelloWorld` messages from `#!python "hello_world"` topic and log them. ```python linenums="1" -{!> docs_src/kafka/consumes_basics/app.py [ln:19-21] !} +{! docs_src/kafka/consumes_basics/app.py [ln:19-21] !} ``` The function decorated with the `#!python @broker.subscriber(...)` decorator will be called when a message is produced to **Kafka**. diff --git a/docs/docs/en/kafka/ack.md b/docs/docs/en/kafka/ack.md index a5bb5e0662..c3bf499c45 100644 --- a/docs/docs/en/kafka/ack.md +++ b/docs/docs/en/kafka/ack.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Consuming Acknowledgements As you may know, *Kafka* consumer should commit a topic offset when consuming a message. @@ -44,10 +54,10 @@ async def base_handler(body: str, msg: KafkaMessage): If you wish to interrupt the processing of a message at any call stack level and acknowledge the message, you can achieve that by raising the `faststream.exceptions.AckMessage`. -``` python linenums="1" hl_lines="2 18" -{!> docs_src/kafka/ack/errors.py !} +```python linenums="1" hl_lines="2 18" +{! docs_src/kafka/ack/errors.py !} ``` This way, **FastStream** interrupts the current message processing and acknowledges it immediately. Similarly, you can raise `NackMessage` as well to prevent the message from being committed. -{!> includes/en/no_ack.md !} +{! includes/en/no_ack.md !} diff --git a/docs/docs/en/kafka/index.md b/docs/docs/en/kafka/index.md index 1f40856c2c..7e5bd11e9c 100644 --- a/docs/docs/en/kafka/index.md +++ b/docs/docs/en/kafka/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Kafka Routing ## Kafka Overview @@ -39,12 +49,12 @@ To connect to Kafka using the FastStream KafkaBroker module, follow these steps: 2. **Create your processing logic:** Write a function that will consume the incoming messages in the defined format and produce a response to the defined topic -3. **Decorate your processing function:** To connect your processing function to the desired Kafka topics you need to decorate it with `#!python @broker.subscriber` and `#!python @broker.publisher` decorators. Now, after you start your application, your processing function will be called whenever a new message in the subscribed topic is available and produce the function return value to the topic defined in the publisher decorator. +3. **Decorate your processing function:** To connect your processing function to the desired Kafka topics you need to decorate it with `#!python @broker.subscriber(...)` and `#!python @broker.publisher(...)` decorators. Now, after you start your application, your processing function will be called whenever a new message in the subscribed topic is available and produce the function return value to the topic defined in the publisher decorator. Here's a simplified code example demonstrating how to establish a connection to Kafka using FastStream's KafkaBroker module: ```python linenums="1" -{!> docs_src/index/kafka/basic.py!} +{! docs_src/index/kafka/basic.py!} ``` This minimal example illustrates how FastStream simplifies the process of connecting to Kafka and performing basic message processing from the **in_topic** to the **out-topic**. Depending on your specific use case and requirements, you can further customize your Kafka integration with FastStream to build robust and efficient streaming applications. diff --git a/docs/docs/en/kafka/message.md b/docs/docs/en/kafka/message.md index 0950430ec0..4d82f716d3 100644 --- a/docs/docs/en/kafka/message.md +++ b/docs/docs/en/kafka/message.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Access to Message Information As you may know, **FastStream** serializes a message body and provides you access to it through function arguments. However, there are times when you need to access additional message attributes such as offsets, headers, or other metadata. @@ -52,4 +62,4 @@ async def base_handler( print(headers) ``` -{!> includes/message/headers.md !} +{! includes/message/headers.md !} diff --git a/docs/docs/en/kafka/security.md b/docs/docs/en/kafka/security.md index de36c6e751..12a3d05361 100644 --- a/docs/docs/en/kafka/security.md +++ b/docs/docs/en/kafka/security.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # FastStream Kafka Security This chapter discusses the security options available in **FastStream** and how to use them. @@ -12,8 +22,8 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** -```python linenums="1" -{!> docs_src/kafka/basic_security/app.py [ln:1-11] !} +```python linenums="1" hl_lines="4 7 9" +{! docs_src/kafka/security/basic.py !} ``` ### 2. SASLPlaintext Object with SSL/TLS @@ -23,7 +33,7 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** ```python linenums="1" -{!> docs_src/kafka/plaintext_security/app.py [ln:1-11] !} +{! docs_src/kafka/security/plaintext.py !} ``` **Using any SASL authentication without SSL:** @@ -31,13 +41,13 @@ This chapter discusses the security options available in **FastStream** and how The following example will log a **RuntimeWarning**: ```python linenums="1" -{!> docs_src/kafka/security_without_ssl/example.py [ln:8] !} +{! docs_src/kafka/security/ssl_warning.py [ln:8.16] !} ``` If the user does not want to use SSL encryption without the waringning getting logged, they must explicitly set the `use_ssl` parameter to `False` when creating a SASL object. ```python linenums="1" -{!> docs_src/kafka/security_without_ssl/example.py [ln:12] !} +{! docs_src/kafka/security/ssl_warning.py [ln:12.5] !} ``` ### 3. SASLScram256/512 Object with SSL/TLS @@ -48,10 +58,10 @@ If the user does not want to use SSL encryption without the waringning getting l === "SCRAM256" ```python linenums="1" - {!> docs_src/kafka/sasl_scram256_security/app.py [ln:1-11] !} + {!> docs_src/kafka/security/sasl_scram256.py !} ``` === "SCRAM512" ```python linenums="1" - {!> docs_src/kafka/sasl_scram512_security/app.py [ln:1-11] !} + {!> docs_src/kafka/security/sasl_scram512.py !} ``` diff --git a/docs/docs/en/nats/examples/direct.md b/docs/docs/en/nats/examples/direct.md index 5d0d574e6a..d2a0aad621 100644 --- a/docs/docs/en/nats/examples/direct.md +++ b/docs/docs/en/nats/examples/direct.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Direct The **Direct** Subject is the basic way to route messages in *NATS*. Its essence is very simple: @@ -22,15 +32,15 @@ async def handler(): Full example: ```python linenums="1" -{!> docs_src/nats/direct.py !} +{! docs_src/nats/direct.py !} ``` ### Consumer Announcement -To begin with, we have declared several consumers for two `subjects`: `test-subj-1` and `test-subj-2`: +To begin with, we have declared several consumers for two `subjects`: `#!python "test-subj-1"` and `#!python "test-subj-2"`: ```python linenums="7" hl_lines="1 5 9" -{!> docs_src/nats/direct.py [ln:7-17]!} +{! docs_src/nats/direct.py [ln:7-17] !} ``` !!! note @@ -42,15 +52,15 @@ To begin with, we have declared several consumers for two `subjects`: `test-subj Now the distribution of messages between these consumers will look like this: ```python -{!> docs_src/nats/direct.py [ln:21]!} +{! docs_src/nats/direct.py [ln:21.5] !} ``` -The message `1` will be sent to `handler1` or `handler2` because they are listening to one `subject` within one `queue group`. +The message `1` will be sent to `handler1` or `handler2` because they are listening to one `#!python "test-subj-1"` `subject` within one `queue group`. --- ```python -{!> docs_src/nats/direct.py [ln:22]!} +{! docs_src/nats/direct.py [ln:22.5] !} ``` Message `2` will be sent similarly to message `1`. @@ -58,7 +68,7 @@ Message `2` will be sent similarly to message `1`. --- ```python -{!> docs_src/nats/direct.py [ln:23]!} +{! docs_src/nats/direct.py [ln:23.5] !} ``` -The message `3` will be sent to `handler3` because it is the only one listening to `test-subj-2`. +The message `3` will be sent to `handler3` because it is the only one listening to `#!python "test-subj-2"`. diff --git a/docs/docs/en/nats/examples/pattern.md b/docs/docs/en/nats/examples/pattern.md index 2feed327d3..d30f3446de 100644 --- a/docs/docs/en/nats/examples/pattern.md +++ b/docs/docs/en/nats/examples/pattern.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Pattern [**Pattern**](https://docs.nats.io/nats-concepts/subjects#wildcards){.external-link target="_blank"} Subject is a powerful *NATS* routing engine. This type of `subject` routes messages to consumers based on the *pattern* specified when they connect to the `subject` and a message key. @@ -11,15 +21,15 @@ Thus, *NATS* can independently balance the load on queue consumers. You can incr ## Example ```python linenums="1" -{!> docs_src/nats/pattern.py !} +{! docs_src/nats/pattern.py !} ``` ### Consumer Announcement -To begin with, we have announced several consumers for two `subjects`: `*.info` and `*.error`: +To begin with, we have announced several consumers for two `subjects`: `#!python "*.info"` and `#!python "*.error"`: ```python linenums="7" hl_lines="1 5 9" -{!> docs_src/nats/pattern.py [ln:7-17]!} +{! docs_src/nats/pattern.py [ln:7-17] !} ``` At the same time, in the `subject` of our consumers, we specify the *pattern* that will be processed by these consumers. @@ -33,7 +43,7 @@ At the same time, in the `subject` of our consumers, we specify the *pattern* th Now the distribution of messages between these consumers will look like this: ```python -{!> docs_src/nats/pattern.py [ln:21]!} +{! docs_src/nats/pattern.py [ln:21.5] !} ``` The message `1` will be sent to `handler1` or `handler2` because they listen to the same `subject` template within the same `queue group`. @@ -41,7 +51,7 @@ The message `1` will be sent to `handler1` or `handler2` because they listen to --- ```python -{!> docs_src/nats/pattern.py [ln:22]!} +{! docs_src/nats/pattern.py [ln:22.5] !} ``` Message `2` will be sent similarly to message `1`. @@ -49,7 +59,7 @@ Message `2` will be sent similarly to message `1`. --- ```python -{!> docs_src/nats/pattern.py [ln:23]!} +{! docs_src/nats/pattern.py [ln:23.5] !} ``` -The message `3` will be sent to `handler3` because it is the only one listening to the pattern `*.error*`. +The message `3` will be sent to `handler3` because it is the only one listening to the pattern `#!python "*.error"`. diff --git a/docs/docs/en/nats/index.md b/docs/docs/en/nats/index.md index 5fb8de062c..eb74eb00c6 100644 --- a/docs/docs/en/nats/index.md +++ b/docs/docs/en/nats/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # NATS !!! note "" diff --git a/docs/docs/en/nats/jetstream/ack.md b/docs/docs/en/nats/jetstream/ack.md index 836ddbf9ac..2d70633532 100644 --- a/docs/docs/en/nats/jetstream/ack.md +++ b/docs/docs/en/nats/jetstream/ack.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Consuming Acknowledgements As you may know, *Nats* employs a rather extensive [Acknowledgement](https://docs.nats.io/using-nats/developer/develop_jetstream#acknowledging-messages){.external-link target="_blank"} policy. @@ -51,10 +61,10 @@ async def base_handler(body: str, msg: NatsMessage): If you want to interrupt message processing at any call stack, you can raise `faststream.exceptions.AckMessage` -``` python linenums="1" hl_lines="2 16" -{!> docs_src/nats/ack/errors.py !} +```python linenums="1" hl_lines="2 16" +{! docs_src/nats/ack/errors.py !} ``` This way, **FastStream** interrupts the current message proccessing and acknowledges it immediately. Also, you can raise `NackMessage` and `RejectMessage` too. -{!> includes/en/no_ack.md !} +{! includes/en/no_ack.md !} diff --git a/docs/docs/en/nats/jetstream/index.md b/docs/docs/en/nats/jetstream/index.md index 542e663372..f1b4847207 100644 --- a/docs/docs/en/nats/jetstream/index.md +++ b/docs/docs/en/nats/jetstream/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # NATS JetStream The default *NATS* usage is suitable for scenarios where: @@ -26,7 +36,7 @@ Also, **NATS JetStream** has built-in `key-value` (similar to **Redis**) and `ob **FastStream** does not provide access to this functionality directly, but it is covered by the [nats-py](https://github.com/nats-io/nats.py){.external-link target="_blank"} library used. You can access the **JS** object from the application context: ```python linenums="1" hl_lines="2 7 11-12 21" -{!> docs_src/nats/js/main.py !} +{! docs_src/nats/js/main.py !} ``` !!! tip diff --git a/docs/docs/en/nats/jetstream/key-value.md b/docs/docs/en/nats/jetstream/key-value.md index 3ef2bfaace..a5bbc0f1ef 100644 --- a/docs/docs/en/nats/jetstream/key-value.md +++ b/docs/docs/en/nats/jetstream/key-value.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Key-Value Storage ## Overview @@ -18,8 +28,8 @@ This interface provides you with rich abilities to use it like a regular *KV* st First of all, you need to create a *Key-Value* storage object and pass it into the context: -```python linenums="1" hl_lines="14-15" -{!> docs_src/nats/js/key_value.py [ln:5-7,11-12,22-27] !} +```python linenums="1" hl_lines="12-13" +{! docs_src/nats/js/key_value.py [ln:5-8,11-13,22-27] !} ``` !!! tip @@ -33,20 +43,20 @@ Next, we are ready to use this object right in our handlers. Let's create an annotated object to shorten context object access: -```python linenums="1" hl_lines="5" -{!> docs_src/nats/js/key_value.py [ln:1-2,9] !} +```python linenums="1" hl_lines="4" +{! docs_src/nats/js/key_value.py [ln:1-3,9] !} ``` And just use it in a handler: -```python linenums="1" hl_lines="4 7-8" -{!> docs_src/nats/js/key_value.py [ln:4,15-19] !} +```python linenums="1" hl_lines="4 6-8" +{! docs_src/nats/js/key_value.py [ln:4,14-19] !} ``` Finally, let's test our code behavior by putting something into the KV storage and sending a message: ```python linenums="1" hl_lines="3-4" -{!> docs_src/nats/js/key_value.py [ln:30-33] !} +{! docs_src/nats/js/key_value.py [ln:30-33] !} ``` ??? example "Full listing" diff --git a/docs/docs/en/nats/jetstream/object.md b/docs/docs/en/nats/jetstream/object.md index 30b207a747..2e1501f53a 100644 --- a/docs/docs/en/nats/jetstream/object.md +++ b/docs/docs/en/nats/jetstream/object.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Object Storage Object storage is almost identical to the [*Key-Value*](./key-value.md) stroge concept, so you can reuse the guide. @@ -16,8 +26,8 @@ The main difference between *KV* and *Object* storages is that in the *Object* s First of all, you need to create an *Object* storage object and pass in to the context: -```python linenums="1" hl_lines="14-15" -{!> docs_src/nats/js/object.py [ln:7-9,13-14,24-29] !} +```python linenums="1" hl_lines="12-13" +{! docs_src/nats/js/object.py [ln:7-10,13-15,24-29] !} ``` !!! tip @@ -31,20 +41,20 @@ Next, we are ready to use this object right in the our handlers. Let's create an Annotated object to shorten `Context` object access: -```python linenums="1" hl_lines="5" -{!> docs_src/nats/js/object.py [ln:3-4,11] !} +```python linenums="1" hl_lines="4" +{! docs_src/nats/js/object.py [ln:3-5,11] !} ``` And just use it in a handler: -```python linenums="1" hl_lines="8 10-11" -{!> docs_src/nats/js/object.py [ln:1,6,17-21] !} +```python linenums="1" hl_lines="6 8-9" +{! docs_src/nats/js/object.py [ln:1-2,6,16-21] !} ``` Finally, let's test our code behavior by putting something into the *Object storage* and sending a message: ```python linenums="1" hl_lines="3-4" -{!> docs_src/nats/js/object.py [ln:32-35] !} +{! docs_src/nats/js/object.py [ln:32-35] !} ``` !!! tip diff --git a/docs/docs/en/nats/jetstream/pull.md b/docs/docs/en/nats/jetstream/pull.md index fe5e5bc6c2..ddc58b420e 100644 --- a/docs/docs/en/nats/jetstream/pull.md +++ b/docs/docs/en/nats/jetstream/pull.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Pull Subscriber ## Overview @@ -19,12 +29,12 @@ So, if you want to consume a large flow of messages without strict time limitati The **Pull** consumer is just a regular *Stream* consumer, but with the `pull_sub` argument, which controls consuming messages with batch size and block interval. ```python linenums="1" hl_lines="10-11" -{!> docs_src/nats/js/pull_sub.py !} +{! docs_src/nats/js/pull_sub.py !} ``` -The batch size doesn't mean that your `msg` argument is a list of messages, but it means that you consume up to `10` messages for one request to **NATS** and call your handler for each message in an `asyncio.gather` pool. +The batch size doesn't mean that your `msg` argument is a list of messages, but it means that you consume up to `#!python 10` messages for one request to **NATS** and call your handler for each message in an `asyncio.gather` pool. !!! tip If you want to consume list of messages, just set the `batch=True` in `PullSub` class. -So, your subject will be processed much faster, without blocking for each message processing. However, if your subject has fewer than `10` messages, your request to **NATS** will be blocked for `timeout` (5 seconds by default) while trying to collect the required number of messages. Therefor, you should choose `batch_size` and `timeout` accurately to optimize your consumer efficiency. +So, your subject will be processed much faster, without blocking for each message processing. However, if your subject has fewer than `#!python 10` messages, your request to **NATS** will be blocked for `timeout` (5 seconds by default) while trying to collect the required number of messages. Therefor, you should choose `batch_size` and `timeout` accurately to optimize your consumer efficiency. diff --git a/docs/docs/en/nats/message.md b/docs/docs/en/nats/message.md index 6c9c481352..c44c248092 100644 --- a/docs/docs/en/nats/message.md +++ b/docs/docs/en/nats/message.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Access to Message Information As you know, **FastStream** serializes a message body and provides you access to it through function arguments. But sometimes you need to access message_id, headers, or other meta-information. @@ -8,7 +18,7 @@ You can get it in a simple way: just acces the message object in the [Context](. It contains the required information such as: -{!> includes/message/attrs.md !} +{! includes/message/attrs.md !} It is a **FastStream** wrapper around a native broker library message (`nats.aio.msg.Msg` in the *NATS*' case) that you can access with `raw_message`. @@ -67,13 +77,13 @@ async def base_handler( But this code is too long to reuse everywhere. In this case, you can use a Python [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated){.external-link target="_blank"} feature: -{!> includes/message/annotated.md !} +{! includes/message/annotated.md !} -{!> includes/message/headers.md !} +{! includes/message/headers.md !} ## Subject Pattern Access -As you know, **NATS** allows you to use a pattern like this `logs.*` to subscriber to subjects. Getting access to the real `*` value is an often-used scenario, and **FastStream** provide it to you with the `Path` object (which is a shortcut to `#!python Context("message.path.*")`). +As you know, **NATS** allows you to use a pattern like this `#!python "logs.*"` to subscriber to subjects. Getting access to the real `*` value is an often-used scenario, and **FastStream** provide it to you with the `Path` object (which is a shortcut to `#!python Context("message.path.*")`). To use it, you just need to replace your `*` with `{variable-name}` and use `Path` as a regular `Context` object: diff --git a/docs/docs/en/nats/publishing/index.md b/docs/docs/en/nats/publishing/index.md index 898bf2b340..e6385f28bf 100644 --- a/docs/docs/en/nats/publishing/index.md +++ b/docs/docs/en/nats/publishing/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing **FastStream** `NatsBroker` supports all regular [publishing usecases](../../getting-started/publishing/index.md){.internal-link}. You can use them without any changes. @@ -8,7 +18,7 @@ However, if you wish to further customize the publishing logic, you should take `NatsBroker` also uses the unified `publish` method (from a `publisher` object) to send messages. -``` python +```python import asyncio from faststream.nats import NatsBroker diff --git a/docs/docs/en/nats/rpc.md b/docs/docs/en/nats/rpc.md index 33a9fe8f29..d75a7c5c3a 100644 --- a/docs/docs/en/nats/rpc.md +++ b/docs/docs/en/nats/rpc.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # RPC over NATS Because **NATS** has zero cost for creating new subjects, we can easily set up a new subject consumer just for the one response message. This way, your request message will be published to one topic, and the response message will be consumed from another one (temporary subject), which allows you to use regular **FastStream RPC** syntax in the **NATS** case too. @@ -13,7 +23,7 @@ Just send a message like a regular one and get a response synchronously. It is very close to the common **requests** syntax: -``` python hl_lines="1 4" +```python hl_lines="1 4" msg = await broker.publish( "Hi!", subject="test", @@ -32,7 +42,7 @@ Also, if you want to create a permanent request-reply data flow, probably, you s So, if you have such one, you can specify it with the `reply_to` argument. This way, **FastStream** will send a response to this subject automatically. -``` python hl_lines="1 8" +```python hl_lines="1 8" @broker.subscriber("response-subject") async def consume_responses(msg): ... diff --git a/docs/docs/en/rabbit/ack.md b/docs/docs/en/rabbit/ack.md index dd49612916..853d84c58e 100644 --- a/docs/docs/en/rabbit/ack.md +++ b/docs/docs/en/rabbit/ack.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Consuming Acknowledgements As you may know, *RabbitMQ* employs a rather extensive [Acknowledgement](https://www.rabbitmq.com/confirms.html){.external-link target="_blank"} policy. @@ -63,10 +73,10 @@ async def base_handler(body: str, msg: RabbitMessage): If you want to interrupt message processing at any call stack, you can raise `faststream.exceptions.AckMessage` -``` python linenums="1" hl_lines="2 16" -{!> docs_src/rabbit/ack/errors.py !} +```python linenums="1" hl_lines="2 16" +{! docs_src/rabbit/ack/errors.py !} ``` This way, **FastStream** interrupts the current message proccessing and acknowledges it immediately. Also, you can raise `NackMessage` and `RejectMessage` too. -{!> includes/en/no_ack.md !} +{! includes/en/no_ack.md !} diff --git a/docs/docs/en/rabbit/declare.md b/docs/docs/en/rabbit/declare.md index cb8d558c13..51c03368fa 100644 --- a/docs/docs/en/rabbit/declare.md +++ b/docs/docs/en/rabbit/declare.md @@ -1,11 +1,21 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # RabbitMQ Queue/Exchange Declaration **FastStream** declares and validates all exchanges and queues using *publishers* and *subscribers* *RabbitMQ* objects, but sometimes you need to declare them manually. **RabbitBroker** provides a way to achieve this easily. -``` python linenums="1" hl_lines="15-20 22-27" -{!> docs_src/rabbit/declare.py !} +```python linenums="1" hl_lines="15-20 22-27" +{! docs_src/rabbit/declare.py !} ``` These methods require just one argument (`RabbitQueue`/`RabbitExchange`) containing information about your *RabbitMQ* required objects. They declare/validate *RabbitMQ* objects and return low-level **aio-pika** robust objects to interact with. diff --git a/docs/docs/en/rabbit/examples/direct.md b/docs/docs/en/rabbit/examples/direct.md index 48b644d1c2..90df4b1fb8 100644 --- a/docs/docs/en/rabbit/examples/direct.md +++ b/docs/docs/en/rabbit/examples/direct.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Direct Exchange The **Direct** Exchange is the basic way to route messages in *RabbitMQ*. Its core is very simple: the `exchange` sends messages to those queues whose `routing_key` matches the `routing_key` of the message being sent. @@ -25,7 +35,7 @@ Thus, *RabbitMQ* can independently balance the load on queue consumers. You can The argument `auto_delete=True` in this and subsequent examples is used only to clear the state of *RabbitMQ* after example runs. ```python linenums="1" -{!> docs_src/rabbit/subscription/direct.py !} +{! docs_src/rabbit/subscription/direct.py !} ``` ### Consumer Announcement @@ -33,13 +43,13 @@ The argument `auto_delete=True` in this and subsequent examples is used only to First, we announce our **Direct** exchange and several queues that will listen to it: ```python linenums="7" -{!> docs_src/rabbit/subscription/direct.py [ln:7-10]!} +{! docs_src/rabbit/subscription/direct.py [ln:7-10] !} ``` Then we sign up several consumers using the advertised queues to the `exchange` we created: ```python linenums="13" hl_lines="1 6 11" -{!> docs_src/rabbit/subscription/direct.py [ln:13-25]!} +{! docs_src/rabbit/subscription/direct.py [ln:13-25] !} ``` !!! note @@ -52,23 +62,23 @@ Then we sign up several consumers using the advertised queues to the `exchange` Now, the distribution of messages between these consumers will look like this: ```python linenums="30" -{!> docs_src/rabbit/subscription/direct.py [ln:30]!} +{! docs_src/rabbit/subscription/direct.py [ln:30.5] !} ``` -Message `1` will be sent to `handler1` because it listens to the `exchange` using a queue with the routing key `test-q-1`. +Message `1` will be sent to `handler1` because it listens to the `#!python "exchange"` using a queue with the routing key `#!python "test-q-1"`. --- ```python linenums="31" -{!> docs_src/rabbit/subscription/direct.py [ln:31]!} +{! docs_src/rabbit/subscription/direct.py [ln:31.5] !} ``` -Message `2` will be sent to `handler2` because it listens to the `exchange` using the same queue, but `handler1` is busy. +Message `2` will be sent to `handler2` because it listens to the `#!python "exchange"` using the same queue, but `handler1` is busy. --- ```python linenums="32" -{!> docs_src/rabbit/subscription/direct.py [ln:32]!} +{! docs_src/rabbit/subscription/direct.py [ln:32.5] !} ``` Message `3` will be sent to `handler1` again because it is currently free. @@ -76,7 +86,7 @@ Message `3` will be sent to `handler1` again because it is currently free. --- ```python linenums="33" -{!> docs_src/rabbit/subscription/direct.py [ln:33]!} +{! docs_src/rabbit/subscription/direct.py [ln:33.5] !} ``` -Message `4` will be sent to `handler3` because it is the only one listening to the `exchange` using a queue with the routing key `test-q-2`. +Message `4` will be sent to `handler3` because it is the only one listening to the `#!python "exchange"` using a queue with the routing key `#!python "test-q-2"`. diff --git a/docs/docs/en/rabbit/examples/fanout.md b/docs/docs/en/rabbit/examples/fanout.md index 5d3ce45aa3..6f89f7f377 100644 --- a/docs/docs/en/rabbit/examples/fanout.md +++ b/docs/docs/en/rabbit/examples/fanout.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Fanout Exchange The **Fanout** Exchange is an even simpler, but slightly less popular way of routing in *RabbitMQ*. This type of `exchange` sends messages to all queues subscribed to it, ignoring any arguments of the message. @@ -7,7 +17,7 @@ At the same time, if the queue listens to several consumers, messages will also ## Example ```python linenums="1" -{!> docs_src/rabbit/subscription/fanout.py !} +{! docs_src/rabbit/subscription/fanout.py !} ``` ### Consumer Announcement @@ -15,13 +25,13 @@ At the same time, if the queue listens to several consumers, messages will also To begin with, we announced our **Fanout** exchange and several queues that will listen to it: ```python linenums="7" hl_lines="1" -{!> docs_src/rabbit/subscription/fanout.py [ln:7-10]!} +{! docs_src/rabbit/subscription/fanout.py [ln:7-10] !} ``` Then we signed up several consumers using the advertised queues to the `exchange` we created: ```python linenums="13" hl_lines="1 6 11" -{!> docs_src/rabbit/subscription/fanout.py [ln:13-25]!} +{! docs_src/rabbit/subscription/fanout.py [ln:13-25] !} ``` !!! note @@ -34,7 +44,7 @@ Then we signed up several consumers using the advertised queues to the `exchange Now the all messages will be send to all subscribers due they are binded to the same **FANOUT** exchange: ```python linenums="30" -{!> docs_src/rabbit/subscription/fanout.py [ln:30-33]!} +{! docs_src/rabbit/subscription/fanout.py [ln:30.5,31.5,32.5,33.5] !} ``` --- diff --git a/docs/docs/en/rabbit/examples/headers.md b/docs/docs/en/rabbit/examples/headers.md index 8289864cc6..adddd4c2d4 100644 --- a/docs/docs/en/rabbit/examples/headers.md +++ b/docs/docs/en/rabbit/examples/headers.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Header Exchange The **Header** Exchange is the most complex and flexible way to route messages in *RabbitMQ*. This `exchange` type sends messages to queues according by matching the queue binding arguments with message headers. @@ -7,7 +17,7 @@ At the same time, if several consumers are subscribed to the queue, messages wil ## Example ```python linenums="1" -{!> docs_src/rabbit/subscription/header.py !} +{! docs_src/rabbit/subscription/header.py !} ``` ### Consumer Announcement @@ -15,7 +25,7 @@ At the same time, if several consumers are subscribed to the queue, messages wil First, we announce our **Header** exchange and several queues that will listen to it: ```python linenums="7" hl_lines="1 6 11 16" -{!> docs_src/rabbit/subscription/header.py [ln:7-23]!} +{! docs_src/rabbit/subscription/header.py [ln:7-23] !} ``` The `x-match` argument indicates whether the arguments should match the message headers in whole or in part. @@ -23,7 +33,7 @@ The `x-match` argument indicates whether the arguments should match the message Then we signed up several consumers using the advertised queues to the `exchange` we created: ```python linenums="26" hl_lines="1 6 11 16" -{!> docs_src/rabbit/subscription/header.py [ln:26-43]!} +{! docs_src/rabbit/subscription/header.py [ln:26-43] !} ``` !!! note @@ -36,15 +46,15 @@ Then we signed up several consumers using the advertised queues to the `exchange Now the distribution of messages between these consumers will look like this: ```python linenums="48" -{!> docs_src/rabbit/subscription/header.py [ln:48]!} +{! docs_src/rabbit/subscription/header.py [ln:48.5] !} ``` -Message `1` will be sent to `handler1` because it listens to a queue whose `key` header matches the `key` header of the message. +Message `1` will be sent to `handler1` because it listens to a queue whose `#!python "key"` header matches the `#!python "key"` header of the message. --- ```python linenums="49" -{!> docs_src/rabbit/subscription/header.py [ln:49]!} +{! docs_src/rabbit/subscription/header.py [ln:49.5]!} ``` Message `2` will be sent to `handler2` because it listens to `exchange` using the same queue, but `handler1` is busy. @@ -52,7 +62,7 @@ Message `2` will be sent to `handler2` because it listens to `exchange` using th --- ```python linenums="50" -{!> docs_src/rabbit/subscription/header.py [ln:50]!} +{! docs_src/rabbit/subscription/header.py [ln:50.5]!} ``` Message `3` will be sent to `handler1` again because it is currently free. @@ -60,23 +70,23 @@ Message `3` will be sent to `handler1` again because it is currently free. --- ```python linenums="51" -{!> docs_src/rabbit/subscription/header.py [ln:51]!} +{! docs_src/rabbit/subscription/header.py [ln:51.5]!} ``` -Message `4` will be sent to `handler3` because it listens to a queue whose `key` header coincided with the `key` header of the message. +Message `4` will be sent to `handler3` because it listens to a queue whose `#!python "key"` header coincided with the `#!python "key"` header of the message. --- ```python linenums="52" -{!> docs_src/rabbit/subscription/header.py [ln:52]!} +{! docs_src/rabbit/subscription/header.py [ln:52.5]!} ``` -Message `5` will be sent to `handler3` because it listens to a queue whose header `key2` coincided with the header `key2` of the message. +Message `5` will be sent to `handler3` because it listens to a queue whose header `#!python "key2"` coincided with the header `#!python "key2"` of the message. --- ```python linenums="53" -{!> docs_src/rabbit/subscription/header.py [ln:53-55]!} +{! docs_src/rabbit/subscription/header.py [ln:53.5,54.5,55.5]!} ``` Message `6` will be sent to `handler3` and `handler4` because the message headers completely match the queue keys. diff --git a/docs/docs/en/rabbit/examples/index.md b/docs/docs/en/rabbit/examples/index.md index 419d9f496d..27f13d79a8 100644 --- a/docs/docs/en/rabbit/examples/index.md +++ b/docs/docs/en/rabbit/examples/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Basic Subscriber If you know nothing about *RabbitMQ* and how it works, you will still able to use **FastStream RabbitBroker**. @@ -5,7 +15,7 @@ If you know nothing about *RabbitMQ* and how it works, you will still able to us Just use the `#!python @broker.subscriber(...)` method with a string as a routing key. ```python linenums="1" -{!> docs_src/rabbit/subscription/index.py !} +{! docs_src/rabbit/subscription/index.py !} ``` This is the principle all **FastStream** brokers work by: you don't need to learn them in-depth if you want to *just send a message*. diff --git a/docs/docs/en/rabbit/examples/stream.md b/docs/docs/en/rabbit/examples/stream.md index 550aed70c0..b54c0df1a2 100644 --- a/docs/docs/en/rabbit/examples/stream.md +++ b/docs/docs/en/rabbit/examples/stream.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # RabbitMQ Streams *RabbitMQ* has a [Streams](https://www.rabbitmq.com/streams.html){.exteranl-link target="_blank"} feature, which is closely related to *Kafka* topics. @@ -7,5 +17,5 @@ The main difference from regular *RabbitMQ* queues is that the messages are not And **FastStream** supports this feature as well! ```python linenums="1" hl_lines="4 10-12 17" -{!> docs_src/rabbit/subscription/stream.py !} +{! docs_src/rabbit/subscription/stream.py !} ``` diff --git a/docs/docs/en/rabbit/examples/topic.md b/docs/docs/en/rabbit/examples/topic.md index 472675e414..6c456bb4ad 100644 --- a/docs/docs/en/rabbit/examples/topic.md +++ b/docs/docs/en/rabbit/examples/topic.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Topic Exchange The **Topic** Exchange is a powerful *RabbitMQ* routing tool. This type of `exchange` sends messages to the queue in accordance with the *pattern* specified when they are connected to `exchange` and the `routing_key` of the message itself. @@ -7,7 +17,7 @@ At the same time, if several consumers are subscribed to the queue, messages wil ## Example ```python linenums="1" -{!> docs_src/rabbit/subscription/topic.py !} +{! docs_src/rabbit/subscription/topic.py !} ``` ### Consumer Announcement @@ -15,7 +25,7 @@ At the same time, if several consumers are subscribed to the queue, messages wil First, we announce our **Topic** exchange and several queues that will listen to it: ```python linenums="7" hl_lines="1 3-4" -{!> docs_src/rabbit/subscription/topic.py [ln:7-10]!} +{! docs_src/rabbit/subscription/topic.py [ln:7-10]!} ``` At the same time, in the `routing_key` of our queues, we specify the *pattern* of routing keys that will be processed by this queue. @@ -23,7 +33,7 @@ At the same time, in the `routing_key` of our queues, we specify the *pattern* o Then we sign up several consumers using the advertised queues to the `exchange` we created: ```python linenums="13" hl_lines="1 6 11" -{!> docs_src/rabbit/subscription/topic.py [ln:13-25]!} +{! docs_src/rabbit/subscription/topic.py [ln:13-25]!} ``` !!! note @@ -36,23 +46,23 @@ Then we sign up several consumers using the advertised queues to the `exchange` Now the distribution of messages between these consumers will look like this: ```python linenums="30" -{!> docs_src/rabbit/subscription/topic.py [ln:30]!} +{! docs_src/rabbit/subscription/topic.py [ln:30.5]!} ``` -Message `1` will be sent to `handler1` because it listens to `exchange` using a queue with the routing key `*.info`. +Message `1` will be sent to `handler1` because it listens to `#!python "exchange"` using a queue with the routing key `#!python "*.info"`. --- ```python linenums="31" -{!> docs_src/rabbit/subscription/topic.py [ln:31]!} +{! docs_src/rabbit/subscription/topic.py [ln:31.5]!} ``` -Message `2` will be sent to `handler2` because it listens to `exchange` using the same queue, but `handler1` is busy. +Message `2` will be sent to `handler2` because it listens to `#!python "exchange"` using the same queue, but `handler1` is busy. --- ```python linenums="32" -{!> docs_src/rabbit/subscription/topic.py [ln:32]!} +{! docs_src/rabbit/subscription/topic.py [ln:32.5]!} ``` Message `3` will be sent to `handler1` again because it is currently free. @@ -60,7 +70,7 @@ Message `3` will be sent to `handler1` again because it is currently free. --- ```python linenums="33" -{!> docs_src/rabbit/subscription/topic.py [ln:33]!} +{! docs_src/rabbit/subscription/topic.py [ln:33.5]!} ``` -Message `4` will be sent to `handler3` because it is the only one listening to `exchange` using a queue with the routing key `*.debug`. +Message `4` will be sent to `handler3` because it is the only one listening to `#!python "exchange"` using a queue with the routing key `#!python "*.debug"`. diff --git a/docs/docs/en/rabbit/index.md b/docs/docs/en/rabbit/index.md index 92abf6f840..2c459e093c 100644 --- a/docs/docs/en/rabbit/index.md +++ b/docs/docs/en/rabbit/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Rabbit Routing !!! note "" diff --git a/docs/docs/en/rabbit/message.md b/docs/docs/en/rabbit/message.md index 4fffd02249..2e47b132b8 100644 --- a/docs/docs/en/rabbit/message.md +++ b/docs/docs/en/rabbit/message.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Access to Message Information As you know, **FastStream** serializes a message body and provides you access to it through function arguments. But sometimes you need access to a message_id, headers, or other meta-information. @@ -8,7 +18,7 @@ You can get it in a simple way: just acces the message object in the [Context](. This message contains the required information such as: -{!> includes/message/attrs.md !} +{! includes/message/attrs.md !} Also, it is a **FastStream** wrapper around a native broker library message (`aio_pika.IncomingMessage` in the *RabbitMQ* case) that you can access with `raw_message`. @@ -67,13 +77,13 @@ async def base_handler( But this code is too long to be reused everywhere. In this case, you can use a Python [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated){.external-link target="_blank"} feature: -{!> includes/message/annotated.md !} +{! includes/message/annotated.md !} -{!> includes/message/headers.md !} +{! includes/message/headers.md !} ## Topic Pattern Access -As you know, **Rabbit** allows you to use a pattern like this `logs.*` with a [Topic](./examples/topic.md){.internal-link} exchange. Getting access to the real `*` value is an often-used scenario and **FastStream** provide it to you with the `Path` object (which is a shortcut to `#!python Context("message.path.*")`). +As you know, **Rabbit** allows you to use a pattern like this `#!python "logs.*"` with a [Topic](./examples/topic.md){.internal-link} exchange. Getting access to the real `*` value is an often-used scenario and **FastStream** provide it to you with the `Path` object (which is a shortcut to `#!python Context("message.path.*")`). To use it, you just need to replace your `*` with `{variable-name}` and use `Path` as a regular `Context` object: diff --git a/docs/docs/en/rabbit/publishing.md b/docs/docs/en/rabbit/publishing.md index f7036950d9..a01af3bdf6 100644 --- a/docs/docs/en/rabbit/publishing.md +++ b/docs/docs/en/rabbit/publishing.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing **FastStream** `RabbitBroker` supports all regular [publishing usecases](../getting-started/publishing/index.md){.internal-link}. you can use them without any changes. @@ -12,7 +22,7 @@ However, in this case, an object of the `aio_pika.Message` class (if necessary) You can specify queue (used as a routing_key) and exchange (optionally) to send by their name. -``` python +```python import asyncio from faststream.rabbit import RabbitBroker @@ -31,7 +41,7 @@ If you don't specify any exchange, the message will be send to the default one. Also, you are able to use special **RabbitQueue** and **RabbitExchange** objects as `queue` and `exchange` arguments: -``` python +```python from faststream.rabbit import RabbitExchange, RabbitQueue await broker.publish( diff --git a/docs/docs/en/rabbit/rpc.md b/docs/docs/en/rabbit/rpc.md index 57704a385a..843d3edc3c 100644 --- a/docs/docs/en/rabbit/rpc.md +++ b/docs/docs/en/rabbit/rpc.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # RPC over RMQ ## Blocking Request @@ -10,7 +20,7 @@ Just send a message like a regular one and get a response synchronously. It is very close to common **requests** syntax: -``` python hl_lines="1 4" +```python hl_lines="1 4" msg = await broker.publish( "Hi!", queue="test", @@ -29,7 +39,7 @@ Also, if you want to create a permanent request-reply data flow, probably, you s So, if you have such one, you can specify it with the `reply_to` argument. This way, **FastStream** will send a response to this queue automatically. -``` python hl_lines="1 8" +```python hl_lines="1 8" @broker.subscriber("response-queue") async def consume_responses(msg): ... diff --git a/docs/docs/en/rabbit/security.md b/docs/docs/en/rabbit/security.md index a263fdf49c..ead22317b3 100644 --- a/docs/docs/en/rabbit/security.md +++ b/docs/docs/en/rabbit/security.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # FastStream RabbitMQ Security This chapter discusses the security options available in **FastStream** and how to use them. @@ -13,7 +23,7 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** ```python linenums="1" hl_lines="6-7 9" -{!> docs_src/rabbit/security/basic.py !} +{! docs_src/rabbit/security/basic.py !} ``` ### 2. SASLPlaintext Object with SSL/TLS @@ -23,5 +33,5 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** ```python linenums="1" hl_lines="6-11 13" -{!> docs_src/rabbit/security/plaintext.py !} +{! docs_src/rabbit/security/plaintext.py !} ``` diff --git a/docs/docs/en/redis/index.md b/docs/docs/en/redis/index.md index 87d0203e66..13b545cfb6 100644 --- a/docs/docs/en/redis/index.md +++ b/docs/docs/en/redis/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Broker ## Redis Overview @@ -43,7 +53,7 @@ To connect to **Redis** using the **FastStream** `RedisBroker` module, follow th Here's a simplified code example demonstrating how to establish a connection to **Redis** using **FastStream**'s `RedisBroker` module: ```python linenums="1" -{!> docs_src/index/redis/basic.py!} +{! docs_src/index/redis/basic.py!} ``` This minimal example illustrates how **FastStream** simplifies the process of connecting to **Redis** and performing basic message processing from the *in-channel* to the *out-channel*. Depending on your specific use case and requirements, you can further customize your **Redis** integration with **FastStream** to build efficient and responsive applications. diff --git a/docs/docs/en/redis/list/batch.md b/docs/docs/en/redis/list/batch.md index 4aa8f912ef..c920fc276a 100644 --- a/docs/docs/en/redis/list/batch.md +++ b/docs/docs/en/redis/list/batch.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis List Batch Subscriber If you want to consume data in batches from a Redis list, the `#!python @broker.subscriber(...)` decorator makes it possible. By defining your consumed `msg` object as a list of messages and setting the `batch` parameter to `True` within the `ListSub` object, the subscriber will call your consuming function with a batch of messages. Let's walk through how to achieve this with the FastStream library. @@ -11,7 +21,7 @@ To consume messages in batches from a Redis list, follow these steps: In your FastStream application, define the subscriber using the `#!python @broker.subscriber(...)` decorator. Ensure that you pass a `ListSub` object with the `batch` parameter set to `True`. This configuration tells the subscriber to handle message consumption in batches from the specified Redis list. ```python linenums="1" -{!> docs_src/redis/list/sub_batch.py [ln:8] !} +{! docs_src/redis/list/sub_batch.py [ln:8] !} ``` ### Step 2: Implement Your Consuming Function @@ -19,7 +29,7 @@ In your FastStream application, define the subscriber using the `#!python @broke Create a consuming function that accepts the list of messages. The `#!python @broker.subscriber(...)` decorator will take care of collecting and grouping messages into batches. ```python linenums="1" -{!> docs_src/redis/list/sub_batch.py [ln:8-10] !} +{! docs_src/redis/list/sub_batch.py [ln:8-10] !} ``` ## Example of Consuming in Batches @@ -27,7 +37,7 @@ Create a consuming function that accepts the list of messages. The `#!python @br Let's illustrate how to consume messages in batches from the `#!python "test-list"` Redis list with a practical example: ```python linenums="1" -{!> docs_src/redis/list/sub_batch.py !} +{! docs_src/redis/list/sub_batch.py !} ``` In this example, the subscriber is configured to process messages in batches from the Redis list, and the consuming function is designed to handle these batches efficiently. diff --git a/docs/docs/en/redis/list/index.md b/docs/docs/en/redis/list/index.md index cf23be1814..c17045d4a7 100644 --- a/docs/docs/en/redis/list/index.md +++ b/docs/docs/en/redis/list/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Lists [Redis Lists](https://redis.io/docs/data-types/lists/){.external-link target="_blank"} are a simple and flexible data structure that function as ordered collections of strings. They are similar to lists in programming languages, and **Redis** provides commands to perform a variety of operations such as adding, retrieving, and removing elements from either end of the list. diff --git a/docs/docs/en/redis/list/publishing.md b/docs/docs/en/redis/list/publishing.md index db6275eec3..12a918b197 100644 --- a/docs/docs/en/redis/list/publishing.md +++ b/docs/docs/en/redis/list/publishing.md @@ -1,10 +1,20 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis List Publishing with FastStream Utilizing the **FastStream** library, you can effectively publish data to Redis lists, which act as queues in Redis-based messaging systems. ## Understanding Redis List Publishing -Just like with Redis streams, messages can be published to Redis lists. FastStream utilizes the `@broker.publisher` decorator, along with a list's name, to push messages onto the list. +Just like with Redis streams, messages can be published to Redis lists. FastStream utilizes the `#!python @broker.publisher(...)` decorator, along with a list's name, to push messages onto the list. 1. Instantiate your RedisBroker @@ -26,20 +36,20 @@ Just like with Redis streams, messages can be published to Redis lists. FastStre 4. Implement a data processing function for publishing to Redis lists - Use the `@broker.publisher(list="...")` decorator alongside the `@broker.subscriber(list="...")` decorator to create a function that processes incoming messages and pushes the results to an output list in Redis. + Use the `#!python @broker.publisher(list="...")` decorator alongside the `#!python @broker.subscriber(list="...")` decorator to create a function that processes incoming messages and pushes the results to an output list in Redis. ```python linenums="1" {!> docs_src/redis/list/list_pub.py [ln:17-20] !} ``` -In this pattern, the function stands as a subscriber to the "input-list" and publishes the processed data as a new message to the "output-list." By using decorators, you establish a pipeline that reads messages from one Redis list, applies some logic, and then pushes outputs to another list. +In this pattern, the function stands as a subscriber to the `#!python "input-list"` and publishes the processed data as a new message to the `#!python "output-list"`. By using decorators, you establish a pipeline that reads messages from one Redis list, applies some logic, and then pushes outputs to another list. ## Full Example of Redis List Publishing Here's an example that demonstrates Redis list publishing in action using decorators with FastStream: ```python linenums="1" -{!> docs_src/redis/list/list_pub.py !} +{! docs_src/redis/list/list_pub.py !} ``` The provided example illustrates the ease of setting up publishing mechanisms to interact with Redis lists. In this environment, messages are dequeued from the input list, processed, and enqueued onto the output list seamlessly, empowering developers to leverage Redis lists as messaging queues. diff --git a/docs/docs/en/redis/list/subscription.md b/docs/docs/en/redis/list/subscription.md index 701b757af0..c3368b6377 100644 --- a/docs/docs/en/redis/list/subscription.md +++ b/docs/docs/en/redis/list/subscription.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis List Basic Subscriber To start consuming from a **Redis** list, simply decorate your consuming function with the `#!python @broker.subscriber(...)` decorator, passing a string as the list key. @@ -7,7 +17,7 @@ In the following example, we will create a simple FastStream app that will consu The full app code looks like this: ```python linenums="1" -{!> docs_src/redis/list/list_sub.py [ln:1-10] !} +{! docs_src/redis/list/list_sub.py [ln:1-10] !} ``` ## Import FastStream and RedisBroker @@ -15,7 +25,7 @@ The full app code looks like this: To use the `#!python @broker.subscriber(...)` decorator, first, we need to import the base FastStream app and RedisBroker to create our broker. ```python linenums="1" -{!> docs_src/redis/list/list_sub.py [ln:1-2] !} +{! docs_src/redis/list/list_sub.py [ln:1-2] !} ``` ## Create a RedisBroker @@ -23,7 +33,7 @@ To use the `#!python @broker.subscriber(...)` decorator, first, we need to impor Next, we will create a `RedisBroker` object and wrap it into the `FastStream` object so that we can start our app using CLI later. ```python linenums="1" -{!> docs_src/redis/list/list_sub.py [ln:4-5] !} +{! docs_src/redis/list/list_sub.py [ln:4-5] !} ``` ## Create a Function that will Consume Messages from a Redis list @@ -31,7 +41,7 @@ Next, we will create a `RedisBroker` object and wrap it into the `FastStream` ob Let’s create a consumer function that will consume messages from `#!python "test-list"` Redis list and log them. ```python linenums="1" -{!> docs_src/redis/list/list_sub.py [ln:8-10] !} +{! docs_src/redis/list/list_sub.py [ln:8-10] !} ``` The function decorated with the `#!python @broker.subscriber(...)` decorator will be called when a message is pushed to the **Redis** list. diff --git a/docs/docs/en/redis/message.md b/docs/docs/en/redis/message.md index f877ab8e28..455a2ba0bd 100644 --- a/docs/docs/en/redis/message.md +++ b/docs/docs/en/redis/message.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Accessing Redis Message Information with FastStream In **FastStream**, messages passed through a **Redis** broker are serialized and can be interacted with just like function parameters. However, you might occasionally need to access more than just the message content, such as metadata and other attributes. diff --git a/docs/docs/en/redis/pubsub/index.md b/docs/docs/en/redis/pubsub/index.md index b2d4d5f185..795aaf16d8 100644 --- a/docs/docs/en/redis/pubsub/index.md +++ b/docs/docs/en/redis/pubsub/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Channels [**Redis Pub/Sub Channels**](https://redis.io/docs/interact/pubsub/){.external-link target="_blank"} are a feature of **Redis** that enables messaging between clients through a publish/subscribe (pub/sub) pattern. A **Redis** channel is essentially a medium through which messages are transmitted. Different clients can subscribe to these channels to listen for messages, while other clients can publish messages to these channels. diff --git a/docs/docs/en/redis/pubsub/publishing.md b/docs/docs/en/redis/pubsub/publishing.md index f89cea69dd..21c253f077 100644 --- a/docs/docs/en/redis/pubsub/publishing.md +++ b/docs/docs/en/redis/pubsub/publishing.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Publishing The **FastStream** `RedisBroker` supports all standard [publishing use cases](../../getting-started/publishing/index.md){.internal-link} similar to the `KafkaBroker`, allowing you to publish messages to Redis channels with ease. @@ -19,7 +29,7 @@ To publish a message to a Redis channel, follow these steps: 2. Publish a message using the `publish` method ```python linenums="1" - {!> docs_src/redis/pub_sub/raw_publish.py [ln:28] !} + {!> docs_src/redis/pub_sub/raw_publish.py [ln:28.9] !} ``` This is the most straightforward way to use the RedisBroker to publish messages to Redis channels. @@ -43,7 +53,7 @@ For a more structured approach and to include your publishers in the AsyncAPI do 3. Publish a message using the `publish` method of the prepared publisher ```python linenums="1" - {!> docs_src/redis/pub_sub/publisher_object.py [ln:27] !} + {!> docs_src/redis/pub_sub/publisher_object.py [ln:27.9] !} ``` When you encapsulate your broker within a FastStream object, the publisher will be documented in your service's AsyncAPI documentation. @@ -52,14 +62,14 @@ When you encapsulate your broker within a FastStream object, the publisher will Decorators in FastStream provide a convenient way to define the data flow within your application. The `RedisBroker` allows you to use decorators to publish messages to Redis channels, similar to the `KafkaBroker`. -By decorating a function with both `@broker.subscriber` and `@broker.publisher`, you create a DataPipeline unit that processes incoming messages and publishes the results to another channel. The order of decorators does not matter, but they must be applied to a function that has already been decorated by a `@broker.subscriber`. +By decorating a function with both `#!python @broker.subscriber(...)` and `#!python @broker.publisher(...)`, you create a DataPipeline unit that processes incoming messages and publishes the results to another channel. The order of decorators does not matter, but they must be applied to a function that has already been decorated by a `#!python @broker.subscriber(...)`. The decorated function should have a return type annotation to ensure the correct interpretation of the return value before it's published. Here's an example of using decorators with RedisBroker: ```python linenums="1" -{!> docs_src/redis/pub_sub/publisher_decorator.py !} +{! docs_src/redis/pub_sub/publisher_decorator.py !} ``` 1. **Initialize the RedisBroker instance:** Start by creating a RedisBroker instance. @@ -80,7 +90,7 @@ Here's an example of using decorators with RedisBroker: {!> docs_src/redis/pub_sub/publisher_decorator.py [ln:22-23] !} ``` -4. **Decorate your processing function:** Apply the `@broker.subscriber` and `@broker.publisher` decorators to your function to define the input channel and the output channel, respectively. Once your application is running, this decorated function will be triggered whenever a new message arrives on the "input_data" channel, and it will publish the result to the "output_data" channel. +4. **Decorate your processing function:** Apply the `#!python @broker.subscriber(...)` and `#!python @broker.publisher(...)` decorators to your function to define the input channel and the output channel, respectively. Once your application is running, this decorated function will be triggered whenever a new message arrives on the `#!python "input_data"` channel, and it will publish the result to the `#!python "output_data"` channel. ```python linenums="1" {!> docs_src/redis/pub_sub/publisher_decorator.py [ln:20-23] !} diff --git a/docs/docs/en/redis/pubsub/subscription.md b/docs/docs/en/redis/pubsub/subscription.md index 7b4d4fb611..e0c0682b60 100644 --- a/docs/docs/en/redis/pubsub/subscription.md +++ b/docs/docs/en/redis/pubsub/subscription.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Channel Subscription ## Basic Channel Subscription @@ -9,7 +19,7 @@ In this example, we will build a FastStream application that listens to messages The complete application code is presented below: ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub.py!} +{! docs_src/redis/pub_sub/channel_sub.py!} ``` ### Import FastStream and RedisBroker @@ -17,7 +27,7 @@ The complete application code is presented below: To utilize the `#!python @broker.subscriber(...)` decorator for Redis channel subscription, you must first import FastStream and RedisBroker. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub.py [ln:1-2]!} +{! docs_src/redis/pub_sub/channel_sub.py [ln:1-2]!} ``` ### Create a RedisBroker Instance @@ -25,7 +35,7 @@ To utilize the `#!python @broker.subscriber(...)` decorator for Redis channel su Create a `#!python RedisBroker` object and pass it to the `FastStream` object. This setup prepares the application for launch using the FastStream CLI. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub.py [ln:4-5]!} +{! docs_src/redis/pub_sub/channel_sub.py [ln:4-5]!} ``` ### Define the Message Handler Function @@ -33,7 +43,7 @@ Create a `#!python RedisBroker` object and pass it to the `FastStream` object. T Construct a function that will act as the consumer of messages from the `#!python "test"` channel and use the logger to output the message content. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub.py [ln:8-10]!} +{! docs_src/redis/pub_sub/channel_sub.py [ln:8-10]!} ``` When a message is published to the **Redis** channel `#!python "test"`, it will trigger the invocation of the decorated function. The message will be passed to the function's `msg` parameter, while the logger will be available for logging purposes. @@ -45,7 +55,7 @@ For subscribing to multiple Redis channels matching a pattern, use the `#!python Here's how to create a FastStream application that subscribes to all channels matching the `#!python "test.*"` pattern: ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub_pattern.py!} +{! docs_src/redis/pub_sub/channel_sub_pattern.py!} ``` ### Use PubSub for Pattern Matching @@ -53,7 +63,7 @@ Here's how to create a FastStream application that subscribes to all channels ma Import the `PubSub` class from `faststream.redis` along with other necessary modules. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub_pattern.py [ln:1-2] !} +{! docs_src/redis/pub_sub/channel_sub_pattern.py [ln:1-2] !} ``` ### Specify the Pattern for Channel Subscription @@ -61,7 +71,7 @@ Import the `PubSub` class from `faststream.redis` along with other necessary mod To define the pattern subscription, create a `PubSub` object with the desired pattern (`#!python "test.*"` in this case) and indicate that it's a pattern subscription by setting `pattern=True`. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub_pattern.py [ln:8] !} +{! docs_src/redis/pub_sub/channel_sub_pattern.py [ln:8] !} ``` ### Create the Pattern Message Handler Function @@ -69,7 +79,7 @@ To define the pattern subscription, create a `PubSub` object with the desired pa Decide on a function that will act as the subscriber of messages from channels matching the specified pattern. Logging the messages is handled similarly as with basic channel subscription. ```python linenums="1" -{!> docs_src/redis/pub_sub/channel_sub_pattern.py [ln:8-10] !} +{! docs_src/redis/pub_sub/channel_sub_pattern.py [ln:8-10] !} ``` With pattern channel subscription, when a message is published to a channel that matches the specified pattern (`#!python "test.*"`), our handler function will be invoked. The message is delivered to the `msg` argument of the function, similar to how it works in basic channel subscriptions. @@ -79,5 +89,5 @@ With pattern channel subscription, when a message is published to a channel that You can also use the **Redis Pub/Sub** pattern feature to encode some data directly in the channel name. With **FastStream** you can easily access this data using the following code: ```python linenums="1" hl_lines="1 8 12" -{!> docs_src/redis/pub_sub/pattern_data.py !} +{! docs_src/redis/pub_sub/pattern_data.py !} ``` diff --git a/docs/docs/en/redis/rpc.md b/docs/docs/en/redis/rpc.md index f0aa38d24a..0921e7a6d7 100644 --- a/docs/docs/en/redis/rpc.md +++ b/docs/docs/en/redis/rpc.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis RPC with FastStream **FastStream** `RedisBroker` provides the powerful capability to perform Remote Procedure Calls (RPC) using **Redis**. This feature enables you to send a message and await a response, effectively creating a synchronous request-response pattern over the inherently asynchronous **Redis** messaging system. Below is the guide to set up and utilize the **Redis RPC** publishing feature with **FastStream**. @@ -42,7 +52,7 @@ In this example, we assert that the `msg` sent is the same as the response recei Combining all the code snippets above, here is the complete example of how to set up **Redis RPC** with **FastStream** `RedisBroker`: ```python linenums="1" -{!> docs_src/redis/rpc/app.py !} +{! docs_src/redis/rpc/app.py !} ``` By embracing **Redis** RPC with **FastStream**, you can build sophisticated message-based architectures that require direct feedback from message processors. This feature is particularly suitable for cases where immediate processing is necessary or calling functions across different services is essential. diff --git a/docs/docs/en/redis/security.md b/docs/docs/en/redis/security.md index 61af21d985..0093d829cc 100644 --- a/docs/docs/en/redis/security.md +++ b/docs/docs/en/redis/security.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # FastStream Redis Security This chapter discusses the security options available in **FastStream** and how to use them. @@ -13,7 +23,7 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** ```python linenums="1" hl_lines="6-7 9" -{!> docs_src/redis/security/basic.py !} +{! docs_src/redis/security/basic.py !} ``` ### 2. SASLPlaintext Object with SSL/TLS @@ -23,5 +33,5 @@ This chapter discusses the security options available in **FastStream** and how **Usage:** ```python linenums="1" hl_lines="6-11 13" -{!> docs_src/redis/security/plaintext.py !} +{! docs_src/redis/security/plaintext.py !} ``` diff --git a/docs/docs/en/redis/streams/ack.md b/docs/docs/en/redis/streams/ack.md index 0d5ea7fc5e..65666f0dec 100644 --- a/docs/docs/en/redis/streams/ack.md +++ b/docs/docs/en/redis/streams/ack.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Stream Acknowledgement When working with *Redis* streams in the **FastStream** library, it's important to manage message acknowledgements carefully to ensure that messages are not lost and that they have been processed as intended. @@ -31,10 +41,10 @@ Using `ack` will mark the message as processed in the stream, while `nack` is us If the need arises to instantly interrupt message processing at any point in the call stack and acknowledge the message, you can achieve this by raising the `faststream.exceptions.AckMessage` exception: -``` python linenums="1" hl_lines="2 16" -{!> docs_src/redis/stream/ack_errors.py !} +```python linenums="1" hl_lines="2 16" +{! docs_src/redis/stream/ack_errors.py !} ``` By raising `AckMessage`, **FastStream** will halt the current message processing routine and immediately acknowledge it. Analogously, raising `NackMessage` would prevent the message from being acknowledged and could lead to its subsequent reprocessing by the same or a different consumer. -{!> includes/en/no_ack.md !} +{! includes/en/no_ack.md !} diff --git a/docs/docs/en/redis/streams/batch.md b/docs/docs/en/redis/streams/batch.md index dff5778b38..ed55aa249e 100644 --- a/docs/docs/en/redis/streams/batch.md +++ b/docs/docs/en/redis/streams/batch.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Stream Batch Subscriber If you want to consume data in batches from a Redis stream, the `#!python @broker.subscriber(...)` decorator makes it possible. By defining your consumed `msg` object as a list of messages and setting the `batch` parameter to `True` within the `StreamSub` object, the subscriber will call your consuming function with a batch of messages. Let's walk through how to achieve this with the FastStream library. @@ -11,7 +21,7 @@ To consume messages in batches from a Redis stream, follow these steps: In your FastStream application, define the subscriber using the `#!python @broker.subscriber(...)` decorator. Ensure that you pass a `StreamSub` object with the `batch` parameter set to `True`. This configuration tells the subscriber to handle message consumption in batches from the specified Redis stream. ```python linenums="1" -{!> docs_src/redis/stream/batch_sub.py [ln:8] !} +{! docs_src/redis/stream/batch_sub.py [ln:8] !} ``` ### Step 2: Implement Your Consuming Function @@ -19,7 +29,7 @@ In your FastStream application, define the subscriber using the `#!python @broke Create a consuming function that accepts the list of messages. The `#!python @broker.subscriber(...)` decorator will take care of collecting and grouping messages into batches. ```python linenums="1" -{!> docs_src/redis/stream/batch_sub.py [ln:8-10] !} +{! docs_src/redis/stream/batch_sub.py [ln:8-10] !} ``` ## Example of Consuming in Batches @@ -27,7 +37,7 @@ Create a consuming function that accepts the list of messages. The `#!python @br Let's illustrate how to consume messages in batches from the `#!python "test-stream"` Redis stream with a practical example: ```python linenums="1" -{!> docs_src/redis/stream/batch_sub.py !} +{! docs_src/redis/stream/batch_sub.py !} ``` In this example, the subscriber is configured to process messages in batches from the Redis stream, and the consuming function is designed to handle these batches efficiently. diff --git a/docs/docs/en/redis/streams/groups.md b/docs/docs/en/redis/streams/groups.md index 04def843a0..9272d405ad 100644 --- a/docs/docs/en/redis/streams/groups.md +++ b/docs/docs/en/redis/streams/groups.md @@ -1,15 +1,25 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Stream Consumer Groups Consuming messages from a **Redis** stream can be accomplished by using a Consumer Group. This allows multiple consumers to divide the workload of processing messages in a stream and provides a form of message acknowledgment, ensuring that messages are not processed repeatedly. -Consumer Groups in Redis enable a group of clients to cooperatively consume different portions of the same stream of messages. When using `group="..."` (which internally uses `XREADGROUP`), messages are distributed among different consumers in a group and are not delivered to any other consumer in that group again, unless they are not acknowledged (i.e., the client fails to process and does not call `msg.ack()` or `XACK`). This is in contrast to a normal consumer (also known as `XREAD`), where every consumer sees all the messages. `XREAD` is useful for broadcasting to multiple consumers, while `XREADGROUP` is better suited for workload distribution. +Consumer Groups in Redis enable a group of clients to cooperatively consume different portions of the same stream of messages. When using `#!python group="..."` (which internally uses `XREADGROUP`), messages are distributed among different consumers in a group and are not delivered to any other consumer in that group again, unless they are not acknowledged (i.e., the client fails to process and does not call `msg.ack()` or `XACK`). This is in contrast to a normal consumer (also known as `XREAD`), where every consumer sees all the messages. `XREAD` is useful for broadcasting to multiple consumers, while `XREADGROUP` is better suited for workload distribution. -In the following example, we will create a simple FastStream app that utilizes a Redis stream with a Consumer Group. It will consume messages sent to the `test-stream` as part of the `test-group` consumer group. +In the following example, we will create a simple FastStream app that utilizes a Redis stream with a Consumer Group. It will consume messages sent to the `#!python "test-stream"` as part of the `#!python "test-group"` consumer group. The full app code is as follows: ```python linenums="1" -{!> docs_src/redis/stream/group.py !} +{! docs_src/redis/stream/group.py !} ``` ## Import FastStream and RedisBroker @@ -17,7 +27,7 @@ The full app code is as follows: First, import the `FastStream` class and the `RedisBroker` from the `faststream.redis` module to define our broker. ```python linenums="1" -{!> docs_src/redis/stream/group.py [ln:1-2] !} +{! docs_src/redis/stream/group.py [ln:1-2] !} ``` ## Create a RedisBroker @@ -25,15 +35,15 @@ First, import the `FastStream` class and the `RedisBroker` from the `faststream. To establish a connection to Redis, instantiate a `RedisBroker` object and pass it to the `FastStream` app. ```python linenums="1" -{!> docs_src/redis/stream/group.py [ln:4-5] !} +{! docs_src/redis/stream/group.py [ln:4-5] !} ``` ## Define a Consumer Group Subscription -Define a subscription to a Redis stream with a specific Consumer Group using the `StreamSub` object and the `@broker.subscriber(...)` decorator. Then, define a function that will be triggered when new messages are sent to the `test-stream` Redis stream. This function is decorated with `@broker.subscriber(...)` and will process the messages as part of the `test-group` consumer group. +Define a subscription to a Redis stream with a specific Consumer Group using the `StreamSub` object and the `#!python @broker.subscriber(...)` decorator. Then, define a function that will be triggered when new messages are sent to the `#!python "test-stream"` Redis stream. This function is decorated with `#!python @broker.subscriber(...)` and will process the messages as part of the `#!python "test-group"` consumer group. ```python linenums="1" -{!> docs_src/redis/stream/group.py [ln:8-10] !} +{! docs_src/redis/stream/group.py [ln:8-10] !} ``` ## Publishing a message @@ -41,7 +51,7 @@ Define a subscription to a Redis stream with a specific Consumer Group using the Publishing a message is the same as what's defined on [Stream Publishing](./publishing.md). ```python linenums="1" -{!> docs_src/redis/stream/group.py [ln:15] !} +{! docs_src/redis/stream/group.py [ln:15.5] !} ``` By following the steps and code examples provided above, you can create a FastStream application that consumes messages from a Redis stream using a Consumer Group for distributed message processing. diff --git a/docs/docs/en/redis/streams/index.md b/docs/docs/en/redis/streams/index.md index 448e785b30..f9da536f9f 100644 --- a/docs/docs/en/redis/streams/index.md +++ b/docs/docs/en/redis/streams/index.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Streams [Redis Streams](https://redis.io/docs/data-types/streams/){.external-link target="_blank"} are a data structure introduced in **Redis 5.0** that offer a reliable and highly scalable way to handle streams of data. They are similar to logging systems like **Apache Kafka**, where data is stored in a log structure and can be consumed by multiple clients. **Streams** provide a sequence of ordered messages, and they are designed to handle a high volume of data by allowing partitioning and multiple consumers. diff --git a/docs/docs/en/redis/streams/publishing.md b/docs/docs/en/redis/streams/publishing.md index 6a86c5a3a1..f6ca230799 100644 --- a/docs/docs/en/redis/streams/publishing.md +++ b/docs/docs/en/redis/streams/publishing.md @@ -1,8 +1,18 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Stream Publishing with FastStream ## Publishing Data to Redis Stream -To publish messages to a Redis Stream, you implement a function that processes the incoming data and applies the `@broker.publisher` decorator along with the Redis stream name to it. The function will then publish its return value to the specified stream. +To publish messages to a Redis Stream, you implement a function that processes the incoming data and applies the `#!python @broker.publisher(...)` decorator along with the Redis stream name to it. The function will then publish its return value to the specified stream. 1. Create your RedisBroker instance @@ -24,16 +34,16 @@ To publish messages to a Redis Stream, you implement a function that processes t 1. Set up the function for data processing and publishing - Using the `@broker.publisher()` decorator in conjunction with the `@broker.subscriber()` decorator allows seamless message processing and republishing to a different stream. + Using the `#!python @broker.publisher(...)` decorator in conjunction with the `#!python @broker.subscriber(...)` decorator allows seamless message processing and republishing to a different stream. ```python linenums="1" {!> docs_src/redis/stream/pub.py [ln:17-20] !} ``` - By decorating a function with `@broker.publisher`, we tell FastStream to publish the function's returned data to the designated output stream. The defined function also serves as a subscriber to the `input-stream`, thereby setting up a straightforward data pipeline within Redis streams. + By decorating a function with `#!python @broker.publisher(...)`, we tell FastStream to publish the function's returned data to the designated `#!python "output stream"`. The defined function also serves as a subscriber to the `#!python "input-stream"`, thereby setting up a straightforward data pipeline within Redis streams. Here's the complete example that showcases the use of decorators for both subscribing and publishing to Redis streams: ```python linenums="1" -{!> docs_src/redis/stream/pub.py !} +{! docs_src/redis/stream/pub.py !} ``` diff --git a/docs/docs/en/redis/streams/subscription.md b/docs/docs/en/redis/streams/subscription.md index 0819a76eea..0f75c83ad5 100644 --- a/docs/docs/en/redis/streams/subscription.md +++ b/docs/docs/en/redis/streams/subscription.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Redis Stream Basic Subscriber To start consuming from a **Redis** stream, simply decorate your consuming function with the `#!python @broker.subscriber(...)` decorator, passing a string as the stream key. @@ -7,7 +17,7 @@ In the following example, we will create a simple FastStream app that will consu The full app code looks like this: ```python linenums="1" -{!> docs_src/redis/stream/sub.py !} +{! docs_src/redis/stream/sub.py !} ``` ## Import FastStream and RedisBroker @@ -15,7 +25,7 @@ The full app code looks like this: To use the `#!python @broker.subscriber(...)` decorator, first, we need to import the base FastStream app and RedisBroker to create our broker. ```python linenums="1" -{!> docs_src/redis/stream/sub.py [ln:1-2] !} +{! docs_src/redis/stream/sub.py [ln:1-2] !} ``` ## Create a RedisBroker @@ -23,7 +33,7 @@ To use the `#!python @broker.subscriber(...)` decorator, first, we need to impor Next, we will create a `RedisBroker` object and wrap it into the `FastStream` object so that we can start our app using CLI later. ```python linenums="1" -{!> docs_src/redis/stream/sub.py [ln:4-5] !} +{! docs_src/redis/stream/sub.py [ln:4-5] !} ``` ## Create a Function that will Consume Messages from a Redis stream @@ -31,7 +41,7 @@ Next, we will create a `RedisBroker` object and wrap it into the `FastStream` ob Let’s create a consumer function that will consume messages from `#!python "test-stream"` Redis stream and log them. ```python linenums="1" -{!> docs_src/redis/stream/sub.py [ln:8-10] !} +{! docs_src/redis/stream/sub.py [ln:8-10] !} ``` The function decorated with the `#!python @broker.subscriber(...)` decorator will be called when a message is produced to the **Redis** stream. diff --git a/docs/docs/en/release.md b/docs/docs/en/release.md index ea0ed39d00..10a1a7a084 100644 --- a/docs/docs/en/release.md +++ b/docs/docs/en/release.md @@ -18,9 +18,9 @@ hide: A large update by [@Lancetnik](https://github.com/Lancetnik){.external-link target="_blank"} in [#1048](https://github.com/airtai/faststream/pull/1048){.external-link target="_blank"} -Provides with the ability to setup `graceful_timeout` to wait for consumed messages processed correctly before apllication shutdown - `Broker(graceful_timeout=30.0)` (waits up to 30 seconds) +Provides with the ability to setup `graceful_timeout` to wait for consumed messages processed correctly before apllication shutdown - `#!python Broker(graceful_timeout=30.0)` (waits up to `#!python 30` seconds) -* allows to get acces to `context.get_local("message"` from **FastAPI** plugin +* allows to get acces to `#!python context.get_local("message")` from **FastAPI** plugin * docs: fix Avro custom serialization example * docs: add KafkaBroker `publish_batch` notice * docs: add RabbitMQ security page @@ -127,7 +127,7 @@ from faststream.redis import RedisBroker broker = RedisBroker() app = FastStream(broker) -[@broker](https://github.com/broker){.external-link target="_blank"}.subscriber( +@broker.subscriber( channel="test", # or # list="test", or # stream="test", @@ -147,7 +147,7 @@ async def handle(msg: str, logger: Logger): * Support `faststream run -k 1 -k 2 ...` as `k=["1", "2"]` extra options * Add subscriber, publisher and router `include_in_schema: bool` argument to disable **AsyncAPI** render * remove `watchfiles` from default distribution - * Allow create `broker.publisher` with already running broker + * Allow create `#!python broker.publisher(...)` with already running broker * **FastAPI**-like lifespan `FastStream` application context manager * automatic `TestBroker(connect_only=...)` argument based on AST * add `NatsMessage.in_progress()` method @@ -200,7 +200,7 @@ pip install faststream==0.3.0rc0 && pip install "faststream[redis]" * Support `faststream run -k 1 -k 2 ...` as `k=["1", "2"]` extra options * Add subscriber, publisher and router `include_in_schema: bool` argument to disable **AsyncAPI** render * remove `watchfiles` from default distribution -* Allow create `broker.publisher` with already running broker +* Allow create `#!python @broker.publisher(...)` with already running broker * **FastAPI**-like lifespan `FastStream` application context manager * automatic `TestBroker(connect_only=...)` argument based on AST * add `NatsMessage.in_progress()` method diff --git a/docs/docs/en/scheduling.md b/docs/docs/en/scheduling.md index 3c24ee4979..35bac8ff8e 100644 --- a/docs/docs/en/scheduling.md +++ b/docs/docs/en/scheduling.md @@ -1,3 +1,13 @@ +--- +# 0.5 - API +# 2 - Release +# 3 - Contributing +# 5 - Template Page +# 10 - Default +search: + boost: 10 +--- + # Tasks Scheduling **FastStream** is a framework for asynchronous service development. It allows you to build disturbed event-based systems in an easy way. Tasks scheduling is a pretty often usecase in such systems. @@ -26,7 +36,7 @@ Let's take a look at the code example. At first, we should create a regular **FastStream** application. -{!> includes/scheduling/app.md !} +{! includes/scheduling/app.md !} ### Broker Wrapper @@ -40,7 +50,7 @@ taskiq_broker = BrokerWrapper(broker) It creates a *taskiq-compatible* object, that can be used as an object to create a regular [**taskiq** scheduler](https://taskiq-python.github.io/guide/scheduling-tasks.html){.external-link target="_blank"}. -{!> includes/scheduling/taskiq_broker.md !} +{! includes/scheduling/taskiq_broker.md !} !!! note "" We patched the original `TaskiqScheduler` to support message generation callbacks, but its signature remains the same. @@ -104,7 +114,7 @@ Also, you can integrate your **FastStream** application with any other libraries As an example, you can use [**Rocketry**](https://github.com/Miksus/rocketry){.external-link target="_blank"}: -```python +```python linenums="1" import asyncio from rocketry import Rocketry diff --git a/docs/docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py b/docs/docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py index 9ee239f904..7c284c8299 100644 --- a/docs/docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py +++ b/docs/docs_src/getting_started/asyncapi/asyncapi_customization/custom_info.py @@ -5,14 +5,15 @@ broker = KafkaBroker("localhost:9092") description="""# Title of the description This description supports **Markdown** syntax""" -app = FastStream(broker, - title="My App", - version="1.0.0", - description=description, - license=License(name="MIT", url="https://opensource.org/license/mit/"), - terms_of_service="https://my-terms.com/", - contact=Contact(name="support", url="https://help.com/"), - ) +app = FastStream( + broker, + title="My App", + version="1.0.0", + description=description, + license=License(name="MIT", url="https://opensource.org/license/mit/"), + terms_of_service="https://my-terms.com/", + contact=Contact(name="support", url="https://help.com/"), +) @broker.publisher("output_data") @broker.subscriber("input_data") diff --git a/docs/docs_src/getting_started/routers/kafka/router_delay.py b/docs/docs_src/getting_started/routers/kafka/router_delay.py index 26d9178fb9..721a013793 100644 --- a/docs/docs_src/getting_started/routers/kafka/router_delay.py +++ b/docs/docs_src/getting_started/routers/kafka/router_delay.py @@ -1,5 +1,6 @@ from faststream import FastStream -from faststream.kafka import KafkaBroker, KafkaRoute, KafkaRouter +from faststream.kafka import KafkaBroker +from faststream.kafka import KafkaRoute, KafkaRouter broker = KafkaBroker("localhost:9092") app = FastStream(broker) @@ -10,7 +11,11 @@ async def handle(name: str, user_id: int): assert user_id == 1 -router = KafkaRouter(handlers=(KafkaRoute(handle, "test-topic"),)) +router = KafkaRouter( + handlers=( + KafkaRoute(handle, "test-topic"), + ) +) broker.include_router(router) diff --git a/docs/docs_src/getting_started/routers/nats/router_delay.py b/docs/docs_src/getting_started/routers/nats/router_delay.py index c000c5b48d..74bf1723ac 100644 --- a/docs/docs_src/getting_started/routers/nats/router_delay.py +++ b/docs/docs_src/getting_started/routers/nats/router_delay.py @@ -1,5 +1,6 @@ from faststream import FastStream -from faststream.nats import NatsBroker, NatsRoute, NatsRouter +from faststream.nats import NatsBroker +from faststream.nats import NatsRoute, NatsRouter broker = NatsBroker("nats://localhost:4222") app = FastStream(broker) @@ -10,7 +11,11 @@ async def handle(name: str, user_id: int): assert user_id == 1 -router = NatsRouter(handlers=(NatsRoute(handle, "test-subject"),)) +router = NatsRouter( + handlers=( + NatsRoute(handle, "test-subject"), + ) +) broker.include_router(router) diff --git a/docs/docs_src/getting_started/routers/rabbit/router_delay.py b/docs/docs_src/getting_started/routers/rabbit/router_delay.py index cee597170a..d809a48681 100644 --- a/docs/docs_src/getting_started/routers/rabbit/router_delay.py +++ b/docs/docs_src/getting_started/routers/rabbit/router_delay.py @@ -1,5 +1,6 @@ from faststream import FastStream -from faststream.rabbit import RabbitBroker, RabbitRoute, RabbitRouter +from faststream.rabbit import RabbitBroker +from faststream.rabbit import RabbitRoute, RabbitRouter broker = RabbitBroker("amqp://guest:guest@localhost:5672/") app = FastStream(broker) @@ -10,7 +11,11 @@ async def handle(name: str, user_id: int): assert user_id == 1 -router = RabbitRouter(handlers=(RabbitRoute(handle, "test-queue"),)) +router = RabbitRouter( + handlers=( + RabbitRoute(handle, "test-queue"), + ) +) broker.include_router(router) diff --git a/docs/docs_src/getting_started/routers/redis/router_delay.py b/docs/docs_src/getting_started/routers/redis/router_delay.py index 33779b5630..2417568694 100644 --- a/docs/docs_src/getting_started/routers/redis/router_delay.py +++ b/docs/docs_src/getting_started/routers/redis/router_delay.py @@ -1,5 +1,6 @@ from faststream import FastStream -from faststream.redis import RedisBroker, RedisRouter, RedisRoute +from faststream.redis import RedisBroker +from faststream.redis import RedisRouter, RedisRoute broker = RedisBroker("redis://localhost:6379") app = FastStream(broker) @@ -10,7 +11,11 @@ async def handle(name: str, user_id: int): assert user_id == 1 -router = RedisRouter(handlers=(RedisRoute(handle, "test-channel"),)) +router = RedisRouter( + handlers=( + RedisRoute(handle, "test-channel"), + ) +) broker.include_router(router) diff --git a/docs/docs_src/getting_started/subscription/kafka/annotation.py b/docs/docs_src/getting_started/subscription/kafka/annotation.py index b95e7ffd8b..f45878c900 100644 --- a/docs/docs_src/getting_started/subscription/kafka/annotation.py +++ b/docs/docs_src/getting_started/subscription/kafka/annotation.py @@ -6,6 +6,9 @@ @broker.subscriber("test-topic") -async def handle(name: str, user_id: int): +async def handle( + name: str, + user_id: int, +): assert name == "John" assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/kafka/pydantic_annotated_fields.py b/docs/docs_src/getting_started/subscription/kafka/pydantic_annotated_fields.py new file mode 100644 index 0000000000..f4bb952268 --- /dev/null +++ b/docs/docs_src/getting_started/subscription/kafka/pydantic_annotated_fields.py @@ -0,0 +1,24 @@ +from typing import Annotated + +from pydantic import Field, NonNegativeInt + +from faststream import FastStream +from faststream.kafka import KafkaBroker + +broker = KafkaBroker("localhost:9092") +app = FastStream(broker) + + +@broker.subscriber("test") +async def handle( + name: Annotated[ + str, + Field(..., examples=["John"], description="Registered user name") + ], + user_id: Annotated[ + NonNegativeInt, + Field(..., examples=[1], description="Registered user id"), + ] +): + assert name == "John" + assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/kafka/pydantic_model.py b/docs/docs_src/getting_started/subscription/kafka/pydantic_model.py index d54fd18d08..7efd395439 100644 --- a/docs/docs_src/getting_started/subscription/kafka/pydantic_model.py +++ b/docs/docs_src/getting_started/subscription/kafka/pydantic_model.py @@ -17,6 +17,8 @@ class UserInfo(BaseModel): @broker.subscriber("test-topic") -async def handle(user: UserInfo): +async def handle( + user: UserInfo, +): assert user.name == "John" assert user.user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/nats/annotation.py b/docs/docs_src/getting_started/subscription/nats/annotation.py index d0da646d14..de3bd986f1 100644 --- a/docs/docs_src/getting_started/subscription/nats/annotation.py +++ b/docs/docs_src/getting_started/subscription/nats/annotation.py @@ -6,6 +6,9 @@ @broker.subscriber("test-subject") -async def handle(name: str, user_id: int): +async def handle( + name: str, + user_id: int, +): assert name == "John" assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/nats/pydantic_annotated_fields.py b/docs/docs_src/getting_started/subscription/nats/pydantic_annotated_fields.py new file mode 100644 index 0000000000..a4802e427c --- /dev/null +++ b/docs/docs_src/getting_started/subscription/nats/pydantic_annotated_fields.py @@ -0,0 +1,24 @@ +from typing import Annotated + +from pydantic import Field, NonNegativeInt + +from faststream import FastStream +from faststream.nats import NatsBroker + +broker = NatsBroker("nats://localhost:4222") +app = FastStream(broker) + + +@broker.subscriber("test") +async def handle( + name: Annotated[ + str, + Field(..., examples=["John"], description="Registered user name") + ], + user_id: Annotated[ + NonNegativeInt, + Field(..., examples=[1], description="Registered user id"), + ] +): + assert name == "John" + assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/nats/pydantic_model.py b/docs/docs_src/getting_started/subscription/nats/pydantic_model.py index fc915891ec..4382fc00ca 100644 --- a/docs/docs_src/getting_started/subscription/nats/pydantic_model.py +++ b/docs/docs_src/getting_started/subscription/nats/pydantic_model.py @@ -17,6 +17,8 @@ class UserInfo(BaseModel): @broker.subscriber("test-subject") -async def handle(user: UserInfo): +async def handle( + user: UserInfo, +): assert user.name == "John" assert user.user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/rabbit/annotation.py b/docs/docs_src/getting_started/subscription/rabbit/annotation.py index 8dbf401627..261e2b23db 100644 --- a/docs/docs_src/getting_started/subscription/rabbit/annotation.py +++ b/docs/docs_src/getting_started/subscription/rabbit/annotation.py @@ -6,6 +6,9 @@ @broker.subscriber("test-queue") -async def handle(name: str, user_id: int): +async def handle( + name: str, + user_id: int, +): assert name == "John" assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/rabbit/pydantic_annotated_fields.py b/docs/docs_src/getting_started/subscription/rabbit/pydantic_annotated_fields.py new file mode 100644 index 0000000000..ab66c0329e --- /dev/null +++ b/docs/docs_src/getting_started/subscription/rabbit/pydantic_annotated_fields.py @@ -0,0 +1,24 @@ +from typing import Annotated + +from pydantic import Field, NonNegativeInt + +from faststream import FastStream +from faststream.rabbit import RabbitBroker + +broker = RabbitBroker("amqp://guest:guest@localhost:5672/") +app = FastStream(broker) + + +@broker.subscriber("test") +async def handle( + name: Annotated[ + str, + Field(..., examples=["John"], description="Registered user name") + ], + user_id: Annotated[ + NonNegativeInt, + Field(..., examples=[1], description="Registered user id"), + ] +): + assert name == "John" + assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/rabbit/pydantic_model.py b/docs/docs_src/getting_started/subscription/rabbit/pydantic_model.py index 118be562bc..f3b15dac21 100644 --- a/docs/docs_src/getting_started/subscription/rabbit/pydantic_model.py +++ b/docs/docs_src/getting_started/subscription/rabbit/pydantic_model.py @@ -17,6 +17,8 @@ class UserInfo(BaseModel): @broker.subscriber("test-queue") -async def handle(user: UserInfo): +async def handle( + user: UserInfo, +): assert user.name == "John" assert user.user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/redis/annotation.py b/docs/docs_src/getting_started/subscription/redis/annotation.py index f54b186fff..3d93f29dc4 100644 --- a/docs/docs_src/getting_started/subscription/redis/annotation.py +++ b/docs/docs_src/getting_started/subscription/redis/annotation.py @@ -6,6 +6,9 @@ @broker.subscriber("test-channel") -async def handle(name: str, user_id: int): +async def handle( + name: str, + user_id: int, +): assert name == "John" assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/redis/pydantic_annotated_fields.py b/docs/docs_src/getting_started/subscription/redis/pydantic_annotated_fields.py new file mode 100644 index 0000000000..05acd13739 --- /dev/null +++ b/docs/docs_src/getting_started/subscription/redis/pydantic_annotated_fields.py @@ -0,0 +1,23 @@ +from typing import Annotated +from pydantic import Field, NonNegativeInt + +from faststream import FastStream +from faststream.redis import RedisBroker + +broker = RedisBroker("redis://localhost:6379") +app = FastStream(broker) + + +@broker.subscriber("test") +async def handle( + name: Annotated[ + str, + Field(..., examples=["John"], description="Registered user name") + ], + user_id: Annotated[ + NonNegativeInt, + Field(..., examples=[1], description="Registered user id"), + ] +): + assert name == "John" + assert user_id == 1 diff --git a/docs/docs_src/getting_started/subscription/redis/pydantic_model.py b/docs/docs_src/getting_started/subscription/redis/pydantic_model.py index 0e5e27d7e8..6dce30bffc 100644 --- a/docs/docs_src/getting_started/subscription/redis/pydantic_model.py +++ b/docs/docs_src/getting_started/subscription/redis/pydantic_model.py @@ -17,6 +17,8 @@ class UserInfo(BaseModel): @broker.subscriber("test-channel") -async def handle(user: UserInfo): +async def handle( + user: UserInfo, +): assert user.name == "John" assert user.user_id == 1 diff --git a/docs/docs_src/kafka/plaintext_security/__init__.py b/docs/docs_src/kafka/plaintext_security/__init__.py deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/docs_src/kafka/plaintext_security/app.py b/docs/docs_src/kafka/plaintext_security/app.py deleted file mode 100644 index 747759dad0..0000000000 --- a/docs/docs_src/kafka/plaintext_security/app.py +++ /dev/null @@ -1,17 +0,0 @@ -import ssl - -from faststream import FastStream -from faststream.kafka import KafkaBroker -from faststream.security import SASLPlaintext - -ssl_context = ssl.create_default_context() -security = SASLPlaintext(ssl_context=ssl_context, username="admin", password="password") - -broker = KafkaBroker("localhost:9092", security=security) -app = FastStream(broker) - - -@broker.publisher("test_2") -@broker.subscriber("test_1") -async def test_topic(msg: str) -> str: - pass diff --git a/docs/docs_src/kafka/sasl_scram256_security/__init__.py b/docs/docs_src/kafka/sasl_scram256_security/__init__.py deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/docs_src/kafka/sasl_scram256_security/app.py b/docs/docs_src/kafka/sasl_scram256_security/app.py deleted file mode 100644 index 771f17999f..0000000000 --- a/docs/docs_src/kafka/sasl_scram256_security/app.py +++ /dev/null @@ -1,17 +0,0 @@ -import ssl - -from faststream import FastStream -from faststream.kafka import KafkaBroker -from faststream.security import SASLScram256 - -ssl_context = ssl.create_default_context() -security = SASLScram256(ssl_context=ssl_context, username="admin", password="password") - -broker = KafkaBroker("localhost:9092", security=security) -app = FastStream(broker) - - -@broker.publisher("test_2") -@broker.subscriber("test_1") -async def test_topic(msg: str) -> str: - pass diff --git a/docs/docs_src/kafka/sasl_scram512_security/__init__.py b/docs/docs_src/kafka/sasl_scram512_security/__init__.py deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/docs/docs_src/kafka/sasl_scram512_security/app.py b/docs/docs_src/kafka/sasl_scram512_security/app.py deleted file mode 100644 index 3d7de26444..0000000000 --- a/docs/docs_src/kafka/sasl_scram512_security/app.py +++ /dev/null @@ -1,17 +0,0 @@ -import ssl - -from faststream import FastStream -from faststream.kafka import KafkaBroker -from faststream.security import SASLScram512 - -ssl_context = ssl.create_default_context() -security = SASLScram512(ssl_context=ssl_context, username="admin", password="password") - -broker = KafkaBroker("localhost:9092", security=security) -app = FastStream(broker) - - -@broker.publisher("test_2") -@broker.subscriber("test_1") -async def test_topic(msg: str) -> str: - pass diff --git a/docs/docs_src/kafka/basic_security/__init__.py b/docs/docs_src/kafka/security/__init__.py similarity index 100% rename from docs/docs_src/kafka/basic_security/__init__.py rename to docs/docs_src/kafka/security/__init__.py diff --git a/docs/docs_src/kafka/basic_security/app.py b/docs/docs_src/kafka/security/basic.py similarity index 60% rename from docs/docs_src/kafka/basic_security/app.py rename to docs/docs_src/kafka/security/basic.py index 39f8ca50c6..6c5b1c4b10 100644 --- a/docs/docs_src/kafka/basic_security/app.py +++ b/docs/docs_src/kafka/security/basic.py @@ -1,6 +1,5 @@ import ssl -from faststream import FastStream from faststream.kafka import KafkaBroker from faststream.security import BaseSecurity @@ -8,10 +7,3 @@ security = BaseSecurity(ssl_context=ssl_context) broker = KafkaBroker("localhost:9092", security=security) -app = FastStream(broker) - - -@broker.publisher("test_2") -@broker.subscriber("test_1") -async def test_topic(msg: str) -> str: - pass diff --git a/docs/docs_src/kafka/security/plaintext.py b/docs/docs_src/kafka/security/plaintext.py new file mode 100644 index 0000000000..3a17eed5de --- /dev/null +++ b/docs/docs_src/kafka/security/plaintext.py @@ -0,0 +1,13 @@ +import ssl + +from faststream.kafka import KafkaBroker +from faststream.security import SASLPlaintext + +ssl_context = ssl.create_default_context() +security = SASLPlaintext( + ssl_context=ssl_context, + username="admin", + password="password", +) + +broker = KafkaBroker("localhost:9092", security=security) diff --git a/docs/docs_src/kafka/security/sasl_scram256.py b/docs/docs_src/kafka/security/sasl_scram256.py new file mode 100644 index 0000000000..04cf0278cd --- /dev/null +++ b/docs/docs_src/kafka/security/sasl_scram256.py @@ -0,0 +1,13 @@ +import ssl + +from faststream.kafka import KafkaBroker +from faststream.security import SASLScram256 + +ssl_context = ssl.create_default_context() +security = SASLScram256( + ssl_context=ssl_context, + username="admin", + password="password", +) + +broker = KafkaBroker("localhost:9092", security=security) diff --git a/docs/docs_src/kafka/security/sasl_scram512.py b/docs/docs_src/kafka/security/sasl_scram512.py new file mode 100644 index 0000000000..7d1e5ac7ff --- /dev/null +++ b/docs/docs_src/kafka/security/sasl_scram512.py @@ -0,0 +1,13 @@ +import ssl + +from faststream.kafka import KafkaBroker +from faststream.security import SASLScram512 + +ssl_context = ssl.create_default_context() +security = SASLScram512( + ssl_context=ssl_context, + username="admin", + password="password", +) + +broker = KafkaBroker("localhost:9092", security=security) diff --git a/docs/docs_src/kafka/security_without_ssl/example.py b/docs/docs_src/kafka/security/ssl_warning.py similarity index 91% rename from docs/docs_src/kafka/security_without_ssl/example.py rename to docs/docs_src/kafka/security/ssl_warning.py index 1ef559d824..df4971a90a 100644 --- a/docs/docs_src/kafka/security_without_ssl/example.py +++ b/docs/docs_src/kafka/security/ssl_warning.py @@ -9,4 +9,4 @@ def test_without_ssl_warning(): with pytest.warns(RuntimeWarning, match=ssl_not_set_error_msg): parse_security(security) - SASLPlaintext(username="admin", password="password", use_ssl=False) # pragma: allowlist secret + SASLPlaintext(username="admin", password="password", use_ssl=False) diff --git a/docs/includes/en/no_ack.md b/docs/includes/en/no_ack.md index ce64b2ac8f..84c3f35d9d 100644 --- a/docs/includes/en/no_ack.md +++ b/docs/includes/en/no_ack.md @@ -1,3 +1,3 @@ !!! tip - If you want to disable **FastStream** Acknowledgement logic at all, you can use + If you want to disable **FastStream** Acknowledgement logic at all, you can use `#!python @broker.subscriber(..., no_ack=True)` option. This way you should always process a message (ack/nack/terminate/etc) by yourself. diff --git a/docs/includes/getting_started/context/access.md b/docs/includes/getting_started/context/access.md index b446ab2279..cebaec1690 100644 --- a/docs/includes/getting_started/context/access.md +++ b/docs/includes/getting_started/context/access.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="1 12-15" - {!> docs_src/getting_started/context/kafka/existed_context.py [ln:1-2,10-11,14-23] !} + ```python linenums="1" hl_lines="1 10-13" + {!> docs_src/getting_started/context/kafka/existed_context.py [ln:1-2,9-12,14-23] !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="1 12-15" - {!> docs_src/getting_started/context/rabbit/existed_context.py [ln:1-2,10-11,14-23] !} + ```python linenums="1" hl_lines="1 10-13" + {!> docs_src/getting_started/context/rabbit/existed_context.py [ln:1-2,9-12,14-23] !} ``` === "NATS" - ```python linenums="1" hl_lines="1 12-15" - {!> docs_src/getting_started/context/nats/existed_context.py [ln:1-2,10-11,14-23] !} + ```python linenums="1" hl_lines="1 10-13" + {!> docs_src/getting_started/context/nats/existed_context.py [ln:1-2,9-12,14-23] !} ``` === "Redis" - ```python linenums="1" hl_lines="1 12-15" - {!> docs_src/getting_started/context/redis/existed_context.py [ln:1-2,10-11,14-23] !} + ```python linenums="1" hl_lines="1 10-13" + {!> docs_src/getting_started/context/redis/existed_context.py [ln:1-2,9-12,14-23] !} ``` diff --git a/docs/includes/getting_started/context/custom_global.md b/docs/includes/getting_started/context/custom_global.md index c4a6465183..a7040e42df 100644 --- a/docs/includes/getting_started/context/custom_global.md +++ b/docs/includes/getting_started/context/custom_global.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="9-10" - {!> docs_src/getting_started/context/kafka/custom_global_context.py [ln:1-5,16-18] !} + ```python linenums="1" hl_lines="8-9" + {!> docs_src/getting_started/context/kafka/custom_global_context.py [ln:1-5,15-18] !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="9-10" - {!> docs_src/getting_started/context/rabbit/custom_global_context.py [ln:1-5,16-18] !} + ```python linenums="1" hl_lines="8-9" + {!> docs_src/getting_started/context/rabbit/custom_global_context.py [ln:1-5,15-18] !} ``` === "NATS" - ```python linenums="1" hl_lines="9-10" - {!> docs_src/getting_started/context/nats/custom_global_context.py [ln:1-5,16-18] !} + ```python linenums="1" hl_lines="8-9" + {!> docs_src/getting_started/context/nats/custom_global_context.py [ln:1-5,15-18] !} ``` === "Redis" - ```python linenums="1" hl_lines="9-10" - {!> docs_src/getting_started/context/redis/custom_global_context.py [ln:1-5,16-18] !} + ```python linenums="1" hl_lines="8-9" + {!> docs_src/getting_started/context/redis/custom_global_context.py [ln:1-5,15-18] !} ``` diff --git a/docs/includes/getting_started/context/existed_annotations.md b/docs/includes/getting_started/context/existed_annotations.md index 73faf9bf17..74c9fa0ce1 100644 --- a/docs/includes/getting_started/context/existed_annotations.md +++ b/docs/includes/getting_started/context/existed_annotations.md @@ -15,8 +15,8 @@ To use them, simply import and use them as subscriber argument annotations. - ```python linenums="1" hl_lines="3-8 17-20" - {!> docs_src/getting_started/context/kafka/existed_context.py [ln:1-11,26-35] !} + ```python linenums="1" hl_lines="3-8 16-19" + {!> docs_src/getting_started/context/kafka/existed_context.py [ln:1-11,25-35] !} ``` === "RabbitMQ" @@ -36,8 +36,8 @@ To use them, simply import and use them as subscriber argument annotations. - ```python linenums="1" hl_lines="3-8 17-20" - {!> docs_src/getting_started/context/rabbit/existed_context.py [ln:1-11,26-35] !} + ```python linenums="1" hl_lines="3-8 16-19" + {!> docs_src/getting_started/context/rabbit/existed_context.py [ln:1-11,25-35] !} ``` === "NATS" @@ -57,13 +57,13 @@ ``` To use them, simply import and use them as subscriber argument annotations. - ```python linenums="1" hl_lines="3-8 17-20" - {!> docs_src/getting_started/context/nats/existed_context.py [ln:1-11,26-35] !} + ```python linenums="1" hl_lines="3-8 16-19" + {!> docs_src/getting_started/context/nats/existed_context.py [ln:1-11,25-35] !} ``` === "Redis" ```python - from faststream.rabbit.annotations import ( + from faststream.redis.annotations import ( Logger, ContextRepo, RedisMessage, RedisBroker, Redis, NoCast, ) @@ -77,6 +77,6 @@ ``` To use them, simply import and use them as subscriber argument annotations. - ```python linenums="1" hl_lines="3-8 17-20" - {!> docs_src/getting_started/context/redis/existed_context.py [ln:1-11,26-35] !} + ```python linenums="1" hl_lines="3-8 16-19" + {!> docs_src/getting_started/context/redis/existed_context.py [ln:1-11,25-35] !} ``` diff --git a/docs/includes/getting_started/context/fields.md b/docs/includes/getting_started/context/fields.md index 9d522c432e..32cf4ee896 100644 --- a/docs/includes/getting_started/context/fields.md +++ b/docs/includes/getting_started/context/fields.md @@ -6,21 +6,7 @@ {{ comment_1 }} ```python - {!> docs_src/getting_started/context/kafka/fields_access.py [ln:11] !} - ``` - - {{ comment_2 }} - - - ```python - {!> docs_src/getting_started/context/kafka/fields_access.py [ln:12] !} - ``` - - {{ comment_3 }} - - - ```python - {!> docs_src/getting_started/context/kafka/fields_access.py [ln:13] !} + {!> docs_src/getting_started/context/kafka/fields_access.py [ln:11.5] !} ``` === "RabbitMQ" @@ -31,21 +17,7 @@ {{ comment_1 }} ```python - {!> docs_src/getting_started/context/rabbit/fields_access.py [ln:11] !} - ``` - - {{ comment_2 }} - - - ```python - {!> docs_src/getting_started/context/rabbit/fields_access.py [ln:12] !} - ``` - - {{ comment_3 }} - - - ```python - {!> docs_src/getting_started/context/rabbit/fields_access.py [ln:13] !} + {!> docs_src/getting_started/context/rabbit/fields_access.py [ln:11.5] !} ``` === "NATS" @@ -56,21 +28,7 @@ {{ comment_1 }} ```python - {!> docs_src/getting_started/context/nats/fields_access.py [ln:11] !} - ``` - - {{ comment_2 }} - - - ```python - {!> docs_src/getting_started/context/nats/fields_access.py [ln:12] !} - ``` - - {{ comment_3 }} - - - ```python - {!> docs_src/getting_started/context/nats/fields_access.py [ln:13] !} + {!> docs_src/getting_started/context/nats/fields_access.py [ln:11.5] !} ``` === "Redis" @@ -81,19 +39,19 @@ {{ comment_1 }} ```python - {!> docs_src/getting_started/context/redis/fields_access.py [ln:11] !} + {!> docs_src/getting_started/context/redis/fields_access.py [ln:11.5] !} ``` - {{ comment_2 }} +{{ comment_2 }} - ```python - {!> docs_src/getting_started/context/redis/fields_access.py [ln:12] !} - ``` +```python +{! docs_src/getting_started/context/kafka/fields_access.py [ln:12.5] !} +``` - {{ comment_3 }} +{{ comment_3 }} - ```python - {!> docs_src/getting_started/context/redis/fields_access.py [ln:13] !} - ``` +```python +{! docs_src/getting_started/context/kafka/fields_access.py [ln:13.5] !} +``` \ No newline at end of file diff --git a/docs/includes/getting_started/lifespan/2.md b/docs/includes/getting_started/lifespan/2.md index 8838459fb1..f0e7f89739 100644 --- a/docs/includes/getting_started/lifespan/2.md +++ b/docs/includes/getting_started/lifespan/2.md @@ -1,19 +1,3 @@ -=== "Kafka" - ```python linenums="14" hl_lines="1" - {!> docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18]!} - ``` - -=== "RabbitMQ" - ```python linenums="14" hl_lines="1" - {!> docs_src/getting_started/lifespan/rabbit/basic.py [ln:14-18]!} - ``` - -=== "NATS" - ```python linenums="14" hl_lines="1" - {!> docs_src/getting_started/lifespan/nats/basic.py [ln:14-18]!} - ``` - -=== "Redis" - ```python linenums="14" hl_lines="1" - {!> docs_src/getting_started/lifespan/redis/basic.py [ln:14-18]!} - ``` +```python linenums="14" hl_lines="1" +{! docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18]!} +``` \ No newline at end of file diff --git a/docs/includes/getting_started/lifespan/3.md b/docs/includes/getting_started/lifespan/3.md index 66c3b1c414..3d9944e148 100644 --- a/docs/includes/getting_started/lifespan/3.md +++ b/docs/includes/getting_started/lifespan/3.md @@ -1,19 +1,3 @@ -=== "Kafka" - ```python linenums="14" hl_lines="2" - {!> docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18]!} - ``` - -=== "RabbitMQ" - ```python linenums="14" hl_lines="2" - {!> docs_src/getting_started/lifespan/rabbit/basic.py [ln:14-18]!} - ``` - -=== "NATS" - ```python linenums="14" hl_lines="2" - {!> docs_src/getting_started/lifespan/nats/basic.py [ln:14-18]!} - ``` - -=== "Redis" - ```python linenums="14" hl_lines="2" - {!> docs_src/getting_started/lifespan/redis/basic.py [ln:14-18]!} - ``` +```python linenums="14" hl_lines="2" +{! docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18]!} +``` \ No newline at end of file diff --git a/docs/includes/getting_started/lifespan/4.md b/docs/includes/getting_started/lifespan/4.md index 668f90ef6e..4dcf35063d 100644 --- a/docs/includes/getting_started/lifespan/4.md +++ b/docs/includes/getting_started/lifespan/4.md @@ -1,19 +1,3 @@ -=== "Kafka" - ```python linenums="14" hl_lines="3" - {!> docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18] !} - ``` - -=== "RabbitMQ" - ```python linenums="14" hl_lines="3" - {!> docs_src/getting_started/lifespan/rabbit/basic.py [ln:14-18] !} - ``` - -=== "NATS" - ```python linenums="14" hl_lines="3" - {!> docs_src/getting_started/lifespan/nats/basic.py [ln:14-18] !} - ``` - -=== "Redis" - ```python linenums="14" hl_lines="3" - {!> docs_src/getting_started/lifespan/redis/basic.py [ln:14-18] !} - ``` +```python linenums="14" hl_lines="3" +{! docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18] !} +``` \ No newline at end of file diff --git a/docs/includes/getting_started/lifespan/6.md b/docs/includes/getting_started/lifespan/6.md index 8499e7090e..11a9d53cde 100644 --- a/docs/includes/getting_started/lifespan/6.md +++ b/docs/includes/getting_started/lifespan/6.md @@ -1,19 +1,3 @@ -=== "Kafka" - ```python linenums="14" hl_lines="5" - {!> docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18] !} - ``` - -=== "RabbitMQ" - ```python linenums="14" hl_lines="5" - {!> docs_src/getting_started/lifespan/rabbit/basic.py [ln:14-18] !} - ``` - -=== "NATS" - ```python linenums="14" hl_lines="5" - {!> docs_src/getting_started/lifespan/nats/basic.py [ln:14-18] !} - ``` - -=== "Redis" - ```python linenums="14" hl_lines="5" - {!> docs_src/getting_started/lifespan/redis/basic.py [ln:14-18] !} - ``` +```python linenums="14" hl_lines="5" +{! docs_src/getting_started/lifespan/kafka/basic.py [ln:14-18] !} +``` \ No newline at end of file diff --git a/docs/includes/getting_started/publishing/broker/1.md b/docs/includes/getting_started/publishing/broker/1.md index e3922258d2..ee7a296108 100644 --- a/docs/includes/getting_started/publishing/broker/1.md +++ b/docs/includes/getting_started/publishing/broker/1.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" + ```python linenums="1" hl_lines="10 20" {!> docs_src/getting_started/publishing/kafka/broker.py !} ``` === "RabbitMQ" - ```python linenums="1" + ```python linenums="1" hl_lines="10 20" {!> docs_src/getting_started/publishing/rabbit/broker.py !} ``` === "NATS" - ```python linenums="1" + ```python linenums="1" hl_lines="10 20" {!> docs_src/getting_started/publishing/nats/broker.py !} ``` === "Redis" - ```python linenums="1" + ```python linenums="1" hl_lines="10 20" {!> docs_src/getting_started/publishing/redis/broker.py !} ``` diff --git a/docs/includes/getting_started/publishing/decorator/1.md b/docs/includes/getting_started/publishing/decorator/1.md index 3689791fd2..ca7a1dda1f 100644 --- a/docs/includes/getting_started/publishing/decorator/1.md +++ b/docs/includes/getting_started/publishing/decorator/1.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" + ```python linenums="1" hl_lines="9" {!> docs_src/getting_started/publishing/kafka/decorator.py !} ``` === "RabbitMQ" - ```python linenums="1" + ```python linenums="1" hl_lines="9" {!> docs_src/getting_started/publishing/rabbit/decorator.py !} ``` === "NATS" - ```python linenums="1" + ```python linenums="1" hl_lines="9" {!> docs_src/getting_started/publishing/nats/decorator.py !} ``` === "Redis" - ```python linenums="1" + ```python linenums="1" hl_lines="9" {!> docs_src/getting_started/publishing/redis/decorator.py !} ``` diff --git a/docs/includes/getting_started/publishing/direct/1.md b/docs/includes/getting_started/publishing/direct/1.md index bf8249c88d..b8f9eac5cc 100644 --- a/docs/includes/getting_started/publishing/direct/1.md +++ b/docs/includes/getting_started/publishing/direct/1.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" + ```python linenums="1" hl_lines="7 11" {!> docs_src/getting_started/publishing/kafka/direct.py !} ``` === "RabbitMQ" - ```python linenums="1" + ```python linenums="1" hl_lines="7 11" {!> docs_src/getting_started/publishing/rabbit/direct.py !} ``` === "NATS" - ```python linenums="1" + ```python linenums="1" hl_lines="7 11" {!> docs_src/getting_started/publishing/nats/direct.py !} ``` === "Redis" - ```python linenums="1" + ```python linenums="1" hl_lines="7 11" {!> docs_src/getting_started/publishing/redis/direct.py !} ``` diff --git a/docs/includes/getting_started/publishing/object/1.md b/docs/includes/getting_started/publishing/object/1.md index eb9cc85c39..86100a1c1a 100644 --- a/docs/includes/getting_started/publishing/object/1.md +++ b/docs/includes/getting_started/publishing/object/1.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" + ```python linenums="1" hl_lines="7 9" {!> docs_src/getting_started/publishing/kafka/object.py !} ``` === "RabbitMQ" - ```python linenums="1" + ```python linenums="1" hl_lines="7 9" {!> docs_src/getting_started/publishing/rabbit/object.py !} ``` === "NATS" - ```python linenums="1" + ```python linenums="1" hl_lines="7 9" {!> docs_src/getting_started/publishing/nats/object.py !} ``` === "Redis" - ```python linenums="1" + ```python linenums="1" hl_lines="7 9" {!> docs_src/getting_started/publishing/redis/object.py !} ``` diff --git a/docs/includes/getting_started/publishing/testing/1.md b/docs/includes/getting_started/publishing/testing/1.md index dbe41b71c9..8568ae675c 100644 --- a/docs/includes/getting_started/publishing/testing/1.md +++ b/docs/includes/getting_started/publishing/testing/1.md @@ -1,19 +1,19 @@ === "Kafka" ```python linenums="1" - {!> docs_src/getting_started/publishing/kafka/object.py[ln:7-12] !} + {!> docs_src/getting_started/publishing/kafka/object.py [ln:7-12] !} ``` === "RabbitMQ" ```python linenums="1" - {!> docs_src/getting_started/publishing/rabbit/object.py[ln:7-12] !} + {!> docs_src/getting_started/publishing/rabbit/object.py [ln:7-12] !} ``` === "NATS" ```python linenums="1" - {!> docs_src/getting_started/publishing/nats/object.py[ln:7-12] !} + {!> docs_src/getting_started/publishing/nats/object.py [ln:7-12] !} ``` === "Redis" ```python linenums="1" - {!> docs_src/getting_started/publishing/redis/object.py[ln:7-12] !} + {!> docs_src/getting_started/publishing/redis/object.py [ln:7-12] !} ``` diff --git a/docs/includes/getting_started/publishing/testing/2.md b/docs/includes/getting_started/publishing/testing/2.md index f17565632f..b7b1032fe2 100644 --- a/docs/includes/getting_started/publishing/testing/2.md +++ b/docs/includes/getting_started/publishing/testing/2.md @@ -1,19 +1,19 @@ === "Kafka" ```python linenums="1" - {!> docs_src/getting_started/publishing/kafka/direct.py[ln:7-11] !} + {!> docs_src/getting_started/publishing/kafka/direct.py [ln:7-11] !} ``` === "RabbitMQ" ```python linenums="1" - {!> docs_src/getting_started/publishing/rabbit/direct.py[ln:7-11] !} + {!> docs_src/getting_started/publishing/rabbit/direct.py [ln:7-11] !} ``` === "NATS" ```python linenums="1" - {!> docs_src/getting_started/publishing/nats/direct.py[ln:7-11] !} + {!> docs_src/getting_started/publishing/nats/direct.py [ln:7-11] !} ``` === "Redis" ```python linenums="1" - {!> docs_src/getting_started/publishing/redis/direct.py[ln:7-11] !} + {!> docs_src/getting_started/publishing/redis/direct.py [ln:7-11] !} ``` diff --git a/docs/includes/getting_started/publishing/testing/3.md b/docs/includes/getting_started/publishing/testing/3.md index 2c34b8ece9..203f21bc19 100644 --- a/docs/includes/getting_started/publishing/testing/3.md +++ b/docs/includes/getting_started/publishing/testing/3.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" - {!> docs_src/getting_started/publishing/kafka/object_testing.py [ln:1-3,8-12] !} + ```python linenums="1" hl_lines="7-8" + {!> docs_src/getting_started/publishing/kafka/object_testing.py [ln:1-4,8-12] !} ``` === "RabbitMQ" - ```python linenums="1" - {!> docs_src/getting_started/publishing/rabbit/object_testing.py [ln:1-3,8-12] !} + ```python linenums="1" hl_lines="7-8" + {!> docs_src/getting_started/publishing/rabbit/object_testing.py [ln:1-4,8-12] !} ``` === "NATS" - ```python linenums="1" - {!> docs_src/getting_started/publishing/nats/object_testing.py [ln:1-3,8-12] !} + ```python linenums="1" hl_lines="7-8" + {!> docs_src/getting_started/publishing/nats/object_testing.py [ln:1-4,8-12] !} ``` === "Redis" - ```python linenums="1" - {!> docs_src/getting_started/publishing/redis/object_testing.py [ln:1-3,8-12] !} + ```python linenums="1" hl_lines="7-8" + {!> docs_src/getting_started/publishing/redis/object_testing.py [ln:1-4,8-12] !} ``` diff --git a/docs/includes/getting_started/routers/1.md b/docs/includes/getting_started/routers/1.md index 27c9a1e50b..93a3798a22 100644 --- a/docs/includes/getting_started/routers/1.md +++ b/docs/includes/getting_started/routers/1.md @@ -42,44 +42,29 @@ {{ note_include }} -=== "Kafka" - ```python hl_lines="1" - {!> docs_src/getting_started/routers/kafka/router.py [ln:22] !} - ``` - -=== "RabbitMQ" - ```python hl_lines="1" - {!> docs_src/getting_started/routers/rabbit/router.py [ln:22] !} - ``` -=== "NATS" - ```python hl_lines="1" - {!> docs_src/getting_started/routers/nats/router.py [ln:22] !} - ``` - -=== "NATS" - ```python hl_lines="1" - {!> docs_src/getting_started/routers/redis/router.py [ln:22] !} - ``` +```python +{!> docs_src/getting_started/routers/kafka/router.py [ln:22] !} +``` {{ note_publish }} === "Kafka" ```python hl_lines="3" - {!> docs_src/getting_started/routers/kafka/router.py [ln:27-30] !} + {!> docs_src/getting_started/routers/kafka/router.py [ln:27.5,28.5,29.5,30.5] !} ``` === "RabbitMQ" ```python hl_lines="3" - {!> docs_src/getting_started/routers/rabbit/router.py [ln:27-30] !} + {!> docs_src/getting_started/routers/rabbit/router.py [ln:27.5,28.5,29.5,30.5] !} ``` === "NATS" ```python hl_lines="3" - {!> docs_src/getting_started/routers/nats/router.py [ln:27-30] !} + {!> docs_src/getting_started/routers/nats/router.py [ln:27.5,28.5,29.5,30.5] !} ``` === "Redis" ```python hl_lines="3" - {!> docs_src/getting_started/routers/redis/router.py [ln:27-30] !} + {!> docs_src/getting_started/routers/redis/router.py [ln:27.5,28.5,29.5,30.5] !} ``` diff --git a/docs/includes/getting_started/routers/2.md b/docs/includes/getting_started/routers/2.md index 06da49f27a..6aa70e4f6f 100644 --- a/docs/includes/getting_started/routers/2.md +++ b/docs/includes/getting_started/routers/2.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="2 8 13 15" - {!> docs_src/getting_started/routers/kafka/router_delay.py [ln:1-15] !} + ```python linenums="1" hl_lines="8-10" + {!> docs_src/getting_started/routers/kafka/router_delay.py [ln:3-4,9-12,14-18] !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="2 8 13 15" - {!> docs_src/getting_started/routers/rabbit/router_delay.py [ln:1-15] !} + ```python linenums="1" hl_lines="8-10" + {!> docs_src/getting_started/routers/rabbit/router_delay.py [ln:3-4,9-12,14-18] !} ``` === "NATS" - ```python linenums="1" hl_lines="2 8 13 15" - {!> docs_src/getting_started/routers/nats/router_delay.py [ln:1-15] !} + ```python linenums="1" hl_lines="8-10" + {!> docs_src/getting_started/routers/nats/router_delay.py [ln:3-4,9-12,14-18] !} ``` === "Redis" - ```python linenums="1" hl_lines="2 8 13 15" - {!> docs_src/getting_started/routers/redis/router_delay.py [ln:1-15] !} + ```python linenums="1" hl_lines="8-10" + {!> docs_src/getting_started/routers/redis/router_delay.py [ln:3-4,9-12,14-18] !} ``` diff --git a/docs/includes/getting_started/serialization/decoder/1.md b/docs/includes/getting_started/serialization/decoder/1.md index 74149351bc..81f05145a3 100644 --- a/docs/includes/getting_started/serialization/decoder/1.md +++ b/docs/includes/getting_started/serialization/decoder/1.md @@ -1,5 +1,5 @@ === "Kafka" - ``` python + ```python from faststream.types import DecodedMessage from faststream.kafka import KafkaMessage @@ -8,7 +8,7 @@ ``` === "RabbitMQ" - ``` python + ```python from faststream.types import DecodedMessage from faststream.rabbit import RabbitMessage @@ -17,7 +17,7 @@ ``` === "NATS" - ``` python + ```python from faststream.types import DecodedMessage from faststream.nats import NatsMessage @@ -26,7 +26,7 @@ ``` === "Redis" - ``` python + ```python from faststream.types import DecodedMessage from faststream.redis import RedisMessage diff --git a/docs/includes/getting_started/serialization/decoder/2.md b/docs/includes/getting_started/serialization/decoder/2.md index 277ab83a55..4d9a57ca10 100644 --- a/docs/includes/getting_started/serialization/decoder/2.md +++ b/docs/includes/getting_started/serialization/decoder/2.md @@ -1,5 +1,5 @@ === "Kafka" - ``` python + ```python from types import Callable, Awaitable from faststream.types import DecodedMessage from faststream.kafka import KafkaMessage @@ -12,7 +12,7 @@ ``` === "RabbitMQ" - ``` python + ```python from types import Callable, Awaitable from faststream.types import DecodedMessage from faststream.rabbit import RabbitMessage @@ -25,7 +25,7 @@ ``` === "NATS" - ``` python + ```python from types import Callable, Awaitable from faststream.types import DecodedMessage from faststream.nats import NatsMessage @@ -38,7 +38,7 @@ ``` === "Redis" - ``` python + ```python from types import Callable, Awaitable from faststream.types import DecodedMessage from faststream.redis import RedisMessage diff --git a/docs/includes/getting_started/serialization/parser/1.md b/docs/includes/getting_started/serialization/parser/1.md index 3583383e8a..541381ff41 100644 --- a/docs/includes/getting_started/serialization/parser/1.md +++ b/docs/includes/getting_started/serialization/parser/1.md @@ -1,5 +1,5 @@ === "Kafka" - ``` python + ```python from aiokafka import ConsumerRecord from faststream.kafka import KafkaMessage @@ -8,7 +8,7 @@ ``` === "RabbitMQ" - ``` python + ```python from aio_pika import IncomingMessage from faststream.rabbit import RabbitMessage @@ -17,7 +17,7 @@ ``` === "NATS" - ``` python + ```python from nats.aio.msg import Msg from faststream.nats import NatsMessage @@ -26,7 +26,7 @@ ``` === "Redis" - ``` python + ```python from faststream.redis import RedisMessage from faststream.redis.message import PubSubMessage diff --git a/docs/includes/getting_started/serialization/parser/2.md b/docs/includes/getting_started/serialization/parser/2.md index 7275ff7053..aeb4323a6b 100644 --- a/docs/includes/getting_started/serialization/parser/2.md +++ b/docs/includes/getting_started/serialization/parser/2.md @@ -1,5 +1,5 @@ === "Kafka" - ``` python + ```python from types import Callable, Awaitable from aiokafka import ConsumerRecord from faststream.kafka import KafkaMessage @@ -12,7 +12,7 @@ ``` === "RabbitMQ" - ``` python + ```python from types import Callable, Awaitable from aio_pika import IncomingMessage from faststream.rabbit import RabbitMessage @@ -25,7 +25,7 @@ ``` === "NATS" - ``` python + ```python from types import Callable, Awaitable from nats.aio.msg import Msg from faststream.nats import NatsMessage @@ -38,7 +38,7 @@ ``` === "Redis" - ``` python + ```python from types import Callable, Awaitable from faststream.redis import RedisMessage from faststream.redis.message import PubSubMessage diff --git a/docs/includes/getting_started/serialization/parser/3.md b/docs/includes/getting_started/serialization/parser/3.md index 2dd970cdd1..a85356d9f7 100644 --- a/docs/includes/getting_started/serialization/parser/3.md +++ b/docs/includes/getting_started/serialization/parser/3.md @@ -1,20 +1,20 @@ === "Kafka" - ``` python linenums="1" hl_lines="9-15 18 29" + ```python linenums="1" hl_lines="9-15 18 29" {!> docs_src/getting_started/serialization/parser_kafka.py !} ``` === "RabbitMQ" - ``` python linenums="1" hl_lines="9-15 18 29" + ```python linenums="1" hl_lines="9-15 18 29" {!> docs_src/getting_started/serialization/parser_rabbit.py !} ``` === "NATS" - ``` python linenums="1" hl_lines="9-15 18 29" + ```python linenums="1" hl_lines="9-15 18 29" {!> docs_src/getting_started/serialization/parser_nats.py !} ``` === "Redis" - ``` python linenums="1" hl_lines="8-14 17 28" + ```python linenums="1" hl_lines="8-14 17 28" {!> docs_src/getting_started/serialization/parser_redis.py !} ``` diff --git a/docs/includes/getting_started/subscription/annotation/1.md b/docs/includes/getting_started/subscription/annotation/1.md index aa2c1fb736..30a0fe0661 100644 --- a/docs/includes/getting_started/subscription/annotation/1.md +++ b/docs/includes/getting_started/subscription/annotation/1.md @@ -1,13 +1,19 @@ -```python +```python hl_lines="3 9 15" @broker.subscriber("test") -async def handle(msg: str): +async def handle( + msg: str, +): ... @broker.subscriber("test") -async def handle(msg: bytes): +async def handle( + msg: bytes, +): ... @broker.subscriber("test") -async def handle(msg: int): +async def handle( + msg: int, +): ... ``` diff --git a/docs/includes/getting_started/subscription/annotation/2.md b/docs/includes/getting_started/subscription/annotation/2.md index c1cb232880..e9f0d36a33 100644 --- a/docs/includes/getting_started/subscription/annotation/2.md +++ b/docs/includes/getting_started/subscription/annotation/2.md @@ -1,7 +1,9 @@ -```python +```python hl_lines="5" from typing import Dict, Any @broker.subscriber("test") -async def handle(msg: dict[str, Any]): +async def handle( + msg: dict[str, Any], +): ... ``` diff --git a/docs/includes/getting_started/subscription/annotation/3.md b/docs/includes/getting_started/subscription/annotation/3.md index 38f2c1d713..d7bb3ec7d0 100644 --- a/docs/includes/getting_started/subscription/annotation/3.md +++ b/docs/includes/getting_started/subscription/annotation/3.md @@ -1,19 +1,19 @@ === "Kafka" - ```python - {!> docs_src/getting_started/subscription/kafka/annotation.py [ln:8-11] !} + ```python hl_lines="3-4" + {!> docs_src/getting_started/subscription/kafka/annotation.py [ln:8-14] !} ``` === "RabbitMQ" - ```python - {!> docs_src/getting_started/subscription/rabbit/annotation.py [ln:8-11] !} + ```python hl_lines="3-4" + {!> docs_src/getting_started/subscription/rabbit/annotation.py [ln:8-14] !} ``` === "NATS" - ```python - {!> docs_src/getting_started/subscription/nats/annotation.py [ln:8-11] !} + ```python hl_lines="3-4" + {!> docs_src/getting_started/subscription/nats/annotation.py [ln:8-14] !} ``` === "Redis" - ```python - {!> docs_src/getting_started/subscription/redis/annotation.py [ln:8-11] !} + ```python hl_lines="3-4" + {!> docs_src/getting_started/subscription/redis/annotation.py [ln:8-14] !} ``` diff --git a/docs/includes/getting_started/subscription/filtering/2.md b/docs/includes/getting_started/subscription/filtering/2.md index 6664f4fc4a..c22e5fac3a 100644 --- a/docs/includes/getting_started/subscription/filtering/2.md +++ b/docs/includes/getting_started/subscription/filtering/2.md @@ -1,19 +1,19 @@ === "Kafka" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/kafka/filter.py [ln:24-27] !} + {!> docs_src/getting_started/subscription/kafka/filter.py [ln:24.5,25.5,26.5,27.5] !} ``` === "RabbitMQ" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/rabbit/filter.py [ln:24-27] !} + {!> docs_src/getting_started/subscription/rabbit/filter.py [ln:24.5,25.5,26.5,27.5] !} ``` === "NATS" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/nats/filter.py [ln:24-27] !} + {!> docs_src/getting_started/subscription/nats/filter.py [ln:24.5,25.5,26.5,27.5] !} ``` === "Redis" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/redis/filter.py [ln:24-27] !} + {!> docs_src/getting_started/subscription/redis/filter.py [ln:24.5,25.5,26.5,27.5] !} ``` diff --git a/docs/includes/getting_started/subscription/filtering/3.md b/docs/includes/getting_started/subscription/filtering/3.md index 11664e8700..4fd735baed 100644 --- a/docs/includes/getting_started/subscription/filtering/3.md +++ b/docs/includes/getting_started/subscription/filtering/3.md @@ -1,19 +1,19 @@ === "Kafka" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/kafka/filter.py [ln:29-32] !} + {!> docs_src/getting_started/subscription/kafka/filter.py [ln:29.5,30.5,31.5,32.5] !} ``` === "RabbitMQ" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/rabbit/filter.py [ln:29-32] !} + {!> docs_src/getting_started/subscription/rabbit/filter.py [ln:29.5,30.5,31.5,32.5] !} ``` === "NATS" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/nats/filter.py [ln:29-32] !} + {!> docs_src/getting_started/subscription/nats/filter.py [ln:29.5,30.5,31.5,32.5] !} ``` === "Redis" ```python hl_lines="2" - {!> docs_src/getting_started/subscription/redis/filter.py [ln:29-32] !} + {!> docs_src/getting_started/subscription/redis/filter.py [ln:29.5,30.5,31.5,32.5] !} ``` diff --git a/docs/includes/getting_started/subscription/index/2.md b/docs/includes/getting_started/subscription/index/2.md index caed3e3113..9fff1c5518 100644 --- a/docs/includes/getting_started/subscription/index/2.md +++ b/docs/includes/getting_started/subscription/index/2.md @@ -1,5 +1,7 @@ -```python +```python hl_lines="3" @broker.subscriber("test") -async def handle_str(msg_body: str): +async def handle_str( + msg_body: str, +): ... ``` diff --git a/docs/includes/getting_started/subscription/index/3.md b/docs/includes/getting_started/subscription/index/3.md index c32f78dd25..a764d8b18f 100644 --- a/docs/includes/getting_started/subscription/index/3.md +++ b/docs/includes/getting_started/subscription/index/3.md @@ -1,5 +1,5 @@ === "Kafka" - ```python + ```python hl_lines="3" from faststream.kafka import KafkaBroker broker = KafkaBroker(apply_types=False) @@ -10,7 +10,7 @@ ``` === "RabbitMQ" - ```python + ```python hl_lines="3" from faststream.rabbit import RabbitBroker broker = RabbitBroker(apply_types=False) @@ -21,7 +21,7 @@ ``` === "NATS" - ```python + ```python hl_lines="3" from faststream.nats import NatsBroker broker = NatsBroker(apply_types=False) @@ -32,7 +32,7 @@ ``` === "Redis" - ```python + ```python hl_lines="3" from faststream.redis import RedisBroker broker = RedisBroker(apply_types=False) diff --git a/docs/includes/getting_started/subscription/index/4.md b/docs/includes/getting_started/subscription/index/4.md index e0063d6465..3ca69fa3e8 100644 --- a/docs/includes/getting_started/subscription/index/4.md +++ b/docs/includes/getting_started/subscription/index/4.md @@ -1,4 +1,4 @@ -```python +```python hl_lines="1-2" @broker.subscriber("first_sub") @broker.subscriber("second_sub") async def handler(msg): diff --git a/docs/includes/getting_started/subscription/pydantic/2.md b/docs/includes/getting_started/subscription/pydantic/2.md index 4a59a975e7..170798022a 100644 --- a/docs/includes/getting_started/subscription/pydantic/2.md +++ b/docs/includes/getting_started/subscription/pydantic/2.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="1 10 20" + ```python linenums="1" hl_lines="1 10 21" {!> docs_src/getting_started/subscription/kafka/pydantic_model.py !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="1 10 20" + ```python linenums="1" hl_lines="1 10 21" {!> docs_src/getting_started/subscription/rabbit/pydantic_model.py !} ``` === "NATS" - ```python linenums="1" hl_lines="1 10 20" + ```python linenums="1" hl_lines="1 10 21" {!> docs_src/getting_started/subscription/nats/pydantic_model.py !} ``` === "Redis" - ```python linenums="1" hl_lines="1 10 20" + ```python linenums="1" hl_lines="1 10 21" {!> docs_src/getting_started/subscription/redis/pydantic_model.py !} ``` diff --git a/docs/includes/getting_started/subscription/testing/2.md b/docs/includes/getting_started/subscription/testing/2.md index 5a6c2a4bc9..0005186bbc 100644 --- a/docs/includes/getting_started/subscription/testing/2.md +++ b/docs/includes/getting_started/subscription/testing/2.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="4 11-12" - {!> docs_src/getting_started/subscription/kafka/testing.py [ln:1-12] !} + ```python linenums="1" hl_lines="4 8-9" + {!> docs_src/getting_started/subscription/kafka/testing.py [ln:1-4,8-12] !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="4 11-12" - {!> docs_src/getting_started/subscription/rabbit/testing.py [ln:1-12] !} + ```python linenums="1" hl_lines="4 8-9" + {!> docs_src/getting_started/subscription/rabbit/testing.py [ln:1-4,8-12] !} ``` === "NATS" - ```python linenums="1" hl_lines="4 11-12" - {!> docs_src/getting_started/subscription/nats/testing.py [ln:1-12] !} + ```python linenums="1" hl_lines="4 8-9" + {!> docs_src/getting_started/subscription/nats/testing.py [ln:1-4,8-12] !} ``` === "Redis" - ```python linenums="1" hl_lines="4 11-12" - {!> docs_src/getting_started/subscription/redis/testing.py [ln:1-12] !} + ```python linenums="1" hl_lines="4 8-9" + {!> docs_src/getting_started/subscription/redis/testing.py [ln:1-4,8-12] !} ``` diff --git a/docs/includes/getting_started/subscription/testing/real.md b/docs/includes/getting_started/subscription/testing/real.md index b5774069ea..c17ffb7ec7 100644 --- a/docs/includes/getting_started/subscription/testing/real.md +++ b/docs/includes/getting_started/subscription/testing/real.md @@ -1,19 +1,19 @@ === "Kafka" - ```python linenums="1" hl_lines="4 11 13 20 23" - {!> docs_src/getting_started/subscription/kafka/real_testing.py !} + ```python linenums="1" hl_lines="4 8 10 17 20" + {!> docs_src/getting_started/subscription/kafka/real_testing.py [ln:1-5,9-25] !} ``` === "RabbitMQ" - ```python linenums="1" hl_lines="4 11 13 20 23" - {!> docs_src/getting_started/subscription/rabbit/real_testing.py !} + ```python linenums="1" hl_lines="4 8 10 17 20" + {!> docs_src/getting_started/subscription/rabbit/real_testing.py [ln:1-5,9-25] !} ``` === "NATS" - ```python linenums="1" hl_lines="4 11 13 20 23" - {!> docs_src/getting_started/subscription/nats/real_testing.py !} + ```python linenums="1" hl_lines="4 8 10 17 20" + {!> docs_src/getting_started/subscription/nats/real_testing.py [ln:1-5,9-25] !} ``` === "Redis" - ```python linenums="1" hl_lines="4 11 13 20 23" - {!> docs_src/getting_started/subscription/redis/real_testing.py !} + ```python linenums="1" hl_lines="4 8 10 17 20" + {!> docs_src/getting_started/subscription/redis/real_testing.py [ln:1-5,9-25] !} ``` diff --git a/docs/includes/index/integrations.md b/docs/includes/index/integrations.md index 5834c1d4e9..cbf14e65bf 100644 --- a/docs/includes/index/integrations.md +++ b/docs/includes/index/integrations.md @@ -1 +1 @@ -{!> includes/getting_started/integrations/http/1.md [ln:9-37] !} +{! includes/getting_started/integrations/http/1.md [ln:9-37] !} diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index e278b06ce3..9860a27f82 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -134,6 +134,7 @@ markdown_extensions: toc_depth: 3 - mdx_include: base_path: . + line_slice_separator: [] - extra - admonition # !!! note blocks support - pymdownx.details # admonition collapsible diff --git a/faststream/__about__.py b/faststream/__about__.py index 2e79d78edb..c1ed9f3fec 100644 --- a/faststream/__about__.py +++ b/faststream/__about__.py @@ -1,5 +1,5 @@ """Simple and fast framework to create message brokers based microservices""" -__version__ = "0.3.5" +__version__ = "0.3.6" INSTALL_YAML = """ diff --git a/faststream/_compat.py b/faststream/_compat.py index b625011bf3..68725e38f5 100644 --- a/faststream/_compat.py +++ b/faststream/_compat.py @@ -9,7 +9,6 @@ PYDANTIC_VERSION as PYDANTIC_VERSION, ) from fast_depends._compat import FieldInfo -from packaging.version import parse from pydantic import BaseModel if sys.version_info < (3, 12): @@ -54,9 +53,6 @@ IS_OPTIMIZED = os.getenv("PYTHONOPTIMIZE", False) -ANYIO_VERSION = parse(get_version("anyio")) -ANYIO_V3 = ANYIO_VERSION.major == 3 - def is_test_env() -> bool: return bool(os.getenv("PYTEST_CURRENT_TEST")) @@ -67,7 +63,7 @@ def is_test_env() -> bool: HAS_FASTAPI = True - major, minor, _ = map(int, FASTAPI_VERSION.split(".")) + major, minor, *_ = map(int, FASTAPI_VERSION.split(".")) FASTAPI_V2 = major > 0 or minor > 100 if FASTAPI_V2: @@ -191,11 +187,15 @@ def with_info_plain_validator_function( # type: ignore[misc] return {} +anyio_major, *_ = map(int, get_version("anyio").split(".")) +ANYIO_V3 = anyio_major == 3 + + if ANYIO_V3: from anyio import ExceptionGroup as ExceptionGroup # type: ignore[attr-defined] else: if sys.version_info < (3, 11): - from exceptiongroup import ( # type: ignore[assignment] + from exceptiongroup import ( # type: ignore[assignment,no-redef] ExceptionGroup as ExceptionGroup, ) else: diff --git a/faststream/broker/fastapi/route.py b/faststream/broker/fastapi/route.py index 3e8c9c35b2..efbd45c103 100644 --- a/faststream/broker/fastapi/route.py +++ b/faststream/broker/fastapi/route.py @@ -9,6 +9,7 @@ Callable, Coroutine, Generic, + List, Optional, Sequence, Union, @@ -146,15 +147,16 @@ class StreamMessage(Request): def __init__( self, - body: Optional[AnyDict] = None, - headers: Optional[AnyDict] = None, - path: Optional[AnyDict] = None, + *, + body: Union[AnyDict, List[Any]], + headers: AnyDict, + path: AnyDict, ) -> None: """Initialize a class instance. Args: - body: The body of the request as a dictionary. Default is None. - headers: The headers of the request as a dictionary. Default is None. + body: The body of the request as a dictionary. + headers: The headers of the request as a dictionary. Attributes: scope: A dictionary to store the scope of the request. @@ -164,11 +166,12 @@ def __init__( _query_params: A dictionary to store the query parameters of the request. """ - self.scope = {"path_params": path or {}} + self._headers = headers + self._body = body + self._query_params = path + + self.scope = {"path_params": self._query_params} self._cookies = {} - self._headers = headers or {} - self._body = body or {} - self._query_params = {**self._body, **(path or {})} @classmethod def get_session( @@ -222,15 +225,29 @@ async def app(message: NativeMessage[Any]) -> SendableMessage: The above docstring is autogenerated by docstring-gen library (https://docstring-gen.airt.ai) """ body = message.decoded_body + + fastapi_body: Union[AnyDict, List[Any]] if first_arg is not None: - if not isinstance(body, dict) and not isinstance(body, list): - fastapi_body: Any = {first_arg: body} + if isinstance(body, dict): + path = fastapi_body = body or {} + elif isinstance(body, list): + fastapi_body, path = body, {} else: - fastapi_body = body + path = fastapi_body = {first_arg: body} + + session = cls( + body=fastapi_body, + headers=message.headers, + path={**path, **message.path}, + ) - session = cls(fastapi_body, message.headers, message.path) else: - session = cls() + session = cls( + body={}, + headers={}, + path={}, + ) + return await func(session) return app diff --git a/faststream/cli/utils/parser.py b/faststream/cli/utils/parser.py index 15f05ede71..423df81203 100644 --- a/faststream/cli/utils/parser.py +++ b/faststream/cli/utils/parser.py @@ -12,7 +12,6 @@ def parse_cli_args(*args: str) -> Tuple[str, Dict[str, SettingField]]: Returns: A tuple containing the application name and a dictionary of additional keyword arguments. - """ extra_kwargs: Dict[str, SettingField] = {} @@ -70,7 +69,6 @@ def remove_prefix(text: str, prefix: str) -> str: Returns: str: The text with the prefix removed. If the text does not start with the prefix, the original text is returned. - """ if text.startswith(prefix): return text[len(prefix) :] diff --git a/faststream/kafka/message.py b/faststream/kafka/message.py index 4d727be675..569102f12f 100644 --- a/faststream/kafka/message.py +++ b/faststream/kafka/message.py @@ -44,8 +44,6 @@ async def ack(self, **kwargs: Any) -> None: Returns: None: This method does not return a value. """ - print("try to ack") if self.is_manual and not self.commited: - print("acked") await self.consumer.commit() await super().ack() diff --git a/faststream/redis/handler.py b/faststream/redis/handler.py index 2b8acb33ec..dacd24b79a 100644 --- a/faststream/redis/handler.py +++ b/faststream/redis/handler.py @@ -201,8 +201,8 @@ async def _consume( for i in msgs: await self.consume(i) - finally: - await anyio.sleep(sleep) + else: + await anyio.sleep(sleep) async def _consume_stream_msg( self, diff --git a/faststream/utils/context/types.py b/faststream/utils/context/types.py index a038fb55bf..adc5fe3ea5 100644 --- a/faststream/utils/context/types.py +++ b/faststream/utils/context/types.py @@ -16,7 +16,6 @@ class Context(CustomField): Methods: __init__ : constructor method use : method to use the context - """ param_name: str @@ -38,7 +37,6 @@ def __init__( Raises: TypeError: If the default value is not provided. - """ self.name = real_name self.default = default @@ -48,7 +46,7 @@ def __init__( required=(default is _empty), ) - def use(self, **kwargs: Any) -> AnyDict: + def use(self, /, **kwargs: Any) -> AnyDict: """Use the given keyword arguments. Args: @@ -60,8 +58,6 @@ def use(self, **kwargs: Any) -> AnyDict: Raises: KeyError: If the parameter name is not found in the keyword arguments AttributeError: If the parameter name is not a valid attribute - - """ name = f"{self.prefix}{self.name or self.param_name}" @@ -85,7 +81,6 @@ def resolve_context(argument: str) -> Any: Raises: AttributeError: If the attribute does not exist in the context. - """ keys = argument.split(".") diff --git a/pyproject.toml b/pyproject.toml index fad5c4caec..41daf36bc6 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -189,13 +189,14 @@ fix = true line-length = 88 target-version = "py38" select = [ - "E", # pycodestyle errors - "W", # pycodestyle warnings - "F", # pyflakes - "I", # isort - "C", # flake8-comprehensions - "B", # flake8-bugbear - "Q", # flake8-quotes + "E", # pycodestyle errors + "W", # pycodestyle warnings + "F", # pyflakes + "I", # isort + "C", # flake8-comprehensions + "B", # flake8-bugbear + "Q", # flake8-quotes + "T20", # flake8-print ] ignore = [ "E501", # line too long, handled by black diff --git a/scripts/lint.sh b/scripts/lint.sh index da884656ef..cdd8c7fdde 100755 --- a/scripts/lint.sh +++ b/scripts/lint.sh @@ -4,7 +4,7 @@ echo "Running pyup_dirs..." pyup_dirs --py38-plus --recursive faststream examples tests echo "Running ruff..." -ruff faststream examples tests --fix +ruff faststream tests --fix echo "Running black..." -black faststream examples tests +black faststream tests diff --git a/tests/docs/getting_started/subscription/test_annotated.py b/tests/docs/getting_started/subscription/test_annotated.py new file mode 100644 index 0000000000..db267b671e --- /dev/null +++ b/tests/docs/getting_started/subscription/test_annotated.py @@ -0,0 +1,84 @@ +import pytest +from pydantic import ValidationError + +from faststream.kafka import TestKafkaBroker +from faststream.nats import TestNatsBroker +from faststream.rabbit import TestRabbitBroker +from faststream.redis import TestRedisBroker +from tests.marks import python39 + + +@pytest.mark.asyncio +@python39 +class BaseCase: + async def test_handle(self, setup): + broker, handle = setup + + async with self.test_class(broker) as br: + await br.publish({"name": "John", "user_id": 1}, "test") + await handle.wait_call(timeout=3) + handle.mock.assert_called_once_with({"name": "John", "user_id": 1}) + + assert handle.mock is None + + async def test_validation_error(self, setup): + broker, handle = setup + + async with self.test_class(broker) as br: + with pytest.raises(ValidationError): + await br.publish("wrong message", "test") + await handle.wait_call(timeout=3) + + handle.mock.assert_called_once_with("wrong message") + + +class TestKafka(BaseCase): + test_class = TestKafkaBroker + + @pytest.fixture(scope="class") + def setup(self): + from docs.docs_src.getting_started.subscription.kafka.pydantic_annotated_fields import ( + broker, + handle, + ) + + return (broker, handle) + + +class TestRabbit(BaseCase): + test_class = TestRabbitBroker + + @pytest.fixture(scope="class") + def setup(self): + from docs.docs_src.getting_started.subscription.rabbit.pydantic_annotated_fields import ( + broker, + handle, + ) + + return (broker, handle) + + +class TestNats(BaseCase): + test_class = TestNatsBroker + + @pytest.fixture(scope="class") + def setup(self): + from docs.docs_src.getting_started.subscription.nats.pydantic_annotated_fields import ( + broker, + handle, + ) + + return (broker, handle) + + +class TestRedis(BaseCase): + test_class = TestRedisBroker + + @pytest.fixture(scope="class") + def setup(self): + from docs.docs_src.getting_started.subscription.redis.pydantic_annotated_fields import ( + broker, + handle, + ) + + return (broker, handle) diff --git a/tests/docs/kafka/test_security.py b/tests/docs/kafka/test_security.py index 1a71ca29bd..42d421ac3a 100644 --- a/tests/docs/kafka/test_security.py +++ b/tests/docs/kafka/test_security.py @@ -5,7 +5,7 @@ import pytest -from docs.docs_src.kafka.security_without_ssl.example import test_without_ssl_warning +from docs.docs_src.kafka.security.ssl_warning import test_without_ssl_warning __all__ = ["test_without_ssl_warning"] @@ -27,7 +27,11 @@ def patch_aio_consumer_and_producer() -> Tuple[MagicMock, MagicMock]: @pytest.mark.kafka async def test_base_security(): with patch_aio_consumer_and_producer() as (consumer, producer): - from docs.docs_src.kafka.basic_security.app import broker as basic_broker + from docs.docs_src.kafka.security.basic import broker as basic_broker + + @basic_broker.subscriber("test") + async def handler(): + ... async with basic_broker: await basic_broker.start() @@ -49,10 +53,14 @@ async def test_base_security(): @pytest.mark.kafka async def test_scram256(): with patch_aio_consumer_and_producer() as (consumer, producer): - from docs.docs_src.kafka.sasl_scram256_security.app import ( + from docs.docs_src.kafka.security.sasl_scram256 import ( broker as scram256_broker, ) + @scram256_broker.subscriber("test") + async def handler(): + ... + async with scram256_broker: await scram256_broker.start() @@ -76,10 +84,14 @@ async def test_scram256(): @pytest.mark.kafka async def test_scram512(): with patch_aio_consumer_and_producer() as (consumer, producer): - from docs.docs_src.kafka.sasl_scram512_security.app import ( + from docs.docs_src.kafka.security.sasl_scram512 import ( broker as scram512_broker, ) + @scram512_broker.subscriber("test") + async def handler(): + ... + async with scram512_broker: await scram512_broker.start() @@ -103,10 +115,14 @@ async def test_scram512(): @pytest.mark.kafka async def test_plaintext(): with patch_aio_consumer_and_producer() as (consumer, producer): - from docs.docs_src.kafka.plaintext_security.app import ( + from docs.docs_src.kafka.security.plaintext import ( broker as plaintext_broker, ) + @plaintext_broker.subscriber("test") + async def handler(): + ... + async with plaintext_broker: await plaintext_broker.start()