Full Code of airtai/fastkafka for AI

main 249d485f219a cached
352 files
4.0 MB
1.1M tokens
422 symbols
1 requests
Download .txt
Showing preview only (4,343K chars total). Download the full file or copy to clipboard to get everything.
Repository: airtai/fastkafka
Branch: main
Commit: 249d485f219a
Files: 352
Total size: 4.0 MB

Directory structure:
gitextract_q28pcl6l/

├── .github/
│   └── workflows/
│       ├── codeql.yml
│       ├── dependency-review.yml
│       ├── deploy.yaml
│       ├── index-docs-for-fastkafka-chat.yaml
│       └── test.yaml
├── .gitignore
├── .pre-commit-config.yaml
├── .semgrepignore
├── CHANGELOG.md
├── CNAME
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── docker/
│   ├── .semgrepignore
│   └── dev.yml
├── docusaurus/
│   ├── babel.config.js
│   ├── docusaurus.config.js
│   ├── package.json
│   ├── scripts/
│   │   ├── build_docusaurus_docs.sh
│   │   ├── install_docusaurus_deps.sh
│   │   ├── serve_docusaurus_docs.sh
│   │   └── update_readme.sh
│   ├── sidebars.js
│   ├── src/
│   │   ├── components/
│   │   │   ├── BrowserWindow/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageCommunity/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFAQ/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFastkafkaChat/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFeatures/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageWhatYouGet/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   └── RobotFooterIcon/
│   │   │       ├── index.js
│   │   │       └── styles.module.css
│   │   ├── css/
│   │   │   └── custom.css
│   │   ├── pages/
│   │   │   ├── demo/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── index.js
│   │   │   └── index.module.css
│   │   └── utils/
│   │       ├── prismDark.mjs
│   │       └── prismLight.mjs
│   ├── static/
│   │   ├── .nojekyll
│   │   └── CNAME
│   ├── versioned_docs/
│   │   ├── version-0.5.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   └── avsc_to_pydantic.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   └── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.6.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   └── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.7.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_12_Batch_Consuming.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.7.1/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_12_Batch_Consuming.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   └── version-0.8.0/
│   │       ├── CHANGELOG.md
│   │       ├── CNAME
│   │       ├── CONTRIBUTING.md
│   │       ├── LICENSE.md
│   │       ├── api/
│   │       │   └── fastkafka/
│   │       │       ├── EventMetadata.md
│   │       │       ├── FastKafka.md
│   │       │       ├── KafkaEvent.md
│   │       │       ├── encoder/
│   │       │       │   ├── AvroBase.md
│   │       │       │   ├── avro_decoder.md
│   │       │       │   ├── avro_encoder.md
│   │       │       │   ├── avsc_to_pydantic.md
│   │       │       │   ├── json_decoder.md
│   │       │       │   └── json_encoder.md
│   │       │       ├── executors/
│   │       │       │   ├── DynamicTaskExecutor.md
│   │       │       │   └── SequentialExecutor.md
│   │       │       └── testing/
│   │       │           ├── ApacheKafkaBroker.md
│   │       │           ├── LocalRedpandaBroker.md
│   │       │           └── Tester.md
│   │       ├── cli/
│   │       │   ├── fastkafka.md
│   │       │   └── run_fastkafka_server_process.md
│   │       ├── guides/
│   │       │   ├── Guide_00_FastKafka_Demo.md
│   │       │   ├── Guide_01_Intro.md
│   │       │   ├── Guide_02_First_Steps.md
│   │       │   ├── Guide_03_Authentication.md
│   │       │   ├── Guide_04_Github_Actions_Workflow.md
│   │       │   ├── Guide_05_Lifespan_Handler.md
│   │       │   ├── Guide_06_Benchmarking_FastKafka.md
│   │       │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │       │   ├── Guide_11_Consumes_Basics.md
│   │       │   ├── Guide_12_Batch_Consuming.md
│   │       │   ├── Guide_21_Produces_Basics.md
│   │       │   ├── Guide_22_Partition_Keys.md
│   │       │   ├── Guide_23_Batch_Producing.md
│   │       │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │       │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │       │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │       │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │       ├── index.md
│   │       └── overrides/
│   │           ├── css/
│   │           │   └── extra.css
│   │           └── js/
│   │               ├── extra.js
│   │               ├── math.js
│   │               └── mathjax.js
│   ├── versioned_sidebars/
│   │   ├── version-0.5.0-sidebars.json
│   │   ├── version-0.6.0-sidebars.json
│   │   ├── version-0.7.0-sidebars.json
│   │   ├── version-0.7.1-sidebars.json
│   │   └── version-0.8.0-sidebars.json
│   └── versions.json
├── fastkafka/
│   ├── __init__.py
│   ├── _aiokafka_imports.py
│   ├── _application/
│   │   ├── __init__.py
│   │   ├── app.py
│   │   └── tester.py
│   ├── _cli.py
│   ├── _cli_docs.py
│   ├── _cli_testing.py
│   ├── _components/
│   │   ├── __init__.py
│   │   ├── _subprocess.py
│   │   ├── aiokafka_consumer_loop.py
│   │   ├── asyncapi.py
│   │   ├── benchmarking.py
│   │   ├── docs_dependencies.py
│   │   ├── encoder/
│   │   │   ├── __init__.py
│   │   │   ├── avro.py
│   │   │   └── json.py
│   │   ├── helpers.py
│   │   ├── logger.py
│   │   ├── meta.py
│   │   ├── producer_decorator.py
│   │   ├── task_streaming.py
│   │   └── test_dependencies.py
│   ├── _docusaurus_helper.py
│   ├── _helpers.py
│   ├── _modidx.py
│   ├── _server.py
│   ├── _testing/
│   │   ├── __init__.py
│   │   ├── apache_kafka_broker.py
│   │   ├── in_memory_broker.py
│   │   ├── local_redpanda_broker.py
│   │   └── test_utils.py
│   ├── encoder.py
│   ├── executors.py
│   └── testing.py
├── mkdocs/
│   ├── docs_overrides/
│   │   ├── css/
│   │   │   └── extra.css
│   │   └── js/
│   │       ├── extra.js
│   │       ├── math.js
│   │       └── mathjax.js
│   ├── mkdocs.yml
│   ├── overrides/
│   │   └── main.html
│   ├── site_overrides/
│   │   ├── main.html
│   │   └── partials/
│   │       └── copyright.html
│   └── summary_template.txt
├── mypy.ini
├── nbs/
│   ├── .gitignore
│   ├── 000_AIOKafkaImports.ipynb
│   ├── 000_Testing_export.ipynb
│   ├── 001_InMemoryBroker.ipynb
│   ├── 002_ApacheKafkaBroker.ipynb
│   ├── 003_LocalRedpandaBroker.ipynb
│   ├── 004_Test_Utils.ipynb
│   ├── 005_Application_executors_export.ipynb
│   ├── 006_TaskStreaming.ipynb
│   ├── 010_Application_export.ipynb
│   ├── 011_ConsumerLoop.ipynb
│   ├── 013_ProducerDecorator.ipynb
│   ├── 014_AsyncAPI.ipynb
│   ├── 015_FastKafka.ipynb
│   ├── 016_Tester.ipynb
│   ├── 017_Benchmarking.ipynb
│   ├── 018_Avro_Encode_Decoder.ipynb
│   ├── 019_Json_Encode_Decoder.ipynb
│   ├── 020_Encoder_Export.ipynb
│   ├── 021_FastKafkaServer.ipynb
│   ├── 022_Subprocess.ipynb
│   ├── 023_CLI.ipynb
│   ├── 024_CLI_Docs.ipynb
│   ├── 025_CLI_Testing.ipynb
│   ├── 096_Docusaurus_Helper.ipynb
│   ├── 096_Meta.ipynb
│   ├── 097_Docs_Dependencies.ipynb
│   ├── 098_Test_Dependencies.ipynb
│   ├── 099_Test_Service.ipynb
│   ├── 998_Internal_Helpers.ipynb
│   ├── 999_Helpers.ipynb
│   ├── Logger.ipynb
│   ├── _quarto.yml
│   ├── guides/
│   │   ├── .gitignore
│   │   ├── Guide_00_FastKafka_Demo.ipynb
│   │   ├── Guide_01_Intro.ipynb
│   │   ├── Guide_02_First_Steps.ipynb
│   │   ├── Guide_03_Authentication.ipynb
│   │   ├── Guide_04_Github_Actions_Workflow.ipynb
│   │   ├── Guide_05_Lifespan_Handler.ipynb
│   │   ├── Guide_06_Benchmarking_FastKafka.ipynb
│   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.ipynb
│   │   ├── Guide_11_Consumes_Basics.ipynb
│   │   ├── Guide_12_Batch_Consuming.ipynb
│   │   ├── Guide_21_Produces_Basics.ipynb
│   │   ├── Guide_22_Partition_Keys.ipynb
│   │   ├── Guide_23_Batch_Producing.ipynb
│   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.ipynb
│   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.ipynb
│   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.ipynb
│   │   ├── Guide_32_Using_fastapi_to_run_fastkafka_application.ipynb
│   │   └── Guide_33_Using_Tester_class_to_test_fastkafka.ipynb
│   ├── index.ipynb
│   ├── nbdev.yml
│   ├── sidebar.yml
│   └── styles.css
├── run_jupyter.sh
├── set_variables.sh
├── settings.ini
├── setup.py
└── stop_jupyter.sh

================================================
FILE CONTENTS
================================================

================================================
FILE: .github/workflows/codeql.yml
================================================
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"

on:
  push:
    branches: [ "main" ]
  pull_request:
    # The branches below must be a subset of the branches above
    branches: [ "main" ]
  schedule:
    - cron: '34 11 * * 4'

jobs:
  analyze:
    name: Analyze
    runs-on: ubuntu-latest
    permissions:
      actions: read
      contents: read
      security-events: write

    strategy:
      fail-fast: false
      matrix:
        language: [ 'javascript', 'python' ]
        # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
        # Use only 'java' to analyze code written in Java, Kotlin or both
        # Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
        # Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support

    steps:
    - name: Checkout repository
      uses: actions/checkout@v3

    # Initializes the CodeQL tools for scanning.
    - name: Initialize CodeQL
      uses: github/codeql-action/init@v2 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
      with:
        languages: ${{ matrix.language }}
        # If you wish to specify custom queries, you can do so here or in a config file.
        # By default, queries listed here will override any specified in a config file.
        # Prefix the list here with "+" to use these queries and those in the config file.

        # Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
        # queries: security-extended,security-and-quality


    # Autobuild attempts to build any compiled languages  (C/C++, C#, Go, or Java).
    # If this step fails, then you should remove it and run the build manually (see below)
    - name: Autobuild
      uses: github/codeql-action/autobuild@v2 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha

    # ℹ️ Command-line programs to run using the OS shell.
    # 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun

    #   If the Autobuild fails above, remove it and uncomment the following three lines.
    #   modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.

    # - run: |
    #   echo "Run, Build Application using script"
    #   ./location_of_script_within_repo/buildscript.sh

    - name: Perform CodeQL Analysis
      uses: github/codeql-action/analyze@v2 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
      with:
        category: "/language:${{matrix.language}}"


================================================
FILE: .github/workflows/dependency-review.yml
================================================
# Dependency Review Action
#
# This Action will scan dependency manifest files that change as part of a Pull Request, surfacing known-vulnerable versions of the packages declared or updated in the PR. Once installed, if the workflow run is marked as required, PRs introducing known-vulnerable packages will be blocked from merging.
#
# Source repository: https://github.com/actions/dependency-review-action
# Public documentation: https://docs.github.com/en/code-security/supply-chain-security/understanding-your-software-supply-chain/about-dependency-review#dependency-review-enforcement
name: 'Dependency Review'
on: [pull_request]

permissions:
  contents: read

jobs:
  dependency-review:
    runs-on: ubuntu-latest
    steps:
      - name: 'Checkout Repository'
        uses: actions/checkout@v3
      - name: 'Dependency Review'
        uses: actions/dependency-review-action@v2


================================================
FILE: .github/workflows/deploy.yaml
================================================
name: Deploy FastKafka documentation to the GitHub Pages

on:
  push:
    branches: [ "main", "master"]
  workflow_dispatch:
jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: airtai/workflows/fastkafka-docusaurus-ghp@main # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha


================================================
FILE: .github/workflows/index-docs-for-fastkafka-chat.yaml
================================================
name: Index docs for fastkafka chat application

on:
  workflow_run:
    workflows: ["pages-build-deployment"]
    types: [completed]

env:
  OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
  PERSONAL_ACCESS_TOKEN: ${{ secrets.PERSONAL_ACCESS_TOKEN }}

jobs:
  on-success:
    name: Index docs for fastkafka chat application
    runs-on: ubuntu-latest
    permissions:
      contents: write
    if: ${{ github.event.workflow_run.conclusion == 'success' }}
    steps:
      - name: Checkout airtai/fastkafkachat repo
        uses: actions/checkout@v3
        with:
          token: ${{ secrets.PERSONAL_ACCESS_TOKEN }}
          ref: ${{ github.head_ref }}
          repository: airtai/fastkafkachat

      - name: Setup Python
        uses: actions/setup-python@v4
        with:
          python-version: "3.9"
          cache: "pip"
          cache-dependency-path: settings.ini

      - name: Install Dependencies
        shell: bash
        run: |
          set -ux
          python -m pip install --upgrade pip
          test -f setup.py && pip install -e ".[dev]"

      - name: Index the fastkafka docs
        shell: bash
        run: |
          index_website_data

      - name: Push updated index to airtai/fastkafkachat repo
        uses: stefanzweifel/git-auto-commit-action@v4 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
        with:
          commit_message: "Update fastkafka docs index file"
          file_pattern: "data/website_index.zip"


================================================
FILE: .github/workflows/test.yaml
================================================
name: CI
on:  [workflow_dispatch, push]

jobs:
  mypy_static_analysis:
    runs-on: ubuntu-latest
    steps:
      - uses: airtai/workflows/airt-mypy-check@main # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
  bandit_static_analysis:
    runs-on: ubuntu-latest
    steps:
      - uses: airtai/workflows/airt-bandit-check@main # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
  semgrep_static_analysis:
    runs-on: ubuntu-latest
    steps:
      - uses: airtai/workflows/airt-semgrep-check@main # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
  test:
    timeout-minutes: 60
    strategy:
      fail-fast: false
      matrix:
        os:  [ubuntu, windows]
        version: ["3.8", "3.9", "3.10", "3.11"]
    runs-on: ${{ matrix.os }}-latest
    defaults:
      run:
        shell: bash
    steps:
      - name: Configure Pagefile
        if: matrix.os == 'windows'
        uses: al-cheb/configure-pagefile-action@v1.2 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
        with:
          minimum-size: 8GB
          maximum-size: 8GB
          disk-root: "C:"
      - name: Install quarto
        uses: quarto-dev/quarto-actions/setup@v2 # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
      - name: Prepare nbdev env
        uses: fastai/workflows/nbdev-ci@master # nosemgrep: yaml.github-actions.security.third-party-action-not-pinned-to-commit-sha.third-party-action-not-pinned-to-commit-sha
        with:
          version: ${{ matrix.version }}
          skip_test: true
      - name: List pip deps
        run: |
          pip list
      - name: Install testing deps
        run: |
          fastkafka docs install_deps
          fastkafka testing install_deps
      - name: Run nbdev tests
        run: |
          nbdev_test --timing --do_print --n_workers 1 --file_glob "*_CLI*" # Run CLI tests first one by one because of npm installation clashes with other tests
          nbdev_test --timing --do_print --skip_file_glob "*_CLI*"
      - name: Test building docs with nbdev-mkdocs
        if: matrix.os != 'windows'
        run: |
          nbdev_mkdocs docs
          if [ -f "mkdocs/site/index.html" ]; then
            echo "docs built successfully."
          else
            echo "index page not found in rendered docs."
            ls -la
            ls -la mkdocs/site/
            exit 1
          fi

  # https://github.com/marketplace/actions/alls-green#why
  check: # This job does nothing and is only used for the branch protection
    if: always()

    needs:
      - test
      - mypy_static_analysis
      - bandit_static_analysis
      - semgrep_static_analysis

    runs-on: ubuntu-latest

    steps:
      - name: Decide whether the needed jobs succeeded or failed
        uses: re-actors/alls-green@release/v1 # nosemgrep
        with:
          jobs: ${{ toJSON(needs) }}



================================================
FILE: .gitignore
================================================
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
#  Usually these files are written by a python script from a template
#  before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
#   According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
#   However, in case of collaboration, if having platform-specific dependencies or dependencies
#   having no cross-platform support, pipenv may install dependencies that don't work, or not
#   install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# docusaurus documentation
docusaurus/node_modules
docusaurus/docs
docusaurus/build

docusaurus/.docusaurus
docusaurus/.cache-loader

docusaurus/.DS_Store
docusaurus/.env.local
docusaurus/.env.development.local
docusaurus/.env.test.local
docusaurus/.env.production.local

docusaurus/npm-debug.log*
docusaurus/yarn-debug.log*
docusaurus/yarn-error.log*

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# PyCharm
.idea

# nbdev related stuff
.gitattributes
.gitconfig
_proc
_docs

nbs/asyncapi
nbs/guides/asyncapi
nbs/.last_checked
nbs/_*.ipynb
token
*.bak

# nbdev_mkdocs
mkdocs/docs/
mkdocs/site/

# Ignore trashbins
.Trash*

.vscode


================================================
FILE: .pre-commit-config.yaml
================================================
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks

repos:
-   repo: https://github.com/pre-commit/pre-commit-hooks
    rev: "v4.4.0"
    hooks:
      #    -   id: trailing-whitespace
      #    -   id: end-of-file-fixer
      #    -   id: check-yaml
    -   id: check-added-large-files

- repo: https://github.com/PyCQA/bandit
  rev: '1.7.5'
  hooks:
  - id: bandit

    #- repo: https://github.com/returntocorp/semgrep
    #  rev: "v1.14.0"
    #  hooks:
    #    - id: semgrep
    #      name: Semgrep 
    #      args: ["--config", "auto", "--error"]
    #      exclude: ^docker/



================================================
FILE: .semgrepignore
================================================
docker/


================================================
FILE: CHANGELOG.md
================================================
# Release notes

<!-- do not remove -->

## 0.8.0

### New Features

- Add support for Pydantic v2 ([#408](https://github.com/airtai/fastkafka/issues/408)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)
  - FastKafka now uses Pydantic v2 for serialization/deserialization of messages
 
- Enable nbdev_test on windows and run CI tests on windows ([#356](https://github.com/airtai/fastkafka/pull/356)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

### Bugs Squashed

- Fix ´fastkafka testing install deps´ failing ([#385](https://github.com/airtai/fastkafka/pull/385)), thanks to [@Sternakt](https://github.com/Sternakt)

- Create asyncapi docs directory only while building asyncapi docs ([#368](https://github.com/airtai/fastkafka/pull/368)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Add retries to producer in case of raised KafkaTimeoutError exception ([#423](https://github.com/airtai/fastkafka/pull/423)), thanks to [@Sternakt](https://github.com/Sternakt)

## 0.7.1

### Bugs Squashed
 
 - Limit pydantic version to <2.0 ([#427](https://github.com/airtai/fastkafka/issues/427))

 - Fix Kafka broker version installation issues ([#427](https://github.com/airtai/fastkafka/issues/427))

 - Fix ApacheKafkaBroker startup issues ([#427](https://github.com/airtai/fastkafka/issues/427))

## 0.7.0

### New Features

- Optional description argument to consumes and produces decorator implemented ([#338](https://github.com/airtai/fastkafka/pull/338)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Consumes and produces decorators now have optional `description` argument that is used instead of function docstring in async doc generation when specified

- FastKafka Windows OS support enabled ([#326](https://github.com/airtai/fastkafka/pull/326)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)
  - FastKafka can now run on Windows

- FastKafka and FastAPI integration implemented ([#304](https://github.com/airtai/fastkafka/pull/304)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)
  - FastKafka can now be run alongside FastAPI

- Batch consuming option to consumers implemented ([#298](https://github.com/airtai/fastkafka/pull/298)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Consumers can consume events in batches by specifying msg type of consuming function as `List[YourMsgType]` 

- Removed support for synchronous produce functions ([#295](https://github.com/airtai/fastkafka/pull/295)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Added default broker values and update docs ([#292](https://github.com/airtai/fastkafka/pull/292)), thanks to [@Sternakt](https://github.com/Sternakt)

### Bugs Squashed

- Fix index.ipynb to be runnable in colab ([#342](https://github.com/airtai/fastkafka/issues/342))

- Use cli option root_path docs generate and serve CLI commands ([#341](https://github.com/airtai/fastkafka/pull/341)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Fix incorrect asyncapi docs path on fastkafka docs serve command ([#335](https://github.com/airtai/fastkafka/pull/335)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Serve docs now takes app `root_path` argument into consideration when specified in app

- Fix typo (supress_timestamps->suppress_timestamps) and remove fix for enabling timestamps ([#315](https://github.com/airtai/fastkafka/issues/315))

- Fix logs printing timestamps ([#308](https://github.com/airtai/fastkafka/issues/308))

- Fix topics with dots causing failure of tester instantiation ([#306](https://github.com/airtai/fastkafka/pull/306)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Specified topics can now have "." in their names

## 0.6.0

### New Features

- Timestamps added to CLI commands ([#283](https://github.com/airtai/fastkafka/pull/283)), thanks to [@davorrunje](https://github.com/davorrunje)

- Added option to process messages concurrently ([#278](https://github.com/airtai/fastkafka/pull/278)), thanks to [@Sternakt](https://github.com/Sternakt)
  - A new `executor` option is added that supports either sequential processing for tasks with small latencies or concurrent processing for tasks with larger latencies.

- Add consumes and produces functions to app ([#274](https://github.com/airtai/fastkafka/pull/274)), thanks to [@Sternakt](https://github.com/Sternakt)


- Add batching for producers ([#273](https://github.com/airtai/fastkafka/pull/273)), thanks to [@Sternakt](https://github.com/Sternakt)
  - requirement(batch): batch support is a real need! and i see it on the issue list.... so hope we do not need to wait too long

    https://discord.com/channels/1085457301214855171/1090956337938182266/1098592795557630063

- Fix broken links in guides ([#272](https://github.com/airtai/fastkafka/pull/272)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Generate the docusaurus sidebar dynamically by parsing summary.md ([#270](https://github.com/airtai/fastkafka/pull/270)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Metadata passed to consumer ([#269](https://github.com/airtai/fastkafka/pull/269)), thanks to [@Sternakt](https://github.com/Sternakt)
  - requirement(key): read the key value somehow..Maybe I missed something in the docs
    requirement(header): read header values, Reason: I use CDC | Debezium and in the current system the header values are important to differentiate between the CRUD operations.

    https://discord.com/channels/1085457301214855171/1090956337938182266/1098592795557630063

- Contribution with instructions how to build and test added ([#255](https://github.com/airtai/fastkafka/pull/255)), thanks to [@Sternakt](https://github.com/Sternakt)


- Export encoders, decoders from fastkafka.encoder ([#246](https://github.com/airtai/fastkafka/pull/246)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)


- Create a Github action file to automatically index the website and commit it to the FastKafkachat repository. ([#239](https://github.com/airtai/fastkafka/issues/239))


- UI Improvement: Post screenshots with links to the actual messages in testimonials section ([#228](https://github.com/airtai/fastkafka/issues/228))

### Bugs Squashed

- Batch testing fix ([#280](https://github.com/airtai/fastkafka/pull/280)), thanks to [@Sternakt](https://github.com/Sternakt)

- Tester breaks when using Batching or KafkaEvent producers ([#279](https://github.com/airtai/fastkafka/issues/279))

- Consumer loop callbacks are not executing in parallel ([#276](https://github.com/airtai/fastkafka/issues/276))


## 0.5.0

### New Features

- Significant speedup of Kafka producer ([#236](https://github.com/airtai/fastkafka/pull/236)), thanks to [@Sternakt](https://github.com/Sternakt)
 

- Added support for AVRO encoding/decoding ([#231](https://github.com/airtai/fastkafka/pull/231)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)


### Bugs Squashed

- Fixed sidebar to include guides in docusaurus documentation ([#238](https://github.com/airtai/fastkafka/pull/238)), thanks to [@Sternakt](https://github.com/Sternakt)

- Fixed link to symbols in docusaurus docs ([#227](https://github.com/airtai/fastkafka/pull/227)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Removed bootstrap servers from constructor ([#220](https://github.com/airtai/fastkafka/pull/220)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)


## 0.4.0

### New Features

- Integrate FastKafka chat ([#208](https://github.com/airtai/fastkafka/pull/208)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Add benchmarking ([#206](https://github.com/airtai/fastkafka/pull/206)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Enable fast testing without running kafka locally ([#198](https://github.com/airtai/fastkafka/pull/198)), thanks to [@Sternakt](https://github.com/Sternakt)

- Generate docs using Docusaurus ([#194](https://github.com/airtai/fastkafka/pull/194)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Add test cases for LocalRedpandaBroker ([#189](https://github.com/airtai/fastkafka/pull/189)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Reimplement patch and delegates from fastcore ([#188](https://github.com/airtai/fastkafka/pull/188)), thanks to [@Sternakt](https://github.com/Sternakt)

- Rename existing functions into start and stop and add lifespan handler ([#117](https://github.com/airtai/fastkafka/issues/117))
  - https://www.linkedin.com/posts/tiangolo_fastapi-activity-7038907638331404288-Oar3/?utm_source=share&utm_medium=member_ios


## 0.3.1

-  README.md file updated


## 0.3.0

### New Features

- Guide for FastKafka produces using partition key ([#172](https://github.com/airtai/fastkafka/pull/172)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Closes #161

- Add support for Redpanda for testing and deployment ([#181](https://github.com/airtai/fastkafka/pull/181)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Remove bootstrap_servers from __init__ and use the name of broker as an option when running/testing ([#134](https://github.com/airtai/fastkafka/issues/134))

- Add a GH action file to check for broken links in the docs ([#163](https://github.com/airtai/fastkafka/issues/163))

- Optimize requirements for testing and docs ([#151](https://github.com/airtai/fastkafka/issues/151))

- Break requirements into base and optional for testing and dev ([#124](https://github.com/airtai/fastkafka/issues/124))
  - Minimize base requirements needed just for running the service.

- Add link to example git repo into guide for building docs using actions ([#81](https://github.com/airtai/fastkafka/issues/81))

- Add logging for run_in_background ([#46](https://github.com/airtai/fastkafka/issues/46))

- Implement partition Key mechanism for producers ([#16](https://github.com/airtai/fastkafka/issues/16))

### Bugs Squashed

- Implement checks for npm installation and version ([#176](https://github.com/airtai/fastkafka/pull/176)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Closes #158 by checking if the npx is installed and more verbose error handling

- Fix the helper.py link in CHANGELOG.md ([#165](https://github.com/airtai/fastkafka/issues/165))

- fastkafka docs install_deps fails ([#157](https://github.com/airtai/fastkafka/issues/157))
  - Unexpected internal error: [Errno 2] No such file or directory: 'npx'

- Broken links in docs ([#141](https://github.com/airtai/fastkafka/issues/141))

- fastkafka run is not showing up in CLI docs ([#132](https://github.com/airtai/fastkafka/issues/132))


## 0.2.3

- Fixed broken links on PyPi index page


## 0.2.2

### New Features

- Extract JDK and Kafka installation out of LocalKafkaBroker ([#131](https://github.com/airtai/fastkafka/issues/131))

- PyYAML version relaxed ([#119](https://github.com/airtai/fastkafka/pull/119)), thanks to [@davorrunje](https://github.com/davorrunje)

- Replace docker based kafka with local ([#68](https://github.com/airtai/fastkafka/issues/68))
  - [x] replace docker compose with a simple docker run (standard run_jupyter.sh should do)
  - [x] replace all tests to use LocalKafkaBroker
  - [x] update documentation

### Bugs Squashed

- Fix broken link for FastKafka docs in index notebook ([#145](https://github.com/airtai/fastkafka/issues/145))

- Fix encoding issues when loading setup.py on windows OS ([#135](https://github.com/airtai/fastkafka/issues/135))


## 0.2.0

### New Features

- Replace kafka container with LocalKafkaBroker ([#112](https://github.com/airtai/fastkafka/issues/112))
  - - [x] Replace kafka container with LocalKafkaBroker in tests
- [x] Remove kafka container from tests environment
- [x] Fix failing tests

### Bugs Squashed

- Fix random failing in CI ([#109](https://github.com/airtai/fastkafka/issues/109))


## 0.1.3

- version update in __init__.py


## 0.1.2

### New Features


- Git workflow action for publishing Kafka docs ([#78](https://github.com/airtai/fastkafka/issues/78))


### Bugs Squashed

- Include missing requirement ([#110](https://github.com/airtai/fastkafka/issues/110))
  - [x] Typer is imported in this [file](https://github.com/airtai/fastkafka/blob/main/fastkafka/_components/helpers.py) but it is not included in [settings.ini](https://github.com/airtai/fastkafka/blob/main/settings.ini)
  - [x] Add aiohttp which is imported in this [file](https://github.com/airtai/fastkafka/blob/main/fastkafka/_helpers.py)
  - [x] Add nbformat which is imported in _components/helpers.py
  - [x] Add nbconvert which is imported in _components/helpers.py


## 0.1.1


### Bugs Squashed

- JDK install fails on Python 3.8 ([#106](https://github.com/airtai/fastkafka/issues/106))



## 0.1.0

Initial release


================================================
FILE: CNAME
================================================
fastkafka.airt.ai


================================================
FILE: CONTRIBUTING.md
================================================
# Contributing to FastKafka

First off, thanks for taking the time to contribute! ❤️

All types of contributions are encouraged and valued. See the [Table of Contents](#table-of-contents) for different ways to help and details about how this project handles them. Please make sure to read the relevant section before making your contribution. It will make it a lot easier for us maintainers and smooth out the experience for all involved. The community looks forward to your contributions. 🎉

> And if you like the project, but just don't have time to contribute, that's fine. There are other easy ways to support the project and show your appreciation, which we would also be very happy about:
> - Star the project
> - Tweet about it
> - Refer this project in your project's readme
> - Mention the project at local meetups and tell your friends/colleagues

## Table of Contents

- [I Have a Question](#i-have-a-question)
- [I Want To Contribute](#i-want-to-contribute)
  - [Reporting Bugs](#reporting-bugs)
  - [Suggesting Enhancements](#suggesting-enhancements)
  - [Your First Code Contribution](#your-first-code-contribution)
- [Development](#development)
    - [Prepare the dev environment](#prepare-the-dev-environment)
    - [Way of working](#way-of-working)
    - [Before a PR](#before-a-pr)



## I Have a Question

> If you want to ask a question, we assume that you have read the available [Documentation](https://fastkafka.airt.ai/docs).

Before you ask a question, it is best to search for existing [Issues](https://github.com/airtai/fastkafka/issues) that might help you. In case you have found a suitable issue and still need clarification, you can write your question in this issue.

If you then still feel the need to ask a question and need clarification, we recommend the following:

- Contact us on [Discord](https://discord.com/invite/CJWmYpyFbc)
- Open an [Issue](https://github.com/airtai/fastkafka/issues/new)
    - Provide as much context as you can about what you're running into

We will then take care of the issue as soon as possible.

## I Want To Contribute

> ### Legal Notice 
> When contributing to this project, you must agree that you have authored 100% of the content, that you have the necessary rights to the content and that the content you contribute may be provided under the project license.

### Reporting Bugs

#### Before Submitting a Bug Report

A good bug report shouldn't leave others needing to chase you up for more information. Therefore, we ask you to investigate carefully, collect information and describe the issue in detail in your report. Please complete the following steps in advance to help us fix any potential bug as fast as possible.

- Make sure that you are using the latest version.
- Determine if your bug is really a bug and not an error on your side e.g. using incompatible environment components/versions (Make sure that you have read the [documentation](https://fastkafka.airt.ai/docs). If you are looking for support, you might want to check [this section](#i-have-a-question)).
- To see if other users have experienced (and potentially already solved) the same issue you are having, check if there is not already a bug report existing for your bug or error in the [bug tracker](https://github.com/airtai/fastkafka/issues?q=label%3Abug).
- Also make sure to search the internet (including Stack Overflow) to see if users outside of the GitHub community have discussed the issue.
- Collect information about the bug:
  - Stack trace (Traceback)
  - OS, Platform and Version (Windows, Linux, macOS, x86, ARM)
  - Python version
  - Possibly your input and the output
  - Can you reliably reproduce the issue? And can you also reproduce it with older versions?

#### How Do I Submit a Good Bug Report?

We use GitHub issues to track bugs and errors. If you run into an issue with the project:

- Open an [Issue](https://github.com/airtai/fastkafka/issues/new). (Since we can't be sure at this point whether it is a bug or not, we ask you not to talk about a bug yet and not to label the issue.)
- Explain the behavior you would expect and the actual behavior.
- Please provide as much context as possible and describe the *reproduction steps* that someone else can follow to recreate the issue on their own. This usually includes your code. For good bug reports you should isolate the problem and create a reduced test case.
- Provide the information you collected in the previous section.

Once it's filed:

- The project team will label the issue accordingly.
- A team member will try to reproduce the issue with your provided steps. If there are no reproduction steps or no obvious way to reproduce the issue, the team will ask you for those steps and mark the issue as `needs-repro`. Bugs with the `needs-repro` tag will not be addressed until they are reproduced.
- If the team is able to reproduce the issue, it will be marked `needs-fix`, as well as possibly other tags (such as `critical`), and the issue will be left to be implemented.

### Suggesting Enhancements

This section guides you through submitting an enhancement suggestion for FastKafka, **including completely new features and minor improvements to existing functionality**. Following these guidelines will help maintainers and the community to understand your suggestion and find related suggestions.

#### Before Submitting an Enhancement

- Make sure that you are using the latest version.
- Read the [documentation](https://fastkafka.airt.ai/docs) carefully and find out if the functionality is already covered, maybe by an individual configuration.
- Perform a [search](https://github.com/airtai/fastkafka/issues) to see if the enhancement has already been suggested. If it has, add a comment to the existing issue instead of opening a new one.
- Find out whether your idea fits with the scope and aims of the project. It's up to you to make a strong case to convince the project's developers of the merits of this feature. Keep in mind that we want features that will be useful to the majority of our users and not just a small subset. If you're just targeting a minority of users, consider writing an add-on/plugin library.
- If you are not sure or would like to discuiss the enhancement with us directly, you can always contact us on [Discord](https://discord.com/invite/CJWmYpyFbc)

#### How Do I Submit a Good Enhancement Suggestion?

Enhancement suggestions are tracked as [GitHub issues](https://github.com/airtai/fastkafka/issues).

- Use a **clear and descriptive title** for the issue to identify the suggestion.
- Provide a **step-by-step description of the suggested enhancement** in as many details as possible.
- **Describe the current behavior** and **explain which behavior you expected to see instead** and why. At this point you can also tell which alternatives do not work for you.
- **Explain why this enhancement would be useful** to most FastKafka users. You may also want to point out the other projects that solved it better and which could serve as inspiration.

### Your First Code Contribution

A great way to start contributing to FastKafka would be by solving an issue tagged with "good first issue". To find a list of issues that are tagged as "good first issue" and are suitable for newcomers, please visit the following link: [Good first issues](https://github.com/airtai/fastkafka/labels/good%20first%20issue)

These issues are beginner-friendly and provide a great opportunity to get started with contributing to FastKafka. Choose an issue that interests you, follow the contribution process mentioned in [Way of working](#way-of-working) and [Before a PR](#before-a-pr), and help us make FastKafka even better!

If you have any questions or need further assistance, feel free to reach out to us. Happy coding!

## Development

### Prepare the dev environment

To start contributing to FastKafka, you first have to prepare the development environment.

#### Clone the FastKafka repository

To clone the repository, run the following command in the CLI:

```shell
git clone https://github.com/airtai/fastkafka.git
```

#### Optional: create a virtual python environment

To prevent library version clashes with you other projects, it is reccomended that you create a virtual python environment for your FastKafka project by running:

```shell
python3 -m venv fastkafka-env
```

And to activate your virtual environment run:

```shell
source fastkafka-env/bin/activate
```

To learn more about virtual environments, please have a look at [official python documentation](https://docs.python.org/3/library/venv.html#:~:text=A%20virtual%20environment%20is%20created,the%20virtual%20environment%20are%20available.)

#### Install FastKafka

To install FastKafka, navigate to the root directory of the cloned FastKafka project and run:

```shell
pip install fastkafka -e [."dev"]
```

#### Install JRE and Kafka toolkit

To be able to run tests and use all the functionalities of FastKafka, you have to have JRE and Kafka toolkit installed on your machine. To do this, you have two options:

1. Use our `fastkafka testing install-deps` CLI command which will install JRE and Kafka toolkit for you in your .local folder
OR
2. Install JRE and Kafka manually.
   To do this, please refer to [JDK and JRE installation guide](https://docs.oracle.com/javase/9/install/toc.htm) and [Apache Kafka quickstart](https://kafka.apache.org/quickstart)
   
#### Install npm

To be able to run tests you must have npm installed, because of documentation generation. To do this, you have two options:

1. Use our `fastkafka docs install_deps` CLI command which will install npm for you in your .local folder
OR
2. Install npm manually.
   To do this, please refer to [NPM installation guide](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
   
#### Install docusaurus

To generate the documentation, you need docusaurus. To install it run 'docusaurus/scripts/install_docusaurus_deps.sh' in the root of FastKafka project.

#### Check if everything works

After installing FastKafka and all the necessary dependencies, run `nbdev_test` in the root of FastKafka project. This will take a couple of minutes as it will run all the tests on FastKafka project. If everythng is setup correctly, you will get a "Success." message in your terminal, otherwise please refer to previous steps.

### Way of working

The development of FastKafka is done in Jupyter notebooks. Inside the `nbs` directory you will find all the source code of FastKafka, this is where you will implement your changes.

The testing, cleanup and exporting of the code is being handled by `nbdev`, please, before starting the work on FastKafka, get familiar with it by reading [nbdev documentation](https://nbdev.fast.ai/getting_started.html).

The general philosopy you should follow when writing code for FastKafka is:

- Function should be an atomic functionality, short and concise
   - Good rule of thumb: your function should be 5-10 lines long usually
- If there are more than 2 params, enforce keywording using *
   - E.g.: `def function(param1, *, param2, param3): ...`
- Define typing of arguments and return value
   - If not, mypy tests will fail and a lot of easily avoidable bugs will go undetected
- After the function cell, write test cells using the assert keyword
   - Whenever you implement something you should test that functionality immediately in the cells below 
- Add Google style python docstrings when function is implemented and tested

### Before a PR

After you have implemented your changes you will want to open a pull request to merge those changes into our main branch. To make this as smooth for you and us, please do the following before opening the request (all the commands are to be run in the root of FastKafka project):

1. Format your notebooks: `nbqa black nbs`
2. Close, shutdown, and clean the metadata from your notebooks: `nbdev_clean`
3. Export your code: `nbdev_export`
4. Run the tests: `nbdev_test`
5. Test code typing: `mypy fastkafka`
6. Test code safety with bandit: `bandit -r fastkafka`
7. Test code safety with semgrep: `semgrep --config auto -r fastkafka`

When you have done this, and all the tests are passing, your code should be ready for a merge. Please commit and push your code and open a pull request and assign it to one of the core developers. We will then review your changes and if everythng is in order, we will approve your merge.

## Attribution
This guide is based on the **contributing-gen**. [Make your own](https://github.com/bttger/contributing-gen)!

================================================
FILE: LICENSE
================================================
                                 Apache License
                           Version 2.0, January 2004
                        http://www.apache.org/licenses/

   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION

   1. Definitions.

      "License" shall mean the terms and conditions for use, reproduction,
      and distribution as defined by Sections 1 through 9 of this document.

      "Licensor" shall mean the copyright owner or entity authorized by
      the copyright owner that is granting the License.

      "Legal Entity" shall mean the union of the acting entity and all
      other entities that control, are controlled by, or are under common
      control with that entity. For the purposes of this definition,
      "control" means (i) the power, direct or indirect, to cause the
      direction or management of such entity, whether by contract or
      otherwise, or (ii) ownership of fifty percent (50%) or more of the
      outstanding shares, or (iii) beneficial ownership of such entity.

      "You" (or "Your") shall mean an individual or Legal Entity
      exercising permissions granted by this License.

      "Source" form shall mean the preferred form for making modifications,
      including but not limited to software source code, documentation
      source, and configuration files.

      "Object" form shall mean any form resulting from mechanical
      transformation or translation of a Source form, including but
      not limited to compiled object code, generated documentation,
      and conversions to other media types.

      "Work" shall mean the work of authorship, whether in Source or
      Object form, made available under the License, as indicated by a
      copyright notice that is included in or attached to the work
      (an example is provided in the Appendix below).

      "Derivative Works" shall mean any work, whether in Source or Object
      form, that is based on (or derived from) the Work and for which the
      editorial revisions, annotations, elaborations, or other modifications
      represent, as a whole, an original work of authorship. For the purposes
      of this License, Derivative Works shall not include works that remain
      separable from, or merely link (or bind by name) to the interfaces of,
      the Work and Derivative Works thereof.

      "Contribution" shall mean any work of authorship, including
      the original version of the Work and any modifications or additions
      to that Work or Derivative Works thereof, that is intentionally
      submitted to Licensor for inclusion in the Work by the copyright owner
      or by an individual or Legal Entity authorized to submit on behalf of
      the copyright owner. For the purposes of this definition, "submitted"
      means any form of electronic, verbal, or written communication sent
      to the Licensor or its representatives, including but not limited to
      communication on electronic mailing lists, source code control systems,
      and issue tracking systems that are managed by, or on behalf of, the
      Licensor for the purpose of discussing and improving the Work, but
      excluding communication that is conspicuously marked or otherwise
      designated in writing by the copyright owner as "Not a Contribution."

      "Contributor" shall mean Licensor and any individual or Legal Entity
      on behalf of whom a Contribution has been received by Licensor and
      subsequently incorporated within the Work.

   2. Grant of Copyright License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      copyright license to reproduce, prepare Derivative Works of,
      publicly display, publicly perform, sublicense, and distribute the
      Work and such Derivative Works in Source or Object form.

   3. Grant of Patent License. Subject to the terms and conditions of
      this License, each Contributor hereby grants to You a perpetual,
      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
      (except as stated in this section) patent license to make, have made,
      use, offer to sell, sell, import, and otherwise transfer the Work,
      where such license applies only to those patent claims licensable
      by such Contributor that are necessarily infringed by their
      Contribution(s) alone or by combination of their Contribution(s)
      with the Work to which such Contribution(s) was submitted. If You
      institute patent litigation against any entity (including a
      cross-claim or counterclaim in a lawsuit) alleging that the Work
      or a Contribution incorporated within the Work constitutes direct
      or contributory patent infringement, then any patent licenses
      granted to You under this License for that Work shall terminate
      as of the date such litigation is filed.

   4. Redistribution. You may reproduce and distribute copies of the
      Work or Derivative Works thereof in any medium, with or without
      modifications, and in Source or Object form, provided that You
      meet the following conditions:

      (a) You must give any other recipients of the Work or
          Derivative Works a copy of this License; and

      (b) You must cause any modified files to carry prominent notices
          stating that You changed the files; and

      (c) You must retain, in the Source form of any Derivative Works
          that You distribute, all copyright, patent, trademark, and
          attribution notices from the Source form of the Work,
          excluding those notices that do not pertain to any part of
          the Derivative Works; and

      (d) If the Work includes a "NOTICE" text file as part of its
          distribution, then any Derivative Works that You distribute must
          include a readable copy of the attribution notices contained
          within such NOTICE file, excluding those notices that do not
          pertain to any part of the Derivative Works, in at least one
          of the following places: within a NOTICE text file distributed
          as part of the Derivative Works; within the Source form or
          documentation, if provided along with the Derivative Works; or,
          within a display generated by the Derivative Works, if and
          wherever such third-party notices normally appear. The contents
          of the NOTICE file are for informational purposes only and
          do not modify the License. You may add Your own attribution
          notices within Derivative Works that You distribute, alongside
          or as an addendum to the NOTICE text from the Work, provided
          that such additional attribution notices cannot be construed
          as modifying the License.

      You may add Your own copyright statement to Your modifications and
      may provide additional or different license terms and conditions
      for use, reproduction, or distribution of Your modifications, or
      for any such Derivative Works as a whole, provided Your use,
      reproduction, and distribution of the Work otherwise complies with
      the conditions stated in this License.

   5. Submission of Contributions. Unless You explicitly state otherwise,
      any Contribution intentionally submitted for inclusion in the Work
      by You to the Licensor shall be under the terms and conditions of
      this License, without any additional terms or conditions.
      Notwithstanding the above, nothing herein shall supersede or modify
      the terms of any separate license agreement you may have executed
      with Licensor regarding such Contributions.

   6. Trademarks. This License does not grant permission to use the trade
      names, trademarks, service marks, or product names of the Licensor,
      except as required for reasonable and customary use in describing the
      origin of the Work and reproducing the content of the NOTICE file.

   7. Disclaimer of Warranty. Unless required by applicable law or
      agreed to in writing, Licensor provides the Work (and each
      Contributor provides its Contributions) on an "AS IS" BASIS,
      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
      implied, including, without limitation, any warranties or conditions
      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
      PARTICULAR PURPOSE. You are solely responsible for determining the
      appropriateness of using or redistributing the Work and assume any
      risks associated with Your exercise of permissions under this License.

   8. Limitation of Liability. In no event and under no legal theory,
      whether in tort (including negligence), contract, or otherwise,
      unless required by applicable law (such as deliberate and grossly
      negligent acts) or agreed to in writing, shall any Contributor be
      liable to You for damages, including any direct, indirect, special,
      incidental, or consequential damages of any character arising as a
      result of this License or out of the use or inability to use the
      Work (including but not limited to damages for loss of goodwill,
      work stoppage, computer failure or malfunction, or any and all
      other commercial damages or losses), even if such Contributor
      has been advised of the possibility of such damages.

   9. Accepting Warranty or Additional Liability. While redistributing
      the Work or Derivative Works thereof, You may choose to offer,
      and charge a fee for, acceptance of support, warranty, indemnity,
      or other liability obligations and/or rights consistent with this
      License. However, in accepting such obligations, You may act only
      on Your own behalf and on Your sole responsibility, not on behalf
      of any other Contributor, and only if You agree to indemnify,
      defend, and hold each Contributor harmless for any liability
      incurred by, or claims asserted against, such Contributor by reason
      of your accepting any such warranty or additional liability.

   END OF TERMS AND CONDITIONS

   APPENDIX: How to apply the Apache License to your work.

      To apply the Apache License to your work, attach the following
      boilerplate notice, with the fields enclosed by brackets "[]"
      replaced with your own identifying information. (Don't include
      the brackets!)  The text should be enclosed in the appropriate
      comment syntax for the file format. We also recommend that a
      file or class name and description of purpose be included on the
      same "printed page" as the copyright notice for easier
      identification within third-party archives.

   Copyright [yyyy] [name of copyright owner]

   Licensed under the Apache License, Version 2.0 (the "License");
   you may not use this file except in compliance with the License.
   You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.


================================================
FILE: MANIFEST.in
================================================
include settings.ini
include LICENSE
include CONTRIBUTING.md
include README.md
recursive-exclude * __pycache__


================================================
FILE: README.md
================================================
# FastKafka

<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->

<b>Effortless Kafka integration for your web services</b>

## Deprecation notice

This project is superceeded by
[FastStream](https://github.com/airtai/faststream).

FastStream is a new package based on the ideas and experiences gained
from [FastKafka](https://github.com/airtai/fastkafka) and
[Propan](https://github.com/lancetnik/propan). By joining our forces, we
picked up the best from both packages and created the unified way to
write services capable of processing streamed data regradless of the
underliying protocol.

We’ll continue to maintain FastKafka package, but new development will
be in [FastStream](https://github.com/airtai/faststream). If you are
starting a new service,
[FastStream](https://github.com/airtai/faststream) is the recommended
way to do it.

------------------------------------------------------------------------

![PyPI](https://img.shields.io/pypi/v/fastkafka.png) ![PyPI -
Downloads](https://img.shields.io/pypi/dm/fastkafka.png) ![PyPI - Python
Version](https://img.shields.io/pypi/pyversions/fastkafka.png)

![GitHub Workflow
Status](https://img.shields.io/github/actions/workflow/status/airtai/fastkafka/test.yaml)
![CodeQL](https://github.com/airtai/fastkafka//actions/workflows/codeql.yml/badge.svg)
![Dependency
Review](https://github.com/airtai/fastkafka//actions/workflows/dependency-review.yml/badge.svg)

![GitHub](https://img.shields.io/github/license/airtai/fastkafka.png)

------------------------------------------------------------------------

[FastKafka](https://fastkafka.airt.ai/) is a powerful and easy-to-use
Python library for building asynchronous services that interact with
Kafka topics. Built on top of [Pydantic](https://docs.pydantic.dev/),
[AIOKafka](https://github.com/aio-libs/aiokafka) and
[AsyncAPI](https://www.asyncapi.com/), FastKafka simplifies the process
of writing producers and consumers for Kafka topics, handling all the
parsing, networking, task scheduling and data generation automatically.
With FastKafka, you can quickly prototype and develop high-performance
Kafka-based services with minimal code, making it an ideal choice for
developers looking to streamline their workflow and accelerate their
projects.

------------------------------------------------------------------------

#### ⭐⭐⭐ Stay in touch ⭐⭐⭐

Please show your support and stay in touch by:

- giving our [GitHub repository](https://github.com/airtai/fastkafka/) a
  star, and

- joining our [Discord server](https://discord.gg/CJWmYpyFbc).

Your support helps us to stay in touch with you and encourages us to
continue developing and improving the library. Thank you for your
support!

------------------------------------------------------------------------

#### 🐝🐝🐝 We were busy lately 🐝🐝🐝

![Activity](https://repobeats.axiom.co/api/embed/21f36049093d5eb8e5fdad18c3c5d8df5428ca30.svg "Repobeats analytics image")

## Install

FastKafka works on Windows, macOS, Linux, and most Unix-style operating
systems. You can install base version of FastKafka with `pip` as usual:

``` sh
pip install fastkafka
```

To install FastKafka with testing features please use:

``` sh
pip install fastkafka[test]
```

To install FastKafka with asyncapi docs please use:

``` sh
pip install fastkafka[docs]
```

To install FastKafka with all the features please use:

``` sh
pip install fastkafka[test,docs]
```

## Tutorial

You can start an interactive tutorial in Google Colab by clicking the
button below:

<a href="https://colab.research.google.com/github/airtai/fastkafka/blob/main/nbs/index.ipynb" target=”_blank”>
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open in Colab" />
</a>

## Writing server code

To demonstrate FastKafka simplicity of using `@produces` and `@consumes`
decorators, we will focus on a simple app.

The app will consume JSON messages containing positive floats from one topic, log
them, and then produce incremented values to another topic.

### Messages

FastKafka uses [Pydantic](https://docs.pydantic.dev/) to parse input
JSON-encoded data into Python objects, making it easy to work with
structured data in your Kafka-based applications. Pydantic’s
[`BaseModel`](https://docs.pydantic.dev/usage/models/) class allows you
to define messages using a declarative syntax, making it easy to specify
the fields and types of your messages.

This example defines one `Data` mesage class. This Class will model the
consumed and produced data in our app demo, it contains one
`NonNegativeFloat` field `data` that will be logged and “processed”
before being produced to another topic.

These message class will be used to parse and validate incoming data in
Kafka consumers and producers.

``` python
from pydantic import BaseModel, Field, NonNegativeFloat


class Data(BaseModel):
    data: NonNegativeFloat = Field(
        ..., example=0.5, description="Float data example"
    )
```

### Application

This example shows how to initialize a FastKafka application.

It starts by defining a dictionary called `kafka_brokers`, which
contains two entries: `"localhost"` and `"production"`, specifying local
development and production Kafka brokers. Each entry specifies the URL,
port, and other details of a Kafka broker. This dictionary is used for
both generating the documentation and later to run the actual server
against one of the given kafka broker.

Next, an object of the
[`FastKafka`](https://airtai.github.io/fastkafka/docs/api/fastkafka#fastkafka.FastKafka)
class is initialized with the minimum set of arguments:

- `kafka_brokers`: a dictionary used for generation of documentation

We will also import and create a logger so that we can log the incoming
data in our consuming function.

``` python
from logging import getLogger
from fastkafka import FastKafka

logger = getLogger("Demo Kafka app")

kafka_brokers = {
    "localhost": {
        "url": "localhost",
        "description": "local development kafka broker",
        "port": 9092,
    },
    "production": {
        "url": "kafka.airt.ai",
        "description": "production kafka broker",
        "port": 9092,
        "protocol": "kafka-secure",
        "security": {"type": "plain"},
    },
}

kafka_app = FastKafka(
    title="Demo Kafka app",
    kafka_brokers=kafka_brokers,
)
```

### Function decorators

FastKafka provides convenient function decorators `@kafka_app.consumes`
and `@kafka_app.produces` to allow you to delegate the actual process of

- consuming and producing data to Kafka, and

- decoding and encoding JSON messages

from user defined functions to the framework. The FastKafka framework
delegates these jobs to AIOKafka and Pydantic libraries.

These decorators make it easy to specify the processing logic for your
Kafka consumers and producers, allowing you to focus on the core
business logic of your application without worrying about the underlying
Kafka integration.

This following example shows how to use the `@kafka_app.consumes` and
`@kafka_app.produces` decorators in a FastKafka application:

- The `@kafka_app.consumes` decorator is applied to the `on_input_data`
  function, which specifies that this function should be called whenever
  a message is received on the “input_data” Kafka topic. The
  `on_input_data` function takes a single argument which is expected to
  be an instance of the `Data` message class. Specifying the type of the
  single argument is instructing the Pydantic to use `Data.parse_raw()`
  on the consumed message before passing it to the user defined function
  `on_input_data`.

- The `@produces` decorator is applied to the `to_output_data` function,
  which specifies that this function should produce a message to the
  “output_data” Kafka topic whenever it is called. The `to_output_data`
  function takes a single float argument `data`. It it increments the
  data returns it wrapped in a `Data` object. The framework will call
  the `Data.json().encode("utf-8")` function on the returned value and
  produce it to the specified topic.

``` python
@kafka_app.consumes(topic="input_data", auto_offset_reset="latest")
async def on_input_data(msg: Data):
    logger.info(f"Got data: {msg.data}")
    await to_output_data(msg.data)


@kafka_app.produces(topic="output_data")
async def to_output_data(data: float) -> Data:
    processed_data = Data(data=data+1.0)
    return processed_data
```

## Testing the service

The service can be tested using the
[`Tester`](https://airtai.github.io/fastkafka/docs/api/fastkafka/testing/Tester#fastkafka.testing.Tester)
instances which internally starts InMemory implementation of Kafka
broker.

The Tester will redirect your consumes and produces decorated functions
to the InMemory Kafka broker so that you can quickly test your app
without the need for a running Kafka broker and all its dependencies.

``` python
from fastkafka.testing import Tester

msg = Data(
    data=0.1,
)

# Start Tester app and create InMemory Kafka broker for testing
async with Tester(kafka_app) as tester:
    # Send Data message to input_data topic
    await tester.to_input_data(msg)

    # Assert that the kafka_app responded with incremented data in output_data topic
    await tester.awaited_mocks.on_output_data.assert_awaited_with(
        Data(data=1.1), timeout=2
    )
```

    [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._start() called
    [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._patch_consumers_and_producers(): Patching consumers and producers!
    [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker starting
    [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'localhost:9092'}'
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()
    [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'localhost:9092'}'
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched start() called()
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'localhost:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100}
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['input_data']
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'localhost:9092', 'auto_offset_reset': 'earliest', 'max_poll_records': 100}
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched start() called()
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched subscribe() called
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer.subscribe(), subscribing to: ['output_data']
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
    [INFO] Demo Kafka app: Got data: 0.1
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaConsumer patched stop() called
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
    [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
    [INFO] fastkafka._testing.in_memory_broker: AIOKafkaProducer patched stop() called
    [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker._stop() called
    [INFO] fastkafka._testing.in_memory_broker: InMemoryBroker stopping

### Recap

We have created a simple FastKafka application. The app will consume the
`Data` from the `input_data` topic, log it and produce the incremented
data to `output_data` topic.

To test the app we have:

1.  Created the app

2.  Started our Tester class which mirrors the developed app topics for
    testing purposes

3.  Sent Data message to `input_data` topic

4.  Asserted and checked that the developed service has reacted to Data
    message

## Running the service

The service can be started using builtin faskafka run CLI command.
Before we can do that, we will concatenate the code snippets from above
and save them in a file `"application.py"`

``` python
# content of the "application.py" file

from pydantic import BaseModel, Field, NonNegativeFloat

from fastkafka import FastKafka
from fastkafka._components.logger import get_logger

logger = get_logger(__name__)

class Data(BaseModel):
    data: NonNegativeFloat = Field(
        ..., example=0.5, description="Float data example"
    )

kafka_brokers = {
    "localhost": {
        "url": "localhost",
        "description": "local development kafka broker",
        "port": 9092,
    },
    "production": {
        "url": "kafka.airt.ai",
        "description": "production kafka broker",
        "port": 9092,
        "protocol": "kafka-secure",
        "security": {"type": "plain"},
    },
}

kafka_app = FastKafka(
    title="Demo Kafka app",
    kafka_brokers=kafka_brokers,
)

@kafka_app.consumes(topic="input_data", auto_offset_reset="latest")
async def on_input_data(msg: Data):
    logger.info(f"Got data: {msg.data}")
    await to_output_data(msg.data)


@kafka_app.produces(topic="output_data")
async def to_output_data(data: float) -> Data:
    processed_data = Data(data=data+1.0)
    return processed_data
```

To run the service, use the FastKafka CLI command and pass the module
(in this case, the file where the app implementation is located) and the
app simbol to the command.

``` sh
fastkafka run --num-workers=1 --kafka-broker localhost application:kafka_app
```

After running the command, you should see the following output in your
command line:

    [1504]: 23-05-31 11:36:45.874 [INFO] fastkafka._application.app: set_kafka_broker() : Setting bootstrap_servers value to 'localhost:9092'
    [1504]: 23-05-31 11:36:45.875 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'localhost:9092'}'
    [1504]: 23-05-31 11:36:45.937 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
    [1504]: 23-05-31 11:36:45.937 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'localhost:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100}
    [1504]: 23-05-31 11:36:45.956 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
    [1504]: 23-05-31 11:36:45.956 [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
    [1504]: 23-05-31 11:36:45.956 [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
    [1504]: 23-05-31 11:36:45.956 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
    [1506]: 23-05-31 11:36:45.993 [INFO] fastkafka._application.app: set_kafka_broker() : Setting bootstrap_servers value to 'localhost:9092'
    [1506]: 23-05-31 11:36:45.994 [INFO] fastkafka._application.app: _create_producer() : created producer using the config: '{'bootstrap_servers': 'localhost:9092'}'
    [1506]: 23-05-31 11:36:46.014 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
    [1506]: 23-05-31 11:36:46.015 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'localhost:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100}
    [1506]: 23-05-31 11:36:46.040 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
    [1506]: 23-05-31 11:36:46.042 [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
    [1506]: 23-05-31 11:36:46.043 [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
    [1506]: 23-05-31 11:36:46.043 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
    [1506]: 23-05-31 11:36:46.068 [ERROR] aiokafka.cluster: Topic input_data not found in cluster metadata
    [1506]: 23-05-31 11:36:46.070 [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'input_data': 0}. 
    [1504]: 23-05-31 11:36:46.131 [WARNING] aiokafka.cluster: Topic input_data is not available during auto-create initialization
    [1504]: 23-05-31 11:36:46.132 [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'input_data': 0}. 
    [1506]: 23-05-31 11:37:00.237 [ERROR] aiokafka: Unable connect to node with id 0: [Errno 111] Connect call failed ('172.28.0.12', 9092)
    [1506]: 23-05-31 11:37:00.237 [ERROR] aiokafka: Unable to update metadata from [0]
    [1504]: 23-05-31 11:37:00.238 [ERROR] aiokafka: Unable connect to node with id 0: [Errno 111] Connect call failed ('172.28.0.12', 9092)
    [1504]: 23-05-31 11:37:00.238 [ERROR] aiokafka: Unable to update metadata from [0]
    [1506]: 23-05-31 11:37:00.294 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
    [1506]: 23-05-31 11:37:00.294 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
    Starting process cleanup, this may take a few seconds...
    23-05-31 11:37:00.345 [INFO] fastkafka._server: terminate_asyncio_process(): Terminating the process 1504...
    23-05-31 11:37:00.345 [INFO] fastkafka._server: terminate_asyncio_process(): Terminating the process 1506...
    [1504]: 23-05-31 11:37:00.347 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
    [1504]: 23-05-31 11:37:00.347 [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
    23-05-31 11:37:00.607 [INFO] fastkafka._server: terminate_asyncio_process(): Process 1506 was already terminated.
    23-05-31 11:37:00.822 [INFO] fastkafka._server: terminate_asyncio_process(): Process 1504 was already terminated.

## Documentation

The kafka app comes with builtin documentation generation using
[AsyncApi HTML generator](https://www.asyncapi.com/tools/generator).

AsyncApi requires Node.js to be installed and we provide the following
convenience command line for it:

``` sh
fastkafka docs install_deps
```

    23-05-31 11:38:24.128 [INFO] fastkafka._components.docs_dependencies: AsyncAPI generator installed

To generate the documentation programatically you just need to call the
following command:

``` sh
fastkafka docs generate application:kafka_app
```

    23-05-31 11:38:25.113 [INFO] fastkafka._components.asyncapi: Old async specifications at '/content/asyncapi/spec/asyncapi.yml' does not exist.
    23-05-31 11:38:25.118 [INFO] fastkafka._components.asyncapi: New async specifications generated at: '/content/asyncapi/spec/asyncapi.yml'
    23-05-31 11:38:43.455 [INFO] fastkafka._components.asyncapi: Async docs generated at 'asyncapi/docs'
    23-05-31 11:38:43.455 [INFO] fastkafka._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'

    Done! ✨
    Check out your shiny new generated files at /content/asyncapi/docs.

This will generate the *asyncapi* folder in relative path where all your
documentation will be saved. You can check out the content of it with:

``` sh
ls -l asyncapi
```

    total 8
    drwxr-xr-x 4 root root 4096 May 31 11:38 docs
    drwxr-xr-x 2 root root 4096 May 31 11:38 spec

In docs folder you will find the servable static html file of your
documentation. This can also be served using our `fastkafka docs serve`
CLI command (more on that in our guides).

In spec folder you will find a asyncapi.yml file containing the async
API specification of your application.

We can locally preview the generated documentation by running the
following command:

``` sh
fastkafka docs serve application:kafka_app
```

    23-05-31 11:38:45.250 [INFO] fastkafka._components.asyncapi: New async specifications generated at: '/content/asyncapi/spec/asyncapi.yml'
    23-05-31 11:39:04.410 [INFO] fastkafka._components.asyncapi: Async docs generated at 'asyncapi/docs'
    23-05-31 11:39:04.411 [INFO] fastkafka._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'

    Done! ✨
    Check out your shiny new generated files at /content/asyncapi/docs.


    Serving documentation on http://127.0.0.1:8000
    127.0.0.1 - - [31/May/2023 11:39:14] "GET / HTTP/1.1" 200 -
    127.0.0.1 - - [31/May/2023 11:39:14] "GET /css/global.min.css HTTP/1.1" 200 -
    127.0.0.1 - - [31/May/2023 11:39:14] "GET /js/asyncapi-ui.min.js HTTP/1.1" 200 -
    127.0.0.1 - - [31/May/2023 11:39:14] "GET /css/asyncapi.min.css HTTP/1.1" 200 -
    Interupting serving of documentation and cleaning up...

From the parameters passed to the application constructor, we get the
documentation bellow:

``` python
from fastkafka import FastKafka

kafka_brokers = {
    "localhost": {
        "url": "localhost",
        "description": "local development kafka broker",
        "port": 9092,
    },
    "production": {
        "url": "kafka.airt.ai",
        "description": "production kafka broker",
        "port": 9092,
        "protocol": "kafka-secure",
        "security": {"type": "plain"},
    },
}

kafka_app = FastKafka(
    title="Demo Kafka app",
    kafka_brokers=kafka_brokers,
)
```

![Kafka_servers](https://raw.githubusercontent.com/airtai/fastkafka/main/nbs/images/screenshot-kafka-servers.png)

The following documentation snippet are for the consumer as specified in
the code above:

![Kafka_consumer](https://raw.githubusercontent.com/airtai/fastkafka/main/nbs/images/screenshot-kafka-consumer.png)

The following documentation snippet are for the producer as specified in
the code above:

![Kafka_producer](https://raw.githubusercontent.com/airtai/fastkafka/main/nbs/images/screenshot-kafka-producer.png)

Finally, all messages as defined as subclasses of *BaseModel* are
documented as well:

![Kafka\_![Kafka_servers](https://raw.githubusercontent.com/airtai/fastkafka/main/nbs/images/screenshot-kafka-messages.png)](https://raw.githubusercontent.com/airtai/fastkafka/main/nbs/images/screenshot-kafka-messages.png)

## License

FastKafka is licensed under the Apache License 2.0

A permissive license whose main conditions require preservation of
copyright and license notices. Contributors provide an express grant of
patent rights. Licensed works, modifications, and larger works may be
distributed under different terms and without source code.

The full text of the license can be found
[here](https://raw.githubusercontent.com/airtai/fastkafka/main/LICENSE).


================================================
FILE: docker/.semgrepignore
================================================
dev.yml



================================================
FILE: docker/dev.yml
================================================
version: "3"
services:
    fastkafka-devel:  #nosemgrep
        image: ghcr.io/airtai/nbdev-mkdocs
        hostname: $DOCKER_COMPOSE_PROJECT-devel
        container_name: $DOCKER_COMPOSE_PROJECT-devel
        ports:
            - "${PORT_PREFIX}8888:8888"
            - "${PORT_PREFIX}4000:4000"
            - "${PORT_PREFIX}6006:6006"
        volumes:
            - $AIRT_PROJECT:/work/fastkafka
            - /etc/passwd:/etc/passwd
            - /etc/group:/etc/group
            - /etc/shadow:/etc/shadow
            - $HOME/.ssh:$HOME/.ssh
            - $HOME/.gitconfig:/root/.gitconfig
        environment:
            USER: $USER
            USERNAME: $USERNAME
            PRESERVE_ENVS: $PRESERVE_ENVS
            OPENAI_API_KEY: $OPENAI_API_KEY


================================================
FILE: docusaurus/babel.config.js
================================================
module.exports = {
  presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
};


================================================
FILE: docusaurus/docusaurus.config.js
================================================
// @ts-check
// Note: type annotations allow type checking and IDEs autocompletion

const lightCodeTheme = require('prism-react-renderer/themes/github');
const darkCodeTheme = require('prism-react-renderer/themes/dracula');

module.exports = async function configCreatorAsync() {
  /** @type {import('@docusaurus/types').Config} */
  const config = {
    title: 'FastKafka',
    tagline: 'Effortless Kafka integration for web services',
    customFields: {
      description:
        'Powerful and easy-to-use open-source framework for building asynchronous web services that interact with Kafka.',
    },
    favicon: 'img/AIRT_icon_blue.svg',

    // Set the production url of your site here
    url: 'https://fastkafka.airt.ai/',
    // Set the /<baseUrl>/ pathname under which your site is served
    // For GitHub pages deployment, it is often '/<projectName>/'
    baseUrl: '/',

    // GitHub pages deployment config.
    // If you aren't using GitHub pages, you don't need these.
    organizationName: 'airt', // Usually your GitHub org/user name.
    projectName: 'fastkafka', // Usually your repo name.
    trailingSlash: true,
    onBrokenLinks: 'warn',
    onBrokenMarkdownLinks: 'warn',

    // Even if you don't use internalization, you can use this field to set useful
    // metadata like html lang. For example, if your site is Chinese, you may want
    // to replace "en" with "zh-Hans".
    i18n: {
      defaultLocale: 'en',
      locales: ['en'],
    },

    presets: [
      [
        'classic',
        /** @type {import('@docusaurus/preset-classic').Options} */
        ({
          docs: {
            sidebarPath: require.resolve('./sidebars.js'),
            // Please change this to your repo.
            // Remove this to remove the "edit this page" links.
  //           editUrl:
  //             'https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/',
            exclude: [
              // '**/_*.{js,jsx,ts,tsx,md,mdx}',
              // '**/_*/**',
              '**/*.test.{js,jsx,ts,tsx}',
              '**/__tests__/**',
            ],
            versions: {
              current: {
                label: `dev 🚧`,
              },
            },
          },
          blog: {
            showReadingTime: true,
            // Please change this to your repo.
            // Remove this to remove the "edit this page" links.
  //           editUrl:
  //             'https://github.com/facebook/docusaurus/tree/main/packages/create-docusaurus/templates/shared/',
          },
          theme: {
            customCss: require.resolve('./src/css/custom.css'),
          },
          gtag: {
            trackingID: 'G-WLMWPELHMB',
          },
        }),
      ],
    ],

    themeConfig:
      /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
      ({
        algolia: {
          appId: 'EHYNSIUGMY',
          // Public API key: it is safe to commit it
          // nosemgrep
          apiKey: '2680cd13947844a00a5a657b959e6211',
          indexName: 'fastkafka-airt',
        },
        // Replace with your project's social card
        image: 'https://opengraph.githubassets.com/1671805243.560327/airtai/fastkafka',
        // colorMode: {
        //   disableSwitch: true,
        // },
        navbar: {
          title: 'airt',
          logo: {
            alt: 'airt logo',
            src: 'img/AIRT_icon_blue.svg',
            href: 'https://airt.ai',
            target: '_blank'
          },
          items: [
            {to: '/', html: '<div><img src="/img/home-icon.svg"><p>FastKafka</p></div>', position: 'right', className: 'fastkafka-home'},
            {
              type: 'docsVersionDropdown',
              position: 'right',
              dropdownActiveClassDisabled: true,
              // dropdownItemsAfter: [{to: '/versions', label: 'All versions'}],
            },
            {
              type: 'docSidebar',
              sidebarId: 'tutorialSidebar',
              position: 'right',
              label: 'Docs',
            },
  //           {to: '/blog', label: 'Blog', position: 'left'},
            {
              type: 'html',
              position: 'right',
              className: 'github-stars',
              value: '<iframe src="https://ghbtns.com/github-btn.html?user=airtai&repo=fastkafka&type=star&count=true&size=large" frameborder="0" scrolling="0" width="170" height="30" title="GitHub"></iframe>',
            },
            {
              href: 'https://discord.gg/CJWmYpyFbc',
              position: 'right',
              className: "header-discord-link",
              "aria-label": "Discord Link",
            },
            {to: '/', html: '<div><img src="/img/home-icon.svg"></div>', position: 'right', className: 'fastkafka-home-mobile'},
          ],
        },
        footer: {
          style: 'dark',
          links: [
            {
              title: 'COMMUNITY',
              items: [
                {
                  html: `
                      <a class="footer-discord-link" href="https://discord.gg/CJWmYpyFbc" target="_blank" rel="noreferrer noopener" aria-label="Discord link"></a>
                    `,
                },
                {
                  html: `
                      <a class="footer-github-link" href="https://github.com/airtai" target="_blank" rel="noreferrer noopener" aria-label="Github link"></a>
                    `,
                },
                {
                  html: `
                      <a class="footer-twitter-link" href="https://twitter.com/airt_AI" target="_blank" rel="noreferrer noopener" aria-label="Twitter link"></a>
                    `,
                },
                {
                  html: `
                      <a class="footer-facebook-link" href="https://www.facebook.com/airt.ai.api/" target="_blank" rel="noreferrer noopener" aria-label="Facebook link"></a>
                    `,
                },
                {
                  html: `
                      <a class="footer-linkedin-link" href="https://www.linkedin.com/company/airt-ai/" target="_blank" rel="noreferrer noopener" aria-label="LinkedIn link"></a>
                    `,
                },
              ],
            },
            {
              title: 'EXPLORE DOCS',
              items: [
                {
                  label: 'Get Started',
                  to: '/docs',
                },
              ],
            },
            {
              title: 'EXPLORE MORE',
              items: [
                {
                  label: 'News',
                  to: 'https://airt.ai/news',
                },
                {
                  label: 'About Us',
                  to: 'https://airt.ai/about-us',
                },
                {
                  label: 'Company information',
                  to: 'https://airt.ai/company-information',
                },
                // {
                //   label: 'Contact',
                //   to: 'contact',
                // },
                
              ],
            },
          ],
          copyright: `© 2023 airt. All rights reserved.`,
        },
        // prism: {
        //   theme: lightCodeTheme,
        //   darkTheme: darkCodeTheme,
        // },
        prism: {
          theme: ( await import('./src/utils/prismLight.mjs')).default,
          darkTheme: ( await import('./src/utils/prismDark.mjs')).default,
        },
      }),
  };
  return config
};


================================================
FILE: docusaurus/package.json
================================================
{
  "name": "fastkafka",
  "version": "0.0.0",
  "private": true,
  "scripts": {
    "docusaurus": "docusaurus",
    "start": "docusaurus start --host 0.0.0.0 --port 4000",
    "build": "docusaurus build",
    "swizzle": "docusaurus swizzle",
    "deploy": "docusaurus deploy",
    "clear": "docusaurus clear",
    "serve": "docusaurus serve  --host 0.0.0.0 --port 4000",
    "write-translations": "docusaurus write-translations",
    "write-heading-ids": "docusaurus write-heading-ids"
  },
  "dependencies": {
    "@docusaurus/core": "2.4.0",
    "@docusaurus/preset-classic": "2.4.0",
    "@mdx-js/react": "^1.6.22",
    "clsx": "^1.2.1",
    "prism-react-renderer": "^1.3.5",
    "react": "^17.0.2",
    "react-accessible-accordion": "^5.0.0",
    "react-dom": "^17.0.2",
    "react-iframe": "^1.8.5",
    "react-youtube": "^10.1.0"
  },
  "devDependencies": {
    "@docusaurus/module-type-aliases": "2.4.0"
  },
  "browserslist": {
    "production": [
      ">0.5%",
      "not dead",
      "not op_mini all"
    ],
    "development": [
      "last 1 chrome version",
      "last 1 firefox version",
      "last 1 safari version"
    ]
  },
  "engines": {
    "node": ">=16.14"
  }
}


================================================
FILE: docusaurus/scripts/build_docusaurus_docs.sh
================================================
#!/bin/bash

# exit when any command fails
set -e

echo "Cleanup existing build artifacts"
rm -rf docusaurus/docs

echo "Runing nbdev_mkdocs docs"
mkdir -p mkdocs/docs
cp LICENSE mkdocs/docs/LICENSE.md
cp CONTRIBUTING.md mkdocs/docs
nbdev_mkdocs docs

echo "Copying newly generated markdown files to docusaurus directory"
cp -r mkdocs/docs docusaurus/

echo "Generating sidebars.js"
python3 -c "from fastkafka._docusaurus_helper import generate_sidebar; generate_sidebar('./docusaurus/docs/SUMMARY.md', './docusaurus/sidebars.js')"

echo "Deleting the markdown files from the docs directory that are not present in the sidebar."
python3 -c "from fastkafka._docusaurus_helper import delete_unused_markdown_files_from_sidebar; delete_unused_markdown_files_from_sidebar('./docusaurus/docs', './docusaurus/sidebars.js')"

echo "Generating API docs"
python3 -c "from fastkafka._docusaurus_helper import fix_invalid_syntax_in_markdown, generate_markdown_docs; fix_invalid_syntax_in_markdown('./docusaurus/docs'); generate_markdown_docs('fastkafka', './docusaurus/docs')"

echo "Runing docusaurus build"
cd docusaurus && npm run build

echo "Checking and creating new document version..."
settings_file="../settings.ini"
docs_versioning_flag=$( { grep '^docs_versioning[[:space:]]*=' "$settings_file" || [[ $? == 1 ]]; } | awk -F = '{print $2}' | xargs)

if [ "$docs_versioning_flag" == "minor" ]; then
    echo "Error: minor versioning is not supported when using Docusaurus static site generator. Use patch to create new document version or None to disable document versioning." >&2
    exit 1
fi

if [ -z "$docs_versioning_flag" ]; then
    docs_versioning_flag="None"
fi

if [ "$docs_versioning_flag" != "patch" ] && [ "$docs_versioning_flag" != "None" ]; then
    echo "Error: Invalid value set for 'docs_versioning' in settings.ini file: $docs_versioning_flag. Allowed values are patch or None." >&2
    exit 1
fi

docs_version_file="versions.json"
if [ "$docs_versioning_flag" == "patch" ]; then
    echo "Document versioning is enabled."
    lib_version=$(grep '^version[[:space:]]*=' "$settings_file" | awk -F = '{print $2}' | xargs)
    pat="^[0-9]+([.][0-9]+)*$"
    if [[ $lib_version =~ $pat ]]; then
        if [ -f "$docs_version_file" ]; then
            if grep -q "\"$lib_version\"" "$docs_version_file"; then
                echo "Document version already exists: '$lib_version'"
            else
                npm run docusaurus docs:version $lib_version
            fi
        else
            npm run docusaurus docs:version $lib_version
        fi
    else
        echo "Canary document version updated: '$lib_version'"
    fi
elif [ "$docs_versioning_flag" == "None" ]; then
    echo "Document versioning is disabled."
    if [ -f "$docs_version_file" ]; then
        echo "Deleting previously created document versions."
        rm -rf versioned_docs versioned_sidebars versions.json
        echo "Successfully deleted all previous document versions."
    fi
fi

echo -e "\e[36;1m[INFO]\e[0m Creating a compressed archive of the generated Markdown files. This file is essential for implementing semantic search in the FastFafka-Gen library."
cd ../ && mkdir -p .fastkafka_gen
find "./docusaurus/docs/" -type f -name "*.md" | tar -czvf ".fastkafka_gen/site_md_archive.tar.gz" -T -
echo -e "\e[36;1m[INFO]\e[0m Markdown files have been successfully compressed and saved in: .fastkafka_gen/site_md_archive.tar.gz"


================================================
FILE: docusaurus/scripts/install_docusaurus_deps.sh
================================================
#!/bin/bash

echo "Install docusaurus dependencies"
cd docusaurus && npm install


================================================
FILE: docusaurus/scripts/serve_docusaurus_docs.sh
================================================
#!/bin/bash

echo "Serve docusaurus documentation"
cd docusaurus && npm run start



================================================
FILE: docusaurus/scripts/update_readme.sh
================================================
#!/bin/bash

# exit when any command fails
set -e

echo "Run nbdev_readme and fix symbol links"
python3 -c "from fastkafka._docusaurus_helper import update_readme; update_readme()"


================================================
FILE: docusaurus/sidebars.js
================================================
module.exports = {
tutorialSidebar: [
    'index', {'Guides': 
    [{'Writing services': ['guides/Guide_11_Consumes_Basics', 'guides/Guide_12_Batch_Consuming', 'guides/Guide_21_Produces_Basics', 'guides/Guide_22_Partition_Keys', 'guides/Guide_23_Batch_Producing', 'guides/Guide_05_Lifespan_Handler', 'guides/Guide_07_Encoding_and_Decoding_Messages_with_FastKafka', 'guides/Guide_24_Using_Multiple_Kafka_Clusters']}, {'Testing': ['guides/Guide_33_Using_Tester_class_to_test_fastkafka', 'guides/Guide_31_Using_redpanda_to_test_fastkafka']}, {'Documentation generation': ['guides/Guide_04_Github_Actions_Workflow']}, {'Deployment': ['guides/Guide_30_Using_docker_to_deploy_fastkafka', 'guides/Guide_32_Using_fastapi_to_run_fastkafka_application']}, {'Benchmarking': ['guides/Guide_06_Benchmarking_FastKafka']}]},{'API': ['api/fastkafka/EventMetadata', 'api/fastkafka/FastKafka', 'api/fastkafka/KafkaEvent', {'encoder': ['api/fastkafka/encoder/AvroBase', 'api/fastkafka/encoder/avro_decoder', 'api/fastkafka/encoder/avro_encoder', 'api/fastkafka/encoder/avsc_to_pydantic', 'api/fastkafka/encoder/json_decoder', 'api/fastkafka/encoder/json_encoder']}, {'executors': ['api/fastkafka/executors/DynamicTaskExecutor', 'api/fastkafka/executors/SequentialExecutor']}, {'testing': ['api/fastkafka/testing/ApacheKafkaBroker', 'api/fastkafka/testing/LocalRedpandaBroker', 'api/fastkafka/testing/Tester']}]},{'CLI': ['cli/fastkafka', 'cli/run_fastkafka_server_process']},
    "LICENSE",
    "CONTRIBUTING",
    "CHANGELOG",
],
};

================================================
FILE: docusaurus/src/components/BrowserWindow/index.js
================================================
/**
 * Copyright (c) Facebook, Inc. and its affiliates.
 *
 * This source code is licensed under the MIT license found in the
 * LICENSE file in the root directory of this source tree.
 */

import React from 'react';
import clsx from 'clsx';

import styles from './styles.module.css';

export default function BrowserWindow({
  children,
  minHeight,
  url = '',
  style,
  bodyStyle,
}) {
  return (
    <div className={styles.browserWindow} style={{...style, minHeight}}>
      <div className={styles.browserWindowHeader}>
        <div className={styles.buttons}>
          <span className={styles.dot} style={{background: '#f25f58'}} />
          <span className={styles.dot} style={{background: '#fbbe3c'}} />
          <span className={styles.dot} style={{background: '#58cb42'}} />
        </div>
        <div className={clsx(styles.browserWindowAddressBar, 'text--truncate')}>
          {url}
        </div>
        <div className={styles.browserWindowMenuIcon}>
          <div>
            <span className={styles.bar} />
            <span className={styles.bar} />
            <span className={styles.bar} />
          </div>
        </div>
      </div>

      <div className={styles.browserWindowBody} style={bodyStyle}>
        {children}
      </div>
    </div>
  );
}

// Quick and dirty component, to improve later if needed
export function IframeWindow({url}) {
  return (
    <div style={{padding: 10}}>
      <BrowserWindow
        url={url}
        style={{minWidth: '40vw', maxWidth: 400}}
        bodyStyle={{padding: 0}}>
        <iframe src={url} title={url} style={{width: '100%', height: 300}} />
      </BrowserWindow>
    </div>
  );
}

================================================
FILE: docusaurus/src/components/BrowserWindow/styles.module.css
================================================
/**
 * Copyright (c) Facebook, Inc. and its affiliates.
 *
 * This source code is licensed under the MIT license found in the
 * LICENSE file in the root directory of this source tree.
 */

.browserWindow {
  border: 1px solid #fff;
  border-radius: var(--ifm-global-radius);
  box-shadow: rgba(0, 0, 0, 0.35) 0px 5px 15px;
  margin-bottom: var(--ifm-leading);
}

.browserWindowHeader {
  align-items: center;
  background: #ebedf0;
  display: flex;
  padding: 0.5rem 1rem;
}

.row::after {
  content: "";
  display: table;
  clear: both;
}

.buttons {
  white-space: nowrap;
}

.right {
  align-self: center;
  width: 10%;
}

[data-theme="light"] {
  --ifm-background-color: #fff;
}

.browserWindowAddressBar {
  flex: 1 0;
  margin: 0 1rem 0 0.5rem;
  border-radius: 12.5px;
  background-color: #fff;
  color: var(--ifm-color-gray-800);
  padding: 5px 15px;
  font: 400 13px Arial, sans-serif;
  user-select: none;
  height: 20px;
}

[data-theme="dark"] .browserWindowAddressBar {
  color: var(--ifm-color-gray-300);
}

.dot {
  margin-right: 6px;
  margin-top: 4px;
  height: 12px;
  width: 12px;
  background-color: #bbb;
  border-radius: 50%;
  display: inline-block;
}

.browserWindowMenuIcon {
  margin-left: auto;
}

.bar {
  width: 17px;
  height: 3px;
  background-color: #aaa;
  margin: 3px 0;
  display: block;
}

.browserWindowBody {
  background-color: var(--ifm-background-color);
  border-bottom-left-radius: inherit;
  border-bottom-right-radius: inherit;
  padding: 0;
}

.browserWindowBody > *:last-child {
  margin-bottom: -8px;
}


================================================
FILE: docusaurus/src/components/HomepageCommunity/index.js
================================================
import React, { useState, useEffect } from 'react';
import clsx from 'clsx';
import styles from './styles.module.css';

function Testimonial({ testimonialLimitToShow, allTestimonials }) {
  return (
    <div className={`${clsx('col col--4')} ${styles.testimonialWrapper}`}>
      {Object.entries(allTestimonials).map(([key, value]) => {
        if (key.split("_")[1] <= testimonialLimitToShow) {
          return (
            <a
              key={key}
              href={value.source.link}
              target="_blank"
              rel="noopener noreferrer"
              className={styles.testimonialAnchor}
            >
              <div className={styles.testimonialContainer}>
                <div className={styles.testimonialHeader}>
                  <div className={styles.testimonialUserInfo}>
                    <img src={value.user.profilePic} className={styles.testimonialProfilePic} />
                    <div>
                      <h6>{value.user.fullName}</h6>
                      <p>{value.user.userName}</p>
                    </div>
                  </div>
                  <div>
                    <img className={styles.testimonialSourceIcon} src={value.source.icon} alt="" />
                  </div>
                </div>
                <div className="text--center padding-horiz--md">
                  <p className={styles.testimonialDescription}>{value.description}</p>
                </div>
              </div>
            </a>
          );
        }
        return null;
      })}
    </div>
  );
}


const redditUserProfiles = ["deadwisdom", "benbenbang", "Berouald", "baggiponte", "No-Application5593", "code_mc", "teajunky", "SteamingBeer", "BestBottle4517"];
const maxTestimonialSectionToShow = "4"

export default function HomepageCommunity() {
  const [testimonialLimitToShow, setTestimonialLimitToShow] = useState("2");
  const [profiles, setProfiles] = useState(redditUserProfiles.reduce(
    (result, username) => ({
      ...result,
      [username]: {
        icon_img: "https://www.redditstatic.com/avatars/defaults/v2/avatar_default_1.png",
        subreddit: {
          display_name_prefixed: `u/${username}`,
        },
      },
    }),
    {}
  ));
  const testimonials = [
    {
      container_1: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/Python/comments/13i0eaz/comment/jk90bwz/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["deadwisdom"]["icon_img"],
          userName: profiles["deadwisdom"]["subreddit"]["display_name_prefixed"],
          fullName: "deadwisdom",
        },
        description: (
          <>
            Well well well, if it isn't the library I was already making but better. Very nice.

            What is your long-term vision for supporting this as a company?

            And are you using this now to support real customers or are you expecting this might help you establish a niche?
          </>
        ),
      },
      container_2: {
        source: {
          icon: "img/twitter-logo.svg",
          link: "https://twitter.com/emaxerrno/status/1635005087721611264?s=20",
        },
        user: {
          profilePic: "img/a-alphabet-round-icon.png",
          userName: "Alexander Gallego",
          fullName: "Alexander Gallego",
        },
        description: (
          <>
            this is cool. let me know if you want to share it w/ the @redpandadata community.
          </>
        ),
      },
      container_3: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/Python/comments/11paz9u/comment/jbxbbxp/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["BestBottle4517"]["icon_img"].replace(/&amp;/g, '&'),
          userName: profiles["BestBottle4517"]["subreddit"]["display_name_prefixed"],
          fullName: "BestBottle4517",
        },
        description: (
          <>
            Very cool indeed. Currently at work we're using RabbitMQ for messaging so this doesn't apply to us (for now), but this type and style of implementation is exactly what I would expect when searching for libs like this. Great job!
          </>
        ),
      },
      container_4: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/programming/comments/11sjtgm/comment/jceqgml/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["teajunky"]["icon_img"],
          userName: profiles["teajunky"]["subreddit"]["display_name_prefixed"],
          fullName: "teajunky",
        },
        description: (
          <>
            Wow, the code in the package is auto-generated from Jupyter-Notebooks
          </>
        ),
      },
      
    },
    {
      container_1: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/FastAPI/comments/124v5di/comment/jfhg2t2/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["benbenbang"]["icon_img"].replace(/&amp;/g, '&'),
          userName: profiles["benbenbang"]["subreddit"]["display_name_prefixed"],
          fullName: "benbenbang",
        },
        description: (
          <>
            Nice 👍🏻 I’ve promoted this project in the team! Also, would like to contribute if there’s some kind of roadmap
          </>
        ),
      },
      container_2: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/Python/comments/11paz9u/comment/jbxf1v8/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["code_mc"]["icon_img"],
          userName: profiles["code_mc"]["subreddit"]["display_name_prefixed"],
          fullName: "code_mc",
        },
        description: (
          <>
            I really like the idea of this, as the biggest gripe I have with most pub/sub solutions is all of the tedious boiler plate code needed to correctly subscribe and publish and manage message leases etc. While you often just want to grab a message, do some processing and put it on a different queue.
          </>
        ),
      },
      container_3: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/FastAPI/comments/11oq09r/comment/jc4dwit/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["No-Application5593"]["icon_img"],
          userName: profiles["No-Application5593"]["subreddit"]["display_name_prefixed"],
          fullName: "No-Application5593",
        },
        description: (
          <>
            Wow! This is really great, thank you for your efforts guys. This is what I really need for one of my future projects.
          </>
        ),
      },
      container_4: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/FastAPI/comments/11oq09r/comment/jbx4dfn/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["SteamingBeer"]["icon_img"].replace(/&amp;/g, '&'),
          userName: profiles["SteamingBeer"]["subreddit"]["display_name_prefixed"],
          fullName: "SteamingBeer",
        },
        description: (
          <>
            Thank you for your efforts. I see me pitching this library to my team in the near future!
          </>
        ),
      },
    },
    {
      container_1: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/FastAPI/comments/124v5di/comment/jee9vm9/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["Berouald"]["icon_img"],
          userName: profiles["Berouald"]["subreddit"]["display_name_prefixed"],
          fullName: "Berouald",
        },
        description: (
          <>
            This is great! I've been thinking about making a similar tool for quite some time, nice job sir! I guess it's to fit your use case, by why stop at Kafka? A paradigm like this would be awesome in the form of a microframework. Like a general message consumer framework with pluggable interfaces for Kafka, Rabbitmq, ActiveMQ or even the Redis message broker.
          </>
        ),
      },
      container_2: {
        source: {
          icon: "img/reddit-logo.png",
          link: "https://www.reddit.com/r/Python/comments/120mt5k/comment/jdpwycr/?utm_source=share&utm_medium=web2x&context=3",
        },
        user: {
          profilePic: profiles["baggiponte"]["icon_img"],
          userName: profiles["baggiponte"]["subreddit"]["display_name_prefixed"],
          fullName: "baggiponte",
        },
        description: (
          <>
            Really hope this project becomes as popular as the OG FastAPI!
          </>
        ),
      },
      
      container_3: {
        source: {
          icon: "img/twitter-logo.svg",
          link: "https://twitter.com/perbu/status/1635014207656849408?s=20",
        },
        user: {
          profilePic: "img/p-alphabet-round-icon.png",
          userName: "Per Buer",
          fullName: "Per Buer",
        },
        description: (
          <>
            I really like how we're getting these more specialized ways to leverage streaming databases, instead of the somewhat intimidating access libraries.
          </>
        ),
      },
      container_4: {
        source: {
          icon: "img/Y_Combinator_Logo.png",
          link: "https://news.ycombinator.com/item?id=35086594",
        },
        user: {
          profilePic: "img/I.svg",
          userName: "iknownothow",
          fullName: "iknownothow",
        },
        description: (
          <>
            It looks incredible and I truly hope your project takes off for my sake since I have to work with Kafka from time to time!
          </>
        ),
      },
    },
  ];

  const handleLoadMore = () => {
    setTestimonialLimitToShow(testimonialLimitToShow === "2" ? "3" : Object.keys(testimonials[0]).length);
  };

  useEffect(() => {
    async function fetchData() {
      try {
        let profilesData = {};
        for (const profile of redditUserProfiles) {
          const response = await fetch(`https://www.reddit.com/user/${profile}/about.json`);
          let data = await response.json();
          data.data.icon_img = data.data.icon_img.split("?")[0]
          profilesData[profile] = data.data;
        }
        setProfiles(profilesData);
      } catch (error) {
        console.error(error);
      }
    }
    fetchData();
  }, []);
  return (
    <section className={`${styles.features}  hero hero--primary`}>
      <div className="container">
        <div className={clsx('col col--12')}>
          <h2 className={styles.title}>The community has spoken!</h2>
        </div>
        <div className="row">
          {testimonials.map((props, idx) => (
            <Testimonial key={idx} testimonialLimitToShow={testimonialLimitToShow} allTestimonials = {props}  />
          ))}
        </div>
        {testimonialLimitToShow < Object.keys(testimonials[0]).length && (
          <div className={styles.buttons}>
            <button className={clsx("button button--lg", styles.heroButton)} onClick={handleLoadMore}>
                Load More
            </button>
          </div>
        )}
      </div>
    </section>
  );
}


================================================
FILE: docusaurus/src/components/HomepageCommunity/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 3rem 0;
  width: 100%;
  background-color: #60bee4;
}

.featureSvg {
  height: 200px;
  width: 200px;
}
.title {
  font-size: 3rem;
  text-align: center;
  padding-bottom: 2rem;
  color: #fff;
}
.subTitle {
  font-size: 1.5rem;
  text-align: left;
}
.description {
  font-size: 1rem;
  text-align: center;
  margin-top: 3rem;
}
.buttons {
  display: flex;
  align-items: center;
  justify-content: center;
  margin-top: 55px;
}
.heroButton {
  color: #fff;
  background: var(--ifm-navbar-background-color);
  border-radius: 25px;
  padding: 0.7rem 2.5rem 0.7rem 2.5rem;
  margin-top: -1.5rem;
  font-size: 1rem;
}
.heroButton:hover {
  background: #3e99c5;
}

/* .withExtraMargin*/
/* .testimonialWrapper {
  margin-top: 60px;
} */

/* .withExtraMargin a */
/* .testimonialWrapper a {
  color: var(--ifm-hero-text-color);
  display: flex;
  flex-direction: row;
  row-gap: 50px;
} */

.testimonialAnchor {
  background-color: #fff;
  padding: 1.6rem;
  color: #1c1e21; /*var(--ifm-font-color-base); */
  border-radius: 5px;
  margin-left: 1rem;
  margin-top: 1rem;
  display: block;
}

.testimonialAnchor:hover {
  text-decoration: none;
  color: #1c1e21;
}

.testimonialWrapper {
  padding: 0px 0px 3rem 0px;
  /* margin-top: 10px; */
}

.testimonialDescription {
  font-size: 1rem;
}

.testimonialHeader {
  display: flex;
  justify-content: space-between;
  height: 45px;
  margin-bottom: 30px;
}
.testimonialUserInfo {
  display: flex;
}

.testimonialUserInfo h6,
.testimonialUserInfo p {
  margin-bottom: 0px;
  margin-left: 10px;
}

.testimonialProfilePic {
  width: 45px;
  height: auto;
  border-radius: 50%;
}

.testimonialSourceIcon {
  width: 20px;
  height: auto;
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .testimonialAnchor {
    margin: 2rem 1rem 0.5rem 1rem;
  }
  .testimonialWrapper {
    padding: 0 var(--ifm-spacing-horizontal);
  }
  .title {
    font-size: 2rem;
  }
}


================================================
FILE: docusaurus/src/components/HomepageFAQ/index.js
================================================
import React from 'react';
import clsx from 'clsx';
import {
  Accordion,
  AccordionItem,
  AccordionItemHeading,
  AccordionItemButton,
  AccordionItemPanel,
} from 'react-accessible-accordion';

import styles from './styles.module.css';
import 'react-accessible-accordion/dist/fancy-example.css';

const items = [
  {
    "heading": "How much does FastKafka cost?",
    "content": "FastKafka is under Apache 2.0 license and free to use."
  },
  {
    "heading": "How can I contribute or request features?",
    "content": "We love and welcome community contributions! Here is a <a href='https://github.com/airtai/fastkafka/blob/main/CONTRIBUTING.md' target='_blank'>doc</a> to get you started. To request features, add a “Feature request” using the New issue button in GitHub from <a href='https://github.com/airtai/fastkafka/issues' target='_blank'>this link</a>, or join our feature-request <a href='https://discord.gg/CJWmYpyFbc' target='_blank'>Discord channel</a>."
  },
  {
    "heading": "Do you support any streaming platforms other than Kafka?",
    "content": "Slowly, but surely. We built the initial version for Kafka service and for our needs, but we reached out to the wider community to find out what to do next. We added support for Redpanda, and also got requests for RabbitMQ and Pulsar that went to our backlog and we’ll support them in our future releases."
  },
  {
    "heading": "Does FastKafka integrate with AsyncAPI in the way that FastAPi integrates with OpenAPI?",
    "content": "Very much the same, but with a small difference due to dependencies of AsyncAPI. You write your code using decorators and you get AsyncAPI specification generated automatically as YAML file. You can convert that file to static HTML file ether by Python API call, CLI or github action. AsyncAPI requires Node.js, and you don’t necessarily want this in production."
  },
  {
    "heading": "Does it assume that Kafka messages are in JSON format? What if we want to use protobuf, for example?",
    "content": "For the first implementation we just released uses with JSON encoded messages, but we can easily add additional formats/protocols. We’ve created an issue on GitHub and will try to prioritize it for one of the next releases."
  },
]

export default function HomepageFAQ() {
  return (
    <section className={styles.features}>
      <div className="container">
      <div className={clsx('col col--12')}>
          <h2 className={styles.title}>FAQs</h2>
          <p>For anything not covered here, join <a className={styles.href} href="https://discord.gg/CJWmYpyFbc" target="_blank">our Discord</a></p>
        </div>
        <div className={clsx('col col--12 text--left padding-horiz--md')}>
        <Accordion allowZeroExpanded>
          {items.map((item, idx) => (
              <AccordionItem key={idx}>
                  <AccordionItemHeading>
                      <AccordionItemButton>
                          {item.heading}
                      </AccordionItemButton>
                  </AccordionItemHeading>
                  <AccordionItemPanel>
                  <p className={styles.faqAnswer} dangerouslySetInnerHTML={{__html: item.content}} />
                  </AccordionItemPanel>
              </AccordionItem>
          ))}
      </Accordion>
        </div>
      </div>
    </section>
  );
}


================================================
FILE: docusaurus/src/components/HomepageFAQ/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 2rem 0 8rem 0;
  width: 100%;
  background-color: #076d9e;
}

.featureSvg {
  height: 200px;
  width: 200px;
}
.title {
  font-size: 3rem;
  text-align: center;
  color: #fff;
}

.title + p {
  font-size: 1.5rem;
  font-style: italic;
  text-align: center;
  margin-bottom: 5rem;
  color: #fff;
}
.subTitle {
  font-size: 1.5rem;
  text-align: center;
  color: #fff;
}
.description {
  font-size: 1rem;
  text-align: center;
  margin-top: 3rem;
  color: #fff;
}
.rowWitExtraMargin {
  margin-top: 80px;
}
.link {
  color: var(--ifm-hero-text-color);
  text-decoration: underline;
  transition: color var(--ifm-transition-fast)
    var(--ifm-transition-timing-default);
}

.wrapper {
  position: relative;
}

.verticalAndHorizontalCenter {
  margin: 0;
  position: absolute;
  top: 50%;
  left: 50%;
  -ms-transform: translate(-50%, -50%);
  transform: translate(-50%, -50%);
}

.href {
  text-decoration: underline;
}
.faqAnswer a {
  text-decoration: underline;
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .title {
    font-size: 2rem;
  }
  .title + p {
    margin-bottom: 1rem;
  }
}


================================================
FILE: docusaurus/src/components/HomepageFastkafkaChat/index.js
================================================
import React from 'react';
import clsx from 'clsx';

import styles from './styles.module.css';



// const FeatureList = [
//   {
//     title: 'WRITE',
//     Svg: require('@site/static/img/programming-monitor-svgrepo-com.svg').default,
//     description: (
//       <>
//         producers & consumers for Kafka topics in a simplified way
//       </>
//     ),
//   },
//   {
//     title: 'PROTOTYPE',
//     Svg: require('@site/static/img/rocket-svgrepo-com.svg').default,
//     description: (
//       <>
//         quickly & develop high-performance Kafka-based services
//       </>
//     ),
//   },
//   {
//     title: 'STREAMLINE',
//     Svg: require('@site/static/img/hierarchy-order-svgrepo-com.svg').default,
//     description: (
//       <>
//         your workflow & accelerate your progress
//       </>
//     ),
//   },
// ];

function Feature({Svg, title, description}) {
  return (
    <div className={clsx('col col--4')}>
      <div className="text--center">
        <Svg className={styles.featureSvg} role="img" />
      </div>
      <div className="text--center padding-horiz--md">
        <h3>{title}</h3>
        <p>{description}</p>
      </div>
    </div>
  );
}


================================================
FILE: docusaurus/src/components/HomepageFastkafkaChat/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 5rem 0;
  width: 100%;
  background: rgb(82, 175, 216);
  background: linear-gradient(
    180deg,
    rgba(82, 175, 216, 1) 0%,
    rgba(96, 190, 228, 1) 100%
  );
  border-top: 1px solid #8bcae5;
  border-bottom: 1px solid #8bcae5;
}

.featureSvg {
  height: 160px;
  width: 160px;
  margin: 30px 0px;
}
.title {
  font-size: 3rem;
  text-align: center;
  color: #fff;
}
.fastkafkaDescription {
  font-size: 1.5rem;
  font-style: italic;
  text-align: center;
  margin-bottom: 5rem;
  color: #fff;
}
.subTitle {
  font-size: 1.5rem;
  text-align: left;
  color: #fff;
}
.description {
  font-size: 1rem;
  text-align: center;
  margin-top: 3rem;
  color: #fff;
}
.rowWitExtraMargin {
  margin-top: 80px;
}

.fastkafkaChatIframe {
  width: 100%;
  height: 600px;
  display: inline-block;
  position: relative;
}
.fastkafkaChatHeader {
  font-size: 1.8rem;
  text-align: center;
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .title {
    font-size: 2rem;
  }
}

/* .slantedDiv {
  position: relative;
  padding: 200px 0;
  background: #fff;
  overflow: visible;
  z-index: 1;
}

.slantedDiv:before,
.slantedDiv:after {
  content: "";
  width: 100%;
  height: 100%;
  position: absolute;
  background: inherit;
  z-index: -1;
  top: 0;
  transform-origin: left top;
  transform: skewY(-2deg);
}

.slantedDiv:after {
  bottom: 0;
  transform-origin: left bottom;
  transform: skewY(3deg);
} */

/* displays the content inside, as these settings in the parent breaks the effect */
/* .slantedDiv div {
  text-align: center;
  font-size: 1.5em;
  line-height: 1.5;
} */


================================================
FILE: docusaurus/src/components/HomepageFeatures/index.js
================================================
import React from 'react';
import clsx from 'clsx';

import styles from './styles.module.css';



const FeatureList = [
  {
    title: 'WRITE',
    src: "img/write.svg",
    description: (
      <>
        producers & consumers for Kafka topics in a simplified way
      </>
    ),
  },
  {
    title: 'PROTOTYPE',
    src: "img/prototype.svg",
    description: (
      <>
        quickly & develop high-performance Kafka-based services
      </>
    ),
  },
  {
    title: 'STREAMLINE',
    src: "img/streamline.svg",
    description: (
      <>
        your workflow & accelerate your progress
      </>
    ),
  },
];

function Feature({src, title, description}) {
  return (
    <div className={clsx('col col--4')}>
      <div className="text--center">
        <img className={styles.featureSvg} src={src}/>
      </div>
      <div className={clsx("text--center padding-horiz--md"), styles.textContainer}>
        <h3>{title}</h3>
        <p>{description}</p>
      </div>
    </div>
  );
}

export default function HomepageFeatures() {
  return (
    <section className={styles.features}>
      <div className="container">
      <div className={clsx('col col--12')}>
          <h2 className={styles.title}>Swim with the stream…ing services</h2>
        </div>
        <div className="row">
          {FeatureList.map((props, idx) => (
            <Feature key={idx} {...props} />
          ))}
        </div>
      </div>
    </section>
  );
}


================================================
FILE: docusaurus/src/components/HomepageFeatures/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 1rem 0 4rem 0;
  width: 100%;
  background: rgb(20, 116, 166);
  background: linear-gradient(
    180deg,
    rgba(20, 116, 166, 1) 50%,
    rgba(82, 175, 216, 1) 100%
  );
}

.featureSvg {
  height: 250px;
  width: 330px;
  margin: 30px 0px;
}
.title {
  font-size: 3rem;
  text-align: center;
  padding-top: 2rem;
  padding-bottom: 3rem;
  color: #fff;
}
.textContainer {
  color: #fff;
  text-align: center;
  padding: 1rem 2.2rem;
}
.textContainer h3 {
  font-size: 1.5rem;
}
.textContainer p {
  font-size: 1rem;
}
.subTitle {
  font-size: 1.4rem;
  text-align: left;
  color: #fff;
}
.description {
  font-size: 1rem;
  text-align: center;
  margin-top: 3rem;
  color: #fff;
}
.rowWitExtraMargin {
  margin-top: 80px;
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .title {
    font-size: 2rem;
  }
}


================================================
FILE: docusaurus/src/components/HomepageWhatYouGet/index.js
================================================
import React from 'react';
import clsx from 'clsx';
import Link from '@docusaurus/Link';

import styles from './styles.module.css';

export default function HomepageWhatYouGet() {
  return (
    <section className={styles.features}>
      <div className="container">
      <div className={clsx('col col--12')}>
          <h2 className={styles.title}>You get what you expect</h2>
        </div>
        <div className={`row ${styles.childrenWithExtraPadding}`}>
          <div className={clsx('col col--6 text--center padding-horiz--md')}>
            <p>Function decorators with type hints specifying Pydantic classes for JSON encoding/decoding, automatic message routing and documentation generation.</p>
          </div>
          <div className={clsx('col col--6 text--center padding-horiz--md')}>
            <p>Built on top of <a className={styles.link} href="https://docs.pydantic.dev/" target="_blank">Pydantic</a>, <a className={styles.link} href="https://github.com/aio-libs/aiokafka/" target="_blank">AIOKafka</a> and <a className={styles.link} href="https://www.asyncapi.com/" target="_blank">AsyncAPI</a>, FastKafka simplifies the process of writing producers and consumers for Kafka topics, handling all the parsing, networking, task scheduling and data generation automatically. </p>
          </div>
        </div>
        {/* <div className={`${styles.rowWitExtraMargin} row`}>
          <div className={clsx('col col--6', styles.wrapper)}>
            <div className={`text--center padding-horiz--md ${styles.verticalAndHorizontalCenter}`}>
            <Link
              className="btn-github-link button button--secondary button--lg"
              to="https://github.com/airtai/fastkafka">
                Check it out
            </Link>
            </div>
          </div>
          <div className={clsx('col col--6')}>
            <div className="text--center padding-horiz--md">
              <img src="img/docusaurus-plushie-banner.jpeg" />
            </div>
          </div>
        </div> */}
      </div>
    </section>
  );
}


================================================
FILE: docusaurus/src/components/HomepageWhatYouGet/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 4rem 0 0 0;
  width: 100%;
  background-color: #60bee4;
  border-bottom: 1px solid #8bcae5;
}

.features p {
  text-align: left;
  font-size: 1rem;
}

.featureSvg {
  height: 200px;
  width: 200px;
}
.title {
  font-size: 3rem;
  text-align: center;
  color: #fff;
}
.subTitle {
  font-size: 1.5rem;
  text-align: left;
  color: #fff;
}

.rowWitExtraMargin {
  margin-top: 80px;
}
.link {
  color: #fff;
  text-decoration: underline;
  transition: color var(--ifm-transition-fast)
    var(--ifm-transition-timing-default);
}
.link:hover {
  color: var(--ifm-hero-text-color);
}

.wrapper {
  position: relative;
}

.verticalAndHorizontalCenter {
  margin: 0;
  position: absolute;
  top: 50%;
  left: 50%;
  -ms-transform: translate(-50%, -50%);
  transform: translate(-50%, -50%);
}
.childrenWithExtraPadding {
  margin: 4rem 0px;
  /* padding: 0px 30px; */
  font-size: 1.2rem;
  color: #fff;
}
/** Mobile view */
@media screen and (max-width: 996px) {
  .title {
    font-size: 2rem;
  }
}


================================================
FILE: docusaurus/src/components/RobotFooterIcon/index.js
================================================
import React from 'react';
import clsx from 'clsx';

import styles from './styles.module.css';

export default function RobotFooterIcon() {
  return (
    <section>
      <div className={clsx("container", styles.robotFooterContainer)}>
       <img className={styles.robotFooterIcon} src="img/robot-footer.svg" />
      </div>
    </section>
  );
}


================================================
FILE: docusaurus/src/components/RobotFooterIcon/styles.module.css
================================================
.robotFooterContainer {
  text-align: center;
  position: relative;
}

.robotFooterIcon {
  width: 7rem;
  height: auto;
  position: absolute;
  margin-top: -4rem;
  margin-left: -3.5rem;
}


================================================
FILE: docusaurus/src/css/custom.css
================================================
/**
 * Any CSS included here will be global. The classic template
 * bundles Infima by default. Infima is a CSS framework designed to
 * work well for content-centric websites.
 */

/* You can override the default Infima variables here. */

@font-face {
  font-family: "Panton-SemiBold";
  src: url("/static/font/Panton-SemiBold.woff") format("woff");
}

@font-face {
  font-family: "Rubik-Medium";
  src: url("/static/font/Rubik-Medium.ttf") format("truetype");
}

@font-face {
  font-family: "RobotoMono-Regular";
  src: url("/static/font/RobotoMono-Regular.ttf") format("truetype");
}

@font-face {
  font-family: "Roboto-Light";
  src: url("/static/font/Roboto-Light.ttf") format("truetype");
}

@font-face {
  font-family: "Roboto-Regular";
  src: url("/static/font/Roboto-Regular.ttf") format("truetype");
}

:root {
  font-family: "Roboto-Regular";
  --ifm-font-family-monospace: "RobotoMono-Regular";
  --ifm-font-family-base: "Roboto-Regular";
  --ifm-heading-font-family: "Rubik-Medium";
  --ifm-color-primary: #56b7e1; /*#2e8555; */
  --ifm-color-primary-dark: #3cacdc;
  --ifm-color-primary-darker: #2ea6da;
  --ifm-color-primary-darkest: #218bb9;
  --ifm-color-primary-light: #70c2e6;
  --ifm-color-primary-lighter: #7ec8e8;
  --ifm-color-primary-lightest: #a5d9ef;
  --ifm-code-font-size: 95%;
  --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.1);
  --ifm-navbar-background-color: #003257;
  --ifm-dropdown-background-color: #003257;
  --ifm-navbar-height: 4.69rem;
  /* --ifm-font-color-base: #fff; */
}

/* For readability concerns, you should choose a lighter palette in dark mode. */
[data-theme="dark"] {
  --ifm-color-primary: #56b7e1;
  --ifm-color-primary-dark: #3cacdc;
  --ifm-color-primary-darker: #2ea6da;
  --ifm-color-primary-darkest: #218bb9;
  --ifm-color-primary-light: #70c2e6;
  --ifm-color-primary-lighter: #7ec8e8;
  --ifm-color-primary-lightest: #a5d9ef;
  --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.3);
  /* --ifm-navbar-background-color: #242526; */
}

html[data-theme="dark"] .DocSearch-Button {
  background: #ebedf0;
  color: #969faf;
}

html[data-theme="dark"] .DocSearch-Button:hover {
  background: #fff;
  box-shadow: inset 0 0 0 2px var(--docsearch-primary-color);
  color: #1c1e21;
}

html[data-theme="dark"] .DocSearch-Button .DocSearch-Search-Icon {
  color: #1c1e21;
}

html[data-theme="dark"] .DocSearch-Button .DocSearch-Button-Key {
  background: linear-gradient(-225deg, #d5dbe4, #f8f8f8);
  color: #969faf;
  box-shadow: inset 0 -2px 0 0 #cdcde6, inset 0 0 1px 1px #fff,
    0 1px 2px 1px rgba(30, 35, 90, 0.4);
}

/* default settings for both tablet and desktop */
.navbar.navbar--fixed-top
  .navbar__items
  > a.fastkafka-home-mobile
  + div
  > button,
.navbar-sidebar .navbar-sidebar__brand div > button {
  color: #fff;
}
.navbar.navbar--fixed-top
  .navbar__items
  > a.fastkafka-home-mobile
  + div
  > button:hover,
.navbar-sidebar .navbar-sidebar__brand div > button:hover {
  background: #8c9fae;
}

[data-theme="dark"]
  .navbar.navbar--fixed-top
  .navbar__items
  > a.fastkafka-home-mobile
  + div
  > button:hover,
.navbar-sidebar .navbar-sidebar__brand div > button:hover {
  background: #444950;
}

.navbar-sidebar .navbar-sidebar__items .navbar-sidebar__item > button,
.navbar-sidebar .navbar-sidebar__items .navbar-sidebar__item > ul > li a {
  color: #fff;
}

.menu__list-item--collapsed .menu__link--sublist-caret:after {
  background: url("/static/img/icon-arrow-right-blue.svg") 50% / 2rem 2rem;
  min-width: 1rem;
  width: 1rem;
  height: 1rem;
  transform: rotateZ(0deg);
  filter: none;
}
.menu__link--sublist-caret:after {
  background: url("/static/img/icon-arrow-right-blue.svg") 50% / 2rem 2rem;
  transform: rotate(90deg);
  min-width: 1rem;
  width: 1rem;
  height: 1rem;
  filter: none;
}
.navbar-sidebar__back,
.menu__link--active:not(.menu__link--sublist) {
  background: rgba(255, 255, 255, 0.05);
}
/* + div[class^="toggle_"] { */
html.plugin-pages .navbar__items.navbar__items--right > a + div > button {
  display: none;
}

.navbar.navbar-sidebar--show .navbar-sidebar .navbar-sidebar__brand > div {
  margin-left: auto;
  margin-right: 1rem !important;
}

.navbar.navbar-sidebar--show
  .navbar-sidebar
  .navbar-sidebar__brand
  .navbar-sidebar__close {
  margin-left: unset;
}

html.plugin-pages
  .navbar.navbar-sidebar--show
  .navbar-sidebar
  .navbar-sidebar__brand
  > div {
  display: none;
}

html.plugin-pages
  .navbar.navbar-sidebar--show
  .navbar-sidebar
  .navbar-sidebar__brand
  .navbar-sidebar__close {
  margin-left: auto;
}

.navbar--fixed-top {
  padding-top: 0px;
  padding-bottom: 0px;
}
.navbar__title {
  color: #fff;
  font-size: 3.5rem;
  font-family: var(--ifm-heading-font-family);
  font-weight: 100;
  line-height: var(--ifm-heading-line-height);
  margin-left: -0.4rem;
}

.navbar__brand:hover {
  color: var(--ifm-navbar-link-color);
}

.navbar__items.navbar__items--right .navbar__link {
  color: #fff;
}
.navbar__items.navbar__items--right .navbar__item.dropdown .dropdown__link {
  color: #fff;
}
.navbar__items.navbar__items--right
  .navbar__item.dropdown
  .dropdown__link:hover {
  color: var(--ifm-link-hover-color);
}
.navbar__items.navbar__items--right .navbar__item.dropdown,
.navbar__items.navbar__items--right .navbar__item.navbar__link {
  border-right: 1px solid #214c6c;
  padding: 23px 18px;
}
.navbar__items.navbar__items--right
  .navbar__item.navbar__link.header-discord-link {
  border-right: none;
}

.dropdown > .navbar__link:after {
  margin-left: 0.5em;
  top: 1px;
  font-size: 0.8rem;
}

.navbar__logo {
  width: 2.156rem;
  height: auto;
  margin-top: 1.2rem;
}

.navbar__brand {
  margin-left: 0;
  margin-right: 0;
}

html.docs-doc-page main article .markdown > h2.anchor {
  font-weight: 400;
}
html.docs-doc-page main article .markdown > h3.anchor {
  font-weight: 500;
  font-family: var(--ifm-font-family-monospace);
}
html.docs-doc-page main article .markdown > h3.anchor > strong,
html.docs-doc-page main ul.table-of-contents > li > ul a > strong {
  font-weight: 500;
}

html.docs-doc-page main article .markdown table {
    text-align: left;
    font-size: 0.8rem;
}

html.docs-doc-page main article .markdown table code{
    border: 1px solid rgba(0, 0, 0, 0.1);
}

nav .navbar__inner .navbar__items .fastkafka-home > div {
  margin-top: 1px;
}
nav .navbar__inner .navbar__items .fastkafka-home > div > p {
  display: inline-block;
  margin: 0;
  color: #fff;
  font-size: 1.2rem;
  font-family: var(--ifm-heading-font-family);
  font-weight: 100;
  line-height: var(--ifm-heading-line-height);
}
nav .navbar__inner .navbar__items .fastkafka-home > div > img {
  width: 15px;
  height: auto;
  margin-right: 5px;
}
.fastkafka-home-mobile {
  display: none;
}

.navbar.navbar--fixed-top .navbar__items--right {
  justify-content: end;
}
@media screen and (max-width: 996px) {
  .navbar-sidebar .navbar-sidebar__items .menu__link.fastkafka-home {
    display: none;
  }
}

.navbar__link {
  font-size: 1rem;
}
.navbar__toggle {
  color: #fff;
}
@media screen and (max-width: 1024px) {
  .navbar__toggle {
    margin-top: 0.4rem;
  }
}

.header-discord-link:before,
.footer-discord-link:before {
  content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: #8c9fae;
  -webkit-mask-image: url("/img/icon-discord.svg");
  mask-image: url("/img/icon-discord.svg");
}

.header-discord-link:hover,
.footer-discord-link:hover {
  opacity: 0.6;
}

.footer-github-link:before {
  content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: #8c9fae;
  -webkit-mask-image: url("/img/icon-github.svg");
  mask-image: url("/img/icon-github.svg");
}

.footer-github-link:hover {
  opacity: 0.6;
}

.footer-facebook-link:hover {
  opacity: 0.6;
}

.footer-facebook-link:before {
  content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: #8c9fae;
  -webkit-mask-image: url("/img/icon-facebook.svg");
  mask-image: url("/img/icon-facebook.svg");
}

.footer-twitter-link:hover {
  opacity: 0.6;
}

.footer-twitter-link:before {
  content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: #8c9fae;
  -webkit-mask-image: url("/img/icon-twitter.svg");
  mask-image: url("/img/icon-twitter.svg");
}

.footer-linkedin-link:hover {
  opacity: 0.6;
}

.footer-linkedin-link:before {
  content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: #8c9fae;
  -webkit-mask-image: url("/img/icon-linkedin.svg");
  mask-image: url("/img/icon-linkedin.svg");
}

.github-stars {
  display: flex;
  height: 40px;
  width: 150px;
  margin-left: 12px;
}

h3.anchor > code {
  font-size: 1rem;
}

.footer.footer--dark {
  background-color: #003257;
  /* height: 18.75rem; */
}
.footer.footer--dark .container.container-fluid .footer__bottom {
  position: absolute;
  padding: 2rem;
  width: 100%;
  background-color: #003a60;
  left: 0;
}
.footer.footer--dark .footer__copyright {
  opacity: 0.5;
  font-size: 0.85rem;
  letter-spacing: 0.025rem;
}
.footer.footer--dark .footer__col {
  margin: 2.3rem auto 5rem;
  height: 10rem;
  border-left: 2px solid rgb(33, 76, 108);
  padding-left: 1.5rem;
}
.footer.footer--dark .footer__col .footer__title {
  letter-spacing: 0.025rem;
  font-size: 1rem;
}
.footer.footer--dark .footer__col:first-child .footer__item {
  display: inline-block;
  padding: 0.3rem 0.3rem 0.3rem 0.3rem;
}
.footer.footer--dark .footer__col:first-child .footer__item:first-child {
  padding-left: 0;
}
.footer.footer--dark .footer__col .footer__link-item {
  text-decoration: underline;
  font-size: 0.9rem;
}

a.link-to-source {
    margin: 0 0 1rem 0;
    display: inline-block;
}

a.link-to-source::after {
    content: "";
  display: flex;
  height: 24px;
  width: 24px;
  background-color: var(--ifm-link-color);
  -webkit-mask-image: url("data:image/svg+xml,%3Csvg width='24' height='24' viewBox='0 0 24 24' stroke-width='1.5' fill='none' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath d='M21 3L15 3M21 3L12 12M21 3V9' stroke='currentColor' stroke-linecap='round' stroke-linejoin='round'/%3E%3Cpath d='M21 13V19C21 20.1046 20.1046 21 19 21H5C3.89543 21 3 20.1046 3 19V5C3 3.89543 3.89543 3 5 3H11' stroke='currentColor' stroke-linecap='round'/%3E%3C/svg%3E%0A");
  mask-image: url("data:image/svg+xml,%3Csvg width='24' height='24' viewBox='0 0 24 24' stroke-width='1.5' fill='none' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath d='M21 3L15 3M21 3L12 12M21 3V9' stroke='currentColor' stroke-linecap='round' stroke-linejoin='round'/%3E%3Cpath d='M21 13V19C21 20.1046 20.1046 21 19 21H5C3.89543 21 3 20.1046 3 19V5C3 3.89543 3.89543 3 5 3H11' stroke='currentColor' stroke-linecap='round'/%3E%3C/svg%3E%0A");
  display: inline-block;
  position: absolute;
    margin-left: 0rem;
    margin-top: 0.02rem;
    transform: scale(0.67);
}

.footer.footer--dark .footer__col .footer__link-item::after {
  content: "";
  display: flex;
  height: 35px;
  width: 42px;
  background-color: #8c9fae;
  -webkit-mask-image: url(data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIGhlaWdodD0iNDgiIHZpZXdCb3g9IjAgOTYgOTYwIDk2MCIgd2lkdGg9IjQ4Ij48cGF0aCBkPSJNNTQwIDc5M3EtOS05LTktMjEuNXQ4LTIwLjVsMTQ3LTE0N0gxOTBxLTEzIDAtMjEuNS04LjVUMTYwIDU3NHEwLTEzIDguNS0yMS41VDE5MCA1NDRoNDk2TDUzOCAzOTZxLTktOS04LjUtMjF0OS41LTIxcTktOCAyMS41LTh0MjAuNSA4bDE5OSAxOTlxNSA1IDcgMTB0MiAxMXEwIDYtMiAxMXQtNyAxMEw1ODIgNzkzcS05IDktMjEgOXQtMjEtOVoiLz48L3N2Zz4=);
  mask-image: url(data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIGhlaWdodD0iNDgiIHZpZXdCb3g9IjAgOTYgOTYwIDk2MCIgd2lkdGg9IjQ4Ij48cGF0aCBkPSJNNTQwIDc5M3EtOS05LTktMjEuNXQ4LTIwLjVsMTQ3LTE0N0gxOTBxLTEzIDAtMjEuNS04LjVUMTYwIDU3NHEwLTEzIDguNS0yMS41VDE5MCA1NDRoNDk2TDUzOCAzOTZxLTktOS04LjUtMjF0OS41LTIxcTktOCAyMS41LTh0MjAuNSA4bDE5OSAxOTlxNSA1IDcgMTB0MiAxMXEwIDYtMiAxMXQtNyAxMEw1ODIgNzkzcS05IDktMjEgOXQtMjEtOVoiLz48L3N2Zz4=);
  display: inline-block;
  transform: scale(0.4);
  position: absolute;
  margin-left: -0.5rem;
  margin-top: -0.4rem;
}
.footer.footer--dark .footer__col:last-child .footer__link-item::after {
  display: none;
}
.footer.footer--dark .footer__col .footer__link-item > svg {
  display: none;
}

/**
* ----------------------------------------------
* FAQ accordion styles starts
* ----------------------------------------------
**/
.container .accordion {
  border: 1px solid rgba(139, 202, 229, 0.1);
  border-radius: 2px;
}

.accordion__item + .accordion__item {
  border-top: 1px solid rgba(139, 202, 229, 0.1);
}

.accordion__item .accordion__button {
  background-color: #076d9e;
  /* color: var(--ifm-font-color-base); */
  color: #fff;
  cursor: pointer;
  padding: 2rem;
  width: 100%;
  text-align: left;
  border: 1px solid #8bcae5;
  font-size: 1rem;
  margin: 0.5rem 0;
}

.accordion__item .accordion__button:hover {
  background-color: #60bee4;
}

.accordion__button:before {
  display: inline-block;
  content: "";
  height: 10px;
  width: 10px;
  margin-right: 12px;
  border-bottom: 2px solid currentColor;
  border-right: 2px solid currentColor;
  transform: rotate(-45deg);
}

.accordion__button[aria-expanded="true"]::before,
.accordion__button[aria-selected="true"]::before {
  transform: rotate(45deg);
}

[hidden] {
  display: none;
}

.accordion__item .accordion__panel {
  padding: 2rem 2rem 1rem 2rem;
  animation: fadein 0.35s ease-in;
  color: #fff;
  font-size: 1rem;
  /* border: 1px solid #8bcae5; */
  /* border-top: none; */
}

/* -------------------------------------------------- */
/* ---------------- Animation part ------------------ */
/* -------------------------------------------------- */

@keyframes fadein {
  0% {
    opacity: 0;
  }

  100% {
    opacity: 1;
  }
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .accordion__item .accordion__button {
    font-size: 1.1rem;
    padding: 1rem;
  }
  .accordion__item .accordion__panel {
    font-size: 1.1rem;
    padding: 1rem 1rem 0.3rem 1rem;
  }
  .footer.footer--dark .footer__col {
    margin: 1rem auto 1rem;
    height: auto;
    border: none;
  }
}

/**
* ----------------------------------------------
* FAQ accordion ends
* ----------------------------------------------
**/

.prism-code.language-py code .token.decorator {
  color: #c5221f !important;
}

html.docs-doc-page[data-theme="dark"]
  .prism-code.language-py
  code
  .token.decorator {
  color: #fbc02d !important;
}

/** Tablet view */
@media screen and (max-width: 1290px) {
  .navbar__items.navbar__items--right .navbar__item.dropdown,
  .navbar__items.navbar__items--right .navbar__item.navbar__link {
    padding: 23px 18px;
  }
  .navbar__title {
    font-size: 2.8rem;
    /* margin-left: 1.8rem; */
    margin-top: -0.3rem;
    padding: 8px 1rem 8px 0px;
  }
  .navbar__logo {
    margin-top: 0.5rem;
    width: 1.7rem;
  }
  .navbar__brand {
    margin-left: 0;
    margin-top: 0.4rem;
  }
}

/** Tablet view */
@media screen and (max-width: 996px) {
  .navbar__item.github-stars {
    display: none;
  }
  ul.menu__list li {
    margin-bottom: 10px;
  }
  .navbar__items.navbar__items--right
    .navbar__item.navbar__link.fastkafka-home-mobile {
    display: block;
    padding: 0px;
    margin-right: 11rem;
    border-right: none;
    margin-top: 7px;
  }
  .navbar__items.navbar__items--right
    .navbar__item.navbar__link.fastkafka-home-mobile
    img {
    width: 30px;
    height: auto;
  }
}

/** Mobile view */
@media screen and (max-width: 768px) {
  .navbar__items.navbar__items--right
    .navbar__item.navbar__link.fastkafka-home-mobile {
    margin-right: 3.5rem;
  }
  .navbar__items.navbar__items--right
    .navbar__item.navbar__link.fastkafka-home-mobile
    img {
    width: 33px;
  }
}


================================================
FILE: docusaurus/src/pages/demo/index.js
================================================
import React from 'react';
import clsx from 'clsx';
import Layout from '@theme/Layout';
import YouTube from 'react-youtube';

import styles from './styles.module.css';

const opts = {
      height: '720',
      width: '1280',
    };

export default function Hello() {
  return (
    <Layout title="Demo" description="Demo">
      <section className={`hero hero--primary ${styles.containerWithMinHeight}`}>
      <div className="container">
        <div className="row">
          <div className="col col--12">
            <YouTube videoId="dQw4w9WgXcQ" opts={opts}/>
          </div>
        </div>
      </div>
    </section>
    </Layout>
  );
}

================================================
FILE: docusaurus/src/pages/demo/styles.module.css
================================================
.features {
  display: flex;
  align-items: center;
  padding: 2rem 0;
  width: 100%;
}

.header {
  font-size: 4rem;
  text-align: center;
}

.description {
  font-size: 1.2rem;
  margin-top: 1rem;
}

.containerWithMinHeight {
  min-height: 500px;
}

================================================
FILE: docusaurus/src/pages/index.js
================================================
import React from 'react';
import clsx from 'clsx';
import Link from '@docusaurus/Link';
import useDocusaurusContext from '@docusaurus/useDocusaurusContext';
import Layout from '@theme/Layout';
import HomepageFeatures from '@site/src/components/HomepageFeatures';
import HomepageWhatYouGet from '@site/src/components/HomepageWhatYouGet';
import HomepageCommunity from '@site/src/components/HomepageCommunity';
import HomepageFAQ from '@site/src/components/HomepageFAQ';
import RobotFooterIcon from '@site/src/components/RobotFooterIcon';

import styles from './index.module.css';

function HomepageHeader() {
  return (
    <header className={clsx('hero hero--primary', styles.heroBanner)}>
      <div className="container">
        <img className={styles.heroRobot} src="img/robot-hero.svg" />
        <p className={styles.description}>Open-source framework for building asynchronous web </p>
        <p className={styles.description}>services that interact with Kafka</p>
        <p className={styles.descriptionMobile}>Open-source framework for building asynchronous web services that interact with Kafka</p>
        <div className={styles.buttons}>
          <Link
            className={clsx("button button--lg", styles.heroButton)}
            to="/docs">
              Get Started
          </Link>
        </div>
      </div>
    </header>
  );
}

export default function Home() {
  const {siteConfig} = useDocusaurusContext();
  return (
    <Layout
      title={siteConfig.tagline}
      description={siteConfig.customFields.description}>
      <HomepageHeader />
      <main>
        <HomepageFeatures />
        <HomepageWhatYouGet />
        <HomepageCommunity />
        <HomepageFAQ />
        <RobotFooterIcon />
      </main>
    </Layout>
  );
}


================================================
FILE: docusaurus/src/pages/index.module.css
================================================
/**
 * CSS files with the .module.css suffix will be treated as CSS modules
 * and scoped locally.
 */

.heroBanner {
  padding: 4rem 0;
  text-align: center;
  position: relative;
  overflow: hidden;
  background: rgb(96, 190, 228);
  background: linear-gradient(
    180deg,
    rgba(96, 190, 228, 1) 0%,
    rgba(17, 115, 164, 1) 100%
  );
}
.heroRobot {
  width: 870px;
  margin-top: 1rem;
  margin-bottom: 2rem;
}

.buttons {
  display: flex;
  align-items: center;
  justify-content: center;
  margin-top: 55px;
}

.title {
  font-size: 3rem;
  margin-bottom: 60px;
}

.description {
  font-size: 1.5rem;
  line-height: 0.8rem;
  font-style: italic;
  color: #fff;
}
.heroButton {
  color: #fff;
  background: var(--ifm-navbar-background-color);
  border-radius: 25px;
  padding: 0.7rem 2.5rem 0.7rem 2.5rem;
  margin-top: -1.5rem;
  font-size: 1rem;
}
.heroButton:hover {
  background: #3e99c5;
}
.descriptionMobile {
  display: none;
}

/** Tablet view */
@media screen and (max-width: 1290px) {
}

/** Mobile view */
@media screen and (max-width: 996px) {
  .heroBanner {
    padding: 2rem;
  }
  .description {
    display: none;
  }
  .descriptionMobile {
    font-size: 1.3rem;
    font-style: italic;
    line-height: 1.8rem;
    margin-bottom: 0;
    display: block;
  }
}


================================================
FILE: docusaurus/src/utils/prismDark.mjs
================================================
/**
 * Copyright (c) Facebook, Inc. and its affiliates.
 *
 * This source code is licensed under the MIT license found in the
 * LICENSE file in the root directory of this source tree.
 */

import darkTheme from 'prism-react-renderer/themes/vsDark/index.cjs.js';

export default {
  plain: {
    color: '#D4D4D4',
    backgroundColor: '#212121',
  },
  styles: [
    ...darkTheme.styles,
    {
      types: ['title'],
      style: {
        color: '#569CD6',
        fontWeight: 'bold',
      },
    },
    {
      types: ['property', 'parameter'],
      style: {
        color: '#9CDCFE',
      },
    },
    {
      types: ['script'],
      style: {
        color: '#D4D4D4',
      },
    },
    {
      types: ['boolean', 'arrow', 'atrule', 'tag'],
      style: {
        color: '#569CD6',
      },
    },
    {
      types: ['number', 'color', 'unit'],
      style: {
        color: '#B5CEA8',
      },
    },
    {
      types: ['font-matter'],
      style: {
        color: '#CE9178',
      },
    },
    {
      types: ['keyword', 'rule'],
      style: {
        color: '#C586C0',
      },
    },
    {
      types: ['regex'],
      style: {
        color: '#D16969',
      },
    },
    {
      types: ['maybe-class-name'],
      style: {
        color: '#4EC9B0',
      },
    },
    {
      types: ['constant'],
      style: {
        color: '#4FC1FF',
      },
    },
  ],
};

================================================
FILE: docusaurus/src/utils/prismLight.mjs
================================================
/**
 * Copyright (c) Facebook, Inc. and its affiliates.
 *
 * This source code is licensed under the MIT license found in the
 * LICENSE file in the root directory of this source tree.
 */

import lightTheme from 'prism-react-renderer/themes/github/index.cjs.js';

export default {
  ...lightTheme,
  styles: [
    ...lightTheme.styles,
    {
      types: ['title'],
      style: {
        color: '#0550AE',
        fontWeight: 'bold',
      },
    },
    {
      types: ['parameter'],
      style: {
        color: '#953800',
      },
    },
    {
      types: ['boolean', 'rule', 'color', 'number', 'constant', 'property'],
      style: {
        color: '#005CC5',
      },
    },
    {
      types: ['atrule', 'tag'],
      style: {
        color: '#22863A',
      },
    },
    {
      types: ['script'],
      style: {
        color: '#24292E',
      },
    },
    {
      types: ['operator', 'unit', 'rule'],
      style: {
        color: '#D73A49',
      },
    },
    {
      types: ['font-matter', 'string', 'attr-value'],
      style: {
        color: '#C6105F',
      },
    },
    {
      types: ['class-name'],
      style: {
        color: '#116329',
      },
    },
    {
      types: ['attr-name'],
      style: {
        color: '#0550AE',
      },
    },
    {
      types: ['keyword'],
      style: {
        color: '#CF222E',
      },
    },
    {
      types: ['function'],
      style: {
        color: '#8250DF',
      },
    },
    {
      types: ['selector'],
      style: {
        color: '#6F42C1',
      },
    },
    {
      types: ['variable'],
      style: {
        color: '#E36209',
      },
    },
    {
      types: ['comment'],
      style: {
        color: '#6B6B6B',
      },
    },
    {
      types: ['builtin'],
      style: {
        color: '#005CC5',
      },
    },
  ],
};

================================================
FILE: docusaurus/static/.nojekyll
================================================


================================================
FILE: docusaurus/static/CNAME
================================================
fastkafka.airt.ai


================================================
FILE: docusaurus/versioned_docs/version-0.5.0/CHANGELOG.md
================================================
# Release notes

<!-- do not remove -->

## 0.5.0

### New Features

- Significant speedup of Kafka producer ([#236](https://github.com/airtai/fastkafka/pull/236)), thanks to [@Sternakt](https://github.com/Sternakt)
 

- Added support for AVRO encoding/decoding ([#231](https://github.com/airtai/fastkafka/pull/231)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)


### Bugs Squashed

- Fixed sidebar to include guides in docusaurus documentation ([#238](https://github.com/airtai/fastkafka/pull/238)), thanks to [@Sternakt](https://github.com/Sternakt)

- Fixed link to symbols in docusaurus docs ([#227](https://github.com/airtai/fastkafka/pull/227)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Removed bootstrap servers from constructor ([#220](https://github.com/airtai/fastkafka/pull/220)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)


## 0.4.0

### New Features

- Integrate fastkafka chat ([#208](https://github.com/airtai/fastkafka/pull/208)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Add benchmarking ([#206](https://github.com/airtai/fastkafka/pull/206)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Enable fast testing without running kafka locally ([#198](https://github.com/airtai/fastkafka/pull/198)), thanks to [@Sternakt](https://github.com/Sternakt)

- Generate docs using Docusaurus ([#194](https://github.com/airtai/fastkafka/pull/194)), thanks to [@harishmohanraj](https://github.com/harishmohanraj)

- Add test cases for LocalRedpandaBroker ([#189](https://github.com/airtai/fastkafka/pull/189)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Reimplement patch and delegates from fastcore ([#188](https://github.com/airtai/fastkafka/pull/188)), thanks to [@Sternakt](https://github.com/Sternakt)

- Rename existing functions into start and stop and add lifespan handler ([#117](https://github.com/airtai/fastkafka/issues/117))
  - https://www.linkedin.com/posts/tiangolo_fastapi-activity-7038907638331404288-Oar3/?utm_source=share&utm_medium=member_ios


## 0.3.1

-  README.md file updated


## 0.3.0

### New Features

- Guide for fastkafka produces using partition key ([#172](https://github.com/airtai/fastkafka/pull/172)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Closes #161

- Add support for Redpanda for testing and deployment ([#181](https://github.com/airtai/fastkafka/pull/181)), thanks to [@kumaranvpl](https://github.com/kumaranvpl)

- Remove bootstrap_servers from __init__ and use the name of broker as an option when running/testing ([#134](https://github.com/airtai/fastkafka/issues/134))

- Add a GH action file to check for broken links in the docs ([#163](https://github.com/airtai/fastkafka/issues/163))

- Optimize requirements for testing and docs ([#151](https://github.com/airtai/fastkafka/issues/151))

- Break requirements into base and optional for testing and dev ([#124](https://github.com/airtai/fastkafka/issues/124))
  - Minimize base requirements needed just for running the service.

- Add link to example git repo into guide for building docs using actions ([#81](https://github.com/airtai/fastkafka/issues/81))

- Add logging for run_in_background ([#46](https://github.com/airtai/fastkafka/issues/46))

- Implement partition Key mechanism for producers ([#16](https://github.com/airtai/fastkafka/issues/16))

### Bugs Squashed

- Implement checks for npm installation and version ([#176](https://github.com/airtai/fastkafka/pull/176)), thanks to [@Sternakt](https://github.com/Sternakt)
  - Closes #158 by checking if the npx is installed and more verbose error handling

- Fix the helper.py link in CHANGELOG.md ([#165](https://github.com/airtai/fastkafka/issues/165))

- fastkafka docs install_deps fails ([#157](https://github.com/airtai/fastkafka/issues/157))
  - Unexpected internal error: [Errno 2] No such file or directory: 'npx'

- Broken links in docs ([#141](https://github.com/airtai/fastkafka/issues/141))

- fastkafka run is not showing up in CLI docs ([#132](https://github.com/airtai/fastkafka/issues/132))


## 0.2.3

- Fixed broken links on PyPi index page


## 0.2.2

### New Features

- Extract JDK and Kafka installation out of LocalKafkaBroker ([#131](https://github.com/airtai/fastkafka/issues/131))

- PyYAML version relaxed ([#119](https://github.com/airtai/fastkafka/pull/119)), thanks to [@davorrunje](https://github.com/davorrunje)

- Replace docker based kafka with local ([#68](https://github.com/airtai/fastkafka/issues/68))
  - [x] replace docker compose with a simple docker run (standard run_jupyter.sh should do)
  - [x] replace all tests to use LocalKafkaBroker
  - [x] update documentation

### Bugs Squashed

- Fix broken link for FastKafka docs in index notebook ([#145](https://github.com/airtai/fastkafka/issues/145))

- Fix encoding issues when loading setup.py on windows OS ([#135](https://github.com/airtai/fastkafka/issues/135))


## 0.2.0

### New Features

- Replace kafka container with LocalKafkaBroker ([#112](https://github.com/airtai/fastkafka/issues/112))
  - - [x] Replace kafka container with LocalKafkaBroker in tests
- [x] Remove kafka container from tests environment
- [x] Fix failing tests

### Bugs Squashed

- Fix random failing in CI ([#109](https://github.com/airtai/fastkafka/issues/109))


## 0.1.3

- version update in __init__.py


## 0.1.2

### New Features


- Git workflow action for publishing Kafka docs ([#78](https://github.com/airtai/fastkafka/issues/78))


### Bugs Squashed

- Include missing requirement ([#110](https://github.com/airtai/fastkafka/issues/110))
  - [x] Typer is imported in this [file](https://github.com/airtai/fastkafka/blob/main/fastkafka/_components/helpers.py) but it is not included in [settings.ini](https://github.com/airtai/fastkafka/blob/main/settings.ini)
  - [x] Add aiohttp which is imported in this [file](https://github.com/airtai/fastkafka/blob/main/fastkafka/_helpers.py)
  - [x] Add nbformat which is imported in _components/helpers.py
  - [x] Add nbconvert which is imported in _components/helpers.py


## 0.1.1


### Bugs Squashed

- JDK install fails on Python 3.8 ([#106](https://github.com/airtai/fastkafka/issues/106))



## 0.1.0

Initial release


================================================
FILE: docusaurus/versioned_docs/version-0.5.0/CNAME
================================================
fastkafka.airt.ai


================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/FastKafka.md
================================================
## `fastkafka.FastKafka` {#fastkafka.FastKafka}

### `__init__` {#init}

`def __init__(self, title: Optional[str] = None, description: Optional[str] = None, version: Optional[str] = None, contact: Optional[Dict[str, str]] = None, kafka_brokers: Dict[str, Any], root_path: Optional[pathlib.Path, str] = None, lifespan: Optional[Callable[[ForwardRef('FastKafka')], AbstractAsyncContextManager[NoneType]]] = None, loop=None, client_id=None, metadata_max_age_ms=300000, request_timeout_ms=40000, api_version='auto', acks=<object object at 0x101ca6040>, key_serializer=None, value_serializer=None, compression_type=None, max_batch_size=16384, partitioner=<kafka.partitioner.default.DefaultPartitioner object at 0x101c80310>, max_request_size=1048576, linger_ms=0, send_backoff_ms=100, retry_backoff_ms=100, security_protocol='PLAINTEXT', ssl_context=None, connections_max_idle_ms=540000, enable_idempotence=False, transactional_id=None, transaction_timeout_ms=60000, sasl_mechanism='PLAIN', sasl_plain_password=None, sasl_plain_username=None, sasl_kerberos_service_name='kafka', sasl_kerberos_domain_name=None, sasl_oauth_token_provider=None, group_id=None, key_deserializer=None, value_deserializer=None, fetch_max_wait_ms=500, fetch_max_bytes=52428800, fetch_min_bytes=1, max_partition_fetch_bytes=1048576, auto_offset_reset='latest', enable_auto_commit=True, auto_commit_interval_ms=5000, check_crcs=True, partition_assignment_strategy=(<class 'kafka.coordinator.assignors.roundrobin.RoundRobinPartitionAssignor'>,), max_poll_interval_ms=300000, rebalance_timeout_ms=None, session_timeout_ms=10000, heartbeat_interval_ms=3000, consumer_timeout_ms=200, max_poll_records=None, exclude_internal_topics=True, isolation_level='read_uncommitted') -> None`

Creates FastKafka application

**Parameters**:
- `title`: optional title for the documentation. If None,
the title will be set to empty string
- `description`: optional description for the documentation. If
None, the description will be set to empty string
- `version`: optional version for the documentation. If None,
the version will be set to empty string
- `contact`: optional contact for the documentation. If None, the
contact will be set to placeholder values:
name='Author' url=HttpUrl(' https://www.google.com ', ) email='noreply@gmail.com'
- `kafka_brokers`: dictionary describing kafka brokers used for
generating documentation
- `root_path`: path to where documentation will be created
- `lifespan`: asynccontextmanager that is used for setting lifespan hooks.
__aenter__ is called before app start and __aexit__ after app stop.
The lifespan is called whe application is started as async context
manager, e.g.:`async with kafka_app...`
- `client_id`: a name for this client. This string is passed in
each request to servers and can be used to identify specific
server-side log entries that correspond to this client.
Default: ``aiokafka-producer-#`` (appended with a unique number
per instance)
- `key_serializer`: used to convert user-supplied keys to bytes
If not :data:`None`, called as ``f(key),`` should return
:class:`bytes`.
Default: :data:`None`.
- `value_serializer`: used to convert user-supplied message
values to :class:`bytes`. If not :data:`None`, called as
``f(value)``, should return :class:`bytes`.
Default: :data:`None`.
- `acks`: one of ``0``, ``1``, ``all``. The number of acknowledgments
the producer requires the leader to have received before considering a
request complete. This controls the durability of records that are
sent. The following settings are common:

* ``0``: Producer will not wait for any acknowledgment from the server
  at all. The message will immediately be added to the socket
  buffer and considered sent. No guarantee can be made that the
  server has received the record in this case, and the retries
  configuration will not take effect (as the client won't
  generally know of any failures). The offset given back for each
  record will always be set to -1.
* ``1``: The broker leader will write the record to its local log but
  will respond without awaiting full acknowledgement from all
  followers. In this case should the leader fail immediately
  after acknowledging the record but before the followers have
  replicated it then the record will be lost.
* ``all``: The broker leader will wait for the full set of in-sync
  replicas to acknowledge the record. This guarantees that the
  record will not be lost as long as at least one in-sync replica
  remains alive. This is the strongest available guarantee.

If unset, defaults to ``acks=1``. If `enable_idempotence` is
:data:`True` defaults to ``acks=all``
- `compression_type`: The compression type for all data generated by
the producer. Valid values are ``gzip``, ``snappy``, ``lz4``, ``zstd``
or :data:`None`.
Compression is of full batches of data, so the efficacy of batching
will also impact the compression ratio (more batching means better
compression). Default: :data:`None`.
- `max_batch_size`: Maximum size of buffered data per partition.
After this amount :meth:`send` coroutine will block until batch is
drained.
Default: 16384
- `linger_ms`: The producer groups together any records that arrive
in between request transmissions into a single batched request.
Normally this occurs only under load when records arrive faster
than they can be sent out. However in some circumstances the client
may want to reduce the number of requests even under moderate load.
This setting accomplishes this by adding a small amount of
artificial delay; that is, if first request is processed faster,
than `linger_ms`, producer will wait ``linger_ms - process_time``.
Default: 0 (i.e. no delay).
- `partitioner`: Callable used to determine which partition
each message is assigned to. Called (after key serialization):
``partitioner(key_bytes, all_partitions, available_partitions)``.
The default partitioner implementation hashes each non-None key
using the same murmur2 algorithm as the Java client so that
messages with the same key are assigned to the same partition.
When a key is :data:`None`, the message is delivered to a random partition
(filtered to partitions with available leaders only, if possible).
- `max_request_size`: The maximum size of a request. This is also
effectively a cap on the maximum record size. Note that the server
has its own cap on record size which may be different from this.
This setting will limit the number of record batches the producer
will send in a single request to avoid sending huge requests.
Default: 1048576.
- `metadata_max_age_ms`: The period of time in milliseconds after
which we force a refresh of metadata even if we haven't seen any
partition leadership changes to proactively discover any new
brokers or partitions. Default: 300000
- `request_timeout_ms`: Produce request timeout in milliseconds.
As it's sent as part of
:class:`~kafka.protocol.produce.ProduceRequest` (it's a blocking
call), maximum waiting time can be up to ``2 *
request_timeout_ms``.
Default: 40000.
- `retry_backoff_ms`: Milliseconds to backoff when retrying on
errors. Default: 100.
- `api_version`: specify which kafka API version to use.
If set to ``auto``, will attempt to infer the broker version by
probing various APIs. Default: ``auto``
- `security_protocol`: Protocol used to communicate with brokers.
Valid values are: ``PLAINTEXT``, ``SSL``. Default: ``PLAINTEXT``.
Default: ``PLAINTEXT``.
- `ssl_context`: pre-configured :class:`~ssl.SSLContext`
for wrapping socket connections. Directly passed into asyncio's
:meth:`~asyncio.loop.create_connection`. For more
information see :ref:`ssl_auth`.
Default: :data:`None`
- `connections_max_idle_ms`: Close idle connections after the number
of milliseconds specified by this config. Specifying :data:`None` will
disable idle checks. Default: 540000 (9 minutes).
- `enable_idempotence`: When set to :data:`True`, the producer will
ensure that exactly one copy of each message is written in the
stream. If :data:`False`, producer retries due to broker failures,
etc., may write duplicates of the retried message in the stream.
Note that enabling idempotence acks to set to ``all``. If it is not
explicitly set by the user it will be chosen. If incompatible
values are set, a :exc:`ValueError` will be thrown.
New in version 0.5.0.
- `sasl_mechanism`: Authentication mechanism when security_protocol
is configured for ``SASL_PLAINTEXT`` or ``SASL_SSL``. Valid values
are: ``PLAIN``, ``GSSAPI``, ``SCRAM-SHA-256``, ``SCRAM-SHA-512``,
``OAUTHBEARER``.
Default: ``PLAIN``
- `sasl_plain_username`: username for SASL ``PLAIN`` authentication.
Default: :data:`None`
- `sasl_plain_password`: password for SASL ``PLAIN`` authentication.
Default: :data:`None`
- `sasl_oauth_token_provider (`: class:`~aiokafka.abc.AbstractTokenProvider`):
OAuthBearer token provider instance. (See
:mod:`kafka.oauth.abstract`).
Default: :data:`None`
- `*topics`: optional list of topics to subscribe to. If not set,
call :meth:`.subscribe` or :meth:`.assign` before consuming records.
Passing topics directly is same as calling :meth:`.subscribe` API.
- `group_id`: name of the consumer group to join for dynamic
partition assignment (if enabled), and to use for fetching and
committing offsets. If None, auto-partition assignment (via
group coordinator) and offset commits are disabled.
Default: None
- `key_deserializer`: Any callable that takes a
raw message key and returns a deserialized key.
- `value_deserializer`: Any callable that takes a
raw message value and returns a deserialized value.
- `fetch_min_bytes`: Minimum amount of data the server should
return for a fetch request, otherwise wait up to
`fetch_max_wait_ms` for more data to accumulate. Default: 1.
- `fetch_max_bytes`: The maximum amount of data the server should
return for a fetch request. This is not an absolute maximum, if
the first message in the first non-empty partition of the fetch
is larger than this value, the message will still be returned
to ensure that the consumer can make progress. NOTE: consumer
performs fetches to multiple brokers in parallel so memory
usage will depend on the number of brokers containing
partitions for the topic.
Supported Kafka version >= 0.10.1.0. Default: 52428800 (50 Mb).
- `fetch_max_wait_ms`: The maximum amount of time in milliseconds
the server will block before answering the fetch request if
there isn't sufficient data to immediately satisfy the
requirement given by fetch_min_bytes. Default: 500.
- `max_partition_fetch_bytes`: The maximum amount of data
per-partition the server will return. The maximum total memory
used for a request ``= #partitions * max_partition_fetch_bytes``.
This size must be at least as large as the maximum message size
the server allows or else it is possible for the producer to
send messages larger than the consumer can fetch. If that
happens, the consumer can get stuck trying to fetch a large
message on a certain partition. Default: 1048576.
- `max_poll_records`: The maximum number of records returned in a
single call to :meth:`.getmany`. Defaults ``None``, no limit.
- `auto_offset_reset`: A policy for resetting offsets on
:exc:`.OffsetOutOfRangeError` errors: ``earliest`` will move to the oldest
available message, ``latest`` will move to the most recent, and
``none`` will raise an exception so you can handle this case.
Default: ``latest``.
- `enable_auto_commit`: If true the consumer's offset will be
periodically committed in the background. Default: True.
- `auto_commit_interval_ms`: milliseconds between automatic
offset commits, if enable_auto_commit is True. Default: 5000.
- `check_crcs`: Automatically check the CRC32 of the records
consumed. This ensures no on-the-wire or on-disk corruption to
the messages occurred. This check adds some overhead, so it may
be disabled in cases seeking extreme performance. Default: True
- `partition_assignment_strategy`: List of objects to use to
distribute partition ownership amongst consumer instances when
group management is used. This preference is implicit in the order
of the strategies in the list. When assignment strategy changes:
to support a change to the assignment strategy, new versions must
enable support both for the old assignment strategy and the new
one. The coordinator will choose the old assignment strategy until
all members have been updated. Then it will choose the new
strategy. Default: [:class:`.RoundRobinPartitionAssignor`]
- `max_poll_interval_ms`: Maximum allowed time between calls to
consume messages (e.g., :meth:`.getmany`). If this interval
is exceeded the consumer is considered failed and the group will
rebalance in order to reassign the partitions to another consumer
group member. If API methods block waiting for messages, that time
does not count against this timeout. See `KIP-62`_ for more
information. Default 300000
- `rebalance_timeout_ms`: The maximum time server will wait for this
consumer to rejoin the group in a case of rebalance. In Java client
this behaviour is bound to `max.poll.interval.ms` configuration,
but as ``aiokafka`` will rejoin the group in the background, we
decouple this setting to allow finer tuning by users that use
:class:`.ConsumerRebalanceListener` to delay rebalacing. Defaults
to ``session_timeout_ms``
- `session_timeout_ms`: Client group session and failure detection
timeout. The consumer sends periodic heartbeats
(`heartbeat.interval.ms`) to indicate its liveness to the broker.
If no hearts are received by the broker for a group member within
the session timeout, the broker will remove the consumer from the
group and trigger a rebalance. The allowed range is configured with
the **broker** configuration properties
`group.min.session.timeout.ms` and `group.max.session.timeout.ms`.
Default: 10000
- `heartbeat_interval_ms`: The expected time in milliseconds
between heartbeats to the consumer coordinator when using
Kafka's group management feature. Heartbeats are used to ensure
that the consumer's session stays active and to facilitate
rebalancing when new consumers join or leave the group. The
value must be set lower than `session_timeout_ms`, but typically
should be set no higher than 1/3 of that value. It can be
adjusted even lower to control the expected time for normal
rebalances. Default: 3000
- `consumer_timeout_ms`: maximum wait timeout for background fetching
routine. Mostly defines how fast the system will see rebalance and
request new data for new partitions. Default: 200
- `exclude_internal_topics`: Whether records from internal topics
(such as offsets) should be exposed to the consumer. If set to True
the only way to receive records from an internal topic is
subscribing to it. Requires 0.10+ Default: True
- `isolation_level`: Controls how to read messages written
transactionally.

If set to ``read_committed``, :meth:`.getmany` will only return
transactional messages which have been committed.
If set to ``read_uncommitted`` (the default), :meth:`.getmany` will
return all messages, even transactional messages which have been
aborted.

Non-transactional messages will be returned unconditionally in
either mode.

Messages will always be returned in offset order. Hence, in
`read_committed` mode, :meth:`.getmany` will only return
messages up to the last stable offset (LSO), which is the one less
than the offset of the first open transaction. In particular any
messages appearing after messages belonging to ongoing transactions
will be withheld until the relevant transaction has been completed.
As a result, `read_committed` consumers will not be able to read up
to the high watermark when there are in flight transactions.
Further, when in `read_committed` the seek_to_end method will
return the LSO. See method docs below. Default: ``read_uncommitted``
- `sasl_oauth_token_provider`: OAuthBearer token provider instance. (See :mod:`kafka.oauth.abstract`).
Default: None

### `benchmark` {#benchmark}

`def benchmark(self: fastkafka.FastKafka, interval: Union[int, datetime.timedelta] = 1, sliding_window_size: Optional[int] = None) -> typing.Callable[[typing.Callable[[~I], typing.Union[~O, NoneType]]], typing.Callable[[~I], typing.Union[~O, NoneType]]]`

Decorator to benchmark produces/consumes functions

**Parameters**:
- `interval`: Period to use to calculate throughput. If value is of type int,
then it will be used as seconds. If value is of type timedelta,
then it will be used as it is. default: 1 - one second
- `sliding_window_size`: The size of the sliding window to use to calculate
average throughput. default: None - By default average throughput is
not calculated

### `consumes` {#consumes}

`def consumes(self: fastkafka.FastKafka, topic: Optional[str] = None, decoder: Union[str, Callable[[bytes, pydantic.main.ModelMetaclass], Any]] = 'json', prefix: str = 'on_', loop=None, bootstrap_servers='localhost', client_id='aiokafka-0.8.0', group_id=None, key_deserializer=None, value_deserializer=None, fetch_max_wait_ms=500, fetch_max_bytes=52428800, fetch_min_bytes=1, max_partition_fetch_bytes=1048576, request_timeout_ms=40000, retry_backoff_ms=100, auto_offset_reset='latest', enable_auto_commit=True, auto_commit_interval_ms=5000, check_crcs=True, metadata_max_age_ms=300000, partition_assignment_strategy=(<class 'kafka.coordinator.assignors.roundrobin.RoundRobinPartitionAssignor'>,), max_poll_interval_ms=300000, rebalance_timeout_ms=None, session_timeout_ms=10000, heartbeat_interval_ms=3000, consumer_timeout_ms=200, max_poll_records=None, ssl_context=None, security_protocol='PLAINTEXT', api_version='auto', exclude_internal_topics=True, connections_max_idle_ms=540000, isolation_level='read_uncommitted', sasl_mechanism='PLAIN', sasl_plain_password=None, sasl_plain_username=None, sasl_kerberos_service_name='kafka', sasl_kerberos_domain_name=None, sasl_oauth_token_provider=None) -> typing.Callable[[typing.Callable[[pydantic.main.BaseModel], typing.Union[NoneType, typing.Awaitable[NoneType]]]], typing.Callable[[pydantic.main.BaseModel], typing.Union[NoneType, typing.Awaitable[NoneType]]]]`

Decorator registering the callback called when a message is received in a topic.

This function decorator is also responsible for registering topics for AsyncAPI specificiation and documentation.

**Parameters**:
- `topic`: Kafka topic that the consumer will subscribe to and execute the
decorated function when it receives a message from the topic,
default: None. If the topic is not specified, topic name will be
inferred from the decorated function name by stripping the defined prefix
- `decoder`: Decoder to use to decode messages consumed from the topic,
default: json - By default, it uses json decoder to decode
bytes to json string and then it creates instance of pydantic
BaseModel. It also accepts custom decoder function.
- `prefix`: Prefix stripped from the decorated function to define a topic name
if the topic argument is not passed, default: "on_". If the decorated
function name is not prefixed with the defined prefix and topic argument
is not passed, then this method will throw ValueError
- `*topics`: optional list of topics to subscribe to. If not set,
call :meth:`.subscribe` or :meth:`.assign` before consuming records.
Passing topics directly is same as calling :meth:`.subscribe` API.
- `bootstrap_servers`: a ``host[:port]`` string (or list of
``host[:port]`` strings) that the consumer should contact to bootstrap
initial cluster metadata.

This does not have to be the full node list.
It just needs to have at least one broker that will respond to a
Metadata API Request. Default port is 9092. If no servers are
specified, will default to ``localhost:9092``.
- `client_id`: a name for this client. This string is passed in
each request to servers and can be used to identify specific
server-side log entries that correspond to this client. Also
submitted to :class:`~.consumer.group_coordinator.GroupCoordinator`
for logging with respect to consumer group administration. Default:
``aiokafka-{version}``
- `group_id`: name of the consumer group to join for dynamic
partition assignment (if enabled), and to use for fetching and
committing offsets. If None, auto-partition assignment (via
group coordinator) and offset commits are disabled.
Default: None
- `key_deserializer`: Any callable that takes a
raw message key and returns a deserialized key.
- `value_deserializer`: Any callable that takes a
raw message value and returns a deserialized value.
- `fetch_min_bytes`: Minimum amount of data the server should
return for a fetch request, otherwise wait up to
`fetch_max_wait_ms` for more data to accumulate. Default: 1.
- `fetch_max_bytes`: The maximum amount of data the server should
return for a fetch request. This is not an absolute maximum, if
the first message in the first non-empty partition of the fetch
is larger than this value, the message will still be returned
to ensure that the consumer can make progress. NOTE: consumer
performs fetches to multiple brokers in parallel so memory
usage will depend on the number of brokers containing
partitions for the topic.
Supported Kafka version >= 0.10.1.0. Default: 52428800 (50 Mb).
- `fetch_max_wait_ms`: The maximum amount of time in milliseconds
the server will block before answering the fetch request if
there isn't sufficient data to immediately satisfy the
requirement given by fetch_min_bytes. Default: 500.
- `max_partition_fetch_bytes`: The maximum amount of data
per-partition the server will return. The maximum total memory
used for a request ``= #partitions * max_partition_fetch_bytes``.
This size must be at least as large as the maximum message size
the server allows or else it is possible for the producer to
send messages larger than the consumer can fetch. If that
happens, the consumer can get stuck trying to fetch a large
message on a certain partition. Default: 1048576.
- `max_poll_records`: The maximum number of records returned in a
single call to :meth:`.getmany`. Defaults ``None``, no limit.
- `request_timeout_ms`: Client request timeout in milliseconds.
Default: 40000.
- `retry_backoff_ms`: Milliseconds to backoff when retrying on
errors. Default: 100.
- `auto_offset_reset`: A policy for resetting offsets on
:exc:`.OffsetOutOfRangeError` errors: ``earliest`` will move to the oldest
available message, ``latest`` will move to the most recent, and
``none`` will raise an exception so you can handle this case.
Default: ``latest``.
- `enable_auto_commit`: If true the consumer's offset will be
periodically committed in the background. Default: True.
- `auto_commit_interval_ms`: milliseconds between automatic
offset commits, if enable_auto_commit is True. Default: 5000.
- `check_crcs`: Automatically check the CRC32 of the records
consumed. This ensures no on-the-wire or on-disk corruption to
the messages occurred. This check adds some overhead, so it may
be disabled in cases seeking extreme performance. Default: True
- `metadata_max_age_ms`: The period of time in milliseconds after
which we force a refresh of metadata even if we haven't seen any
partition leadership changes to proactively discover any new
brokers or partitions. Default: 300000
- `partition_assignment_strategy`: List of objects to use to
distribute partition ownership amongst consumer instances when
group management is used. This preference is implicit in the order
of the strategies in the list. When assignment strategy changes:
to support a change to the assignment strategy, new versions must
enable support both for the old assignment strategy and the new
one. The coordinator will choose the old assignment strategy until
all members have been updated. Then it will choose the new
strategy. Default: [:class:`.RoundRobinPartitionAssignor`]
- `max_poll_interval_ms`: Maximum allowed time between calls to
consume messages (e.g., :meth:`.getmany`). If this interval
is exceeded the consumer is considered failed and the group will
rebalance in order to reassign the partitions to another consumer
group member. If API methods block waiting for messages, that time
does not count against this timeout. See `KIP-62`_ for more
information. Default 300000
- `rebalance_timeout_ms`: The maximum time server will wait for this
consumer to rejoin the group in a case of rebalance. In Java client
this behaviour is bound to `max.poll.interval.ms` configuration,
but as ``aiokafka`` will rejoin the group in the background, we
decouple this setting to allow finer tuning by users that use
:class:`.ConsumerRebalanceListener` to delay rebalacing. Defaults
to ``session_timeout_ms``
- `session_timeout_ms`: Client group session and failure detection
timeout. The consumer sends periodic heartbeats
(`heartbeat.interval.ms`) to indicate its liveness to the broker.
If no hearts are received by the broker for a group member within
the session timeout, the broker will remove the consumer from the
group and trigger a rebalance. The allowed range is configured with
the **broker** configuration properties
`group.min.session.timeout.ms` and `group.max.session.timeout.ms`.
Default: 10000
- `heartbeat_interval_ms`: The expected time in milliseconds
between heartbeats to the consumer coordinator when using
Kafka's group management feature. Heartbeats are used to ensure
that the consumer's session stays active and to facilitate
rebalancing when new consumers join or leave the group. The
value must be set lower than `session_timeout_ms`, but typically
should be set no higher than 1/3 of that value. It can be
adjusted even lower to control the expected time for normal
rebalances. Default: 3000
- `consumer_timeout_ms`: maximum wait timeout for background fetching
routine. Mostly defines how fast the system will see rebalance and
request new data for new partitions. Default: 200
- `api_version`: specify which kafka API version to use.
:class:`AIOKafkaConsumer` supports Kafka API versions >=0.9 only.
If set to ``auto``, will attempt to infer the broker version by
probing various APIs. Default: ``auto``
- `security_protocol`: Protocol used to communicate with brokers.
Valid values are: ``PLAINTEXT``, ``SSL``. Default: ``PLAINTEXT``.
- `ssl_context`: pre-configured :class:`~ssl.SSLContext`
for wrapping socket connections. Directly passed into asyncio's
:meth:`~asyncio.loop.create_connection`. For more information see
:ref:`ssl_auth`. Default: None.
- `exclude_internal_topics`: Whether records from internal topics
(such as offsets) should be exposed to the consumer. If set to True
the only way to receive records from an internal topic is
subscribing to it. Requires 0.10+ Default: True
- `connections_max_idle_ms`: Close idle connections after the number
of milliseconds specified by this config. Specifying `None` will
disable idle checks. Default: 540000 (9 minutes).
- `isolation_level`: Controls how to read messages written
transactionally.

If set to ``read_committed``, :meth:`.getmany` will only return
transactional messages which have been committed.
If set to ``read_uncommitted`` (the default), :meth:`.getmany` will
return all messages, even transactional messages which have been
aborted.

Non-transactional messages will be returned unconditionally in
either mode.

Messages will always be returned in offset order. Hence, in
`read_committed` mode, :meth:`.getmany` will only return
messages up to the last stable offset (LSO), which is the one less
than the offset of the first open transaction. In particular any
messages appearing after messages belonging to ongoing transactions
will be withheld until the relevant transaction has been completed.
As a result, `read_committed` consumers will not be able to read up
to the high watermark when there are in flight transactions.
Further, when in `read_committed` the seek_to_end method will
return the LSO. See method docs below. Default: ``read_uncommitted``
- `sasl_mechanism`: Authentication mechanism when security_protocol
is configured for ``SASL_PLAINTEXT`` or ``SASL_SSL``. Valid values are:
``PLAIN``, ``GSSAPI``, ``SCRAM-SHA-256``, ``SCRAM-SHA-512``,
``OAUTHBEARER``.
Default: ``PLAIN``
- `sasl_plain_username`: username for SASL ``PLAIN`` authentication.
Default: None
- `sasl_plain_password`: password for SASL ``PLAIN`` authentication.
Default: None
- `sasl_oauth_token_provider`: OAuthBearer token provider instance. (See :mod:`kafka.oauth.abstract`).
Default: None

**Returns**:
- : A function returning the same function

### `create_mocks` {#create_mocks}

`def create_mocks(self: fastkafka.FastKafka) -> None`

Creates self.mocks as a named tuple mapping a new function obtained by calling the original functions and a mock

### `produces` {#produces}

`def produces(self: fastkafka.FastKafka, topic: Optional[str] = None, encoder: Union[str, Callable[[pydantic.main.BaseModel], bytes]] = 'json', prefix: str = 'to_', loop=None, bootstrap_servers='localhost', client_id=None, metadata_max_age_ms=300000, request_timeout_ms=40000, api_version='auto', acks=<object object at 0x101ca6040>, key_serializer=None, value_serializer=None, compression_type=None, max_batch_size=16384, partitioner=<kafka.partitioner.default.DefaultPartitioner object at 0x101c80310>, max_request_size=1048576, linger_ms=0, send_backoff_ms=100, retry_backoff_ms=100, security_protocol='PLAINTEXT', ssl_context=None, connections_max_idle_ms=540000, enable_idempotence=False, transactional_id=None, transaction_timeout_ms=60000, sasl_mechanism='PLAIN', sasl_plain_password=None, sasl_plain_username=None, sasl_kerberos_service_name='kafka', sasl_kerberos_domain_name=None, sasl_oauth_token_provider=None) -> typing.Callable[[typing.Union[typing.Callable[..., typing.Union[pydantic.main.BaseModel, fastkafka.KafkaEvent[pydantic.main.BaseModel]]], typing.Callable[..., typing.Awaitable[typing.Union[pydantic.main.BaseModel, fastkafka.KafkaEvent[pydantic.main.BaseModel]]]]]], typing.Union[typing.Callable[..., typing.Union[pydantic.main.BaseModel, fastkafka.KafkaEvent[pydantic.main.BaseModel]]], typing.Callable[..., typing.Awaitable[typing.Union[pydantic.main.BaseModel, fastkafka.KafkaEvent[pydantic.main.BaseModel]]]]]]`

Decorator registering the callback called when delivery report for a produced message is received

This function decorator is also responsible for registering topics for AsyncAPI specificiation and documentation.

**Parameters**:
- `topic`: Kafka topic that the producer will send returned values from
the decorated function to, default: None- If the topic is not
specified, topic name will be inferred from the decorated function
name by stripping the defined prefix.
- `encoder`: Encoder to use to encode messages before sending it to topic,
default: json - By default, it uses json encoder to convert
pydantic basemodel to json string and then encodes the string to bytes
using 'utf-8' encoding. It also accepts custom encoder function.
- `prefix`: Prefix stripped from the decorated function to define a topic
name if the topic argument is not passed, default: "to_". If the
decorated function name is not prefixed with the defined prefix
and topic argument is not passed, then this method will throw ValueError
- `bootstrap_servers`: a ``host[:port]`` string or list of
``host[:port]`` strings that the producer should contact to
bootstrap initial cluster metadata. This does not have to be the
full node list.  It just needs to have at least one broker that will
respond to a Metadata API Request. Default port is 9092. If no
servers are specified, will default to ``localhost:9092``.
- `client_id`: a name for this client. This string is passed in
each request to servers and can be used to identify specific
server-side log entries that correspond to this client.
Default: ``aiokafka-producer-#`` (appended with a unique number
per instance)
- `key_serializer`: used to convert user-supplied keys to bytes
If not :data:`None`, called as ``f(key),`` should return
:class:`bytes`.
Default: :data:`None`.
- `value_serializer`: used to convert user-supplied message
values to :class:`bytes`. If not :data:`None`, called as
``f(value)``, should return :class:`bytes`.
Default: :data:`None`.
- `acks`: one of ``0``, ``1``, ``all``. The number of acknowledgments
the producer requires the leader to have received before considering a
request complete. This controls the durability of records that are
sent. The following settings are common:

* ``0``: Producer will not wait for any acknowledgment from the server
  at all. The message will immediately be added to the socket
  buffer and considered sent. No guarantee can be made that the
  server has received the record in this case, and the retries
  configuration will not take effect (as the client won't
  generally know of any failures). The offset given back for each
  record will always be set to -1.
* ``1``: The broker leader will write the record to its local log but
  will respond without awaiting full acknowledgement from all
  followers. In this case should the leader fail immediately
  after acknowledging the record but before the followers have
  replicated it then the record will be lost.
* ``all``: The broker leader will wait for the full set of in-sync
  replicas to acknowledge the record. This guarantees that the
  record will not be lost as long as at least one in-sync replica
  remains alive. This is the strongest available guarantee.

If unset, defaults to ``acks=1``. If `enable_idempotence` is
:data:`True` defaults to ``acks=all``
- `compression_type`: The compression type for all data generated by
the producer. Valid values are ``gzip``, ``snappy``, ``lz4``, ``zstd``
or :data:`None`.
Compression is of full batches of data, so the efficacy of batching
will also impact the compression ratio (more batching means better
compression). Default: :data:`None`.
- `max_batch_size`: Maximum size of buffered data per partition.
After this amount :meth:`send` coroutine will block until batch is
drained.
Default: 16384
- `linger_ms`: The producer groups together any records that arrive
in between request transmissions into a single batched request.
Normally this occurs only under load when records arrive faster
than they can be sent out. However in some circumstances the client
may want to reduce the number of requests even under moderate load.
This setting accomplishes this by adding a small amount of
artificial delay; that is, if first request is processed faster,
than `linger_ms`, producer will wait ``linger_ms - process_time``.
Default: 0 (i.e. no delay).
- `partitioner`: Callable used to determine which partition
each message is assigned to. Called (after key serialization):
``partitioner(key_bytes, all_partitions, available_partitions)``.
The default partitioner implementation hashes each non-None key
using the same murmur2 algorithm as the Java client so that
messages with the same key are assigned to the same partition.
When a key is :data:`None`, the message is delivered to a random partition
(filtered to partitions with available leaders only, if possible).
- `max_request_size`: The maximum size of a request. This is also
effectively a cap on the maximum record size. Note that the server
has its own cap on record size which may be different from this.
This setting will limit the number of record batches the producer
will send in a single request to avoid sending huge requests.
Default: 1048576.
- `metadata_max_age_ms`: The period of time in milliseconds after
which we force a refresh of metadata even if we haven't seen any
partition leadership changes to proactively discover any new
brokers or partitions. Default: 300000
- `request_timeout_ms`: Produce request timeout in milliseconds.
As it's sent as part of
:class:`~kafka.protocol.produce.ProduceRequest` (it's a blocking
call), maximum waiting time can be up to ``2 *
request_timeout_ms``.
Default: 40000.
- `retry_backoff_ms`: Milliseconds to backoff when retrying on
errors. Default: 100.
- `api_version`: specify which kafka API version to use.
If set to ``auto``, will attempt to infer the broker version by
probing various APIs. Default: ``auto``
- `security_protocol`: Protocol used to communicate with brokers.
Valid values are: ``PLAINTEXT``, ``SSL``. Default: ``PLAINTEXT``.
Default: ``PLAINTEXT``.
- `ssl_context`: pre-configured :class:`~ssl.SSLContext`
for wrapping socket connections. Directly passed into asyncio's
:meth:`~asyncio.loop.create_connection`. For more
information see :ref:`ssl_auth`.
Default: :data:`None`
- `connections_max_idle_ms`: Close idle connections after the number
of milliseconds specified by this config. Specifying :data:`None` will
disable idle checks. Default: 540000 (9 minutes).
- `enable_idempotence`: When set to :data:`True`, the producer will
ensure that exactly one copy of each message is written in the
stream. If :data:`False`, producer retries due to broker failures,
etc., may write duplicates of the retried message in the stream.
Note that enabling idempotence acks to set to ``all``. If it is not
explicitly set by the user it will be chosen. If incompatible
values are set, a :exc:`ValueError` will be thrown.
New in version 0.5.0.
- `sasl_mechanism`: Authentication mechanism when security_protocol
is configured for ``SASL_PLAINTEXT`` or ``SASL_SSL``. Valid values
are: ``PLAIN``, ``GSSAPI``, ``SCRAM-SHA-256``, ``SCRAM-SHA-512``,
``OAUTHBEARER``.
Default: ``PLAIN``
- `sasl_plain_username`: username for SASL ``PLAIN`` authentication.
Default: :data:`None`
- `sasl_plain_password`: password for SASL ``PLAIN`` authentication.
Default: :data:`None`
- `sasl_oauth_token_provider (`: class:`~aiokafka.abc.AbstractTokenProvider`):
OAuthBearer token provider instance. (See
:mod:`kafka.oauth.abstract`).
Default: :data:`None`

**Returns**:
- : A function returning the same function

**Exceptions**:
- `ValueError`: when needed

### `run_in_background` {#run_in_background}

`def run_in_background(self: fastkafka.FastKafka) -> typing.Callable[[typing.Callable[..., typing.Coroutine[typing.Any, typing.Any, typing.Any]]], typing.Callable[..., typing.Coroutine[typing.Any, typing.Any, typing.Any]]]`

Decorator to schedule a task to be run in the background.

This decorator is used to schedule a task to be run in the background when the app's `_on_startup` event is triggered.

**Returns**:
- A decorator function that takes a background task as an input and stores it to be run in the backround.



================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/KafkaEvent.md
================================================
## `fastkafka.KafkaEvent` {#fastkafka.KafkaEvent}


A generic class for representing Kafka events. Based on BaseSubmodel, bound to pydantic.BaseModel

**Parameters**:
- `message`: The message contained in the Kafka event, can be of type pydantic.BaseModel.
- `key`: The optional key used to identify the Kafka event.



================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/encoder/avsc_to_pydantic.md
================================================
## `fastkafka.encoder.avsc_to_pydantic` {#fastkafka.encoder.avsc_to_pydantic}

### `avsc_to_pydantic` {#avsc_to_pydantic}

`def avsc_to_pydantic(schema: Dict[str, Any]) -> ModelMetaclass`

Generate pydantic model from given Avro Schema

**Parameters**:
- `schema`: Avro schema in dictionary format

**Returns**:
- Pydantic model class built from given avro schema



================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/ApacheKafkaBroker.md
================================================
## `fastkafka.testing.ApacheKafkaBroker` {#fastkafka.testing.ApacheKafkaBroker}


ApacheKafkaBroker class, used for running unique kafka brokers in tests to prevent topic clashing.

### `__init__` {#init}

`def __init__(self, topics: Iterable[str] = [], retries: int = 3, apply_nest_asyncio: bool = False, zookeeper_port: int = 2181, listener_port: int = 9092) -> None`

Initialises the ApacheKafkaBroker object

**Parameters**:
- `data_dir`: Path to the directory where the zookeepeer instance will save data
- `zookeeper_port`: Port for clients (Kafka brokes) to connect
- `listener_port`: Port on which the clients (producers and consumers) can connect

### `start` {#start}

`def start(self: fastkafka.testing.ApacheKafkaBroker) -> str`

Starts a local kafka broker and zookeeper instance synchronously

**Returns**:
- Kafka broker bootstrap server address in string format: add:port

### `stop` {#stop}

`def stop(self: fastkafka.testing.ApacheKafkaBroker) -> None`

Stops a local kafka broker and zookeeper instance synchronously

**Returns**:
- None



================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/LocalRedpandaBroker.md
================================================
## `fastkafka.testing.LocalRedpandaBroker` {#fastkafka.testing.LocalRedpandaBroker}


LocalRedpandaBroker class, used for running unique redpanda brokers in tests to prevent topic clashing.

### `__init__` {#init}

`def __init__(self, topics: Iterable[str] = [], retries: int = 3, apply_nest_asyncio: bool = False, listener_port: int = 9092, tag: str = 'v23.1.2', seastar_core: int = 1, memory: str = '1G', mode: str = 'dev-container', default_log_level: str = 'debug', **kwargs: Dict[str, Any]) -> None`

Initialises the LocalRedpandaBroker object

**Parameters**:
- `listener_port`: Port on which the clients (producers and consumers) can connect
- `tag`: Tag of Redpanda image to use to start container
- `seastar_core`: Core(s) to use byt Seastar (the framework Redpanda uses under the hood)
- `memory`: The amount of memory to make available to Redpanda
- `mode`: Mode to use to load configuration properties in container
- `default_log_level`: Log levels to use for Redpanda

### `get_service_config_string` {#get_service_config_string}

`def get_service_config_string(self, service: str, data_dir: pathlib.Path) -> str`

Generates a configuration for a service

**Parameters**:
- `data_dir`: Path to the directory where the zookeepeer instance will save data
- `service`: "redpanda", defines which service to get config string for

### `start` {#start}

`def start(self: fastkafka.testing.LocalRedpandaBroker) -> str`

Starts a local redpanda broker instance synchronously

**Returns**:
- Redpanda broker bootstrap server address in string format: add:port

### `stop` {#stop}

`def stop(self: fastkafka.testing.LocalRedpandaBroker) -> None`

Stops a local redpanda broker instance synchronously

**Returns**:
- None



================================================
FILE: docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/Tester.md
================================================
## `fastkafka.testing.Tester` {#fastkafka.testing.Tester}

### `__init__` {#init}

`def __init__(self, app: Union[fastkafka.FastKafka, List[fastkafka.FastKafka]], broker: Optional[fastkafka.testing.ApacheKafkaBroker, fastkafka.testing.LocalRedpandaBroker, fastkafka._testing.in_memory_broker.InMemoryBroker] = None, topics: Iterable[str] = [], retries: int = 3, apply_nest_asyncio: bool = False, zookeeper_port: int = 2181, listener_port: int = 9092) -> None`

Mirror-like object for testing a FastKafka application

Can be used as context manager

**Parameters**:
- `data_dir`: Path to the directory where the zookeepeer instance will save data
- `zookeeper_port`: Port for clients (Kafka brokes) to connect
- `listener_port`: Port on which the clients (producers and consumers) can connect

### `benchmark` {#benchmark}

`def benchmark(self: fastkafka.FastKafka, interval: Union[int, datetime.timedelta] = 1, sliding_window_size: Optional[int] = None) -> typing.Callable[[typing.Callable[[~I], typing.Union[~O, NoneType]]], typing.Callable[[~I], typing.Union[~O, NoneType]]]`

Decorator to benchmark produces/consumes functions

**Parameters**:
- `interval`: Period to use to calculate throughput. If value is of type int,
then it will be used as seconds. If value is of type timedelta,
then it will be used as it is. default: 1 - one second
- `sliding_window_size`: The size of the sliding window to use to calculate
average throughput. default: None - By default average throughput is
not calculated

### `consumes` {#consumes}

`def consumes(self: fastkafka.FastKafka, topic: Optional[str] = None, decoder: Union[str, Callable[[bytes, pydantic.main.ModelMetaclass], Any]] = 'json', prefix: str = 'on_', loop=None, bootstrap_servers='localhost', client_id='aiokafka-0.8.0', group_id=None, key_deserializer=None, value_deserializer=None, fetch_max_wait_ms=500, fetch_max_bytes=52428800, fetch_min_bytes=1, max_partition_fetch_bytes=1048576, request_timeout_ms=40000, retry_backoff_ms=100, auto_offset_reset='latest', enable_auto_commit=True, auto_commit_interval_ms=5000, check_crcs=True, metadata_max_age_ms=300000, partition_assignment_strategy=(<class 'kafka.coordinator.assignors.roundrobin.RoundRobinPartitionAssignor'>,), max_poll_interval_ms=300000, rebalance_timeout_ms=None, session_timeout_ms=10000, heartbeat_interval_ms=3000, consumer_timeout_ms=200, max_poll_records=None, ssl_context=None, security_protocol='PLAINTEXT', api_version='auto', exclude_internal_topics=True, connections_max_idle_ms=540000, isolation_level='read_uncommitted', sasl_mechanism='PLAIN', sasl_plain_password=None, sasl_plain_username=None, sasl_kerberos_service_name='kafka', sasl_kerberos_domain_name=None, sasl_oauth_token_provider=None) -> typing.Callable[[typing.Callable[[pydantic.main.BaseModel], typing.Union[NoneType, typing.Awaitable[NoneType]]]], typing.Callable[[pydantic.main.BaseModel], typing.Union[NoneType, typing.Awaitable[NoneType]]]]`

Decorator registering the callback called when a message is received in a topic.

This function decorator is also respon
Download .txt
gitextract_q28pcl6l/

├── .github/
│   └── workflows/
│       ├── codeql.yml
│       ├── dependency-review.yml
│       ├── deploy.yaml
│       ├── index-docs-for-fastkafka-chat.yaml
│       └── test.yaml
├── .gitignore
├── .pre-commit-config.yaml
├── .semgrepignore
├── CHANGELOG.md
├── CNAME
├── CONTRIBUTING.md
├── LICENSE
├── MANIFEST.in
├── README.md
├── docker/
│   ├── .semgrepignore
│   └── dev.yml
├── docusaurus/
│   ├── babel.config.js
│   ├── docusaurus.config.js
│   ├── package.json
│   ├── scripts/
│   │   ├── build_docusaurus_docs.sh
│   │   ├── install_docusaurus_deps.sh
│   │   ├── serve_docusaurus_docs.sh
│   │   └── update_readme.sh
│   ├── sidebars.js
│   ├── src/
│   │   ├── components/
│   │   │   ├── BrowserWindow/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageCommunity/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFAQ/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFastkafkaChat/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageFeatures/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── HomepageWhatYouGet/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   └── RobotFooterIcon/
│   │   │       ├── index.js
│   │   │       └── styles.module.css
│   │   ├── css/
│   │   │   └── custom.css
│   │   ├── pages/
│   │   │   ├── demo/
│   │   │   │   ├── index.js
│   │   │   │   └── styles.module.css
│   │   │   ├── index.js
│   │   │   └── index.module.css
│   │   └── utils/
│   │       ├── prismDark.mjs
│   │       └── prismLight.mjs
│   ├── static/
│   │   ├── .nojekyll
│   │   └── CNAME
│   ├── versioned_docs/
│   │   ├── version-0.5.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   └── avsc_to_pydantic.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   └── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.6.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   └── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.7.0/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_12_Batch_Consuming.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   ├── version-0.7.1/
│   │   │   ├── CHANGELOG.md
│   │   │   ├── CNAME
│   │   │   ├── CONTRIBUTING.md
│   │   │   ├── LICENSE.md
│   │   │   ├── api/
│   │   │   │   └── fastkafka/
│   │   │   │       ├── EventMetadata.md
│   │   │   │       ├── FastKafka.md
│   │   │   │       ├── KafkaEvent.md
│   │   │   │       ├── encoder/
│   │   │   │       │   ├── AvroBase.md
│   │   │   │       │   ├── avro_decoder.md
│   │   │   │       │   ├── avro_encoder.md
│   │   │   │       │   ├── avsc_to_pydantic.md
│   │   │   │       │   ├── json_decoder.md
│   │   │   │       │   └── json_encoder.md
│   │   │   │       ├── executors/
│   │   │   │       │   ├── DynamicTaskExecutor.md
│   │   │   │       │   └── SequentialExecutor.md
│   │   │   │       └── testing/
│   │   │   │           ├── ApacheKafkaBroker.md
│   │   │   │           ├── LocalRedpandaBroker.md
│   │   │   │           └── Tester.md
│   │   │   ├── cli/
│   │   │   │   ├── fastkafka.md
│   │   │   │   └── run_fastkafka_server_process.md
│   │   │   ├── guides/
│   │   │   │   ├── Guide_00_FastKafka_Demo.md
│   │   │   │   ├── Guide_01_Intro.md
│   │   │   │   ├── Guide_02_First_Steps.md
│   │   │   │   ├── Guide_03_Authentication.md
│   │   │   │   ├── Guide_04_Github_Actions_Workflow.md
│   │   │   │   ├── Guide_05_Lifespan_Handler.md
│   │   │   │   ├── Guide_06_Benchmarking_FastKafka.md
│   │   │   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │   │   │   ├── Guide_11_Consumes_Basics.md
│   │   │   │   ├── Guide_12_Batch_Consuming.md
│   │   │   │   ├── Guide_21_Produces_Basics.md
│   │   │   │   ├── Guide_22_Partition_Keys.md
│   │   │   │   ├── Guide_23_Batch_Producing.md
│   │   │   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │   │   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │   │   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │   │   │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │   │   ├── index.md
│   │   │   └── overrides/
│   │   │       ├── css/
│   │   │       │   └── extra.css
│   │   │       └── js/
│   │   │           ├── extra.js
│   │   │           ├── math.js
│   │   │           └── mathjax.js
│   │   └── version-0.8.0/
│   │       ├── CHANGELOG.md
│   │       ├── CNAME
│   │       ├── CONTRIBUTING.md
│   │       ├── LICENSE.md
│   │       ├── api/
│   │       │   └── fastkafka/
│   │       │       ├── EventMetadata.md
│   │       │       ├── FastKafka.md
│   │       │       ├── KafkaEvent.md
│   │       │       ├── encoder/
│   │       │       │   ├── AvroBase.md
│   │       │       │   ├── avro_decoder.md
│   │       │       │   ├── avro_encoder.md
│   │       │       │   ├── avsc_to_pydantic.md
│   │       │       │   ├── json_decoder.md
│   │       │       │   └── json_encoder.md
│   │       │       ├── executors/
│   │       │       │   ├── DynamicTaskExecutor.md
│   │       │       │   └── SequentialExecutor.md
│   │       │       └── testing/
│   │       │           ├── ApacheKafkaBroker.md
│   │       │           ├── LocalRedpandaBroker.md
│   │       │           └── Tester.md
│   │       ├── cli/
│   │       │   ├── fastkafka.md
│   │       │   └── run_fastkafka_server_process.md
│   │       ├── guides/
│   │       │   ├── Guide_00_FastKafka_Demo.md
│   │       │   ├── Guide_01_Intro.md
│   │       │   ├── Guide_02_First_Steps.md
│   │       │   ├── Guide_03_Authentication.md
│   │       │   ├── Guide_04_Github_Actions_Workflow.md
│   │       │   ├── Guide_05_Lifespan_Handler.md
│   │       │   ├── Guide_06_Benchmarking_FastKafka.md
│   │       │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md
│   │       │   ├── Guide_11_Consumes_Basics.md
│   │       │   ├── Guide_12_Batch_Consuming.md
│   │       │   ├── Guide_21_Produces_Basics.md
│   │       │   ├── Guide_22_Partition_Keys.md
│   │       │   ├── Guide_23_Batch_Producing.md
│   │       │   ├── Guide_24_Using_Multiple_Kafka_Clusters.md
│   │       │   ├── Guide_30_Using_docker_to_deploy_fastkafka.md
│   │       │   ├── Guide_31_Using_redpanda_to_test_fastkafka.md
│   │       │   └── Guide_32_Using_fastapi_to_run_fastkafka_application.md
│   │       ├── index.md
│   │       └── overrides/
│   │           ├── css/
│   │           │   └── extra.css
│   │           └── js/
│   │               ├── extra.js
│   │               ├── math.js
│   │               └── mathjax.js
│   ├── versioned_sidebars/
│   │   ├── version-0.5.0-sidebars.json
│   │   ├── version-0.6.0-sidebars.json
│   │   ├── version-0.7.0-sidebars.json
│   │   ├── version-0.7.1-sidebars.json
│   │   └── version-0.8.0-sidebars.json
│   └── versions.json
├── fastkafka/
│   ├── __init__.py
│   ├── _aiokafka_imports.py
│   ├── _application/
│   │   ├── __init__.py
│   │   ├── app.py
│   │   └── tester.py
│   ├── _cli.py
│   ├── _cli_docs.py
│   ├── _cli_testing.py
│   ├── _components/
│   │   ├── __init__.py
│   │   ├── _subprocess.py
│   │   ├── aiokafka_consumer_loop.py
│   │   ├── asyncapi.py
│   │   ├── benchmarking.py
│   │   ├── docs_dependencies.py
│   │   ├── encoder/
│   │   │   ├── __init__.py
│   │   │   ├── avro.py
│   │   │   └── json.py
│   │   ├── helpers.py
│   │   ├── logger.py
│   │   ├── meta.py
│   │   ├── producer_decorator.py
│   │   ├── task_streaming.py
│   │   └── test_dependencies.py
│   ├── _docusaurus_helper.py
│   ├── _helpers.py
│   ├── _modidx.py
│   ├── _server.py
│   ├── _testing/
│   │   ├── __init__.py
│   │   ├── apache_kafka_broker.py
│   │   ├── in_memory_broker.py
│   │   ├── local_redpanda_broker.py
│   │   └── test_utils.py
│   ├── encoder.py
│   ├── executors.py
│   └── testing.py
├── mkdocs/
│   ├── docs_overrides/
│   │   ├── css/
│   │   │   └── extra.css
│   │   └── js/
│   │       ├── extra.js
│   │       ├── math.js
│   │       └── mathjax.js
│   ├── mkdocs.yml
│   ├── overrides/
│   │   └── main.html
│   ├── site_overrides/
│   │   ├── main.html
│   │   └── partials/
│   │       └── copyright.html
│   └── summary_template.txt
├── mypy.ini
├── nbs/
│   ├── .gitignore
│   ├── 000_AIOKafkaImports.ipynb
│   ├── 000_Testing_export.ipynb
│   ├── 001_InMemoryBroker.ipynb
│   ├── 002_ApacheKafkaBroker.ipynb
│   ├── 003_LocalRedpandaBroker.ipynb
│   ├── 004_Test_Utils.ipynb
│   ├── 005_Application_executors_export.ipynb
│   ├── 006_TaskStreaming.ipynb
│   ├── 010_Application_export.ipynb
│   ├── 011_ConsumerLoop.ipynb
│   ├── 013_ProducerDecorator.ipynb
│   ├── 014_AsyncAPI.ipynb
│   ├── 015_FastKafka.ipynb
│   ├── 016_Tester.ipynb
│   ├── 017_Benchmarking.ipynb
│   ├── 018_Avro_Encode_Decoder.ipynb
│   ├── 019_Json_Encode_Decoder.ipynb
│   ├── 020_Encoder_Export.ipynb
│   ├── 021_FastKafkaServer.ipynb
│   ├── 022_Subprocess.ipynb
│   ├── 023_CLI.ipynb
│   ├── 024_CLI_Docs.ipynb
│   ├── 025_CLI_Testing.ipynb
│   ├── 096_Docusaurus_Helper.ipynb
│   ├── 096_Meta.ipynb
│   ├── 097_Docs_Dependencies.ipynb
│   ├── 098_Test_Dependencies.ipynb
│   ├── 099_Test_Service.ipynb
│   ├── 998_Internal_Helpers.ipynb
│   ├── 999_Helpers.ipynb
│   ├── Logger.ipynb
│   ├── _quarto.yml
│   ├── guides/
│   │   ├── .gitignore
│   │   ├── Guide_00_FastKafka_Demo.ipynb
│   │   ├── Guide_01_Intro.ipynb
│   │   ├── Guide_02_First_Steps.ipynb
│   │   ├── Guide_03_Authentication.ipynb
│   │   ├── Guide_04_Github_Actions_Workflow.ipynb
│   │   ├── Guide_05_Lifespan_Handler.ipynb
│   │   ├── Guide_06_Benchmarking_FastKafka.ipynb
│   │   ├── Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.ipynb
│   │   ├── Guide_11_Consumes_Basics.ipynb
│   │   ├── Guide_12_Batch_Consuming.ipynb
│   │   ├── Guide_21_Produces_Basics.ipynb
│   │   ├── Guide_22_Partition_Keys.ipynb
│   │   ├── Guide_23_Batch_Producing.ipynb
│   │   ├── Guide_24_Using_Multiple_Kafka_Clusters.ipynb
│   │   ├── Guide_30_Using_docker_to_deploy_fastkafka.ipynb
│   │   ├── Guide_31_Using_redpanda_to_test_fastkafka.ipynb
│   │   ├── Guide_32_Using_fastapi_to_run_fastkafka_application.ipynb
│   │   └── Guide_33_Using_Tester_class_to_test_fastkafka.ipynb
│   ├── index.ipynb
│   ├── nbdev.yml
│   ├── sidebar.yml
│   └── styles.css
├── run_jupyter.sh
├── set_variables.sh
├── settings.ini
├── setup.py
└── stop_jupyter.sh
Download .txt
SYMBOL INDEX (422 symbols across 39 files)

FILE: docusaurus/src/components/BrowserWindow/index.js
  function BrowserWindow (line 13) | function BrowserWindow({
  function IframeWindow (line 48) | function IframeWindow({url}) {

FILE: docusaurus/src/components/HomepageCommunity/index.js
  function Testimonial (line 5) | function Testimonial({ testimonialLimitToShow, allTestimonials }) {
  function HomepageCommunity (line 48) | function HomepageCommunity() {

FILE: docusaurus/src/components/HomepageFAQ/index.js
  function HomepageFAQ (line 37) | function HomepageFAQ() {

FILE: docusaurus/src/components/HomepageFastkafkaChat/index.js
  function Feature (line 38) | function Feature({Svg, title, description}) {

FILE: docusaurus/src/components/HomepageFeatures/index.js
  function Feature (line 38) | function Feature({src, title, description}) {
  function HomepageFeatures (line 52) | function HomepageFeatures() {

FILE: docusaurus/src/components/HomepageWhatYouGet/index.js
  function HomepageWhatYouGet (line 7) | function HomepageWhatYouGet() {

FILE: docusaurus/src/components/RobotFooterIcon/index.js
  function RobotFooterIcon (line 6) | function RobotFooterIcon() {

FILE: docusaurus/src/pages/demo/index.js
  function Hello (line 13) | function Hello() {

FILE: docusaurus/src/pages/index.js
  function HomepageHeader (line 14) | function HomepageHeader() {
  function Home (line 34) | function Home() {

FILE: fastkafka/__init__.py
  function dummy (line 17) | def dummy() -> None:

FILE: fastkafka/_aiokafka_imports.py
  function dummy (line 18) | def dummy() -> None:

FILE: fastkafka/_application/app.py
  function _get_kafka_config (line 60) | def _get_kafka_config(
  function _get_kafka_brokers (line 86) | def _get_kafka_brokers(
  function _get_broker_addr_list (line 135) | def _get_broker_addr_list(
  function _get_topic_name (line 144) | def _get_topic_name(
  function _get_contact_info (line 163) | def _get_contact_info(
  class FastKafka (line 178) | class FastKafka:
    method __init__ (line 180) | def __init__(
    method is_started (line 309) | def is_started(self) -> bool:
    method set_kafka_broker (line 322) | def set_kafka_broker(self, kafka_broker_name: str) -> None:
    method __aenter__ (line 340) | async def __aenter__(self) -> "FastKafka":
    method __aexit__ (line 347) | async def __aexit__(
    method _start (line 357) | async def _start(self) -> None:
    method _stop (line 360) | async def _stop(self) -> None:
    method consumes (line 363) | def consumes(
    method produces (line 375) | def produces(
    method benchmark (line 387) | def benchmark(
    method run_in_background (line 395) | def run_in_background(
    method _populate_consumers (line 400) | def _populate_consumers(
    method get_topics (line 406) | def get_topics(self) -> Iterable[str]:
    method _populate_producers (line 409) | async def _populate_producers(self) -> None:
    method _populate_bg_tasks (line 412) | async def _populate_bg_tasks(self) -> None:
    method create_docs (line 415) | def create_docs(self) -> None:
    method create_mocks (line 418) | def create_mocks(self) -> None:
    method _shutdown_consumers (line 421) | async def _shutdown_consumers(self) -> None:
    method _shutdown_producers (line 424) | async def _shutdown_producers(self) -> None:
    method _shutdown_bg_tasks (line 427) | async def _shutdown_bg_tasks(self) -> None:
  function _get_decoder_fn (line 431) | def _get_decoder_fn(decoder: str) -> Callable[[bytes, Type[BaseModel]], ...
  function _prepare_and_check_brokers (line 451) | def _prepare_and_check_brokers(
  function _resolve_key (line 464) | def _resolve_key(key: str, dictionary: Dict[str, Any]) -> str:
  function consumes (line 475) | def consumes(
  function _get_encoder_fn (line 563) | def _get_encoder_fn(encoder: str) -> Callable[[BaseModel], bytes]:
  function produces (line 585) | def produces(
  function get_topics (line 668) | def get_topics(self: FastKafka) -> Iterable[str]:
  function run_in_background (line 681) | def run_in_background(
  function _populate_consumers (line 718) | def _populate_consumers(
  function _shutdown_consumers (line 761) | async def _shutdown_consumers(
  function _create_producer (line 769) | async def _create_producer(  # type: ignore
  function _populate_producers (line 814) | async def _populate_producers(self: FastKafka) -> None:
  function _shutdown_producers (line 859) | async def _shutdown_producers(self: FastKafka) -> None:
  function _populate_bg_tasks (line 882) | async def _populate_bg_tasks(
  function _shutdown_bg_tasks (line 895) | async def _shutdown_bg_tasks(
  function _start (line 918) | async def _start(self: FastKafka) -> None:
  function _stop (line 931) | async def _stop(self: FastKafka) -> None:
  function create_docs (line 943) | def create_docs(self: FastKafka) -> None:
  class AwaitedMock (line 972) | class AwaitedMock:
    method _await_for (line 981) | def _await_for(f: Callable[..., Any]) -> Callable[..., Any]:
    method __init__ (line 1014) | def __init__(self, o: Any):
  function create_mocks (line 1031) | def create_mocks(self: FastKafka) -> None:
  function benchmark (line 1113) | def benchmark(
  function fastapi_lifespan (line 1168) | def fastapi_lifespan(

FILE: fastkafka/_application/tester.py
  function _get_broker_spec (line 32) | def _get_broker_spec(bootstrap_server: str) -> KafkaBroker:
  class Tester (line 48) | class Tester(FastKafka):
    method __init__ (line 51) | def __init__(
    method _start_tester (line 79) | async def _start_tester(self) -> None:
    method _stop_tester (line 88) | async def _stop_tester(self) -> None:
    method _create_mirrors (line 94) | def _create_mirrors(self) -> None:
    method _arrange_mirrors (line 97) | def _arrange_mirrors(self) -> None:
    method _set_arguments_and_return_old (line 100) | def _set_arguments_and_return_old(
    method _restore_initial_arguments (line 121) | def _restore_initial_arguments(self, initial_arguments: Dict[Any, Any]...
    method using_external_broker (line 131) | async def using_external_broker(
    method using_inmemory_broker (line 154) | async def using_inmemory_broker(
    method _create_ctx (line 177) | async def _create_ctx(self) -> AsyncGenerator["Tester", None]:
    method __aenter__ (line 192) | async def __aenter__(self) -> "Tester":
    method __aexit__ (line 196) | async def __aexit__(self, *args: Any) -> None:
  function mirror_producer (line 200) | def mirror_producer(
  function mirror_consumer (line 245) | def mirror_consumer(
  function _create_mirrors (line 284) | def _create_mirrors(self: Tester) -> None:
  class AmbiguousWarning (line 331) | class AmbiguousWarning:
    method __init__ (line 340) | def __init__(self, topic: str, functions: List[str]):
    method __getattribute__ (line 344) | def __getattribute__(self, attr: str) -> Any:
    method __call__ (line 349) | def __call__(self, *args: Any, **kwargs: Any) -> Any:
  function set_sugar (line 355) | def set_sugar(
  function _arrange_mirrors (line 394) | def _arrange_mirrors(self: Tester) -> None:

FILE: fastkafka/_cli.py
  function run (line 27) | def run(

FILE: fastkafka/_cli_docs.py
  function docs_install_deps (line 37) | def docs_install_deps() -> None:
  function generate_docs (line 63) | def generate_docs(
  function serve_docs (line 103) | def serve_docs(

FILE: fastkafka/_cli_testing.py
  function testing_install_deps (line 25) | def testing_install_deps() -> None:

FILE: fastkafka/_components/_subprocess.py
  function terminate_asyncio_process (line 22) | async def terminate_asyncio_process(p: asyncio.subprocess.Process) -> None:
  function run_async_subprocesses (line 72) | async def run_async_subprocesses(

FILE: fastkafka/_components/aiokafka_consumer_loop.py
  class EventMetadata (line 27) | class EventMetadata:
    method create_event_metadata (line 56) | def create_event_metadata(record: ConsumerRecord) -> "EventMetadata": ...
  function _callback_parameters_wrapper (line 93) | def _callback_parameters_wrapper(
  function _prepare_callback (line 123) | def _prepare_callback(callback: ConsumeCallable) -> AsyncConsumeMeta:
  function _get_single_msg_handlers (line 141) | def _get_single_msg_handlers(  # type: ignore
  function _get_batch_msg_handlers (line 198) | def _get_batch_msg_handlers(  # type: ignore
  function _aiokafka_consumer_loop (line 256) | async def _aiokafka_consumer_loop(  # type: ignore
  function sanitize_kafka_config (line 308) | def sanitize_kafka_config(**kwargs: Any) -> Dict[str, Any]:
  function aiokafka_consumer_loop (line 315) | async def aiokafka_consumer_loop(

FILE: fastkafka/_components/asyncapi.py
  class KafkaMessage (line 34) | class KafkaMessage(BaseModel):
  class SecurityType (line 39) | class SecurityType(str, Enum):
  class APIKeyLocation (line 55) | class APIKeyLocation(str, Enum):
  class SecuritySchema (line 66) | class SecuritySchema(BaseModel):
    method __init__ (line 76) | def __init__(self, **kwargs: Any):
    method model_dump (line 82) | def model_dump(self, *args: Any, **kwargs: Any) -> Dict[str, Any]:
    method model_dump_json (line 94) | def model_dump_json(self, *args: Any, **kwargs: Any) -> str:
  class KafkaBroker (line 99) | class KafkaBroker(BaseModel):
    method model_dump (line 108) | def model_dump(self, *args: Any, **kwargs: Any) -> Dict[str, Any]:
    method model_dump_json (line 120) | def model_dump_json(self, *args: Any, **kwargs: Any) -> str:
  class ContactInfo (line 125) | class ContactInfo(BaseModel):
  class KafkaServiceInfo (line 131) | class KafkaServiceInfo(BaseModel):
  class KafkaBrokers (line 140) | class KafkaBrokers(BaseModel):
    method model_dump (line 143) | def model_dump(self, *args: Any, **kwargs: Any) -> Dict[str, Any]:
    method model_dump_json (line 163) | def model_dump_json(self, *args: Any, **kwargs: Any) -> str:
  function _get_msg_cls_for_producer (line 171) | def _get_msg_cls_for_producer(f: ProduceCallable) -> Type[Any]:
  function _get_msg_cls_for_consumer (line 188) | def _get_msg_cls_for_consumer(f: ConsumeCallable) -> Type[Any]:
  function _get_topic_dict (line 216) | def _get_topic_dict(
  function _get_channels_schema (line 240) | def _get_channels_schema(
  function _get_kafka_msg_classes (line 251) | def _get_kafka_msg_classes(
  function _get_kafka_msg_definitions (line 260) | def _get_kafka_msg_definitions(
  function _get_example (line 271) | def _get_example(cls: Type[BaseModel]) -> BaseModel:
  function _add_example_to_msg_definitions (line 283) | def _add_example_to_msg_definitions(
  function _get_msg_definitions_with_examples (line 294) | def _get_msg_definitions_with_examples(
  function _get_security_schemes (line 314) | def _get_security_schemes(kafka_brokers: KafkaBrokers) -> Dict[str, Any]:
  function _get_components_schema (line 329) | def _get_components_schema(
  function _get_servers_schema (line 362) | def _get_servers_schema(kafka_brokers: KafkaBrokers) -> Dict[str, Any]:
  function _get_asyncapi_schema (line 371) | def _get_asyncapi_schema(
  function yaml_file_cmp (line 392) | def yaml_file_cmp(file_1: Union[Path, str], file_2: Union[Path, str]) ->...
  function _generate_async_spec (line 417) | def _generate_async_spec(
  function _generate_async_docs (line 461) | def _generate_async_docs(
  function export_async_spec (line 497) | def export_async_spec(

FILE: fastkafka/_components/benchmarking.py
  function _benchmark (line 18) | def _benchmark(

FILE: fastkafka/_components/docs_dependencies.py
  function _check_npm (line 33) | def _check_npm(required_major_version: int = npm_required_major_version)...
  function _check_npm_with_local (line 76) | def _check_npm_with_local(node_path: Path = node_path) -> None:
  function _install_node (line 103) | def _install_node(
  function _install_docs_npm_deps (line 158) | async def _install_docs_npm_deps() -> None:

FILE: fastkafka/_components/encoder/avro.py
  class AvroBase (line 22) | class AvroBase(BaseModel):
    method avro_schema_for_pydantic_object (line 26) | def avro_schema_for_pydantic_object(
    method avro_schema_for_pydantic_class (line 53) | def avro_schema_for_pydantic_class(
    method avro_schema (line 80) | def avro_schema(
    method _avro_schema (line 102) | def _avro_schema(schema: Dict[str, Any], namespace: str) -> Dict[str, ...
  function avro_encoder (line 239) | def avro_encoder(msg: BaseModel) -> bytes:
  function avro_decoder (line 263) | def avro_decoder(raw_msg: bytes, cls: Type[BaseModel]) -> Any:
  function avsc_to_pydantic (line 283) | def avsc_to_pydantic(schema: Dict[str, Any]) -> Type[BaseModel]:

FILE: fastkafka/_components/encoder/json.py
  function _to_json_utf8 (line 19) | def _to_json_utf8(o: Any) -> bytes:
  function json_encoder (line 28) | def json_encoder(msg: BaseModel) -> bytes:
  function json_decoder (line 42) | def json_decoder(raw_msg: bytes, cls: Type[BaseModel]) -> Any:

FILE: fastkafka/_components/helpers.py
  function in_notebook (line 7) | def in_notebook() -> bool:
  function change_dir (line 38) | def change_dir(d: str) -> Generator[None, None, None]:
  class ImportFromStringError (line 56) | class ImportFromStringError(Exception):
  function _import_from_string (line 60) | def _import_from_string(import_str: str) -> Any:
  function true_after (line 105) | def true_after(seconds: Union[int, float]) -> Callable[[], bool]:
  function unwrap_list_type (line 115) | def unwrap_list_type(var_type: Union[Type, Parameter]) -> Union[Type, Pa...
  function remove_suffix (line 137) | def remove_suffix(topic: str) -> str:

FILE: fastkafka/_components/logger.py
  function suppress_timestamps (line 26) | def suppress_timestamps(flag: bool = True) -> None:
  function get_default_logger_configuration (line 37) | def get_default_logger_configuration(level: int = logging.INFO) -> Dict[...
  function get_logger (line 80) | def get_logger(
  function set_level (line 120) | def set_level(level: int) -> None:
  function cached_log (line 138) | def cached_log(

FILE: fastkafka/_components/meta.py
  function test_eq (line 21) | def test_eq(a: Any, b: Any) -> None:
  function copy_func (line 30) | def copy_func(f: Union[F, FunctionType]) -> Union[F, FunctionType]:
  function patch_to (line 45) | def patch_to(
  function eval_type (line 72) | def eval_type(
  function union2tuple (line 86) | def union2tuple(t) -> Tuple[Any, ...]:  # type: ignore
  function get_annotations_ex (line 97) | def get_annotations_ex(
  function patch (line 159) | def patch(  # type: ignore
  function _delegates_without_docs (line 172) | def _delegates_without_docs(
  function _format_args (line 219) | def _format_args(xs: List[docstring_parser.DocstringParam]) -> str:
  function combine_params (line 225) | def combine_params(
  function delegates (line 263) | def delegates(
  function use_parameters_of (line 300) | def use_parameters_of(
  function filter_using_signature (line 317) | def filter_using_signature(f: Callable, **kwargs: Dict[str, Any]) -> Dic...
  function export (line 326) | def export(module_name: str) -> Callable[[TorF], TorF]:
  function classcontextmanager (line 347) | def classcontextmanager(name: str = "lifecycle") -> Callable[[Type[T]], ...
  function _get_default_kwargs_from_sig (line 379) | def _get_default_kwargs_from_sig(f: F, **kwargs: Any) -> Dict[str, Any]:

FILE: fastkafka/_components/producer_decorator.py
  class KafkaEvent (line 38) | class KafkaEvent(Generic[BaseSubmodel]):
  function unwrap_from_kafka_event (line 51) | def unwrap_from_kafka_event(var_type: Union[Type, Parameter]) -> Union[T...
  function _wrap_in_event (line 82) | def _wrap_in_event(
  function release_callback (line 88) | def release_callback(
  function produce_single (line 102) | async def produce_single(  # type: ignore
  function send_batch (line 133) | async def send_batch(  # type: ignore
  function produce_batch (line 164) | async def produce_batch(  # type: ignore
  function producer_decorator (line 202) | def producer_decorator(

FILE: fastkafka/_components/task_streaming.py
  class TaskPool (line 26) | class TaskPool:
    method __init__ (line 27) | def __init__(
    method add (line 47) | async def add(self, item: Task) -> None:
    method discard (line 62) | def discard(self, task: Task) -> None:
    method __len__ (line 83) | def __len__(self) -> int:
    method __aenter__ (line 92) | async def __aenter__(self) -> "TaskPool":
    method __aexit__ (line 96) | async def __aexit__(self, *args: Any, **kwargs: Any) -> None:
    method log_error (line 102) | def log_error(logger: Logger) -> Callable[[Exception], None]:
  class ExceptionMonitor (line 119) | class ExceptionMonitor:
    method __init__ (line 120) | def __init__(self) -> None:
    method on_error (line 130) | def on_error(self, e: Exception) -> None:
    method _monitor_step (line 143) | def _monitor_step(self) -> None:
    method __aenter__ (line 154) | async def __aenter__(self) -> "ExceptionMonitor":
    method __aexit__ (line 157) | async def __aexit__(self, *args: Any, **kwargs: Any) -> None:
  class StreamExecutor (line 163) | class StreamExecutor(ABC):
    method run (line 165) | async def run(  # type: ignore
  function _process_items_task (line 183) | def _process_items_task(  # type: ignore
  class DynamicTaskExecutor (line 207) | class DynamicTaskExecutor(StreamExecutor):
    method __init__ (line 214) | def __init__(
    method run (line 239) | async def run(  # type: ignore
  function _process_items_coro (line 275) | def _process_items_coro(  # type: ignore
  class SequentialExecutor (line 305) | class SequentialExecutor(StreamExecutor):
    method __init__ (line 312) | def __init__(
    method run (line 328) | async def run(  # type: ignore
  function get_executor (line 359) | def get_executor(executor: Union[str, StreamExecutor, None] = None) -> S...

FILE: fastkafka/_components/test_dependencies.py
  function check_java (line 34) | def check_java(*, potential_jdk_path: Optional[List[Path]] = None) -> bool:
  function _install_java (line 59) | def _install_java() -> None:
  class VersionParser (line 87) | class VersionParser(HTMLParser):
    method __init__ (line 99) | def __init__(self) -> None:
    method handle_data (line 109) | def handle_data(self, data: str) -> None:
  function get_kafka_version (line 130) | def get_kafka_version(kafka_repo_url: str = kafka_repo_url) -> str:
  function check_kafka (line 169) | def check_kafka(local_path: Path = local_path) -> bool:
  function _install_kafka (line 201) | def _install_kafka(
  function _install_testing_deps (line 265) | def _install_testing_deps() -> None:
  function generate_app_src (line 275) | def generate_app_src(out_path: Union[Path, str]) -> None:
  function generate_app_in_tmp (line 303) | def generate_app_in_tmp() -> Generator[str, None, None]:

FILE: fastkafka/_docusaurus_helper.py
  function _get_return_annotation (line 49) | def _get_return_annotation(s: Signature) -> str:
  function _get_param_annotation (line 71) | def _get_param_annotation(param: Parameter) -> str:
  function _get_default_value (line 94) | def _get_default_value(param: Parameter) -> str:
  function _get_params_annotation (line 114) | def _get_params_annotation(s: Signature) -> Dict[str, Dict[str, str]]:
  function _generate_parameters_table (line 132) | def _generate_parameters_table(
  function _generate_return_and_raises_table (line 164) | def _generate_return_and_raises_table(
  function _format_docstring_section_items (line 192) | def _format_docstring_section_items(
  function _get_annotation (line 215) | def _get_annotation(symbol: Type) -> Dict[str, Union[Dict[str, Dict[str,...
  function _format_docstring_sections (line 231) | def _format_docstring_sections(symbol: Type, parsed_docstring: Docstring...
  function _format_free_links (line 258) | def _format_free_links(s: str) -> str:
  function _docstring_to_markdown (line 274) | def _docstring_to_markdown(symbol: Type) -> str:
  function _get_submodules (line 299) | def _get_submodules(module_name: str) -> List[str]:
  function _load_submodules (line 316) | def _load_submodules(
  function _get_parameters (line 340) | def _get_parameters(_signature: Signature) -> List[str]:
  function _format_symbol_definition (line 361) | def _format_symbol_definition(symbol: Type, params_list: List[str]) -> str:
  function _get_exps (line 381) | def _get_exps(mod: str) -> Dict[str, str]:
  function _lineno (line 403) | def _lineno(sym: str, fname: str) -> Optional[str]:
  class CustomNbdevLookup (line 408) | class CustomNbdevLookup(NbdevLookup.__wrapped__):  # type: ignore
    method __init__ (line 409) | def __init__(
    method code (line 417) | def code(self, sym: str) -> Optional[str]:
  function _get_symbol_source_link (line 427) | def _get_symbol_source_link(symbol: Type, lib_version: str) -> str:
  function _get_method_type (line 451) | def _get_method_type(symbol: Type) -> str:
  function _get_symbol_definition (line 466) | def _get_symbol_definition(symbol: Type, header_level: int, lib_version:...
  function _is_method (line 496) | def _is_method(symbol: Type) -> bool:
  function _get_formatted_docstring_for_symbol (line 508) | def _get_formatted_docstring_for_symbol(
  function _convert_html_style_attribute_to_jsx (line 553) | def _convert_html_style_attribute_to_jsx(contents: str) -> str:
  function _get_all_markdown_files_path (line 585) | def _get_all_markdown_files_path(docs_path: Path) -> List[Path]:
  function _fix_special_symbols_in_html (line 598) | def _fix_special_symbols_in_html(contents: str) -> str:
  function _add_file_extension_to_link (line 603) | def _add_file_extension_to_link(url: str) -> str:
  function _generate_production_url (line 616) | def _generate_production_url(url: str) -> str:
  function _fix_symbol_links (line 632) | def _fix_symbol_links(
  function _get_relative_url_prefix (line 669) | def _get_relative_url_prefix(docs_path: Path, sub_path: Path) -> str:
  function fix_invalid_syntax_in_markdown (line 692) | def fix_invalid_syntax_in_markdown(docs_path: str) -> None:
  function generate_markdown_docs (line 715) | def generate_markdown_docs(module_name: str, docs_path: str) -> None:
  function _parse_lines (line 735) | def _parse_lines(lines: List[str]) -> Tuple[List[str], int]:
  function _parse_section (line 752) | def _parse_section(text: str, ignore_first_line: bool = False) -> List[A...
  function _get_section_from_markdown (line 781) | def _get_section_from_markdown(
  function generate_sidebar (line 799) | def generate_sidebar(
  function _get_markdown_filenames_from_sidebar (line 855) | def _get_markdown_filenames_from_sidebar(sidebar_file_path: str) -> List...
  function _delete_files (line 876) | def _delete_files(files: List[Path]) -> None:
  function delete_unused_markdown_files_from_sidebar (line 895) | def delete_unused_markdown_files_from_sidebar(
  function update_readme (line 918) | def update_readme() -> None:

FILE: fastkafka/_helpers.py
  function aiokafka2confluent (line 37) | def aiokafka2confluent(**kwargs: Dict[str, Any]) -> Dict[str, Any]:
  function confluent2aiokafka (line 182) | def confluent2aiokafka(confluent_config: Dict[str, Any]) -> Dict[str, Any]:
  function produce_messages (line 207) | async def produce_messages(  # type: ignore
  function consumes_messages (line 367) | async def consumes_messages(
  function produce_and_consume_messages (line 559) | async def produce_and_consume_messages(
  function get_collapsible_admonition (line 834) | def get_collapsible_admonition(
  function source2markdown (line 857) | def source2markdown(o: Union[str, Callable[..., Any]]) -> Markdown:
  function wait_for_get_url (line 873) | async def wait_for_get_url(

FILE: fastkafka/_server.py
  class ServerProcess (line 27) | class ServerProcess:
    method __init__ (line 28) | def __init__(self, app: str, kafka_broker_name: str):
    method run (line 40) | def run(self) -> None:
    method _serve (line 46) | async def _serve(self) -> None:
    method _install_signal_handlers (line 58) | def _install_signal_handlers(self) -> None:
    method _main_loop (line 86) | async def _main_loop(self) -> None:
  function run_fastkafka_server_process (line 98) | def run_fastkafka_server_process(
  function run_fastkafka_server (line 111) | async def run_fastkafka_server(num_workers: int, app: str, kafka_broker:...
  function run_in_process (line 205) | def run_in_process(

FILE: fastkafka/_testing/apache_kafka_broker.py
  function get_zookeeper_config_string (line 39) | def get_zookeeper_config_string(
  function get_kafka_config_string (line 65) | def get_kafka_config_string(
  class ApacheKafkaBroker (line 168) | class ApacheKafkaBroker:
    method __init__ (line 173) | def __init__(
    method is_started (line 212) | def is_started(self) -> bool:
    method _check_deps (line 225) | def _check_deps(cls) -> None:
    method _start (line 232) | async def _start(self) -> str:
    method start (line 239) | def start(self) -> str:
    method stop (line 246) | def stop(self) -> None:
    method _stop (line 250) | async def _stop(self) -> None:
    method get_service_config_string (line 257) | def get_service_config_string(self, service: str, *, data_dir: Path) -...
    method _start_service (line 265) | async def _start_service(self, service: str = "kafka") -> None:
    method _start_zookeeper (line 272) | async def _start_zookeeper(self) -> None:
    method _start_kafka (line 279) | async def _start_kafka(self) -> None:
    method _create_topics (line 286) | async def _create_topics(self) -> None:
    method __enter__ (line 293) | def __enter__(self) -> str:
    method __exit__ (line 297) | def __exit__(self, *args: Any, **kwargs: Any) -> None:
    method __aenter__ (line 300) | async def __aenter__(self) -> str:
    method __aexit__ (line 304) | async def __aexit__(self, *args: Any, **kwargs: Any) -> None:
  function _check_deps (line 309) | def _check_deps(cls: ApacheKafkaBroker) -> None:
  function run_and_match (line 325) | async def run_and_match(
  function is_port_in_use (line 407) | def is_port_in_use(port: Union[int, str]) -> bool:
  function get_free_port (line 421) | def get_free_port() -> str:
  function write_config_and_run (line 434) | async def write_config_and_run(
  function get_service_config_string (line 459) | def get_service_config_string(
  function _start_service (line 479) | async def _start_service(self: ApacheKafkaBroker, service: str = "kafka"...
  function _start_kafka (line 546) | async def _start_kafka(self: ApacheKafkaBroker) -> None:
  function _start_zookeeper (line 552) | async def _start_zookeeper(self: ApacheKafkaBroker) -> None:
  function _create_topics (line 558) | async def _create_topics(self: ApacheKafkaBroker) -> None:
  function _start (line 589) | async def _start(self: ApacheKafkaBroker) -> str:
  function _stop (line 615) | async def _stop(self: ApacheKafkaBroker) -> None:
  function start (line 624) | def start(self: ApacheKafkaBroker) -> str:
  function stop (line 668) | def stop(self: ApacheKafkaBroker) -> None:
  function _start_broker (line 683) | async def _start_broker(broker: Any) -> Union[Any, Exception]:
  function _stop_broker (line 691) | async def _stop_broker(broker: Any) -> Union[Any, Exception]:
  function _get_unique_local_brokers_to_start (line 699) | async def _get_unique_local_brokers_to_start(
  function _start_and_stop_brokers (line 739) | async def _start_and_stop_brokers(brokers: List[T]) -> AsyncIterator[None]:
  function start_apache_kafka_brokers (line 761) | async def start_apache_kafka_brokers(

FILE: fastkafka/_testing/in_memory_broker.py
  class KafkaRecord (line 36) | class KafkaRecord:
  class KafkaPartition (line 50) | class KafkaPartition:
    method __init__ (line 51) | def __init__(self, *, partition: int, topic: str):
    method write (line 63) | def write(self, value: bytes, key: Optional[bytes] = None) -> RecordMe...
    method read (line 93) | def read(self, offset: int) -> Tuple[List[KafkaRecord], int]:
    method latest_offset (line 105) | def latest_offset(self) -> int:
  class KafkaTopic (line 115) | class KafkaTopic:
    method __init__ (line 116) | def __init__(self, topic: str, num_partitions: int = 1):
    method read (line 131) | def read(  # type: ignore
    method write_with_partition (line 148) | def write_with_partition(  # type: ignore
    method write_with_key (line 165) | def write_with_key(self, value: bytes, key: bytes) -> RecordMetadata: ...
    method write (line 179) | def write(  # type: ignore
    method latest_offset (line 206) | def latest_offset(self, partition: int) -> int:
  function split_list (line 219) | def split_list(list_to_split: List[Any], split_size: int) -> List[List[A...
  class GroupMetadata (line 236) | class GroupMetadata:
    method __init__ (line 237) | def __init__(self, num_partitions: int):
    method subscribe (line 249) | def subscribe(self, consumer_id: uuid.UUID) -> None:
    method unsubscribe (line 259) | def unsubscribe(self, consumer_id: uuid.UUID) -> None:
    method rebalance (line 269) | def rebalance(self) -> None:
    method assign_partitions (line 281) | def assign_partitions(self, partitions_per_actor: int) -> None:
    method get_partitions (line 290) | def get_partitions(
    method set_offset (line 309) | def set_offset(self, partition: int, offset: int) -> None:
  class InMemoryBroker (line 321) | class InMemoryBroker:
    method __init__ (line 322) | def __init__(
    method connect (line 331) | def connect(self) -> uuid.UUID:
    method dissconnect (line 334) | def dissconnect(self, consumer_id: uuid.UUID) -> None:
    method subscribe (line 343) | def subscribe(
    method unsubscribe (line 348) | def unsubscribe(
    method read (line 353) | def read(  # type: ignore
    method write (line 364) | def write(  # type: ignore
    method lifecycle (line 376) | def lifecycle(self) -> Iterator["InMemoryBroker"]:
    method _start (line 385) | async def _start(self) -> str:
    method _stop (line 396) | async def _stop(self) -> None:
  function subscribe (line 405) | def subscribe(
  function unsubscribe (line 434) | def unsubscribe(
  function write (line 454) | def write(  # type: ignore
  function read (line 487) | def read(  # type: ignore
  class InMemoryConsumer (line 538) | class InMemoryConsumer:
    method __init__ (line 539) | def __init__(
    method __call__ (line 551) | def __call__(self, **kwargs: Any) -> "InMemoryConsumer":
    method start (line 569) | async def start(self, **kwargs: Any) -> None:
    method stop (line 573) | async def stop(self, **kwargs: Any) -> None:
    method subscribe (line 577) | def subscribe(self, topics: List[str], **kwargs: Any) -> None:
    method getmany (line 581) | async def getmany(  # type: ignore
  function start (line 589) | async def start(self: InMemoryConsumer, **kwargs: Any) -> None:
  function subscribe (line 606) | def subscribe(self: InMemoryConsumer, topics: List[str], **kwargs: Any) ...
  function stop (line 632) | async def stop(self: InMemoryConsumer, **kwargs: Any) -> None:
  function getmany (line 653) | async def getmany(  # type: ignore
  class InMemoryProducer (line 676) | class InMemoryProducer:
    method __init__ (line 677) | def __init__(self, broker: InMemoryBroker, **kwargs: Any) -> None:
    method __call__ (line 683) | def __call__(self, **kwargs: Any) -> "InMemoryProducer":
    method start (line 694) | async def start(self, **kwargs: Any) -> None:
    method stop (line 698) | async def stop(self, **kwargs: Any) -> None:
    method send (line 702) | async def send(  # type: ignore
    method partitions_for (line 712) | async def partitions_for(self, topic: str) -> List[int]:
    method _partition (line 716) | def _partition(
    method create_batch (line 722) | def create_batch(self) -> "MockBatch":
    method send_batch (line 726) | async def send_batch(self, batch: "MockBatch", topic: str, partition: ...
  function start (line 732) | async def start(self: InMemoryProducer, **kwargs: Any) -> None:
  function stop (line 749) | async def stop(self: InMemoryProducer, **kwargs: Any) -> None:
  function send (line 763) | async def send(  # type: ignore
  function partitions_for (line 806) | async def partitions_for(self: InMemoryProducer, topic: str) -> List[int]:
  function _partition (line 821) | def _partition(
  class MockBatch (line 843) | class MockBatch:
    method __init__ (line 844) | def __init__(self) -> None:
    method append (line 850) | def append(  # type: ignore
  function create_batch (line 878) | def create_batch(self: InMemoryProducer) -> "MockBatch":
  function send_batch (line 890) | async def send_batch(
  function lifecycle (line 913) | def lifecycle(self: InMemoryBroker) -> Iterator[InMemoryBroker]:

FILE: fastkafka/_testing/local_redpanda_broker.py
  function get_redpanda_docker_cmd (line 37) | def get_redpanda_docker_cmd(
  class LocalRedpandaBroker (line 84) | class LocalRedpandaBroker:
    method __init__ (line 88) | def __init__(
    method is_started (line 123) | def is_started(self) -> bool:
    method _check_deps (line 136) | async def _check_deps(cls) -> None:
    method _start (line 143) | async def _start(self) -> str:
    method start (line 150) | def start(self) -> str:
    method stop (line 157) | def stop(self) -> None:
    method _stop (line 161) | async def _stop(self) -> None:
    method get_service_config_string (line 168) | def get_service_config_string(self, service: str, *, data_dir: Path) -...
    method _start_redpanda (line 176) | async def _start_redpanda(self) -> None:
    method _create_topics (line 183) | async def _create_topics(self) -> None:
    method __enter__ (line 190) | def __enter__(self) -> str:
    method __exit__ (line 193) | def __exit__(self, *args: Any, **kwargs: Any) -> None:
    method __aenter__ (line 196) | async def __aenter__(self) -> str:
    method __aexit__ (line 199) | async def __aexit__(self, *args: Any, **kwargs: Any) -> None:
  function check_docker (line 203) | async def check_docker(tag: str = "v23.1.2") -> bool:
  function _check_deps (line 227) | async def _check_deps(cls: LocalRedpandaBroker) -> None:
  function _start_redpanda (line 235) | async def _start_redpanda(self: LocalRedpandaBroker, service: str = "red...
  function _create_topics (line 273) | async def _create_topics(self: LocalRedpandaBroker) -> None:
  function _start (line 303) | async def _start(self: LocalRedpandaBroker) -> str:
  function _stop (line 324) | async def _stop(self: LocalRedpandaBroker) -> None:
  function start (line 333) | def start(self: LocalRedpandaBroker) -> str:
  function stop (line 376) | def stop(self: LocalRedpandaBroker) -> None:
  function start_redpanda_brokers (line 392) | async def start_redpanda_brokers(

FILE: fastkafka/_testing/test_utils.py
  function nb_safe_seed (line 32) | def nb_safe_seed(s: str) -> Callable[[int], int]:
  function mock_AIOKafkaProducer_send (line 50) | def mock_AIOKafkaProducer_send() -> Generator[unittest.mock.Mock, None, ...
  function run_script_and_cancel (line 62) | async def run_script_and_cancel(
  function display_docs (line 134) | async def display_docs(docs_path: str, port: int = 4000) -> None:

FILE: fastkafka/encoder.py
  function dummy (line 27) | def dummy() -> None:

FILE: fastkafka/executors.py
  function dummy (line 14) | def dummy() -> None:

FILE: fastkafka/testing.py
  function dummy (line 33) | def dummy() -> None:
Condensed preview — 352 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (4,572K chars).
[
  {
    "path": ".github/workflows/codeql.yml",
    "chars": 3339,
    "preview": "# For most projects, this workflow file will not need changing; you simply need\n# to commit it to your repository.\n#\n# Y"
  },
  {
    "path": ".github/workflows/dependency-review.yml",
    "chars": 885,
    "preview": "# Dependency Review Action\n#\n# This Action will scan dependency manifest files that change as part of a Pull Request, su"
  },
  {
    "path": ".github/workflows/deploy.yaml",
    "chars": 370,
    "preview": "name: Deploy FastKafka documentation to the GitHub Pages\n\non:\n  push:\n    branches: [ \"main\", \"master\"]\n  workflow_dispa"
  },
  {
    "path": ".github/workflows/index-docs-for-fastkafka-chat.yaml",
    "chars": 1542,
    "preview": "name: Index docs for fastkafka chat application\n\non:\n  workflow_run:\n    workflows: [\"pages-build-deployment\"]\n    types"
  },
  {
    "path": ".github/workflows/test.yaml",
    "chars": 3218,
    "preview": "name: CI\non:  [workflow_dispatch, push]\n\njobs:\n  mypy_static_analysis:\n    runs-on: ubuntu-latest\n    steps:\n      - use"
  },
  {
    "path": ".gitignore",
    "chars": 2386,
    "preview": "# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n\n# C extensions\n*.so\n\n# Distribution / packagi"
  },
  {
    "path": ".pre-commit-config.yaml",
    "chars": 639,
    "preview": "# See https://pre-commit.com for more information\n# See https://pre-commit.com/hooks.html for more hooks\n\nrepos:\n-   rep"
  },
  {
    "path": ".semgrepignore",
    "chars": 8,
    "preview": "docker/\n"
  },
  {
    "path": "CHANGELOG.md",
    "chars": 12788,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.8.0\n\n### New Features\n\n- Add support for Pydantic v2 ([#408](https://githu"
  },
  {
    "path": "CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 12565,
    "preview": "# Contributing to FastKafka\n\nFirst off, thanks for taking the time to contribute! ❤️\n\nAll types of contributions are enc"
  },
  {
    "path": "LICENSE",
    "chars": 11357,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "MANIFEST.in",
    "chars": 111,
    "preview": "include settings.ini\ninclude LICENSE\ninclude CONTRIBUTING.md\ninclude README.md\nrecursive-exclude * __pycache__\n"
  },
  {
    "path": "README.md",
    "chars": 23947,
    "preview": "# FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration for your web s"
  },
  {
    "path": "docker/.semgrepignore",
    "chars": 9,
    "preview": "dev.yml\n\n"
  },
  {
    "path": "docker/dev.yml",
    "chars": 756,
    "preview": "version: \"3\"\nservices:\n    fastkafka-devel:  #nosemgrep\n        image: ghcr.io/airtai/nbdev-mkdocs\n        hostname: $DO"
  },
  {
    "path": "docusaurus/babel.config.js",
    "chars": 89,
    "preview": "module.exports = {\n  presets: [require.resolve('@docusaurus/core/lib/babel/preset')],\n};\n"
  },
  {
    "path": "docusaurus/docusaurus.config.js",
    "chars": 7499,
    "preview": "// @ts-check\n// Note: type annotations allow type checking and IDEs autocompletion\n\nconst lightCodeTheme = require('pris"
  },
  {
    "path": "docusaurus/package.json",
    "chars": 1189,
    "preview": "{\n  \"name\": \"fastkafka\",\n  \"version\": \"0.0.0\",\n  \"private\": true,\n  \"scripts\": {\n    \"docusaurus\": \"docusaurus\",\n    \"st"
  },
  {
    "path": "docusaurus/scripts/build_docusaurus_docs.sh",
    "chars": 3431,
    "preview": "#!/bin/bash\n\n# exit when any command fails\nset -e\n\necho \"Cleanup existing build artifacts\"\nrm -rf docusaurus/docs\n\necho "
  },
  {
    "path": "docusaurus/scripts/install_docusaurus_deps.sh",
    "chars": 81,
    "preview": "#!/bin/bash\n\necho \"Install docusaurus dependencies\"\ncd docusaurus && npm install\n"
  },
  {
    "path": "docusaurus/scripts/serve_docusaurus_docs.sh",
    "chars": 83,
    "preview": "#!/bin/bash\n\necho \"Serve docusaurus documentation\"\ncd docusaurus && npm run start\n\n"
  },
  {
    "path": "docusaurus/scripts/update_readme.sh",
    "chars": 181,
    "preview": "#!/bin/bash\n\n# exit when any command fails\nset -e\n\necho \"Run nbdev_readme and fix symbol links\"\npython3 -c \"from fastkaf"
  },
  {
    "path": "docusaurus/sidebars.js",
    "chars": 1514,
    "preview": "module.exports = {\ntutorialSidebar: [\n    'index', {'Guides': \n    [{'Writing services': ['guides/Guide_11_Consumes_Basi"
  },
  {
    "path": "docusaurus/src/components/BrowserWindow/index.js",
    "chars": 1661,
    "preview": "/**\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found i"
  },
  {
    "path": "docusaurus/src/components/BrowserWindow/styles.module.css",
    "chars": 1551,
    "preview": "/**\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found i"
  },
  {
    "path": "docusaurus/src/components/HomepageCommunity/index.js",
    "chars": 11540,
    "preview": "import React, { useState, useEffect } from 'react';\nimport clsx from 'clsx';\nimport styles from './styles.module.css';\n\n"
  },
  {
    "path": "docusaurus/src/components/HomepageCommunity/styles.module.css",
    "chars": 1974,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 3rem 0;\n  width: 100%;\n  background-color: #60bee4;\n}\n\n.f"
  },
  {
    "path": "docusaurus/src/components/HomepageFAQ/index.js",
    "chars": 3336,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\nimport {\n  Accordion,\n  AccordionItem,\n  AccordionItemHeading,\n  Acc"
  },
  {
    "path": "docusaurus/src/components/HomepageFAQ/styles.module.css",
    "chars": 1169,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 2rem 0 8rem 0;\n  width: 100%;\n  background-color: #076d9e"
  },
  {
    "path": "docusaurus/src/components/HomepageFastkafkaChat/index.js",
    "chars": 1196,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\n\nimport styles from './styles.module.css';\n\n\n\n// const FeatureList ="
  },
  {
    "path": "docusaurus/src/components/HomepageFastkafkaChat/styles.module.css",
    "chars": 1646,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 5rem 0;\n  width: 100%;\n  background: rgb(82, 175, 216);\n "
  },
  {
    "path": "docusaurus/src/components/HomepageFeatures/index.js",
    "chars": 1449,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\n\nimport styles from './styles.module.css';\n\n\n\nconst FeatureList = [\n"
  },
  {
    "path": "docusaurus/src/components/HomepageFeatures/styles.module.css",
    "chars": 886,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 1rem 0 4rem 0;\n  width: 100%;\n  background: rgb(20, 116, "
  },
  {
    "path": "docusaurus/src/components/HomepageWhatYouGet/index.js",
    "chars": 2056,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\nimport Link from '@docusaurus/Link';\n\nimport styles from './styles.m"
  },
  {
    "path": "docusaurus/src/components/HomepageWhatYouGet/styles.module.css",
    "chars": 1056,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 4rem 0 0 0;\n  width: 100%;\n  background-color: #60bee4;\n "
  },
  {
    "path": "docusaurus/src/components/RobotFooterIcon/index.js",
    "chars": 348,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\n\nimport styles from './styles.module.css';\n\nexport default function "
  },
  {
    "path": "docusaurus/src/components/RobotFooterIcon/styles.module.css",
    "chars": 190,
    "preview": ".robotFooterContainer {\n  text-align: center;\n  position: relative;\n}\n\n.robotFooterIcon {\n  width: 7rem;\n  height: auto;"
  },
  {
    "path": "docusaurus/src/css/custom.css",
    "chars": 15666,
    "preview": "/**\n * Any CSS included here will be global. The classic template\n * bundles Infima by default. Infima is a CSS framewor"
  },
  {
    "path": "docusaurus/src/pages/demo/index.js",
    "chars": 647,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\nimport Layout from '@theme/Layout';\nimport YouTube from 'react-youtu"
  },
  {
    "path": "docusaurus/src/pages/demo/styles.module.css",
    "chars": 250,
    "preview": ".features {\n  display: flex;\n  align-items: center;\n  padding: 2rem 0;\n  width: 100%;\n}\n\n.header {\n  font-size: 4rem;\n  "
  },
  {
    "path": "docusaurus/src/pages/index.js",
    "chars": 1764,
    "preview": "import React from 'react';\nimport clsx from 'clsx';\nimport Link from '@docusaurus/Link';\nimport useDocusaurusContext fro"
  },
  {
    "path": "docusaurus/src/pages/index.module.css",
    "chars": 1287,
    "preview": "/**\n * CSS files with the .module.css suffix will be treated as CSS modules\n * and scoped locally.\n */\n\n.heroBanner {\n  "
  },
  {
    "path": "docusaurus/src/utils/prismDark.mjs",
    "chars": 1386,
    "preview": "/**\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found i"
  },
  {
    "path": "docusaurus/src/utils/prismLight.mjs",
    "chars": 1816,
    "preview": "/**\n * Copyright (c) Facebook, Inc. and its affiliates.\n *\n * This source code is licensed under the MIT license found i"
  },
  {
    "path": "docusaurus/static/.nojekyll",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/static/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/CHANGELOG.md",
    "chars": 6237,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.5.0\n\n### New Features\n\n- Significant speedup of Kafka producer ([#236](htt"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/FastKafka.md",
    "chars": 38234,
    "preview": "## `fastkafka.FastKafka` {#fastkafka.FastKafka}\n\n### `__init__` {#init}\n\n`def __init__(self, title: Optional[str] = None"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/KafkaEvent.md",
    "chars": 318,
    "preview": "## `fastkafka.KafkaEvent` {#fastkafka.KafkaEvent}\n\n\nA generic class for representing Kafka events. Based on BaseSubmodel"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/encoder/avsc_to_pydantic.md",
    "chars": 365,
    "preview": "## `fastkafka.encoder.avsc_to_pydantic` {#fastkafka.encoder.avsc_to_pydantic}\n\n### `avsc_to_pydantic` {#avsc_to_pydantic"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/ApacheKafkaBroker.md",
    "chars": 1058,
    "preview": "## `fastkafka.testing.ApacheKafkaBroker` {#fastkafka.testing.ApacheKafkaBroker}\n\n\nApacheKafkaBroker class, used for runn"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/LocalRedpandaBroker.md",
    "chars": 1725,
    "preview": "## `fastkafka.testing.LocalRedpandaBroker` {#fastkafka.testing.LocalRedpandaBroker}\n\n\nLocalRedpandaBroker class, used fo"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/api/fastkafka/testing/Tester.md",
    "chars": 24646,
    "preview": "## `fastkafka.testing.Tester` {#fastkafka.testing.Tester}\n\n### `__init__` {#init}\n\n`def __init__(self, app: Union[fastka"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/cli/fastkafka.md",
    "chars": 3164,
    "preview": "# `fastkafka`\n\n**Usage**:\n\n```console\n$ fastkafka [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--install-completion"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/cli/run_fastkafka_server_process.md",
    "chars": 643,
    "preview": "# `run_fastkafka_server_process`\n\n**Usage**:\n\n```console\n$ run_fastkafka_server_process [OPTIONS] APP\n```\n\n**Arguments**"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_00_FastKafka_Demo.md",
    "chars": 26543,
    "preview": "FastKafka tutorial\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n[FastKafka](https://fa"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_01_Intro.md",
    "chars": 3870,
    "preview": "Intro\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nThis tutorial will show you how to "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_02_First_Steps.md",
    "chars": 11067,
    "preview": "First Steps\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Creating a simple Kafka co"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_03_Authentication.md",
    "chars": 623,
    "preview": "Authentication\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## TLS Authentication\n\nsas"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_04_Github_Actions_Workflow.md",
    "chars": 1493,
    "preview": "Deploy FastKafka docs to GitHub Pages\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_05_Lifespan_Handler.md",
    "chars": 11988,
    "preview": "Lifespan Events\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nDid you know that you can"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_06_Benchmarking_FastKafka.md",
    "chars": 15327,
    "preview": "Benchmarking FastKafka app\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Prerequisit"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md",
    "chars": 22300,
    "preview": "Encoding and Decoding Kafka Messages with FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_11_Consumes_Basics.md",
    "chars": 6865,
    "preview": "@consumes basics\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@consumes` "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_21_Produces_Basics.md",
    "chars": 7312,
    "preview": "@produces basics\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@produces` "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_22_Partition_Keys.md",
    "chars": 5003,
    "preview": "Defining a partition key\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nPartition keys a"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_30_Using_docker_to_deploy_fastkafka.md",
    "chars": 7377,
    "preview": "Deploying FastKafka using Docker\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Build"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/guides/Guide_31_Using_redpanda_to_test_fastkafka.md",
    "chars": 14540,
    "preview": "Using Redpanda to test FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## What "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/index.md",
    "chars": 28440,
    "preview": "FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/overrides/css/extra.css",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/overrides/js/extra.js",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/overrides/js/math.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.5.0/overrides/js/mathjax.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/CHANGELOG.md",
    "chars": 9121,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.6.0\n\n### New Features\n\n- Timestamps added to CLI commands ([#283](https://"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/CONTRIBUTING.md",
    "chars": 11742,
    "preview": "# Contributing to fastkafka\n\nFirst off, thanks for taking the time to contribute! ❤️\n\nAll types of contributions are enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/LICENSE.md",
    "chars": 11357,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/EventMetadata.md",
    "chars": 674,
    "preview": "## `fastkafka.EventMetadata` {#fastkafka.EventMetadata}\n\n\nA class for encapsulating Kafka record metadata.\n\n**Parameters"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/FastKafka.md",
    "chars": 39729,
    "preview": "## `fastkafka.FastKafka` {#fastkafka.FastKafka}\n\n### `__init__` {#init}\n\n`def __init__(self, title: Optional[str] = None"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/KafkaEvent.md",
    "chars": 318,
    "preview": "## `fastkafka.KafkaEvent` {#fastkafka.KafkaEvent}\n\n\nA generic class for representing Kafka events. Based on BaseSubmodel"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/AvroBase.md",
    "chars": 120,
    "preview": "## `fastkafka.encoder.AvroBase` {#fastkafka.encoder.AvroBase}\n\n\nThis is base pydantic class that will add some methods\n\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/avro_decoder.md",
    "chars": 482,
    "preview": "## `fastkafka.encoder.avro_decoder` {#fastkafka.encoder.avro_decoder}\n\n### `avro_decoder` {#avro_decoder}\n\n`def avro_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/avro_encoder.md",
    "chars": 353,
    "preview": "## `fastkafka.encoder.avro_encoder` {#fastkafka.encoder.avro_encoder}\n\n### `avro_encoder` {#avro_encoder}\n\n`def avro_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/avsc_to_pydantic.md",
    "chars": 365,
    "preview": "## `fastkafka.encoder.avsc_to_pydantic` {#fastkafka.encoder.avsc_to_pydantic}\n\n### `avsc_to_pydantic` {#avsc_to_pydantic"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/json_decoder.md",
    "chars": 468,
    "preview": "## `fastkafka.encoder.json_decoder` {#fastkafka.encoder.json_decoder}\n\n### `json_decoder` {#json_decoder}\n\n`def json_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/encoder/json_encoder.md",
    "chars": 357,
    "preview": "## `fastkafka.encoder.json_encoder` {#fastkafka.encoder.json_encoder}\n\n### `json_encoder` {#json_encoder}\n\n`def json_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/executors/DynamicTaskExecutor.md",
    "chars": 1272,
    "preview": "## `fastkafka.executors.DynamicTaskExecutor` {#fastkafka.executors.DynamicTaskExecutor}\n\n\nA class that implements a dyna"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/executors/SequentialExecutor.md",
    "chars": 1209,
    "preview": "## `fastkafka.executors.SequentialExecutor` {#fastkafka.executors.SequentialExecutor}\n\n\nA class that implements a sequen"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/testing/ApacheKafkaBroker.md",
    "chars": 1058,
    "preview": "## `fastkafka.testing.ApacheKafkaBroker` {#fastkafka.testing.ApacheKafkaBroker}\n\n\nApacheKafkaBroker class, used for runn"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/testing/LocalRedpandaBroker.md",
    "chars": 1725,
    "preview": "## `fastkafka.testing.LocalRedpandaBroker` {#fastkafka.testing.LocalRedpandaBroker}\n\n\nLocalRedpandaBroker class, used fo"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/api/fastkafka/testing/Tester.md",
    "chars": 26143,
    "preview": "## `fastkafka.testing.Tester` {#fastkafka.testing.Tester}\n\n### `__init__` {#init}\n\n`def __init__(self, app: Union[fastka"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/cli/fastkafka.md",
    "chars": 3165,
    "preview": "# `fastkafka`\n\n**Usage**:\n\n```console\n$ fastkafka [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--install-completion"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/cli/run_fastkafka_server_process.md",
    "chars": 643,
    "preview": "# `run_fastkafka_server_process`\n\n**Usage**:\n\n```console\n$ run_fastkafka_server_process [OPTIONS] APP\n```\n\n**Arguments**"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_00_FastKafka_Demo.md",
    "chars": 26543,
    "preview": "FastKafka tutorial\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n[FastKafka](https://fa"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_01_Intro.md",
    "chars": 3870,
    "preview": "Intro\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nThis tutorial will show you how to "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_02_First_Steps.md",
    "chars": 11067,
    "preview": "First Steps\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Creating a simple Kafka co"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_03_Authentication.md",
    "chars": 623,
    "preview": "Authentication\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## TLS Authentication\n\nsas"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_04_Github_Actions_Workflow.md",
    "chars": 1493,
    "preview": "Deploy FastKafka docs to GitHub Pages\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_05_Lifespan_Handler.md",
    "chars": 11988,
    "preview": "Lifespan Events\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nDid you know that you can"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_06_Benchmarking_FastKafka.md",
    "chars": 15327,
    "preview": "Benchmarking FastKafka app\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Prerequisit"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md",
    "chars": 22379,
    "preview": "Encoding and Decoding Kafka Messages with FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_11_Consumes_Basics.md",
    "chars": 11129,
    "preview": "@consumes basics\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@consumes` "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_21_Produces_Basics.md",
    "chars": 7312,
    "preview": "@produces basics\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@produces` "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_22_Partition_Keys.md",
    "chars": 5003,
    "preview": "Defining a partition key\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nPartition keys a"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_23_Batch_Producing.md",
    "chars": 6290,
    "preview": "Batch producing\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nIf you want to send your "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_30_Using_docker_to_deploy_fastkafka.md",
    "chars": 7377,
    "preview": "Deploying FastKafka using Docker\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Build"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/guides/Guide_31_Using_redpanda_to_test_fastkafka.md",
    "chars": 14540,
    "preview": "Using Redpanda to test FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## What "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/index.md",
    "chars": 28440,
    "preview": "FastKafka\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/overrides/css/extra.css",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/overrides/js/extra.js",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/overrides/js/math.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.6.0/overrides/js/mathjax.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/CHANGELOG.md",
    "chars": 11504,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.7.0\n\n### New Features\n\n- Optional description argument to consumes and pro"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/CONTRIBUTING.md",
    "chars": 12565,
    "preview": "# Contributing to fastkafka\n\nFirst off, thanks for taking the time to contribute! ❤️\n\nAll types of contributions are enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/LICENSE.md",
    "chars": 11357,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/EventMetadata.md",
    "chars": 983,
    "preview": "## `fastkafka.EventMetadata` {#fastkafka.EventMetadata}\n\n\nA class for encapsulating Kafka record metadata.\n\n**Parameters"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/FastKafka.md",
    "chars": 43397,
    "preview": "## `fastkafka.FastKafka` {#fastkafka.FastKafka}\n\n### `__init__` {#init}\n\n`def __init__(self, title: Optional[str] = None"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/KafkaEvent.md",
    "chars": 318,
    "preview": "## `fastkafka.KafkaEvent` {#fastkafka.KafkaEvent}\n\n\nA generic class for representing Kafka events. Based on BaseSubmodel"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/AvroBase.md",
    "chars": 120,
    "preview": "## `fastkafka.encoder.AvroBase` {#fastkafka.encoder.AvroBase}\n\n\nThis is base pydantic class that will add some methods\n\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/avro_decoder.md",
    "chars": 482,
    "preview": "## `fastkafka.encoder.avro_decoder` {#fastkafka.encoder.avro_decoder}\n\n### `avro_decoder` {#avro_decoder}\n\n`def avro_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/avro_encoder.md",
    "chars": 353,
    "preview": "## `fastkafka.encoder.avro_encoder` {#fastkafka.encoder.avro_encoder}\n\n### `avro_encoder` {#avro_encoder}\n\n`def avro_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/avsc_to_pydantic.md",
    "chars": 365,
    "preview": "## `fastkafka.encoder.avsc_to_pydantic` {#fastkafka.encoder.avsc_to_pydantic}\n\n### `avsc_to_pydantic` {#avsc_to_pydantic"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/json_decoder.md",
    "chars": 468,
    "preview": "## `fastkafka.encoder.json_decoder` {#fastkafka.encoder.json_decoder}\n\n### `json_decoder` {#json_decoder}\n\n`def json_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/encoder/json_encoder.md",
    "chars": 357,
    "preview": "## `fastkafka.encoder.json_encoder` {#fastkafka.encoder.json_encoder}\n\n### `json_encoder` {#json_encoder}\n\n`def json_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/executors/DynamicTaskExecutor.md",
    "chars": 1272,
    "preview": "## `fastkafka.executors.DynamicTaskExecutor` {#fastkafka.executors.DynamicTaskExecutor}\n\n\nA class that implements a dyna"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/executors/SequentialExecutor.md",
    "chars": 1209,
    "preview": "## `fastkafka.executors.SequentialExecutor` {#fastkafka.executors.SequentialExecutor}\n\n\nA class that implements a sequen"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/testing/ApacheKafkaBroker.md",
    "chars": 1490,
    "preview": "## `fastkafka.testing.ApacheKafkaBroker` {#fastkafka.testing.ApacheKafkaBroker}\n\n\nApacheKafkaBroker class, used for runn"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/testing/LocalRedpandaBroker.md",
    "chars": 1725,
    "preview": "## `fastkafka.testing.LocalRedpandaBroker` {#fastkafka.testing.LocalRedpandaBroker}\n\n\nLocalRedpandaBroker class, used fo"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/api/fastkafka/testing/Tester.md",
    "chars": 29267,
    "preview": "## `fastkafka.testing.Tester` {#fastkafka.testing.Tester}\n\n### `__init__` {#init}\n\n`def __init__(self, app: Union[fastka"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/cli/fastkafka.md",
    "chars": 3206,
    "preview": "# `fastkafka`\n\n**Usage**:\n\n```console\n$ fastkafka [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--install-completion"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/cli/run_fastkafka_server_process.md",
    "chars": 642,
    "preview": "# `run_fastkafka_server_process`\n\n**Usage**:\n\n```console\n$ run_fastkafka_server_process [OPTIONS] APP\n```\n\n**Arguments**"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_00_FastKafka_Demo.md",
    "chars": 26272,
    "preview": "# FastKafka tutorial\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n[FastKafka](https://fastkafka.airt.ai"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_01_Intro.md",
    "chars": 3855,
    "preview": "# Intro\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nThis tutorial will show you how to use <b>FastKafk"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_02_First_Steps.md",
    "chars": 11052,
    "preview": "# First Steps\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Creating a simple Kafka consumer app\n\nFor"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_03_Authentication.md",
    "chars": 608,
    "preview": "# Authentication\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## TLS Authentication\n\nsasl_mechanism (st"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_04_Github_Actions_Workflow.md",
    "chars": 1316,
    "preview": "# Deploy FastKafka docs to GitHub Pages\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Getting started"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_05_Lifespan_Handler.md",
    "chars": 11973,
    "preview": "# Lifespan Events\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nDid you know that you can define some sp"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_06_Benchmarking_FastKafka.md",
    "chars": 14768,
    "preview": "# Benchmarking FastKafka app\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Prerequisites\n\nTo benchmar"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md",
    "chars": 21610,
    "preview": "# Encoding and Decoding Kafka Messages with FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## P"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_11_Consumes_Basics.md",
    "chars": 9156,
    "preview": "# @consumes basics\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@consumes` decorator to co"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_12_Batch_Consuming.md",
    "chars": 2576,
    "preview": "# Batch consuming\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nIf you want to consume data in batches `"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_21_Produces_Basics.md",
    "chars": 7243,
    "preview": "# @produces basics\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@produces` decorator to pr"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_22_Partition_Keys.md",
    "chars": 4876,
    "preview": "# Defining a partition key\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nPartition keys are used in Apac"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_23_Batch_Producing.md",
    "chars": 6163,
    "preview": "# Batch producing\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nIf you want to send your data in batches"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_24_Using_Multiple_Kafka_Clusters.md",
    "chars": 42716,
    "preview": "# Using multiple Kafka clusters\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nReady to take your FastKaf"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_30_Using_docker_to_deploy_fastkafka.md",
    "chars": 7254,
    "preview": "# Deploying FastKafka using Docker\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Building a Docker Im"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_31_Using_redpanda_to_test_fastkafka.md",
    "chars": 14073,
    "preview": "# Using Redpanda to test FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## What is FastKafka?\n\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/guides/Guide_32_Using_fastapi_to_run_fastkafka_application.md",
    "chars": 5368,
    "preview": "# Using FastAPI to Run FastKafka Application\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nWhen deployin"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/index.md",
    "chars": 23045,
    "preview": "# FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration for your web s"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/overrides/css/extra.css",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/overrides/js/extra.js",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/overrides/js/math.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.0/overrides/js/mathjax.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/CHANGELOG.md",
    "chars": 11504,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.7.0\n\n### New Features\n\n- Optional description argument to consumes and pro"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/CONTRIBUTING.md",
    "chars": 12614,
    "preview": "# Contributing to fastkafka\n\nFirst off, thanks for taking the time to contribute! ❤️\n\nAll types of contributions are enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/LICENSE.md",
    "chars": 11357,
    "preview": "                                 Apache License\n                           Version 2.0, January 2004\n                   "
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/EventMetadata.md",
    "chars": 983,
    "preview": "## `fastkafka.EventMetadata` {#fastkafka.EventMetadata}\n\n\nA class for encapsulating Kafka record metadata.\n\n**Parameters"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/FastKafka.md",
    "chars": 43451,
    "preview": "## `fastkafka.FastKafka` {#fastkafka.FastKafka}\n\n### `__init__` {#init}\n\n`def __init__(self, title: Optional[str] = None"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/KafkaEvent.md",
    "chars": 318,
    "preview": "## `fastkafka.KafkaEvent` {#fastkafka.KafkaEvent}\n\n\nA generic class for representing Kafka events. Based on BaseSubmodel"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/AvroBase.md",
    "chars": 120,
    "preview": "## `fastkafka.encoder.AvroBase` {#fastkafka.encoder.AvroBase}\n\n\nThis is base pydantic class that will add some methods\n\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/avro_decoder.md",
    "chars": 482,
    "preview": "## `fastkafka.encoder.avro_decoder` {#fastkafka.encoder.avro_decoder}\n\n### `avro_decoder` {#avro_decoder}\n\n`def avro_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/avro_encoder.md",
    "chars": 353,
    "preview": "## `fastkafka.encoder.avro_encoder` {#fastkafka.encoder.avro_encoder}\n\n### `avro_encoder` {#avro_encoder}\n\n`def avro_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/avsc_to_pydantic.md",
    "chars": 365,
    "preview": "## `fastkafka.encoder.avsc_to_pydantic` {#fastkafka.encoder.avsc_to_pydantic}\n\n### `avsc_to_pydantic` {#avsc_to_pydantic"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/json_decoder.md",
    "chars": 468,
    "preview": "## `fastkafka.encoder.json_decoder` {#fastkafka.encoder.json_decoder}\n\n### `json_decoder` {#json_decoder}\n\n`def json_dec"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/encoder/json_encoder.md",
    "chars": 357,
    "preview": "## `fastkafka.encoder.json_encoder` {#fastkafka.encoder.json_encoder}\n\n### `json_encoder` {#json_encoder}\n\n`def json_enc"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/executors/DynamicTaskExecutor.md",
    "chars": 1272,
    "preview": "## `fastkafka.executors.DynamicTaskExecutor` {#fastkafka.executors.DynamicTaskExecutor}\n\n\nA class that implements a dyna"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/executors/SequentialExecutor.md",
    "chars": 1209,
    "preview": "## `fastkafka.executors.SequentialExecutor` {#fastkafka.executors.SequentialExecutor}\n\n\nA class that implements a sequen"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/testing/ApacheKafkaBroker.md",
    "chars": 1490,
    "preview": "## `fastkafka.testing.ApacheKafkaBroker` {#fastkafka.testing.ApacheKafkaBroker}\n\n\nApacheKafkaBroker class, used for runn"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/testing/LocalRedpandaBroker.md",
    "chars": 1725,
    "preview": "## `fastkafka.testing.LocalRedpandaBroker` {#fastkafka.testing.LocalRedpandaBroker}\n\n\nLocalRedpandaBroker class, used fo"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/api/fastkafka/testing/Tester.md",
    "chars": 29311,
    "preview": "## `fastkafka.testing.Tester` {#fastkafka.testing.Tester}\n\n### `__init__` {#init}\n\n`def __init__(self, app: Union[fastka"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/cli/fastkafka.md",
    "chars": 3207,
    "preview": "# `fastkafka`\n\n**Usage**:\n\n```console\n$ fastkafka [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--install-completion"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/cli/run_fastkafka_server_process.md",
    "chars": 642,
    "preview": "# `run_fastkafka_server_process`\n\n**Usage**:\n\n```console\n$ run_fastkafka_server_process [OPTIONS] APP\n```\n\n**Arguments**"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_00_FastKafka_Demo.md",
    "chars": 26540,
    "preview": "# FastKafka tutorial\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n[FastKafka](https://fastkafka.airt.ai"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_01_Intro.md",
    "chars": 3855,
    "preview": "# Intro\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nThis tutorial will show you how to use <b>FastKafk"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_02_First_Steps.md",
    "chars": 11052,
    "preview": "# First Steps\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Creating a simple Kafka consumer app\n\nFor"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_03_Authentication.md",
    "chars": 608,
    "preview": "# Authentication\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## TLS Authentication\n\nsasl_mechanism (st"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_04_Github_Actions_Workflow.md",
    "chars": 1478,
    "preview": "# Deploy FastKafka docs to GitHub Pages\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Getting started"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_05_Lifespan_Handler.md",
    "chars": 11973,
    "preview": "# Lifespan Events\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nDid you know that you can define some sp"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_06_Benchmarking_FastKafka.md",
    "chars": 15312,
    "preview": "# Benchmarking FastKafka app\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Prerequisites\n\nTo benchmar"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_07_Encoding_and_Decoding_Messages_with_FastKafka.md",
    "chars": 22364,
    "preview": "# Encoding and Decoding Kafka Messages with FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## P"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_11_Consumes_Basics.md",
    "chars": 9304,
    "preview": "# @consumes basics\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@consumes` decorator to co"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_12_Batch_Consuming.md",
    "chars": 2576,
    "preview": "# Batch consuming\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nIf you want to consume data in batches `"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_21_Produces_Basics.md",
    "chars": 7297,
    "preview": "# @produces basics\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nYou can use `@produces` decorator to pr"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_22_Partition_Keys.md",
    "chars": 4988,
    "preview": "# Defining a partition key\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nPartition keys are used in Apac"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_23_Batch_Producing.md",
    "chars": 6275,
    "preview": "# Batch producing\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nIf you want to send your data in batches"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_24_Using_Multiple_Kafka_Clusters.md",
    "chars": 42716,
    "preview": "# Using multiple Kafka clusters\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nReady to take your FastKaf"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_30_Using_docker_to_deploy_fastkafka.md",
    "chars": 7362,
    "preview": "# Deploying FastKafka using Docker\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Building a Docker Im"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_31_Using_redpanda_to_test_fastkafka.md",
    "chars": 14506,
    "preview": "# Using Redpanda to test FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## What is FastKafka?\n\n"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/guides/Guide_32_Using_fastapi_to_run_fastkafka_application.md",
    "chars": 6090,
    "preview": "# Using FastAPI to Run FastKafka Application\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nWhen deployin"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/index.md",
    "chars": 23161,
    "preview": "# FastKafka\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n<b>Effortless Kafka integration for your web s"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/overrides/css/extra.css",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/overrides/js/extra.js",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/overrides/js/math.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.7.1/overrides/js/mathjax.js",
    "chars": 300,
    "preview": "window.MathJax = {\n  tex: {\n    inlineMath: [[\"\\\\(\", \"\\\\)\"]],\n    displayMath: [[\"\\\\[\", \"\\\\]\"]],\n    processEscapes: tru"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.8.0/CHANGELOG.md",
    "chars": 11504,
    "preview": "# Release notes\n\n<!-- do not remove -->\n\n## 0.7.0\n\n### New Features\n\n- Optional description argument to consumes and pro"
  },
  {
    "path": "docusaurus/versioned_docs/version-0.8.0/CNAME",
    "chars": 18,
    "preview": "fastkafka.airt.ai\n"
  }
]

// ... and 152 more files (download for full content)

About this extraction

This page contains the full source code of the airtai/fastkafka GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 352 files (4.0 MB), approximately 1.1M tokens, and a symbol index with 422 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!