diff --git a/CONTRIBUTOR.md b/CONTRIBUTOR.md new file mode 100644 index 000000000..e9a2f925a --- /dev/null +++ b/CONTRIBUTOR.md @@ -0,0 +1,219 @@ +# Contributing to confluent-kafka-python + +Thank you for your interest in contributing to confluent-kafka-python! This document provides guidelines and best practices for contributing to this project. + +## Table of Contents + +- [Getting Started](#getting-started) +- [Development Environment Setup](#development-environment-setup) +- [Code Style and Standards](#code-style-and-standards) +- [Testing](#testing) +- [Submitting Changes](#submitting-changes) +- [Reporting Issues](#reporting-issues) +- [Community Guidelines](#community-guidelines) + +## Getting Started + +### Ways to Contribute + +- **Bug Reports**: Report bugs and issues you encounter +- **Feature Requests**: Suggest new features or improvements +- **Code Contributions**: Fix bugs, implement features, or improve documentation +- **Documentation**: Improve existing docs or add new documentation +- **Testing**: Help improve test coverage and quality + +### Before You Start + +1. Check existing [issues](../../issues) to see if your bug/feature has already been reported +2. For major changes, open an issue first to discuss the proposed changes +3. Fork the repository and create a feature branch for your work + +## Development Environment Setup + +For complete development environment setup instructions, including prerequisites, virtual environment creation, and dependency installation, see the [Development Environment Setup section in DEVELOPER.md](DEVELOPER.md#development-environment-setup). + +## Code Style and Standards + +### Python Code Style + +- **PEP 8**: Follow [PEP 8](https://pep8.org/) style guidelines as a default, with exceptions captured in the `tox.ini` flake8 rules for modern updates to the recommendations +- **Docstrings**: Use Google-style docstrings for all public functions and classes + +### Code Formatting + +We use automated tools to maintain consistent code style: + +```bash +# Install formatting tools +pip install flake8 + +# Check style +flake8 src/ tests/ +``` + +### Naming Conventions + +- **Functions and Variables**: `snake_case` +- **Classes**: `PascalCase` +- **Constants**: `UPPER_SNAKE_CASE` +- **Private Methods/Objects**: Prefix with single underscore `_private_method` + +### Documentation + +- All public APIs must have docstrings +- Include examples in docstrings when helpful +- Keep docstrings concise but complete +- Update relevant documentation files when making changes + +## Testing + +### Running Tests + +See [tests/README.md](tests/README.md) for comprehensive testing instructions. + +### Test Requirements + +- **Unit Tests**: All new functionality must include unit tests +- **Integration Tests**: Add integration tests for complex features +- **Test Coverage**: Maintain or improve existing test coverage +- **Test Naming**: Use descriptive test names that explain what is being tested + +### Test Structure + +```python +def test_feature_should_behave_correctly_when_condition(): + # Arrange + setup_data = create_test_data() + + # Act + result = function_under_test(setup_data) + + # Assert + assert result.expected_property == expected_value +``` + +## Submitting Changes + +### Pull Request Process + +1. **Create Feature Branch** + ```bash + git checkout -b feature/your-feature-name + # or + git checkout -b fix/issue-number-description + ``` + +2. **Make Your Changes** + - Write clean, well-documented code + - Add appropriate tests + - Update documentation if needed + - Add an entry to the CHANGELOG.md file for the proposed change + +3. **Test Your Changes** + Refer to [tests/README.md](tests/README.md) + +4. **Commit Your Changes** + ```bash + git add . + git commit -m "Clear, descriptive commit message" + ``` + + **Commit Message Guidelines:** + - Use present tense ("Add feature" not "Added feature") + - Keep first line under 50 characters + - Reference issue numbers when applicable (#123) + - Include breaking change notes if applicable + +5. **Push and Create Pull Request** + ```bash + git push origin feature/your-feature-name + ``` + + Then create a pull request through GitHub's interface. + +### Pull Request Guidelines + +- **Title**: Clear and descriptive +- **Description**: Explain what changes you made and why +- **Linked Issues**: Reference related issues using "Fixes #123" or "Relates to #123" +- **Labels**: Review available issue/PR labels and apply relevant ones to help with categorization and triage +- **Documentation**: Update documentation for user-facing changes +- **Tests**: Include appropriate tests +- **Breaking Changes**: Clearly mark any breaking changes + +### Code Review Process + +- All pull requests require review before merging +- Address reviewer feedback promptly +- Keep discussions respectful and constructive +- Be open to suggestions and alternative approaches + +## Reporting Issues + +### Using Labels + +When creating issues or pull requests, please review the available labels and apply those that are relevant to your submission. This helps maintainers categorize and prioritize work effectively. Common label categories include (look at available labels / other issues for options): + +- **Type**: bug, enhancement, documentation, question +- **Priority**: high, medium, low +- **Component**: producer, consumer, admin, schema-registry, etc +- **Status**: needs-investigation, help-wanted, good-first-issue, etc + +### Bug Reports + +When reporting bugs, please include: + +- **Clear Title**: Describe the issue concisely +- **Environment**: Python version, OS, library versions +- **Steps to Reproduce**: Detailed steps to reproduce the issue +- **Expected Behavior**: What you expected to happen +- **Actual Behavior**: What actually happened +- **Code Sample**: Minimal code that demonstrates the issue +- **Error Messages**: Full error messages and stack traces +- **Client Configuration**: Specify how the client was configured and setup +- **Logs**: Client logs when possible +- **Labels**: Apply relevant labels such as "bug" and component-specific labels + +### Feature Requests + +For feature requests, please include: + +- **Use Case**: Describe the problem you're trying to solve +- **Proposed Solution**: Your idea for how to address it +- **Alternatives**: Other solutions you've considered +- **Additional Context**: Any other relevant information +- **Labels**: Apply relevant labels such as "enhancement" and component-specific labels + +## Community Guidelines + +### Code of Conduct + +This project follows the [Contributor Covenant Code of Conduct](https://www.contributor-covenant.org/). By participating, you agree to uphold this code. + +### Communication + +- **Be Respectful**: Treat all community members with respect +- **Be Constructive**: Provide helpful feedback and suggestions +- **Be Patient**: Remember that maintainers and contributors volunteer their time +- **Be Clear**: Communicate clearly and provide sufficient context + +### Getting Help + +- **Issues**: Use GitHub issues for bug reports and feature requests +- **Discussions**: Use GitHub Discussions for questions and general discussion +- **Documentation**: Check existing documentation before asking questions + +## Recognition + +Contributors are recognized in the following ways: + +- Contributors are listed in the project's contributor history +- Significant contributions may be mentioned in release notes + +## License + +By contributing to this project, you agree that your contributions will be licensed under the same license as the project (see LICENSE file). + +--- + +Thank you for contributing to confluent-kafka-python! Your contributions help make this project better for everyone. \ No newline at end of file diff --git a/DEVELOPER.md b/DEVELOPER.md index 48f7e6b53..8f241865e 100644 --- a/DEVELOPER.md +++ b/DEVELOPER.md @@ -2,41 +2,87 @@ This document provides information useful to developers working on confluent-kafka-python. +## Development Environment Setup -## Build +### Prerequisites - $ python -m build +- Python 3.7 or higher +- Git +- librdkafka (for Kafka functionality) -If librdkafka is installed in a non-standard location provide the include and library directories with: +### Setup Steps + +1. **Fork and Clone** + ```bash + git clone https://github.com/your-username/confluent-kafka-python.git + cd confluent-kafka-python + ``` - $ C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python -m build +2. **Create Virtual Environment** + ```bash + python3 -m venv venv + source venv/bin/activate # On Windows: venv\Scripts\activate + ``` **Note**: On Windows the variables for Visual Studio are named INCLUDE and LIB +3. **Install librdkafka** (if not already installed) +See the main README.md for platform-specific installation instructions + +If librdkafka is installed in a non-standard location provide the include and library directories with: + +```bash +C_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python -m build +``` + +4. **Install confluent-kafka-python with optional dependencies** + ```bash + pip3 install -e .[dev,tests,docs] + ``` + + This will also build the wheel be default. Alternatively you can build the bundle independently with: + + ```bash + python3 -m build + ``` + +5. **Verify Setup** + ```bash + python3 -c "import confluent_kafka; print('Setup successful!')" + ``` + ## Generate Documentation Install docs dependencies: - $ pip install .[docs] +```bash +pip3 install .[docs] +``` Build HTML docs: - $ make docs +```bash +make docs +``` Documentation will be generated in `docs/_build/`. or: - $ python setup.py build_sphinx +```bash +python3 setup.py build_sphinx +``` Documentation will be generated in `build/sphinx/html`. ## Unasync -- maintaining sync versions of async code - $ python tools/unasync.py +```bash +python3 tools/unasync.py - # Run the script with the --check flag to ensure the sync code is up to date - $ python tools/unasync.py --check +# Run the script with the --check flag to ensure the sync code is up to date +python3 tools/unasync.py --check +``` If you make any changes to the async code (in `src/confluent_kafka/schema_registry/_async` and `tests/integration/schema_registry/_async`), you **must** run this script to generate the sync counter parts (in `src/confluent_kafka/schema_registry/_sync` and `tests/integration/schema_registry/_sync`). Otherwise, this script will be run in CI with the --check flag and fail the build. diff --git a/README.md b/README.md index 4eb0d0515..afa323ba1 100644 --- a/README.md +++ b/README.md @@ -130,7 +130,9 @@ The `Producer`, `Consumer` and `AdminClient` are all thread safe. **Install self-contained binary wheels** - $ pip install confluent-kafka +```bash +pip install confluent-kafka +``` **NOTE:** The pre-built Linux wheels do NOT contain SASL Kerberos/GSSAPI support. If you need SASL Kerberos/GSSAPI support you must install librdkafka and @@ -140,19 +142,27 @@ The `Producer`, `Consumer` and `AdminClient` are all thread safe. To use Schema Registry with the Avro serializer/deserializer: - $ pip install "confluent-kafka[avro,schemaregistry]" +```bash +pip install "confluent-kafka[avro,schemaregistry]" +``` To use Schema Registry with the JSON serializer/deserializer: - $ pip install "confluent-kafka[json,schemaregistry]" +```bash +pip install "confluent-kafka[json,schemaregistry]" +``` To use Schema Registry with the Protobuf serializer/deserializer: - $ pip install "confluent-kafka[protobuf,schemaregistry]" +```bash +pip install "confluent-kafka[protobuf,schemaregistry]" +``` When using Data Contract rules (including CSFLE) add the `rules`extra, e.g.: - $ pip install "confluent-kafka[avro,schemaregistry,rules]" +```bash +pip install "confluent-kafka[avro,schemaregistry,rules]" +``` **Install from source** diff --git a/examples/README.md b/examples/README.md index fde034dab..1df0bc0bd 100644 --- a/examples/README.md +++ b/examples/README.md @@ -27,11 +27,11 @@ conflicts between projects. To setup a venv with the latest release version of confluent-kafka and dependencies of all examples installed: -``` -$ python3 -m venv venv_examples -$ source venv_examples/bin/activate -$ pip install confluent_kafka -$ pip install -r requirements/requirements-examples.txt +```bash +python3 -m venv venv_examples +source venv_examples/bin/activate +python3 -m pip install confluent_kafka +python3 -m pip install -r requirements/requirements-examples.txt ``` To setup a venv that uses the current source tree version of confluent_kafka, you @@ -39,14 +39,14 @@ need to have a C compiler and librdkafka installed ([from a package](https://github.com/edenhill/librdkafka#installing-prebuilt-packages), or [from source](https://github.com/edenhill/librdkafka#build-from-source)). Then: -``` -$ python3 -m venv venv_examples -$ source venv_examples/bin/activate -$ pip install .[examples] +```bash +python3 -m venv venv_examples +source venv_examples/bin/activate +python3 -m pip install .[examples] ``` When you're finished with the venv: -``` -$ deactivate +```bash +deactivate ``` diff --git a/examples/docker/README.md b/examples/docker/README.md index a02ffb062..5d5730464 100644 --- a/examples/docker/README.md +++ b/examples/docker/README.md @@ -7,5 +7,7 @@ See the header of each Dockerfile in this directory for what is included. From the confluent-kafka-python source top directory: - $ docker build -f examples/docker/Dockerfile.alpine . +```bash +docker build -f examples/docker/Dockerfile.alpine . +``` diff --git a/tests/README.md b/tests/README.md index 7502d2f6b..20396d295 100644 --- a/tests/README.md +++ b/tests/README.md @@ -14,16 +14,20 @@ Test summary: **Note:** Unless otherwise stated, all command, file and directory references are relative to the *repo's root* directory. -A python3 env suitable for running tests: +A python env suitable for running tests: - $ python3 -m venv venv_test - $ source venv_test/bin/activate - $ python3 -m pip install -r requirements/requirements-tests-install.txt - $ python3 -m pip install . +```bash +python3 -m venv venv_test +source venv_test/bin/activate +python3 -m pip install -r requirements/requirements-tests-install.txt +python3 -m pip install . +``` When you're finished with it: - $ deactivate +```bash +deactivate +``` ## Unit tests @@ -32,17 +36,23 @@ not require an active Kafka cluster. You can run them selectively like so: - $ pytest -s -v tests/test_Producer.py +```bash +pytest -s -v tests/test_Producer.py +``` Or run them all with: - $ pytest -s -v tests/test_*.py +```bash +pytest -s -v tests/test_*.py +``` Note that the -v flag enables verbose output and -s flag disables capture of stderr and stdout (so that you see it on the console). You can also use ./tests/run.sh to run the unit tests: - $ ./tests/run.sh unit +```bash +./tests/run.sh unit +``` ## Integration tests @@ -65,11 +75,11 @@ which sets environment variables referenced by `./tests/integration/testconf.jso You can then run the tests as follows: - python ./tests/integration/integration_test.py ./tests/integration/testconf.json + python3 ./tests/integration/integration_test.py ./tests/integration/testconf.json Or selectively using via specifying one or more options ("modes"). You can see all of these via: - python ./tests/integration/integration_test.py --help + python3 ./tests/integration/integration_test.py --help ### The New Way @@ -117,7 +127,7 @@ Tox can be used to test against various supported Python versions (py27, py36, p 2. Uncomment the following line in [tox.ini](../tox.ini) - ```#python tests/integration/integration_test.py``` + ```#python3 tests/integration/integration_test.py``` 3. From top-level directory run: diff --git a/tox.ini b/tox.ini index c646479aa..2f7b43348 100644 --- a/tox.ini +++ b/tox.ini @@ -10,11 +10,11 @@ commands = pip install -r requirements/requirements-tests-install.txt pip install . # Early verification that module is loadable - python -c 'import confluent_kafka ; print(confluent_kafka.version())' + python3 -c 'import confluent_kafka ; print(confluent_kafka.version())' # Run tests (large timeout to allow docker image downloads) - python -m pytest --timeout 600 --ignore=tmp-build {posargs} + python3 -m pytest --timeout 600 --ignore=tmp-build {posargs} # See tests/README.md for additional notes on testing - #python tests/integration/integration_test.py + #python3 tests/integration/integration_test.py [testenv:flake8] deps = flake8