Skip to content

Contributing

Contributing to code-to-module

Fork the repo, create a branch, and install in editable mode with dev dependencies:

git clone https://github.com/svigneau/code-to-module
cd code-to-module
pip install -e ".[dev]"

Before opening a PR, run the fast unit suite:

pytest -x -q -m "not network and not llm"
ruff check src/ tests/

Read docs/architecture.md before contributing. It explains the module map, the import boundary between the conversion pipeline and the validation suite, the fix-is-general principle (no special-case patches for failing tests), and how to add new checks or templates. Violating the import boundary will cause pytest tests/test_import_boundaries.py to fail.

Key rules to keep in mind:

  • validate_cli.py must not import from ingest, discover, assess, infer, container, or generate. This boundary enables future extraction of the validation suite into a standalone package.
  • Every fix proposed by fix.py must be general — if a rule fires on a test case, it must also fire on all structurally identical real modules. No special-case patches.
  • All nf-core conventions (valid process labels, container registry URLs, required meta.yml fields, EDAM term mappings) must live in src/code_to_module/standards/data/nf_core_standards.json and be read via get_standards(). Never hardcode them in Python or Jinja2 templates.

AI assistance policy

This project was built with significant Claude assistance and welcomes contributions that use AI tools. When contributing:

  • Do not add Co-authored-by: Claude or similar AI co-author trailers to commit messages — this creates legal ambiguity and pollutes the contributor graph.
  • If a contribution was substantially AI-assisted, note it in the PR description in plain language (e.g. "Implementation generated by Claude Code from the spec in docs/development-tutorial.md Prompt X"). This is informative without creating attribution ambiguity.
  • The project-level acknowledgement in README.md covers all AI assistance across the project. Individual commit attribution is not required or expected.

Contributing modules generated by this tool

code-to-module generates a reviewable draft. Getting it merged into nf-core requires following the nf-core modules contributing guide, which covers the full PR checklist, test data requirements, and review process.

If derive_test_data.sh was written alongside the module, run it and PR the derived test data file to nf-core/test-datasets first. The module PR references those files by path, so the test-datasets PR must be merged (or at least open and reviewable) before the module PR can be reviewed.

For review questions, the nf-core Slack #modules channel is the right place to ask — module reviewers are active there and can clarify expectations before you iterate on a PR.