TL;DR: We’re sharpening the CNTi test suite so the Open Source telecom community can trust it in day-to-day CNF delivery—operators get reliable validation, vendors gain a fair benchmark, and the community benefits from a stronger shared tool. We need your help to (1) clean up and harden the code and tests, (2) stand up an open-source reference CNF (starting with Free5GC) that exercises every test, (3) stabilize CI/CD, (4) evolve and add best practices (e.g., logging to stdout/stderr), and (5) implement concrete new tests and features (e.g., CNFs deployed by operators). Read on for specific “pick-up-and-do” tasks, success criteria, and how to get involved.
Why now?
CNF packaging, operations and best practices are moving fast—beyond plain Helm toward Operators, from basic security rules to advanced policies for resilience, observability, and compliance. To keep pace, the CNTi test suite must be rock solid, easy to extend, and representative of real CNFs. That’s the point of this push.
What we’re asking you to help with
1) Code cleanup, refactoring & hardening
Goal: Predictable, fast, and maintainable runs; fewer flakes; clearer failures.
Concrete tasks:
- Untangle helpers/fixtures; remove dead code; standardize naming and directory layout.
- Add/strengthen idempotency, timeouts, retries, and cleanup hooks so each test leaves the cluster clean.
- Improve error messages and structured logs from the test runner.
- Raise unit/integration coverage; add lint/type checks; document local dev workflows.
- Introduce a deterministic seed for randomized tests; ensure parallel runs don’t collide (namespaces, labels).
Success criteria:
- “Main” branch stable and green for one week; flake rate <1%.
- Repeatable runs across a minimal K8s version matrix.
2) Establish a reference CNF (open source): Start with Free5GC
Goal: A known-good CNF to validate every test and demonstrate remediation.
Concrete tasks:
- Collaborate with Free5GC maintainers to define a testable profile (minimal yet realistic).
- Package alignment: start with Free5GC Helm charts.
- Create fixtures: sample configs, manifests, minimal data; document prerequisites clearly.
- Write a “golden” runbook: which tests are expected to pass/fail and how to fix failures.
- Add CI job that deploys Free5GC and runs the full test suite on PRs affecting relevant areas.
Success criteria:
- “Run the suite against Free5GC” completes green on CI; docs show end-to-end reproduction locally.
- Each test case mapped to at least one Free5GC component/behavior.
Want to jump in? Help integrate Free5GC and open an “Reference CNF: Free5GC” PR/issue to coordinate workstreams.
3) Stabilize CI/CD
Goal: Fast feedback, trusted signals, and clear contributor UX.
Concrete tasks:
- Analyze and identify reasons for currently flaky CI
- Address identified issues to minimize CI false failures
- Speed up pipelines (layered caching, matrix sharding, artifact reuse).
- Nightly main-branch “burn-in” runs; auto-file issues on flake detection.
- Document contributor flow: “how to run locally” + “how to debug a red CI”.
Success criteria:
- Median PR validation < 60 minutes; low variance.
- End-to-end job green ≥ 5 consecutive runs.
4) Evolve Best practices & identify industry trends
Goal: Curate and evolve best practices while tracking industry trends, turning them into actionable tests that keep the suite relevant and forward-looking.
Concrete tasks:
- Evolve current practices – e.g., refine “Logging to stdout/stderr” into concrete criteria (format, redaction, rate limits).
🔗 Discussion: https://github.com/lfn-cnti/testsuite/discussions/2323 - Bring new practices – contribute ideas for additional best practices.
🔗 Examples: https://github.com/lfn-cnti/testsuite/discussions/1943 - Track trends – propose short RFCs for production-ready shifts (e.g., GitOps loops, policy-as-code, supply-chain hardening, IPv6-only clusters, SR-IOV/DPDK, energy efficiency, multi-arch). Accepted RFCs → roadmap items and test specs.
Success criteria:
- Active community discussions on evolving and new practices.
- At least 3-5 new practices described and formalized.
- A RFC submissions on industry trends, accepted by the community and feeding the roadmap.
5) Implement new tests or features
Goal:
Turn best practices into enforceable, actionable tests, and expand the testsuite to support modern deployment patterns beyond Helm.
Concrete tasks:
- Build new tests, for example:
- BOM/SBOM presence → validate existence, format (SPDX/CycloneDX), discoverability (OCI annotations/labels), and artifact-to-image correlation.
- Each test must include: purpose, rationale, remediation steps, and CNF example (pass & fail).
- BOM/SBOM presence → validate existence, format (SPDX/CycloneDX), discoverability (OCI annotations/labels), and artifact-to-image correlation.
- Expand deployment support, for example:
- Operator-based deployment → install via OLM or direct CRDs; manage lifecycle (install/upgrade/rollback) inside the suite harness.
- Operator-based deployment → install via OLM or direct CRDs; manage lifecycle (install/upgrade/rollback) inside the suite harness.
Success criteria:
- New tests (e.g., SBOM) are shipping with full metadata (purpose, rationale, remediation, CNF pass/fail example).
- Suite harness supports at least one non-Helm deployment pattern (Operator or other).
- The suite validates both the correctness of CNFs and the packaging/delivery method used.
Success criteria:
- One non-Helm deployment path (Operator) fully supported end-to-end.
- Reference CNFs updated/introduced to demonstrate new tests/features.
How you can contribute this week
- Join weekly CNTi calls
- Join our community calls to get in touch with us. We are more than happy to help you with introducing you to the CNTi.
- Pick an area or issue and comment “I’m in.”
- Good first ones: flaky test cleanup, structured error messages, SBOM presence test scaffolding.
- Submit your results in a form of a Pull Request and ask for feedback from community
- CNTi community will be pleased to review your contributions and help to make them actionable, high-quality, and aligned with the evolving test suite.
- CNTi community will be pleased to review your contributions and help to make them actionable, high-quality, and aligned with the evolving test suite.
Definition of Done (for this initiative)
- ✅ Clean, green CI on main; nightly burn-in green for a week.
- ✅ Free5GC reference CNF running the entire suite in CI with docs.
- ✅ At least one new non-Helm deployment path (Operator) supported.
- ✅ SBOM/BOM test implemented with remediation guide.
- ✅ Logging best practice clarified and enforced by tests.
- ✅ Contributor docs updated (local run, CI expectations, profiles/waivers).
Recognition & cadence
We’ll highlight top contributors in the release notes. Regular CNTi Testsuite releases every 4 weeks. Community calls every week.
Final nudge
If you care about shipping reliable CNFs—and proving it—this is your moment to shape the standard. Select an issue, propose a test, set up Free5GC as our reference CNF, or implement operator support. We’ll meet you where you are and help you land your first PR.
Links to start from:
- Logging best practice discussion: https://github.com/lfn-cnti/testsuite/discussions/2323
- SBOM/BOM presence test idea: https://github.com/lfn-cnti/testsuite/issues/1905#issue-2149864288
Let’s make the CNTi test suite the fastest path to trustworthy, production-grade CNFs.
Links and references
- CNTi Landing Page Cloud Native Telecom Initiative – LF Networking
- CNTi Test suite repository: lfn-cnti/testsuite: 📞📱☎️📡🌐 Cloud Native Telecom Initiative (CNTi) Test Suite is a tool to check for and provide feedback on the use of K8s + cloud native best practices in networking applications and platforms
- CNTi Confluence Cloud Native Telecom Initiative (CNTi) – CNTi – Confluence