Skip to content

Merging Coverage from Jest and Cypress

Posted on:September 19, 2023

Introduction

At our workplace, one of the business requirements is to provide a unified coverage report that merges results from all our test sources, in our case, unit/integration (using Jest) and end-to-end (with Cypress). We even have a KPI: our total code coverage needs to be more than 85%1. While combining these metrics in a single view can be handy, it’s important to recognise that unit and E2E tests serve different purposes.

Why merge coverage?

Pros:

Why it might not be the best idea:

Despite these concerns, and given the business requirements, we needed a way to find a combined coverage metric and engineer our pipeline to meet that need.

Current State of the System

Before diving into our solution, it’s important to understand the current landscape:

Challenges

We need to find a way to collect the coverage generated by the parallel runs of Cypress and merge them along with the coverage generated by Jest.

Another challenge arises from the tooling: Jest is using SWC, but Cypress requires Babel. This creates inconsistencies in the coverage data between unit and E2E tests.

After a few attempts, we realised that the numbers didn’t add up. We tried running our unit tests both with SWC and Babel as transpilers, obtaining the following results.

Coverage with SWC:

=============================== Coverage summary ===============================
Statements   : 85.66% ( 7812/9119 )
Branches     : 65.4% ( 2091/3197 )
Functions    : 79.63% ( 1756/2205 )
Lines        : 85.54% ( 7077/8273 )
================================================================================

Coverage with Babel:

=============================== Coverage summary ===============================
Statements   : 79.57% ( 5030/6321 )
Branches     : 63.48% ( 2191/3451 )
Functions    : 79.01% ( 1747/2211 )
Lines        : 79.64% ( 4874/6120 )
================================================================================

As seen in the numbers above, Babel tends to report lower overall coverage than SWC, particularly in statements and lines. One key issue is that SWC considers imports as covered statements, inflating the overall metrics.

Moreover, while there is an open issue in the Cypress coverage repository regarding SWC support (issue #583), it has been stale for a long time. There’s also an effort to support Istanbul for SWC (swc-plugin-coverage-instrument), but it is currently not working as expected.

This divergence means we must carefully manage configurations to use SWC for performance in Next.js while switching to Babel when generating coverage reports for Cypress.

Overview of the Workflow

To ensure accurate test coverage reports across both Jest and Cypress, we decided to stick to Babel for the code coverage, but keep SWC for everything else. We use a temporary Babel configuration along with a set of npm scripts to apply and clean up the necessary changes.

Key Components

Separate Babel Config for Coverage

We use coverage.babel.config.js specifically for test coverage. This prevents unwanted instrumentation from affecting the main development and production build. This config extends next/babel and enables the istanbul plugin only when BABEL_ENV=component is set.

The file looks like this:

module.exports = {
  presets: ["next/babel"],
  env: {
    component: {
      plugins: ["istanbul"],
    },
  },
};

Updated jest.config.js

Here’s how you can modify your jest.config.js to dynamically switch between babel-jest (when coverage is enabled) and @swc/jest (for faster test execution when coverage is not needed):

const useBabel = process.env.BABEL_ENV === "true";

module.exports = {
  transform: {
    "\\.[j|t]s[x]?$": useBabel
      ? "babel-jest"
      : [
          "@swc/jest",
          {
            jsc: {
              transform: {
                react: {
                  runtime: "automatic",
                },
              },
            },
          },
        ],
  },
  // ... rest of the configuration
};

How It Works

This ensures that the coverage reports are accurate when needed and the tests run much faster when coverage is not required.

"test:coverage": "cross-env BABEL_ENV=true jest --coverage"

When running normal tests (jest without BABEL_ENV=true), it will use @swc/jest for improved performance.

NPM Scripts to Automate the Process

Before running tests or building for coverage, we swap the Babel config using a temporary setup:

The following scripts ensure proper execution:

"pretest:coverage": "npm run coverage:setup",
"test:coverage": "cross-env BABEL_ENV=true jest --coverage",
"posttest:coverage": "npm run coverage:teardown",
"prebuild:coverage": "npm run coverage: setup",
"build:coverage": "cross-env BABEL_ENV=component next build",
"postbuild:coverage": "npm run coverage:teardown",
"coverage:setup": "cp coverage.babel.config.js babel.config.js",
"coverage:teardown": "rm babel.config.js"

The pre/post hooks guarantee that no babel.config.js is present outside of coverage testing.

Merging Coverage from Jest and Cypress

To get a complete test coverage report, we need to combine results from Jest and Cypress.

How It Works

  1. Jest Coverage

    • Running npm run test:coverage executes Jest with BABEL_ENV=true, ensuring that Istanbul (enabled via Babel) collects coverage.
    • Jest generates a coverage report in coverage-jest/.
  2. Cypress Coverage

    • Cypress requires the istanbul plugin in Babel to instrument the code during execution.
    • Since coverage.babel.config.js includes istanbul under the component environment, running npm run build:coverage ensures that the built application includes coverage tracking.
    • Cypress tests then generate coverage reports in coverage/.
  3. Consolidating Cypress coverage:

    Each parallel Cypress job generates a coverage/coverage-final.json file, which is renamed to coverage.<job_number>.json and uploaded as an artifact.

  4. Merging Reports

    • We download the coverage artifacts and merge them using nyc.
    • We generate reports based on the merged coverages.
    • We sanitise the file paths contained in the coverage using sed (otherwise they will be related to the workspace, as /\/home\/runner\/actions-runner\/_work\/...).

By following this approach, we ensure that all parts of the application are properly covered, and reports are accurate and consistent across both unit and end-to-end tests.

Jest Testing Workflow

The sanity-jest job is responsible for running unit tests using Jest:

sanity-jest:
  name: Jest
  runs-on: ubuntu-latest
  steps:
    - uses: actions/checkout@v4

    - uses: actions/setup-node@v4
        with:
          node-version-file: ".nvmrc"
          cache: "npm"

  - name: Install dependencies
        run: npm install
        env:
          CI: true

    - run: npm run test:coverage -- --silent --colors

    - uses: actions/upload-artifact@v4
      with:
        name: coverage-jest
        path: "jest-coverage/coverage-final.json"

Key Enhancements:

Parallelised Cypress E2E Testing

The sanity-e2e job is designed to distribute Cypress E2E tests across multiple containers:

sanity-e2e:
  name: End to End Tests
  runs-on: ubuntu-latest
  strategy:
    fail-fast: false
    matrix:
      containers: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
  env:
    BASE_PATH: ""
  steps:
    - uses: actions/checkout@v4

  - uses: actions/setup-node@v4
        with:
          node-version-file: ".nvmrc"
          cache: "npm"

    - name: Cypress run
      uses: cypress-io/github-action@v6
      with:
        build: npm run build:coverage
        start: npm run start:ci
        wait-on: "http://localhost:3000/"
        parallel: true
        record: true
      env:
        CYPRESS_PROJECT_ID: ${{ secrets.CYPRESS_PROJECT_ID }}
        CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

  - name: Check file existence
    id: check_files
    uses: andstor/file-existence-action@v3
    with:
    files: "coverage/coverage-final.json"

  - name: Copy coverage
    if: steps.check_files.outputs.files_exists == 'true'
    run: |
      mv coverage/coverage-final.json coverage/coverage-${{ matrix.containers }}.json

  - name: Update step coverage
    uses: actions/upload-artifact@v4
    if: steps.check_files.outputs.files_exists == 'true'
    with:
    name: coverage-${{ matrix.containers }}
    path: "coverage/coverage-${{ matrix.containers }}.json"

Key Enhancements:

Merging and Validating Coverage Reports

Finally, the coverage job consolidates all coverage reports:

coverage:
  name: Coverage
  runs-on: ubuntu-latest
  needs: [sanity-e2e, sanity-jest]
  steps:
    - uses: actions/checkout@v4

    - name: Download and move artifacts
      uses: actions/download-artifact@v4
      with:
        merge-multiple: true
        path: coverage
        pattern: coverage-*

    - name: Merge coverage
      run: |
        mkdir .nyc_output
        npx nyc merge coverage
        mv coverage.json .nyc_output/out.json
        npx nyc report --reporter text --reporter json-summary --report-dir coverage --exclude-after-remap false
        sed -i  -e 's/\/home\/runner\/actions-runner\/_work\/{:repo_path}\///g' coverage/coverage-summary.json

    - name: Coverage Diff
      uses: greatwizard/coverage-diff-action@v1
      with:
        github-token: ${{ secrets.GITHUB_TOKEN }}
        allowed-to-fail: "true"

Key Enhancements:

Gotchas

Conclusion

The challenge of using SWC in Next.js versus Babel for Cypress adds another layer of complexity. Our solution strikes a balance by leveraging SWC for performance in Next.js while using Babel for accurate coverage instrumentation.

In summary, despite some inherent challenges, this approach provides a robust CI pipeline that meets business requirements and delivers unified coverage insights, ensuring developers receive faster feedback and maintain high test reliability.


Footnotes

  1. I will never tire enough of saying that a high percentage of code coverage is not necessarily synonymous with good code and/or quality in the test strategy, but this would be a topic that deserves its own full post.