nvim-neotest / neotest-jest

MIT License
116 stars 77 forks source link

Neotest shows that test failed even when it doesn't #94

Open Demianeen opened 8 months ago

Demianeen commented 8 months ago

When using Lazyvim with Neotest, I'm encountering a problem where Neotest incorrectly shows tests as failed even though they pass, particularly when the Jest config is not in the root directory.

https://github.com/nvim-neotest/neotest-jest/assets/51330172/3ce518d2-f356-4db8-ad6f-ba7f61bf1238

I think this is the issue how neotest-jest interprets jest config. If my jest config is on the root of the repo. I almost don't see this behaviour: CleanShot 2023-12-07 at 16 46 23@2x

My config:

local getcwd = function()
    local file = vim.fn.expand("%:p")
    -- to run inside a package root if exist
    if string.find(file, "/packages/") then
        return string.match(file, "(.-/[^/]+/)src")
    end
    return vim.fn.getcwd()
end

return {
    "nvim-neotest/neotest",
    dependencies = {
        "haydenmeade/neotest-jest",
    },
    opts = function(_, opts)
        table.insert(
            opts.adapters,
            require("neotest-jest")({
                jestCommand = "npm test --",
                -- gets config path from `npm run test` `--config` value or returns package root jest.config.ts
                jestConfigFile = function()
                    local cwd = vim.fn.getcwd()
                    local package_json_path = cwd .. "/package.json"
                    local package_json_content =
                        vim.fn.readfile(package_json_path)

                    -- Check if the file read is successful
                    if next(package_json_content) == nil then
                        print("package.json is empty or does not exist")
                        return nil
                    end

                    package_json_content =
                        table.concat(package_json_content, "")
                    local decoded_json =
                        vim.fn.json_decode(package_json_content)

                    -- Check if scripts exists and specifically test script
                    if
                        decoded_json
                        and decoded_json.scripts
                        and decoded_json.scripts.test
                    then
                        local test_script = decoded_json.scripts.test

                        -- Pattern to match the --config argument
                        local config_arg_pattern = "%-%-config%s([%w%./_-]+)"
                        local config_path =
                            test_script:match(config_arg_pattern)
                        print("Config found" .. config_path)

                        return config_path
                    else
                        print("No test script found in package.json")
                        return cwd .. "jest.config.ts"
                    end
                end,
                env = { CI = true },
                cwd = getcwd,
            })
        )
    end,
}

The only behaviour that I debugged was that this happens with a custom config directory like (./config/jest/jest.config.ts).

My jest config:

export default {
  // A set of global variables that need to be available in all test environments
  globals: {
    __IS_DEV__: true,
    __API__: '',
    __PROJECT__: 'jest',
  },

  // Automatically clear mock calls, instances and results before every test
  clearMocks: true,

  // The test environment that will be used for testing
  testEnvironment: 'jsdom',

  // An array of regexp pattern strings used to skip coverage collection
  coveragePathIgnorePatterns: ['/node_modules/'],

  // An array of directory names to be searched recursively up from the requiring module's location
  moduleDirectories: ['node_modules'],
  modulePaths: ['<rootDir>src'],

  // For a jest dom
  setupFilesAfterEnv: ['<rootDir>/config/jest/setupTests.ts'],

  // An array of file extensions your modules use
  moduleFileExtensions: ['js', 'jsx', 'ts', 'tsx', 'json', 'node'],

  // The root directory that Jest should scan for mocks and modules within
  rootDir: '.',

  // The glob patterns Jest uses to detect test files
  testMatch: [
    '**/__tests__/**/*.[jt]s?(x)',
    '**/?(*.)+(spec|test).[tj]s?(x)',
  ],

  // A map from regular expressions to module names or to arrays of module names that allow to stub out resources with a single module
  moduleNameMapper: {
    // enables css modules
    '\\.s?css$': 'identity-obj-proxy',
    // enable svg
    '\\.(jpg|ico|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$':
      path.resolve(
        __dirname,
        'config',
        'jest',
        'jestEmptyComponent.tsx',
      ),
    // enables absolute imports
    '^@/(.*)$': '<rootDir>src/$1',
  },
}
davidfmatheson commented 8 months ago

I am in a similar boat, although I do not see any green checkmarks in the status window, only a top-level x. The output tells me the tests passed, though. Sort of new to neovim, can you point me to where it might be logging the actual error? This is the same situation where I have a monorepo (nx in my case) where I have set jestConfigFile and cwd to a function that figures out where the jest.config.ts file and working directory is.

davidfmatheson commented 8 months ago

I figured this out, at least for my setup. We had -- at the tail end of our jestCommand. This means that jest will not write an output json file, and the plugin assumes there is an output json file. I'm not sure how to get npm test to pass things correctly (I'm using npx jest as my jestCommand), but you can test locally at the command line with --json --outputFile=foo.json until you get that part working.

Demianeen commented 7 months ago

Haven't thought about that David. I checked it and the json is correct. For example for the failed test:

{
  "numFailedTestSuites": 0, // <--
  "numFailedTests": 0,
  "numPassedTestSuites": 1,
  "numPassedTests": 4, // <--
  "numPendingTestSuites": 0,
  "numPendingTests": 0,
  "numRuntimeErrorTestSuites": 0,
  "numTodoTests": 0,
  "numTotalTestSuites": 1,
  "numTotalTests": 4,
  "openHandles": [],
  "snapshot": {
    "added": 0,
    "didUpdate": false,
    "failure": false,
    "filesAdded": 0,
    "filesRemoved": 0,
    "filesRemovedList": [],
    "filesUnmatched": 0,
    "filesUpdated": 0,
    "matched": 0,
    "total": 0,
    "unchecked": 0,
    "uncheckedKeysByFile": [],
    "unmatched": 0,
    "updated": 0
  },
  "startTime": 1704434251739,
  "success": true,
  "testResults": [
    {
      "assertionResults": [
        {
          "ancestorTitles": ["Counter"],
          "duration": 16,
          "failureDetails": [],
          "failureMessages": [],
          "fullName": "Counter should render",
          "invocations": 1,
          "location": null,
          "numPassingAsserts": 1,
          "retryReasons": [],
          "status": "passed",
          "title": "should render"
        },
        {
          "ancestorTitles": ["Counter"],
          "duration": 3,
          "failureDetails": [],
          "failureMessages": [],
          "fullName": "Counter should render with default value",
          "invocations": 1,
          "location": null,
          "numPassingAsserts": 1,
          "retryReasons": [],
          "status": "passed",
          "title": "should render with default value"
        },
        {
          "ancestorTitles": ["Counter"],
          "duration": 30,
          "failureDetails": [],
          "failureMessages": [],
          "fullName": "Counter should increment value",
          "invocations": 1,
          "location": null,
          "numPassingAsserts": 1,
          "retryReasons": [],
          "status": "passed",
          "title": "should increment value"
        },
        {
          "ancestorTitles": ["Counter"],
          "duration": 12,
          "failureDetails": [],
          "failureMessages": [],
          "fullName": "Counter should decrement value",
          "invocations": 1,
          "location": null,
          "numPassingAsserts": 1,
          "retryReasons": [],
          "status": "passed",
          "title": "should decrement value"
        }
      ],
      "endTime": 1704434252954,
      "message": "",
      "name": "/Users/demian/Desktop/program-learning/ulbitv/react-production/react-production/src/entities/Counter/ui/Counter.test.tsx",
      "startTime": 1704434252071,
      "status": "passed",
      "summary": ""
    }
  ],
  "wasInterrupted": false
}

CleanShot 2024-01-05 at 06 41 45@2x

Do you have any idea why this might happen?

Phil-Barber commented 7 months ago

I'm unsure if this is related but I've got the same symptom (tests passing and shown to be passing in output_panel logs, but status in neotest is failed).

The output json is created and parsed by neotest, but I can see it adds a whole bunch of failed missing nodes when being parsed in results callback. In my case the missing nodes have all dropped the outer describe block from my test suite.

eg. I end up with two elements in the results parsed by neotest: The correct one:

  ["/<path>::POST /report/:id/submit::should return 200"] = {
    location = {
      column = 3,
      line = 118
    },
    output = "/var/folders/q8/ngfv3nsd115cn855g6nwkmy00000gp/T/nvim.philip.barber/1KgRcm/2",
    short = "should return 200 and submit the report and snapshots the funds: passed",
    status = "passed"
  },

and one missing the describe:

  ["/path::should return 200 and submit the report and snapshots the funds"] = {
    errors = {},
    output = "/var/folders/q8/ngfv3nsd115cn855g6nwkmy00000gp/T/nvim.philip.barber/1KgRcm/2",
    status = "failed"
  },

I've already put too much time into this right now so can't go further unfortunately. My guess is that the tree is being built incorrectly - I can't see if that's done in the adapter or in neotest itself.

In case it's helpful here's my config, my project is in a monorepo:

  {
    "nvim-neotest/neotest",
    dependencies = {
      "nvim-lua/plenary.nvim",
      "antoinemadec/FixCursorHold.nvim",
      "nvim-treesitter/nvim-treesitter",
      "nvim-neotest/neotest-jest",
    },
    config = function()
      require('neotest').setup({
        discovery = {
          enabled = false,
        },
        adapters = {
          require('neotest-jest')({
            jestCommand = "npm run test:integration -- --passWithNoTests",
            env = { CI = true },
            cwd = function(path)
              return vim.fn.getcwd()
            end,
            jest_test_discovery = false,
            jestConfigFile = '',
          }),
        },
      })
    end
  },
GitMurf commented 5 months ago

I have the same issue. Did anyone come up with a solution for this?

davidfmatheson commented 5 months ago

I have the same issue. Did anyone come up with a solution for this?

Just what I mentioned above with my jestCommand setup. I also noticed tests with odd names, sometimes the plugin has trouble matching things up from the output file. Try a name that is simply [a-zA-Z]* and spaces.

kirylvarykau commented 4 months ago

I have the same issue. No weird symbols in my tests files :( Any ideas what could it be?

mattd-tg commented 3 months ago

Just a possible hint for this - I realized that in our top level describe we weren't passing anything with weird symbols, HOWEVER we were passing in describe(Component) and that was breaking the neotest summary integration, causing all of them to report as failed.

Changing the test to pass a string instead of the component definition directly fixes the issue.

towry commented 3 weeks ago

Seems relate to the jest version or pnpm/npm how to run the commands. In my previous nvim neotest setup, I use following config for neotest-jest:

        require('neotest-jest')({
          jestCommand = 'pnpm test --',
          env = { CI = true },
          cwd = function(path)
            // return current project's root
          end,
        }),

You can see I was using pnpm test -- to let extras arguments pass to the pnpm's test script. but it is not working in my new nvim setup, so i changed it to pnpm test --bail --ci and now it is working well.

sukhiboi commented 1 week ago

Seems relate to the jest version or pnpm/npm how to run the commands. In my previous nvim neotest setup, I use following config for neotest-jest:

        require('neotest-jest')({
          jestCommand = 'pnpm test --',
          env = { CI = true },
          cwd = function(path)
            // return current project's root
          end,
        }),

You can see I was using pnpm test -- to let extras arguments pass to the pnpm's test script. but it is not working in my new nvim setup, so i changed it to pnpm test --bail --ci and now it is working well.

Removing the -- helped me.

This is the jest command I'm using in my project

jestCommand = "node --expose-gc --no-compilation-cache ./node_modules/jest/bin/jest.js --logHeapUsage --colors --silent",

Few things I observed...

  1. If the jestCommand is "npm test --" (without space at the end) - The test runs and it picks up the correct set of tests, but even if the test passes, the summary shows that it failed.
  2. If the jestCommand is "npm test " (with an extra space at the end) - All the tests in project starts running
  3. If the jestCommand is "npm test" (without any spaces and without "--") - The correct set of tests are picked and when run successfully the correct status is shown.

So based on this, with jestCommand written in the 3rd way seems to work for me.