Skip to main content
This guide explains how to execute tests for the Wazuh Dashboard Plugins. Tests ensure code quality, prevent regressions, and validate functionality.

Test Environment

Tests must be executed inside the Docker development environment. Running tests directly on the host machine will fail due to missing dependencies and OpenSearch Dashboards infrastructure.

Prerequisites

Before running tests:
  1. Development environment configured (see Setup)
  2. Docker development environment running (see Building)
  3. Plugin dependencies installed

Starting the Test Environment

Start the Docker development environment:
cd docker/osd-dev
./dev.sh up
Wait for the containers to initialize. Verify containers are running:
docker ps | grep osd
You should see output similar to:
osd-dev-330-osd-1    running    0.0.0.0:5601->5601/tcp

Accessing the Test Container

Attach a shell to the OpenSearch Dashboards container:
# Using container ID
docker exec -it <CONTAINER_ID> bash

# Using container name pattern
docker exec -it osd-dev-330-osd-1 bash
Find your container ID or name:
docker ps | grep osd

Running Unit Tests (Jest)

Jest is the primary testing framework for unit tests. Each plugin contains its own test suite.

Main Plugin Tests

From inside the container:
cd plugins/main
yarn test:jest

Wazuh Core Tests

cd plugins/wazuh-core
yarn test:jest

Wazuh Check Updates Tests

cd plugins/wazuh-check-updates
yarn test:jest

Test Output

Jest executes all .test.ts and .test.tsx files in the plugin directory and displays:
  • Test results (pass/fail)
  • Code coverage information
  • Execution time
  • Failed assertion details
Example output:
PASS  public/components/example.test.tsx
  Example Component
    ✓ renders correctly (45 ms)
    ✓ handles user interaction (23 ms)

Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   0 total
Time:        3.456 s

Running Specific Tests

Run tests matching a pattern:
yarn test:jest --testPathPattern=components/security
Run a single test file:
yarn test:jest public/components/example.test.tsx

Watch Mode

Run tests in watch mode for development:
yarn test:jest --watch
This automatically re-runs tests when files change.

Updating Snapshots

When component output changes intentionally, update snapshots:
yarn test:jest -u
Only update snapshots after verifying the changes are intentional. Review snapshot diffs carefully before committing.

Test Coverage

Generate code coverage reports:
yarn test:jest --coverage
Coverage reports show:
  • Statements: Percentage of executed statements
  • Branches: Percentage of executed conditional branches
  • Functions: Percentage of called functions
  • Lines: Percentage of executed lines
Coverage reports are written to coverage/ directory. Open coverage/lcov-report/index.html in a browser for detailed visualization.

Test Types

Unit Tests

Test individual functions, components, and modules in isolation. Location: *.test.ts, *.test.tsx files throughout the codebase Framework: Jest Examples:
  • Component rendering tests
  • Utility function tests
  • Redux reducer tests
  • API client tests

Integration Tests

Test interactions between multiple components or modules. Location: Same as unit tests, but testing component integration Framework: Jest with additional mocking

End-to-End Tests

Full application tests using real browser automation. Location: plugins/main/test/cypress/ Framework: Cypress
Cypress tests require additional setup and infrastructure not included in the standard Docker development environment. These tests are typically run in CI/CD pipelines.

Writing Tests

Test File Structure

Follow this structure for test files:
import { Component } from './component';

describe('Component Name', () => {
  beforeEach(() => {
    // Setup before each test
  });

  afterEach(() => {
    // Cleanup after each test
  });

  it('should do something specific', () => {
    // Arrange
    const input = 'test';
    
    // Act
    const result = Component(input);
    
    // Assert
    expect(result).toBe('expected');
  });
});

Component Testing

Test React components using React Testing Library:
import { render, screen } from '@testing-library/react';
import { ExampleComponent } from './example-component';

describe('ExampleComponent', () => {
  it('renders with correct text', () => {
    render(<ExampleComponent text="Hello" />);
    expect(screen.getByText('Hello')).toBeInTheDocument();
  });
});

Mocking

Mock external dependencies:
jest.mock('../../services/api-client', () => ({
  getAgents: jest.fn(() => Promise.resolve([]))
}));

Snapshot Testing

Capture component output:
import { render } from '@testing-library/react';
import { Component } from './component';

it('matches snapshot', () => {
  const { container } = render(<Component />);
  expect(container).toMatchSnapshot();
});

Test Scripts Reference

Available test scripts in package.json:
ScriptDescription
test:jestRun Jest unit tests
test:jest:runnerRun tests with custom runner
test:serverRun server-side tests
test:browserRun browser-based tests
test:ui:runnerRun UI functional tests
Scripts test:server, test:browser, and test:ui:runner require additional OpenSearch Dashboards infrastructure not available in the Docker development environment. Use these in production-like environments or CI/CD pipelines.

Continuous Integration

The repository includes GitHub Actions workflows for automated testing:
  • Pull Request Checks: Run on every pull request
  • Branch Protection: Tests must pass before merging
  • Coverage Reports: Uploaded to coverage tracking services
View CI/CD configurations in .github/workflows/ directory.

Common Test Warnings

Some warnings during test execution are expected and do not indicate failures:
  • Browserslist warnings: “caniuse-lite is outdated” - informational only
  • Prop validation warnings: May occur in test environments
  • Console messages: Some tests intentionally trigger console output

Troubleshooting

Tests Fail on Host Machine

Problem: Tests fail when run directly on host machine Solution: Tests must run inside the Docker container. Access the container first:
docker exec -it <CONTAINER_ID> bash
cd plugins/main
yarn test:jest

Missing setup_node_env

Problem: Error about missing setup_node_env Solution: This script is provided by OpenSearch Dashboards. Run tests inside the Docker container where OpenSearch Dashboards is installed.

Out of Memory Errors

Problem: Tests fail with heap out of memory errors Solution: Increase Node.js memory:
NODE_OPTIONS="--max-old-space-size=4096" yarn test:jest

Snapshot Mismatches After Updates

Problem: Snapshot tests fail after dependencies update Solution: Review changes, then update snapshots if intentional:
yarn test:jest -u

Port Already in Use

Problem: Cannot start test environment due to port conflict Solution: Stop other containers or processes using port 5601:
docker ps
docker stop <CONTAINER_ID>

Best Practices

Test Naming

  • Use descriptive test names that explain what is being tested
  • Follow pattern: “should [expected behavior] when [condition]”
  • Group related tests using describe blocks

Test Independence

  • Each test should be independent and isolated
  • Avoid test order dependencies
  • Clean up after tests in afterEach hooks

Mocking Strategy

  • Mock external dependencies (API calls, file system, etc.)
  • Keep mocks simple and focused
  • Place shared mocks in __mocks__/ directories

Coverage Goals

  • Aim for high coverage on critical paths
  • Focus on business logic and user interactions
  • Don’t chase 100% coverage at expense of test quality

Next Steps

  • Review contributing guidelines
  • Explore test files in the repository for examples
  • Set up CI/CD integration for your fork