UnitTesting
Testing Sublime Text Packages
Details
Installs
- Total 3K
- Win 1K
- Mac 895
- Linux 680
Nov 21 | Nov 20 | Nov 19 | Nov 18 | Nov 17 | Nov 16 | Nov 15 | Nov 14 | Nov 13 | Nov 12 | Nov 11 | Nov 10 | Nov 9 | Nov 8 | Nov 7 | Nov 6 | Nov 5 | Nov 4 | Nov 3 | Nov 2 | Nov 1 | Oct 31 | Oct 30 | Oct 29 | Oct 28 | Oct 27 | Oct 26 | Oct 25 | Oct 24 | Oct 23 | Oct 22 | Oct 21 | Oct 20 | Oct 19 | Oct 18 | Oct 17 | Oct 16 | Oct 15 | Oct 14 | Oct 13 | Oct 12 | Oct 11 | Oct 10 | Oct 9 | Oct 8 | Oct 7 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Windows | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Mac | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Linux | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 |
Readme
- Source
- raw.githubusercontent.com
UnitTesting
This is a unittest framework for Sublime Text. It runs unittest testcases on local machines and via Github Actions. It also supports testing syntax_test files for the new sublime-syntax format and sublime-color-scheme files.
Sublime Text 4
Sublime Text 4 is now supported and testing works for Python 3.8 packages.
Preparation
- Install UnitTesting via Package Control.
- Your package!
- TestCases should be placed in
test*.py
under the directorytests
(configurable, see below). The testcases are then loaded by TestLoader.discover.
Here are some small examples
Running Tests Locally
Command Palette
- Open
Command Palette
using ctrl+shift+P or menu itemTools → Command Palette...
- Choose a
Unittesting: ...
command to run and hit Enter
To test any package…
- run
UnitTesting: Test Package
- enter the package name in the input panel and hit enter.
An output panel pops up displaying progress and results of running tests.
To run only tests in particular files, enter <Package name>:<filename>
.
<filename>
should be a unix shell wildcard to match the file names.
<Package name>:test*.py
is used by default.
The command UnitTesting: Test Current Package
runs all tests
of the current package the active view's file is part of.
The package is reloaded to pickup any code changes and then tests are executed.
The command UnitTesting: Test Current Package with Coverage
runs tests for current package and generates a coverage report via coverage.
The .coveragerc file is used to control coverage configurations.
If it is missing, UnitTesting will ignore the tests
directory.
[!NOTE]
As of Unittesting 1.8.0 the following commands have been replaced to enable more flexible usage and integration in build systems.
unit_testing_current_package
{ "command": "unit_testing", "package": "$package_name" }unit_testing_current_file
{ "command": "unit_testing", "package": "$package_name", "pattern": "$file_name" }
Build System
To run tests via build system specify unit_testing
build system "target"
.
{
"target": "unit_testing"
}
Project specific Test Current Package
build command
It is recommended to add the following to .sublime-project file so that ctrl+b would invoke the testing action.
"build_systems":
[
{
"name": "Test Current Package",
"target": "unit_testing",
"package": "$package_name",
"failfast": true
}
]
Project specific Test Current File
build command
It is recommended to add the following to .sublime-project file so that ctrl+b would invoke the testing action.
"build_systems":
[
{
"name": "Test Current File",
"target": "unit_testing",
"package": "$package_name",
"pattern": "$file_name",
"failfast": true
}
]
GitHub Actions
Unittesting provides the following GitHub Actions, which can be combined in a workflow to design package tests.
- SublimeText/UnitTesting/actions/setup
Setup Sublime Text to run tests within.
This must always be the first step after checking out the package to test.
- SublimeText/UnitTesting/actions/run-color-scheme-tests
Test color schemes using ColorSchemeUnit.
- SublimeText/UnitTesting/actions/run-syntax-tests
Test sublime-syntax definitions using built-in syntax test functions of already running Sublime Text environment.
It is an alternative to SublimeText/syntax-test-action or sublimehq's online syntax_test_runner
- SublimeText/UnitTesting/actions/run-tests
Runs the unit_testing
command to perform python unit tests.
[!NOTE]
actions are released in the branch
v1
. Minor changes will be pushed to the same branch unless there are breaking changes.
Color Scheme Tests
To integrate color scheme tests via ColorSchemeUnit
add the following snippet to a workflow file
(e.g. .github/workflows/color-scheme-tests.yml
).
name: ci-color-scheme-tests
on: [push, pull_request]
jobs:
run-syntax-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: SublimeText/UnitTesting/actions/setup@v1
- uses: SublimeText/UnitTesting/actions/run-color-scheme-tests@v1
Syntax Tests
To run only syntax tests add the following snippet to a workflow file
(e.g. .github/workflows/syntax-tests.yml
).
name: ci-syntax-tests
on: [push, pull_request]
jobs:
run-syntax-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: SublimeText/UnitTesting/actions/setup@v1
- uses: SublimeText/UnitTesting/actions/run-syntax-tests@v1
[!NOTE]
If you are looking for syntax tests only, you may also checkout SublimeText/syntax-test-action. Using this test makes most sense to just re-use an already set-up ST test environment.
Unit Tests
To run only python unit tests on all platforms and versions of Sublime Text
add the following snippet to a workflow file (e.g. .github/workflows/unit-tests.yml
).
name: ci-unit-tests
on: [push, pull_request]
jobs:
run-tests:
strategy:
fail-fast: false
matrix:
st-version: [3, 4]
os: ["ubuntu-latest", "macOS-latest", "windows-latest"]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v4
- uses: SublimeText/UnitTesting/actions/setup@v1
with:
package-name: Package Name # if differs from repo name
sublime-text-version: ${{ matrix.st-version }}
- uses: SublimeText/UnitTesting/actions/run-tests@v1
with:
coverage: true
package-name: Package Name # if differs from repo name
- uses: codecov/codecov-action@v4
Run All Tests
name: ci-tests
on: [push, pull_request]
jobs:
run-tests:
strategy:
fail-fast: false
matrix:
st-version: [3, 4]
os: ["ubuntu-latest", "macOS-latest", "windows-latest"]
runs-on: ${{ matrix.os }}
steps:
# checkout package to test
- uses: actions/checkout@v4
# setup test environment
- uses: SublimeText/UnitTesting/actions/setup@v1
with:
sublime-text-version: ${{ matrix.st-version }}
# run color scheme tests (only on Linux)
- if: ${{ matrix.os == 'ubuntu-latest' }}
uses: SublimeText/UnitTesting/actions/run-color-scheme-tests@v1
# run syntax tests and check compatibility with new syntax engine (only on Linux)
- if: ${{ matrix.os == 'ubuntu-latest' }}
uses: SublimeText/UnitTesting/actions/run-syntax-tests@v1
with:
compatibility: true
# run unit tests with coverage upload
- uses: SublimeText/UnitTesting/actions/run-tests@v1
with:
coverage: true
extra-packages: |
A File Icon:SublimeText/AFileIcon
- uses: codecov/codecov-action@v4
Check this for further examples.
Options
Package Configuration
UnitTesting is primarily configured by unittesting.json
file in package root directory.
{
"verbosity": 1,
"coverage": true
}
Build System Configuration
Options provided via build system configuration override unittesting.json
.
{
"target": "unit_testing",
"package": "$package_name",
"verbosity": 2,
"coverage": true
}
Command Arguments
Options passed as arguments to unit_testing
command override unittesting.json
.
window.run_command("unit_testing", {"package": "$package_name", "coverage": False})
Available Options
name | description | default value |
---|---|---|
tests_dir | the name of the directory containing the tests | “tests” |
pattern | the pattern to discover tests | “test*.py” |
deferred | whether to use deferred test runner | true |
condition_timeout | default timeout in ms for callables invoked via yield |
4000 |
failfast | stop early if a test fails | false |
output | name of the test output instead of showing in the panel |
null |
verbosity | verbosity level | 2 |
warnings | The warnings filter controls python warnings treatment. | “default” |
capture_console | capture stdout and stderr in the test output | false |
reload_package_on_testing | reloading package will increase coverage rate | true |
coverage | track test case coverage | false |
coverage_on_worker_thread | (experimental) | false |
generate_html_report | generate HTML report for coverage | false |
generate_xml_report | generate XML report for coverage | false |
Valid warnings
values are:
Value | Disposition |
---|---|
“default” | print the first occurrence of matching warnings for each location (module + line number) where the warning is issued |
“error” | turn matching warnings into exceptions |
“ignore” | never print matching warnings |
“always” | always print matching warnings |
“module” | print the first occurrence of matching warnings for each module where the warning is issued (regardless of line number) |
“once” | print only the first occurrence of matching warnings, regardless of location |
see also: https://docs.python.org/3/library/warnings.html#warning-filter
Writing Unittests
UnitTesting is based on python's unittest
library.
Any valid unittest test case is allowed.
Example:
tests/test_myunit.py
from unittest import TestCase
class MyTestCase(TestCase):
def test_something(self):
self.assertTrue(True)
Deferred testing
Tests can be written using deferrable test cases to test results of asynchronous or long lasting sublime commands, which require yielding control to sublime text runtime and resume test execution at a later point.
It is a kind of cooperative multithreading such as provided by asyncio
,
but with a home grown DeferringTextTestRunner acting as event loop.
The idea was inspired by Plugin UnitTest Harness.
DeferrableTestCase is used to write the test cases. They are executed by the DeferringTextTestRunner and the runner expects not only regular test functions, but also generators. If the test function is a generator, it does the following
- if the yielded object is a callable, the runner will evaluate the
callable and check its returned value. If the result is not
None
, the runner continues the generator, if not, the runner will wait until the condition is met with the default timeout of 4s. The result of the callable can be also retrieved from theyield
statement. The yielded object could also be a dictionary of the form
{
# required condition callable
"condition": callable,
# system timestamp when to start condition checks (default: `time.time()`)
"start_time": timestamp,
# optional the interval to invoke `condition()` (default: 17)
"period": milliseconds,
# optional timeout to wait for condition to be met (default: value from unittesting.json or 4000)
"timeout": milliseconds,
# optional message to print, if condition is not met within timeout
"timeout_message": "Condition not fulfilled"
}
to specify various overrides such as poll interval or timeout in ms.
if the yielded object is an integer, say
x
, then it will continue the generator afterx
ms.yield AWAIT_WORKER
would yield to a task in the worker thread.otherwise, a single
yield
would yield to a task in the main thread.
Example:
import sublime
from unittesting import DeferrableTestCase
class TestCondition(DeferrableTestCase):
def test_condition1(self):
x = []
def append():
x.append(1)
def condition():
return len(x) == 1
sublime.set_timeout(append, 100)
# wait until `condition()` is true
yield condition
self.assertEqual(x[0], 1)
def test_condition2(self):
x = []
def append():
x.append(1)
def condition():
return len(x) == 1
sublime.set_timeout(append, 100)
# wait until `condition()` is true
yield {
"condition": condition,
"period": 200,
"timeout": 5000,
"timeout_message": "Not enough items added to x"
}
self.assertEqual(x[0], 1)
see also tests/test_defer.py.
Helper TestCases
UnitTesting provides some helper test case classes, which perform common tasks such as overriding preferences, setting up views, etc.
- DeferrableViewTestCase
- OverridePreferencesTestCase
- TempDirectoryTestCase
- ViewTestCase
Usage and some examples are available via docstrings, which are displayed as hover popup by LSP and e.g. LSP-pyright.
Credits
Thanks guillermooo and philippotto for their early efforts in AppVeyor and Travis CI macOS support (though these services are not supported now).