Testing with Portals
At the Linux App Summit (LAS) in Albania three months ago, I gave a talk about testing in the xdg-desktop-portal project. There is a recording of the presentation, and the slides are available as well.
To give a small summary of the work I did:
- Revamped the CI
- Reworked and improved the pytest based integration test harness
- Added integration tests for new portals
- Ported over all the existing GLib/C based integration tests
- Support ASAN for detecting memory leaks in the tests
- Made tests pretend to be either a host, Flatpak or Snap app
The hope I had is that this will result in:
- Fewer regressions
- Tests for new features and bug fixes
- More confidence in refactoring
- More activity in the project
While it’s hard to get definite data on those points, at least some of it seems to have become reality. I have seen an increase in activity (there are other factors to this for sure), and a lot of PRs already come with tests without me even having to ask for it. Canonical is involved again, taking care of the Snap side of things. So far it seems like we didn’t introduce any new regressions, but this usually shows after a new release. The experience of refactoring portals also became a lot better because there is a baseline level of confidence when the tests pass, as well as the possibility to easily bisect issues. Overall I’m already quite happy with the results.
This week, Georges merged the last piece of what I talked about in the LAS presentation, so we’re finally testing the code paths that are specific to host, Flatpak and Snap applications! I also continued a bit with improving the tests, and now they can be run with Valgrind, which is super slow and that’s why we’re not doing it in the CI, but it tends to find memory leaks which ASAN does not. With the existing tests, it found 9 small memory leaks.
If you want to improve the Flatpak story, come and contribute to xdg-desktop-portal. It’s now easier than ever!