summaryrefslogtreecommitdiffstatshomepage
path: root/tests (follow)
Commit message (Collapse)AuthorAgeLines
* backend/lint: Do not shadow certain builtinsWolfgang Müller12 days-17/+17
| | | | | | | | | | | | This commit enables ruff's flake8-builtin linter that emits warnings when builtin functions are shadowed. This is useful for builtins like "dict", "list", or "str" which we use often. Given the nature of this program we historically rely a lot on the usage of "id", "hash", and "filter" as variable names which also shadow Python builtins. For now let's ignore those, we have not used any of them in our code and the impact to the codebase would be considerable. This might be revisited in the future.
* backend/plugins: Throw error if E-Hentai response is missing 'gmetadata'Wolfgang Müller12 days-0/+16
| | | | | | | It might be that we get a valid (maybe empty) response from the API, in which case we do not want to simply crash because we expect the 'gmetadata' field in the response. Instead, throw a proper ScrapeError for it.
* backend/tests: Add tests for the E-Hentai APIWolfgang Müller12 days-0/+140
|
* backend/tests: Add tests for the anchira scraperWolfgang Müller12 days-0/+107
|
* backend/tests: Remove unneeded parameter in test_gallery_dlWolfgang Müller12 days-1/+1
|
* backend/plugins: Have exhentai assume no censorship for non-hWolfgang Müller12 days-2/+5
| | | | | | Non-H usually has nothing to censor, so this should be a safe default. We have not come across anything where this would have been a false positive.
* backend/tests: Add tests for gallery_dl scrapersWolfgang Müller12 days-0/+646
|
* backend/scraper: Have collect() ignore None resultsWolfgang Müller12 days-0/+15
| | | | | | If a parser function returned None we yield it regardless, even though it won't have any impact further down the line. Instead clean up the collect() stream as early as possible.
* backend/tests: Add test for open_archive_fileWolfgang Müller12 days-1/+40
|
* backend/scraper: Add parser methods for LanguageWolfgang Müller12 days-0/+33
| | | | | | We can expect a number of scraper sources to either give languages as ISO 639-3 or as their English name, so it makes sense to implement a simple parser method on our side.
* backend/lint: Format overlong lineWolfgang Müller12 days-1/+3
|
* backend/lint: Implement flake8-simplify suggestionsWolfgang Müller13 days-13/+12
|
* backend/lint: Fix import formattingWolfgang Müller13 days-16/+35
|
* Initial commit0.1.0Wolfgang Müller2024-03-05-0/+6576