summaryrefslogtreecommitdiffstatshomepage
path: root/tests (follow)
Commit message (Collapse)AuthorAgeLines
* backend/lint: Do not shadow certain builtinsWolfgang Müller8 days-17/+17
| | | | | | | | | | | | This commit enables ruff's flake8-builtin linter that emits warnings when builtin functions are shadowed. This is useful for builtins like "dict", "list", or "str" which we use often. Given the nature of this program we historically rely a lot on the usage of "id", "hash", and "filter" as variable names which also shadow Python builtins. For now let's ignore those, we have not used any of them in our code and the impact to the codebase would be considerable. This might be revisited in the future.
* backend/plugins: Throw error if E-Hentai response is missing 'gmetadata'Wolfgang Müller8 days-0/+16
| | | | | | | It might be that we get a valid (maybe empty) response from the API, in which case we do not want to simply crash because we expect the 'gmetadata' field in the response. Instead, throw a proper ScrapeError for it.
* backend/tests: Add tests for the E-Hentai APIWolfgang Müller9 days-0/+140
|
* backend/tests: Add tests for the anchira scraperWolfgang Müller9 days-0/+107
|
* backend/tests: Remove unneeded parameter in test_gallery_dlWolfgang Müller9 days-1/+1
|
* backend/plugins: Have exhentai assume no censorship for non-hWolfgang Müller9 days-2/+5
| | | | | | Non-H usually has nothing to censor, so this should be a safe default. We have not come across anything where this would have been a false positive.
* backend/tests: Add tests for gallery_dl scrapersWolfgang Müller9 days-0/+646
|
* backend/scraper: Have collect() ignore None resultsWolfgang Müller9 days-0/+15
| | | | | | If a parser function returned None we yield it regardless, even though it won't have any impact further down the line. Instead clean up the collect() stream as early as possible.
* backend/tests: Add test for open_archive_fileWolfgang Müller9 days-1/+40
|
* backend/scraper: Add parser methods for LanguageWolfgang Müller9 days-0/+33
| | | | | | We can expect a number of scraper sources to either give languages as ISO 639-3 or as their English name, so it makes sense to implement a simple parser method on our side.
* backend/lint: Format overlong lineWolfgang Müller9 days-1/+3
|
* backend/lint: Implement flake8-simplify suggestionsWolfgang Müller9 days-13/+12
|
* backend/lint: Fix import formattingWolfgang Müller9 days-16/+35
|
* Initial commit0.1.0Wolfgang Müller2024-03-05-0/+6576