Commit message (Collapse) | Author | Lines | ||
---|---|---|---|---|
3 days | backend/plugins: Throw error if E-Hentai response is missing 'gmetadata' | Wolfgang Müller | -0/+16 | |
It might be that we get a valid (maybe empty) response from the API, in which case we do not want to simply crash because we expect the 'gmetadata' field in the response. Instead, throw a proper ScrapeError for it. | ||||
3 days | backend/tests: Add tests for the E-Hentai API | Wolfgang Müller | -0/+140 | |
3 days | backend/tests: Add tests for the anchira scraper | Wolfgang Müller | -0/+107 | |
3 days | backend/tests: Remove unneeded parameter in test_gallery_dl | Wolfgang Müller | -1/+1 | |
3 days | backend/plugins: Have exhentai assume no censorship for non-h | Wolfgang Müller | -2/+5 | |
Non-H usually has nothing to censor, so this should be a safe default. We have not come across anything where this would have been a false positive. | ||||
4 days | backend/tests: Add tests for gallery_dl scrapers | Wolfgang Müller | -0/+646 | |
4 days | backend/scraper: Have collect() ignore None results | Wolfgang Müller | -0/+15 | |
If a parser function returned None we yield it regardless, even though it won't have any impact further down the line. Instead clean up the collect() stream as early as possible. | ||||
4 days | backend/tests: Add test for open_archive_file | Wolfgang Müller | -1/+40 | |
4 days | backend/scraper: Add parser methods for Language | Wolfgang Müller | -0/+33 | |
We can expect a number of scraper sources to either give languages as ISO 639-3 or as their English name, so it makes sense to implement a simple parser method on our side. | ||||
4 days | backend/lint: Format overlong line | Wolfgang Müller | -1/+3 | |
4 days | backend/lint: Implement flake8-simplify suggestions | Wolfgang Müller | -13/+12 | |
4 days | backend/lint: Fix import formatting | Wolfgang Müller | -16/+35 | |