summaryrefslogtreecommitdiffstatshomepage
path: root/src (unfollow)
Commit message (Collapse)AuthorLines
3 daysbackend/plugins: Have E-Hentai raise an error from status_code earlyWolfgang Müller-19/+19
This makes the code clearer and saves a whole indentation level.
3 daysbackend/plugins: Catch E-Hentai errors only for relevant linesWolfgang Müller-11/+11
3 daysbackend/plugins: Throw error if E-Hentai response is missing 'gmetadata'Wolfgang Müller-0/+2
It might be that we get a valid (maybe empty) response from the API, in which case we do not want to simply crash because we expect the 'gmetadata' field in the response. Instead, throw a proper ScrapeError for it.
3 daysbackend/plugins: Remove stray apostropheWolfgang Müller-1/+1
3 daysbackend/plugins: Have exhentai assume no censorship for non-hWolfgang Müller-0/+3
Non-H usually has nothing to censor, so this should be a safe default. We have not come across anything where this would have been a false positive.
4 daysbackend/plugins: Use language parser from scraper utilsWolfgang Müller-32/+5
Now that we have this in our utility suite, we can make use of it in the built-in scraper plugins. This increases coverage and removes a lot of duplicate code.
4 daysbackend/scraper: Have collect() ignore None resultsWolfgang Müller-3/+6
If a parser function returned None we yield it regardless, even though it won't have any impact further down the line. Instead clean up the collect() stream as early as possible.
4 daysbackend/tests: Add test for open_archive_fileWolfgang Müller-1/+1
4 daysbackend/plugins: Use "no cover" pragma for consistencyWolfgang Müller-1/+1
4 daysbackend/plugins: Fix MangaDex scraper title formattingWolfgang Müller-1/+1
4 daysbackend/scraper: Add parser methods for LanguageWolfgang Müller-0/+32
We can expect a number of scraper sources to either give languages as ISO 639-3 or as their English name, so it makes sense to implement a simple parser method on our side.
4 daysbackend/lint: Ignore B027 in api/inputs.pyWolfgang Müller-1/+1
Even though our base class here is abstract, this method is not, so we can ignore B027 [1]. [1] https://docs.astral.sh/ruff/rules/empty-method-without-abstract-decorator/
4 daysbackend/lint: Properly chain exceptionsWolfgang Müller-6/+6
This fixes flake8-bugbear's B904 [1]. [1] https://docs.astral.sh/ruff/rules/raise-without-from-inside-except/
4 daysbackend/plugins: Have anchira scraper use parse_dict from scraper utilsWolfgang Müller-11/+2
This cuts down on code duplication and also fixes B023 [1]. [1] https://docs.astral.sh/ruff/rules/function-uses-loop-variable/#function-uses-loop-variable-b023
4 daysbackend/lint: Stop using mutable objects as function argument defaultsWolfgang Müller-3/+12
See https://docs.astral.sh/ruff/rules/mutable-argument-default/
4 daysbackend/lint: Implement pyupgrade suggestionsWolfgang Müller-1/+1
4 daysbackend/lint: Implement flake8-simplify suggestionsWolfgang Müller-19/+17
4 daysbackend/scraper: Bind loop variables correctlyWolfgang Müller-2/+2
This was uncovered by bugbear, but did not seem to have tripped our test. Fix it anyway.
4 daysbackend/lint: Fix import formattingWolfgang Müller-1/+2
2024-03-25backend: Report Archive size as floatWolfgang Müller-1/+1
GraphQL integers are 32-bit as per spec [1] [2]. Implementations may therefore error on large numbers. Since an archive's size can reasonably exceed this value, make sure to report it as a float instead. [1] https://graphql.org/learn/schema/ [2] https://github.com/graphql/graphql-js/issues/292#issuecomment-186702763