Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak? #118

Open
vonProteus opened this issue Aug 24, 2024 · 10 comments
Open

Memory leak? #118

vonProteus opened this issue Aug 24, 2024 · 10 comments
Labels
bug Something isn't working

Comments

@vonProteus
Copy link
Contributor

vonProteus commented Aug 24, 2024

What happened ?

while running app it consume ~8.4G after ~15 minutes

What did you expect to happen ?

i thing ram usage is little excessive and i didn't notice reduction in usage of ram only increasing

OS version

docker

App version

mansuf/mangadex-downloader@sha256:33312a64791de9434e6f2fcd2f423862e667a75fa3d30f47fa912ba0417cdac2

Installation origin

Other

Installation origin (other sources)

docker mansuf/mangadex-downloader

Reproducible command

args: --language en --save-as cbz --delay-requests "1.5" --force-http --progress-bar-layout=none /tmp/mangadex-urls.txt

Additional context

mangadex-urls.txt in i have ~50 links and ~4000 cbz files ~50GB
and this is next run (when i check for new parts)
in this particular run i didn't get any new parts

@vonProteus vonProteus added the bug Something isn't working label Aug 24, 2024
@vonProteus
Copy link
Contributor Author

is it normal?

@mansuf
Copy link
Owner

mansuf commented Aug 24, 2024

Is the memory increased slowly or it increased to 8GB immediately ? And you said for checking new chapters only right ? So i assume there is no download involved, only checking files.

@vonProteus
Copy link
Contributor Author

it increase slowly and no new chapters after completion of crone job

@mansuf
Copy link
Owner

mansuf commented Aug 24, 2024

Can you give me your mangadex-urls.txt ? So i can replicate this on my device

@vonProteus
Copy link
Contributor Author

vonProteus commented Aug 24, 2024

I'd rather not.
I don't want to share all my links.

sory

@mansuf
Copy link
Owner

mansuf commented Aug 24, 2024

No problem, keep the good stuff in there ( ͡° ͜ʖ ͡°)

Anyway, i don't have the exact problem is, but there is some possibilities.

The first one is when verifying files some file pointer may not closed properly
The second is the cache, usually at fetching user, author, groups, and cover_art. This cache is to prevent repeated requests to MangaDex API.

I can test download dozens of popular manga in cbz format and see if this problem can be replicated. But downloading those takes time sadly, so it would take a while to get the problem. While waiting you can contribute too by running a memory profiler for the application, if you found the problem you can report it here and i will fix it 👍 .

@vonProteus
Copy link
Contributor Author

how to run memory profiler?

@mansuf
Copy link
Owner

mansuf commented Aug 24, 2024

You can use tool like guppy3 to run a memory profiler

@idontwanttosayaword

This comment was marked as resolved.

@mansuf
Copy link
Owner

mansuf commented Sep 9, 2024

I cannot find any issue with memory increased up to 1GB or more on my device. But i found some issue that cause memory increased only 50-100MB not 1GB or more. The issue caused by SQL migrations is keeps running indefinitely. Maybe install this patch 61ec2d7 will fix the issue ? Let me know if this patch is not fixed the issue yet

pip uninstall mangadex-downloader

pip install git+https://github.com/mansuf/mangadex-downloader.git@61ec2d7d04899c59e97a84044d1098ad8078c17e

I'm using memory-profiler for watching memory usage. The results can be seen below

Command used:

python -m mangadex_downloader "https://mangadex.org/title/32d76d19-8a05-4db0-9fc2-e0b0648fe9d0"

Before patch

image

After patch

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants