-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to include streaming support for larger JSON files #29
Comments
This is sorely needed. There needs to be an officially-sanctioned backup restoration solution that's maintained. Our database just went past the 250mb threshold, and the ~30 seconds/mb speed of the streaming import tool is extremely painful. I can't imagine the pain of the OP dealing with a 6GB db size. @mikelehen Is there any action happening inside Firebase to update these tools? Edit: It appers the current master branch has streaming support implemented. Just needs a version published to npm and an update to the README. |
Yeah, sorry neither of these tools is great for handling large amounts of data. If you need a full backup restore or similar, your best bet may be to reach out to Firebase Support. That said, PRs to this project to improve its performance / usefulness are welcome. |
Even just publishing the current master as an npm release would be helpful.
…On Fri, Mar 9, 2018, 1:55 PM Michael Lehenbauer, ***@***.***> wrote:
Yeah, sorry neither of these tools is great for handling large amounts of
data. If you need a full backup restore or similar, your best bet may be to
reach out to Firebase Support.
That said, PRs to this project to improve its performance / usefulness are
welcome.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#29 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAoF9FZVPCTXjFvnGdaSncuZ6OZXqQy2ks5tctAmgaJpZM4PHVmu>
.
|
Done. Sorry for the delay on that! |
@mm-gmbd the python firebase-streaming-import package that the readme here points to seems actually to be abandoned, including memory leaks. I'll open an issue to ask to remove it from the docs here. I've solved my problem by splitting one BIG json into multiple smaller files, and then looping over each smaller one with bash. The challenge then is to design a json structure and import process that doesn't overwrite itself after each import step. See my question/answer on stackoverflow, related to how to load multiple files without overwriting the previous |
@mikelehen - support is still directing people to the broken What's the point of having backups if you can't restore from them? |
Happy to look at introducing parallel / multi thread approach to tackle this. How many users this solves a problem for? (doesn't look like many face the issue or at least report this) |
Parallelism doesn’t really help. You hit artificial rate limits pretty quickly. |
If its a problem, we can try to find a proper/working solution. Whats the alternative? |
@siege-nnn Firebase support followed up with me after I posted in their Google Group:
And they attached a form I needed to fill out for them to do the restore. |
The README suggests using firebase-streaming-import for files larger than 250MB. My database that I'd like to import is well over that threshold (6GB+), and with the speed of ~30seconds/MB, that ends up being somewhere around 50+ hours (and has to be run twice). Also, firebase-streaming-import hasn't been updated in almost 3 years.
I'm sure this could be improved, and should be included as a part of this package using node streams (I'd be surprised if I was the only 250MB+ user...).
The text was updated successfully, but these errors were encountered: