You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Any size of data will be eventually serialized without error
What is the motivation / use case for changing the behavior?
Allow the batch insert of large quantities of data greater than 2GB
Environment
LokiDB version: 2.1.0
Node v14.17.0
Others:
- We could add an explicit batch length like knex does in their `batchInsert` method https://knexjs.org/
- We could internal batch inserts with an arbitrary value than can be overridden in DB config e.g batches of 10,000
The text was updated successfully, but these errors were encountered:
elmarti
changed the title
Batch larger FS writes
Batch larger serialisations
Sep 22, 2021
@lustremedia I attempted, but it's quite fundamental to how Loki works so it would be pretty significant, also note that this is somewhat unmaintained #190 (comment) - I can recommend the successor that I wrote that doesn't have this issue - main caveat is lack of indexing at this point (in the near future), although it performs pretty well without it https://github.com/elmarti/camadb
Current behavior
Node has an internal string length limit, meaning that any data larger than 2GB. will fail with a string length error, when we run
JSON.parse
here https://github.com/LokiJS-Forge/LokiDB/blob/master/packages/loki/src/loki.ts#L687Expected behavior
Any size of data will be eventually serialized without error
What is the motivation / use case for changing the behavior?
Allow the batch insert of large quantities of data greater than 2GB
Environment
The text was updated successfully, but these errors were encountered: