Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to process requests evenly with RateLimiterQueue? #112

Closed
sam-lord opened this issue May 6, 2021 · 11 comments
Closed

How to process requests evenly with RateLimiterQueue? #112

sam-lord opened this issue May 6, 2021 · 11 comments
Labels
question Further information is requested

Comments

@sam-lord
Copy link

sam-lord commented May 6, 2021

I'm sending requests to a heavily rate-limited service but want them all to get through eventually, so I have a RateLimiterQueue with the default maximum size, wrapping a RateLimiterMySQL which is using the Sequelize backend for MySQL.

The issue is that whilst we have configured the service to have 10 points per hour, only 5 requests are being executed per hour.

I have the limiter set with the following config (as well as database specific config):

{
  "duration": 3600,
  "points": 10,
  "execEvenly": true
}

And the queue wrapping the limiter has no additional configuration. Is this a known issue?

EDIT: use an example from Consume points evenly article. At the time of this edit, execEvenly option doesn't work with RateLimiterQueue.

@sam-lord
Copy link
Author

sam-lord commented May 6, 2021

Also - possibly should be a separate issue, but if I add items to the queue very quickly, it waits the correct amount of time before releasing the first item, but then immediately releases all other items afterwards, when they should wait for the first item to be release and at least the minimum amount after that

@sam-lord
Copy link
Author

sam-lord commented May 6, 2021

I've tried execEvenly with RateLimiterMemory and also encountered the issue where it executes all requests after the first one simultaneously.

@animir animir added the bug Something isn't working label May 6, 2021
@animir
Copy link
Owner

animir commented May 6, 2021

Hi @sam-lord Thanks for reporting this issue.
It is interesting case, I have never tried it in this conjunction.

First of all, you can avoid execEvenly option, if you change limiter options slightly:

{
  "duration": 360,
  "points": 1,
}

If you use above config with RateLimiterQueue, it should work as you expect: 1 request will be processed every 6 minutes, all other requests are queued. It is better than execEvenly option, as it gives you certain traffic prediction.

Secondly, execEvenly is a tricky thing. You can read more about it here. I think, there is a bug with returned result from a limiter. When any limiter is created with execEvenly and request is delayed, it is delayed with setTimeout, but msBeforeNext is not updated after timeout. RateLimiterQueue relies on msBeforeNext to make decision on how a request should be processed/queued. Note, It is just an idea, I have not tested it.

@sam-lord
Copy link
Author

sam-lord commented May 6, 2021

Thanks for the response. Unfortunately we need to use points > 0 -- my use-case is a certificate management system which issues new certificates with LetsEncrypt, but there is a renewal process happening outside my control. When the renewals happen I am using the penalty method to remove a point from the rate limiter.

I have been looking at the way execEvenly is calculated, and I believe it could be fixed with something like this:

const usedPoints = this.points - res.remainingPoints;

const delay = Math.max(1, usedPoints) * this.execEvenlyMinDelayMs;

setTimeout(resolve, delay, res);

As the number of used points increase, the delay before resolving increases, lining up to the correct interval. Not sure I have understood the current code though, the algorithm used at the moment has gone over my head a little.

@sam-lord
Copy link
Author

sam-lord commented May 6, 2021

@animir I've got a fix that works for me - you might be interested. It's a new RateLimiter type that wraps UnionRate limiter, providing a full AbstractRateLimiter class that uses two limiters under the hood: one main one - uses the exact config provided, and one "interval" - which uses a single point and the duration is just Math.ceil(duration / points) - which sorts the even spreading for us.

const {RateLimiterUnion, RateLimiterMySQL, RateLimiterMemory} = require('rate-limiter-flexible');
const RateLimiterAbstract = require('rate-limiter-flexible/lib/RateLimiterAbstract');

module.exports = class RateLimiterCustom extends RateLimiterAbstract {
    constructor(options = {}, callback) {
        super(options);

        const limiterType = (options.storeClient)
            ? RateLimiterMySQL
            : RateLimiterMemory;

        const mainLimiter = new limiterType(Object.assign({}, options, {
            keyPrefix: 'main'
        }), callback);

        const intervalLimiter = new limiterType(Object.assign({}, options, {
            keyPrefix: 'interval',
            points: 1,
            duration: Math.ceil(this.duration / this.points)
        }));

        this._unionLimiter = new RateLimiterUnion(mainLimiter, intervalLimiter);
        this._mainLimiter = mainLimiter;
    }

    async consume(key, points = 1, options = {}) {
        // Consume from the union limiter and error on a single failure - return result from main limiter if possible
        try {
            const {main, interval} = await this._unionLimiter.consume(key, points, options);
            main.msBeforeNext = Math.max(main.msBeforeNext, interval.msBeforeNext);
            return main;
        } catch (result) {
            if (result instanceof Error) {
                throw result;
            }
            const {main, interval} = result;
            if (main && interval) {
                main.msBeforeNext = Math.max(main.msBeforeNext, interval.msBeforeNext);
                throw main;
            } else if (main) {
                throw main;
            } else {
                throw interval;
            }
        }
    }

    penalty(key, points = 1) {
        return this._mainLimiter.penalty(key, points);
    }

    reward(key, points = 1) {
        return this._mainLimiter.reward(key, points);
    }

    get(key) {
        return this._mainLimiter.get(key);
    }

    set(key, points, secDuration) {
        return this._mainLimiter.set(key, points, secDuration);
    }

    block(key, secDuration) {
        return this._mainLimiter.block(key, secDuration);
    }

    delete(key) {
        return this._mainLimiter.delete(key);
    }
};

@animir
Copy link
Owner

animir commented May 7, 2021

@sam-lord It is interesting solution 👍 Note, Union limiter consumes from all limiters in union every request. While interval limiter rejects, Union will be consuming points from main limiter as well.

I'll describe, how I understand your case, fix me please, if I get it wrong

There are 10 points per hour. Point can be consumed every 6 minutes. If somebody tries to consume it more frequently, this try must be rejected. It should penalty points sometimes. So not every hour has ability to process 10 requests.

@sam-lord
Copy link
Author

sam-lord commented May 7, 2021

Ahh - thanks for mentioning that. I might reward limiters that didn’t reject in my catch block to fix the number of points.

I think you’ve described the use case perfectly, yes.

@animir
Copy link
Owner

animir commented May 7, 2021

@sam-lord Ok, you can try to remove Union from your Custom limiter. Consume points in a sequence, not in parallel. If interval has a point to consume, only then try to consume 1 point from main limiter. I hope, this helps.

Do you mind, if I rename this issue? It may not reflect the nature of execEvenly with RateLimiterQueue bug.

@sam-lord
Copy link
Author

sam-lord commented May 7, 2021

Do you mind, if I rename this issue?

Not at all, please go ahead

Consume points in a sequence, not in parallel. If interval has a point to consume, only then try to consume 1 point from main limiter. I hope, this helps.

Thanks very much for the advice, I'll drop another note if I get all this working. You have been an amazing help.

@sam-lord
Copy link
Author

sam-lord commented May 7, 2021

This is all working perfectly. The code is really simple now:

const {RateLimiterMySQL, RateLimiterMemory} = require('rate-limiter-flexible');
const RateLimiterAbstract = require('rate-limiter-flexible/lib/RateLimiterAbstract');

module.exports = class RateLimiterCustom extends RateLimiterAbstract {
    constructor(options = {}, callback) {
        super(options);

        const limiterType = (options.storeClient)
            ? RateLimiterMySQL
            : RateLimiterMemory;

        const mainLimiter = new limiterType(Object.assign({}, options, {
            keyPrefix: 'main'
        }), callback);

        const intervalLimiter = new limiterType(Object.assign({}, options, {
            keyPrefix: 'interval',
            points: 1,
            duration: Math.ceil(this.duration / this.points)
        }));

        this._mainLimiter = mainLimiter;
        this._intervalLimiter = intervalLimiter;
    }

    async consume(key, points = 1, options = {}) {
        await this._intervalLimiter.consume(key, points, options);
        return await this._mainLimiter.consume(key, points, options);
    }

    penalty(key, points = 1) {
        return this._mainLimiter.penalty(key, points);
    }

    reward(key, points = 1) {
        return this._mainLimiter.reward(key, points);
    }

    get(key) {
        return this._mainLimiter.get(key);
    }

    set(key, points, secDuration) {
        return this._mainLimiter.set(key, points, secDuration);
    }

    block(key, secDuration) {
        return this._mainLimiter.block(key, secDuration);
    }

    delete(key) {
        return this._mainLimiter.delete(key);
    }
};

Thanks once again for the help with this - hopefully this ends up being useful to others with this type of use-case

@animir
Copy link
Owner

animir commented May 8, 2021

@sam-lord You're welcome. It looks nice.

There is a separate issue for execEvenly option bug: #113

And I created a knowledge base article with slightly changed example: Consume points evenly

@animir animir closed this as completed May 8, 2021
@animir animir changed the title Performance issue with RateLimiterQueue and execEvenly How to process requests evenly with RateLimiterQueue? May 8, 2021
@animir animir added question Further information is requested and removed bug Something isn't working labels May 8, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants