Maximum # network calls

Hi dev,

In my app, on behalf of each user, the server will be making multiple network.request heavy loops at times.

I used to do this on the clients, but the server is faster and more secure.

My concern and question is then, if many users will make these calls at practically the same time, is there a network.request rate limit from serverside onto the web like there is from client?

If there is, I may have to rethink my server operations.

anaqim

Hi,

There is no limit on the request server-side, Nginx will handle them as fast as possible.

In your loop I assume you are hitting the same endpoint?

-dev

yes i am but i am not getting any error from it but instead it looks like when i push calls through, i get some failiure that I cannot pinpoint the reason why, only that it happens when i “overload” the system

perhaps this makes sense to you?

2018/02/10 17:31:33 [error] 1944#0: *4265 lua entry thread aborted: runtime error: /home/coronium/projects/tracker/main.lua:355: attempt to get length of field ‘items’ (a nil value)
stack traceback:
coroutine 0:
/home/coronium/projects/tracker/main.lua: in function ‘requestmore’
/home/coronium/projects/tracker/main.lua:387: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:402: in function 
[string “coronium.input”]: in function ‘request’
content_by_lua(coronium.locations.conf:119):2: in function , client: X, server: , request: “POST / HTTP/1.1”, host: “X:10001”(coronium.locations.conf:119):1>

the code that trigger the error above looks like this, and runs fine when not overloading the calls

local resp,err=core.network.getJson(url,headers) if not response then core.log(err) return err end for j=1,#resp.items do --\< this is the line that triggers the error

OMG i see that now, resp vs response, meaning i am probably hitting the remote API rate limit

Which makes my previous post regarding a user defineable delay even more useable.

I infrequently need to do some rather large loops agans an endpoint but I dont want to make the requests too fast cause it will trigger the remote API rate limit, which makes it not very reliable to use with many users on at the same time.

Hi,

Does this API you are using have a batch mode or something. I guess I am not understanding why it needs to be hit so many times on a single call.   :huh:

-dev

Hi,

Just an API rate limit per key but i need to loop through a bunch of pages and if it happens too fast, dang!

I’ll wait for the coming update, while i try to check my logic and see if i cannot find a better approach.

anaqim

Hi,

There is no limit on the request server-side, Nginx will handle them as fast as possible.

In your loop I assume you are hitting the same endpoint?

-dev

yes i am but i am not getting any error from it but instead it looks like when i push calls through, i get some failiure that I cannot pinpoint the reason why, only that it happens when i “overload” the system

perhaps this makes sense to you?

2018/02/10 17:31:33 [error] 1944#0: *4265 lua entry thread aborted: runtime error: /home/coronium/projects/tracker/main.lua:355: attempt to get length of field ‘items’ (a nil value)
stack traceback:
coroutine 0:
/home/coronium/projects/tracker/main.lua: in function ‘requestmore’
/home/coronium/projects/tracker/main.lua:387: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:398: in function ‘request’
/home/coronium/projects/tracker/main.lua:402: in function 
[string “coronium.input”]: in function ‘request’
content_by_lua(coronium.locations.conf:119):2: in function , client: X, server: , request: “POST / HTTP/1.1”, host: “X:10001”(coronium.locations.conf:119):1>

the code that trigger the error above looks like this, and runs fine when not overloading the calls

local resp,err=core.network.getJson(url,headers) if not response then core.log(err) return err end for j=1,#resp.items do --\< this is the line that triggers the error

OMG i see that now, resp vs response, meaning i am probably hitting the remote API rate limit

Which makes my previous post regarding a user defineable delay even more useable.

I infrequently need to do some rather large loops agans an endpoint but I dont want to make the requests too fast cause it will trigger the remote API rate limit, which makes it not very reliable to use with many users on at the same time.

Hi,

Does this API you are using have a batch mode or something. I guess I am not understanding why it needs to be hit so many times on a single call.   :huh:

-dev

Hi,

Just an API rate limit per key but i need to loop through a bunch of pages and if it happens too fast, dang!

I’ll wait for the coming update, while i try to check my logic and see if i cannot find a better approach.

anaqim