coronium:run() max input table size

Hi there,

I’m getting the following error when sending a table over a certain size as the input to a coronium:run() call

runtime error: /usr/local/coronium/lib/coronium.lua:143: bad argument #1 to ‘decode’ (string expected, got nil)

stack traceback:
coroutine 0:
    [C]: in function ‘decode’
    /usr/local/coronium/lib/coronium.lua:143: in function ‘input’

Just wondering if this is expected behaviour and whether there is anything I can do to circumvent it?

(running coronium 1.93.1 on AWS)

Many thanks,

Andrzej

Hi,

Do you happen to have any idea how big the payload is size wise? You could probably just save the post in a file and give me the file size.

Cheers.

Hi Chris,

Thanks for your swift reply :slight_smile:

40,103 bytes succeeds

41,439 bytes fails

Hi,

Sounds about right, I think the max is around 32k.

What version are you running? I should be able to help you increase it.

Cheers.

I’m on the latest(?) coronium 1.93.1 for AWS

Hi,

You can log in and try this on the command line:

sudo sed ‘s/client_body_buffer_size 32k/client_body_buffer_size 64k/’ -i /usr/local/openresty/nginx/conf/nginx.conf

If you get a blank response, then that’s good. You’ll also need to reload Coronium:

sudo service coronium reload

You can of course change the 64k above to whatever you want, but it should be just large enough to cover your largest use case (in other words, smaller the better).

Cheers.

That worked!

Thank you :slight_smile:

I’m doing a backup/sync of save data which could end up being >1MB of json, which I imagine wouldn’t make a good max table size. With this knowledge I can now split up the data and upload in pieces to be sure of not hitting the limit.

Awesome! Have fun!

Cheers.

Hi,

Do you happen to have any idea how big the payload is size wise? You could probably just save the post in a file and give me the file size.

Cheers.

Hi Chris,

Thanks for your swift reply :slight_smile:

40,103 bytes succeeds

41,439 bytes fails

Hi,

Sounds about right, I think the max is around 32k.

What version are you running? I should be able to help you increase it.

Cheers.

I’m on the latest(?) coronium 1.93.1 for AWS

Hi,

You can log in and try this on the command line:

sudo sed ‘s/client_body_buffer_size 32k/client_body_buffer_size 64k/’ -i /usr/local/openresty/nginx/conf/nginx.conf

If you get a blank response, then that’s good. You’ll also need to reload Coronium:

sudo service coronium reload

You can of course change the 64k above to whatever you want, but it should be just large enough to cover your largest use case (in other words, smaller the better).

Cheers.

That worked!

Thank you :slight_smile:

I’m doing a backup/sync of save data which could end up being >1MB of json, which I imagine wouldn’t make a good max table size. With this knowledge I can now split up the data and upload in pieces to be sure of not hitting the limit.

Awesome! Have fun!

Cheers.