API rate limit - "Too many operations"

Hello everyone,

I am currently testing an on-prem Seatable server and I am facing some issues while making multiple API calls through my script. I have tried optimizing my code to reduce the number of calls, but it hasn’t been fruitful so far.

Since I am working with a small setup and not much concerned about CPU time, I would like to increase the configured API limits. However, when I call base.update_row() in Python, I encounter the following error message:

ConnectionError: [Errno 400] {"error_type":"operation_limit","error_message":"Too many operations"}

I have attempted to increase the limits in dtable_server_config.json and dtable_web_settings.py , but without any luck. I believe this error message appears after making 700 requests.

Can someone guide me if there is any other configuration file that I need to modify to increase or disable this limit? Any help would be appreciated.

Thank you.

I think you missed the limits in dtable-db.conf: dtable-db.conf - SeaTable Admin Manual

Check for example: row_update_limit.

Here is another hint. If you update so many rows at the same time, you might want to use the “batch update row” endpoints. Here is the python call for that:

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.