Increasing the file upload limit

My Setup:
It’s running on coolify with docker-compose on VPS server Intel AMD Ubuntu 24.04.2 LTS.

Describe the Problem/Error/Question:

It works perfectly fine, but i cannot upload large files into the tables. I’ve tried adding environment variables in the dtable_web_settings.py and in the dtable_server_settings.json file according to some other similar questions I found in this forum but it doesn’t seem to help. I also tried the general limitations page: System Limitations - SeaTable Admin Manual

dtable_web_settings.py:
image

dtable_server_settings.json:

Coolify .env file for docker-compose:
image

I kind of just tried anywhere that might work lol, but none did change anything.

Here’s my trying it again showing a smaller file upload vs larger .mp4 file: https://zipline.kaiyerlab.com/u/hvbFXT.webm

When I try to upload a smaller file (e.g. a gif) it will upload fine. But a larger video file doesn’t upload, and no error no notice is shown either.

Error Messages:

No actual error messages show as you can see in the link provided of the screen recording. Also the console shows no errors either, it seems to just not understand the mp4 upload at all.

So update on this issue: Smaller videos will load.

Larger videos like 2GB + will not.

Edit: testing to see where it’s cutting off. I’ll try a 1GB video next.

Edit 2: Also looking into if I can connect this to s3 perhaps to solve this issue and save server file space.
Edit 2 continued … s3:
So I found this:

I am currently trying to get it s3 like this to work. I got all the details except for “HOST” variable value, I don’t know what I am supposed to use for that?

Please try the variable

[fileserver]
#Set maximum upload file size to 200M.
#If not configured, there is no file size limit for uploading.
max_upload_size=200

in seafile.conf.

I will try but that looks like it’s going to do the opposite of my goal right? Isn’t that setting a limit since it says “#If not configured, there is no file size limit for uploading.”

My goal would be to either have no file size limit or a very large one like 5GB. Hence the confusion over why I would be setting a file size limit considering the comments in your example.

Results: I tested this with

max_upload_size=5000
(for 5GB)

I then tried to upload a small video and it worked fine but larger videos 2GB still cannot upload and I get no error message. It just doesn’t do anything.

Edit: Also I am still using s3 but I could only get S3 to work with the table backups and not with the file uploads. I have not yet switched it back to file system for storage. I would hope to get s3 working but if allowing a larger file size means I need to use filesystem storage instead I’m happy to do that but I cannot get it work with either storage method actually.

max_upload_size=200

does not fix your issue. I simply pasted the part of the configuration and forgot modifying it. Sorry, my bad. Setting the variable value to 5000 is correct.

The settings is independent of the storage backend use.

Did you do

cd /opt/seatable-compose
docker compose down
docker compose up -d

after modifying the config file?

You don’t find anything in the browser console or log files?

Can you check your nginx.conf for the client_max_body_size directive?

Yes, I restarted docker after. I even tried the /seatable.sh restart method nothing work to allow large file uploads.

I tried changing all the client_max_body_size in the nginx.conf to 5000M and that didn’t help either.

Still only smaller files upload, including smaller videos. The large video doesn’t though and no error in console it just literally behaves like nothing was uploaded still

No error messages in the server logs?

No, but I tried it in a different browser (I went from Brave to Chrome) and I now see this in the console even on smaller video upload fails:

"test/?tid=0000&vid=0000:72 django-pwa: ServiceWorker registration successful with scope: https://REDACTED.REDACTED.com/
index.iife.js:1 content script loaded
commons.3968bdd9.js:2

       POST https://REDACTED.REDACTED.com/seafhttp/upload-api/3444abe5-0adf-4747-bd2f-2ea29073e28c?ret-json=1 502 (Bad Gateway)

(anonymous) @ commons.3968bdd9.js:2
xhr @ commons.3968bdd9.js:2
rt @ commons.3968bdd9.js:2
_request @ commons.3968bdd9.js:2
request @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
uploadImage @ viewDataGrid.9f85873d.js:1
uploadFile @ viewDataGrid.9f85873d.js:1
(anonymous) @ viewDataGrid.9f85873d.js:1
Promise.then
(anonymous) @ viewDataGrid.9f85873d.js:1
createPromise @ viewDataGrid.9f85873d.js:1
uploadFilesInBatch @ viewDataGrid.9f85873d.js:1
p @ viewDataGrid.9f85873d.js:1
(anonymous) @ viewDataGrid.9f85873d.js:1
FileReader
handleFilesChange @ viewDataGrid.9f85873d.js:1
uploadFilesChange @ viewDataGrid.9f85873d.js:1
je @ commons.3968bdd9.js:2
Xe @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Sn @ commons.3968bdd9.js:2
Rn @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Be @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Pn @ commons.3968bdd9.js:2
qt @ commons.3968bdd9.js:2
zt @ commons.3968bdd9.js:2
t.unstable_runWithPriority @ commons.3968bdd9.js:2
Ko @ commons.3968bdd9.js:2
Me @ commons.3968bdd9.js:2
Zt @ commons.3968bdd9.js:2
viewDataGrid.9f85873d.js:1 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading ‘name’)
at viewDataGrid.9f85873d.js:1:2775540
at Array.findIndex ()
at u.findUploadFileIndex (viewDataGrid.9f85873d.js:1:2775504)
at onFileUploadFailed (viewDataGrid.9f85873d.js:1:2775355)
at viewDataGrid.9f85873d.js:1:1215936
(anonymous) @ viewDataGrid.9f85873d.js:1
findUploadFileIndex @ viewDataGrid.9f85873d.js:1
onFileUploadFailed @ viewDataGrid.9f85873d.js:1
(anonymous) @ viewDataGrid.9f85873d.js:1
Promise.catch
(anonymous) @ viewDataGrid.9f85873d.js:1
Promise.then
(anonymous) @ viewDataGrid.9f85873d.js:1
createPromise @ viewDataGrid.9f85873d.js:1
uploadFilesInBatch @ viewDataGrid.9f85873d.js:1
p @ viewDataGrid.9f85873d.js:1
(anonymous) @ viewDataGrid.9f85873d.js:1
FileReader
handleFilesChange @ viewDataGrid.9f85873d.js:1
uploadFilesChange @ viewDataGrid.9f85873d.js:1
je @ commons.3968bdd9.js:2
Xe @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Sn @ commons.3968bdd9.js:2
Rn @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Be @ commons.3968bdd9.js:2
(anonymous) @ commons.3968bdd9.js:2
Pn @ commons.3968bdd9.js:2
qt @ commons.3968bdd9.js:2
zt @ commons.3968bdd9.js:2
t.unstable_runWithPriority @ commons.3968bdd9.js:2
Ko @ commons.3968bdd9.js:2
Me @ commons.3968bdd9.js:2
Zt @ commons.3968bdd9.js:2
"

I tried this also and it didn’t work:

        client_max_body_size 5000m;
        proxy_request_buffering off;

Actually, which log files in logs directory do you recommend checking for this specific issue?

On my server version: I’ve since stopped with s3 and gone back to the filesystem method to try to simplify the moving parts to find the issue to the file upload before adding s3 to the mix.

On my local version: I am experimenting with a fresh install on my localhost version also and no luck.

Do you allow bounties on this forum? I’d consider paying someone to fix this once and for all. Its literally the only problem stopping me from sticking with seatable for my business but I cannot continue to dedicate so much time to trying to get large file attachments to work.

Hey,
I will have a look at this topic today.
Best regards
Christoph

The Analysis

In the default setup/configuration of SeaTable 5.2.7, there is no limitation in the upload file size by nginx, caddy or any SeaTable component.

The problem is the upload window, that SeaTable provides. It seems that this window is not capable to deal with files bigger than 1.5 GB.


The Workaround

But there is a workaround. Instead of using the upload window of the file column, you can upload a file via the “file mgmt”.

After the file is uploaded, you can assign the file to the file column.

I successful tested this approach with a 2.1 GB and 5 GB file.

Alternative

In general, I would say, that SeaTable is not optimized dealing with bigger files. SeaTable is good for structured data. Just one example: as soon as your base has assets bigger than 100MB, exporting your base becomes difficult…

I would recommend, that you upload the files somewhere else and just add the URL to the file in your base.

1 Like