We have a base with one table with an unusually large number of active rows. (Over time, most of these rows will be moved to Big Data, but as we work through them, this is the situation.)
We were able to change the config to allow the necessary number of rows, and our server and clients are powerful enough to handle the the table - but - as we started populating these rows with more data we quickly hit the hard 200MB file size limit that doesn’t allow changes to sync with the server.
Is there any way we can raise that 200MB cap? I understand all the reasons why it may not be advisable to do so. We would still like to try to see if it can get us over the finish line to where we can archive all this data.