I recently experienced "Catastrophic Failure" on a large .xlsm file. Its size was >165 MB. It was so damaged that I could not view or edit VBA and I could not copy any worksheet to a new workbook.
I manually recreated it. First, I built all the Data Connections to establish their sheets which were required by the "working" worksheets. Then, I manually rebuilt each working sheet, copying the headers and table names (I created tables with 1 header and 1 data row in each), and copied the formulas' source code cell-by-cell into the tables (hurrah for tables automatically applying formulas to entire columns). I had the foresight to periodically save my VBA modules and was able to import them. It took 2 days to rebuild the workbook (30 worksheets).
I refreshed all data connections, and populated each working sheet with at least 30 rows of key values to test the formulas and the resulting exports. Everything works and much faster too. When done, my new workbook was only 16 MB!
That is an incredible level of bloat. Can anyone explain how and why it got so bad? Can anyone offer suggestions of how to avoid such trouble in the future?