At the moment a new catalog is generated every time so there are a lot of files to upload.
We have been thinking of ways to minimise the overheads. The first simple optimisation that was tested was to assume if a files size was unchanged it did not need copying. This worked for the vast majority of the files, but not all - and the ones that went wrong were often important ones. A shame as it really made the copy phase run a lot faster.
One option that is possible is to do a generate time check to check each generated file to see if the new file has identical contents to the old one, and if so leave the old one in place. This would add a significant cost at generate time but might well be more than gained back in such a scenario by reduced upload time. Definitely something to think about.
These sots of optimisations can easily be ignored on small libraries but become more valuable as the libraries get larger.