Those who know C++ may know what I mean by 'unity build':
- *.cpp files of a project are all effectively #include-ed into a single supermassive source file following #include directives specified in *.cpp and *.h files
- this source file is fed into the compiler
- finish! You get the output binary!
Doing things this way means that there are that there are fewer intermediate files (*.o), fewer file reads and disk IO overheads, fewer invocations of the compiler, leading to a better build performance.
My question is, is this possible for Latex at all? I want it because there is a slow post-processing pass that I would like to run over .tex files before building my final .pdf using pdflatex. Currently, it takes around 7 seconds to process my growing list of .tex files. I believe that running this pass over one file is significantly faster. This motivates my question!
To summarize, I want to
- 'merge' all the .tex files into a supermassive .tex source file by following the \input{} and \include{} macros in each .tex file
- feed the supermassive .tex source file into the slow post-processing pass (actually the Ott tex-filter, fyi)
- pipe the output straight into pdflatex
- finish! I get the output PDF file!
The first step is the problem here. Any ideas welcome. It's best if I don't need to write my own script to do this step!
Many thanks!
A good tool that can handle this is rubber, with the help of its
combine
module. It will gather all dependencies, and produce a single file ready for consumption.