Long compile times in Visual C++ 2010 with large static arrays

2.6k views Asked by At

We have a C++ project in which there are several large static data tables (arrays of structs) generated by an preprocessing tool and compiled into our project. We've been using VC++ 2008 up to now, but are preparing to move to 2010, and these data tables are suddenly taking a very long time to compile.

As an example, one such table has about 3,000 entries, each of which is a struct containing several ints and pointers, all initialized statically. This one file took ~15 seconds to compile in VC++ 2008, but is taking 30 minutes in VC++ 2010!

As an experiment, I tried splitting this table evenly into 8 tables, each in its own .cpp file, and they compile in 20-30 seconds each. This makes me think that something inside the compiler is O(n^2) in the length of these tables.

Memory usage for cl.exe plateaus at around 400 MB (my machine has 12 GB of RAM), and I do not see any I/O activity once it plateaus, so I believe this is not a disk caching issue.

Does anyone have an idea what could be going on here? Is there some compiler feature I can turn off to get back to sane compile times?

Here is a sample of the data in the table:

//  cid (0 = 0x0)
{
    OID_cid,
    OTYP_Cid,
    0 | FOPTI_GetFn,
    NULL,
    0,
    NULL,
    (PFNGET_VOID) static_cast<PFNGET_CID>(&CBasic::Cid),
    NULL,
    CID_Basic,
    "cid",
    OID_Identity,
    0,
    NULL,
},

//  IS_DERIVED_FROM (1 = 0x1)
{
    OID_IS_DERIVED_FROM,
    OTYP_Bool,
    0 | FOPTI_Fn,
    COptThunkMgr::ThunkOptBasicIS_DERIVED_FROM,
    false,
    NULL,
    NULL,
    NULL,
    CID_Basic,
    "IS_DERIVED_FROM",
    OID_Nil,
    0,
    &COptionInfoMgr::s_aFnsig[0],
},

//  FIRE_TRIGGER_EVENT (2 = 0x2)
{
    OID_FIRE_TRIGGER_EVENT,
    OTYP_Void,
    0 | FOPTI_Fn,
    COptThunkMgr::ThunkOptBasicFIRE_TRIGGER_EVENT,
    false,
    NULL,
    NULL,
    NULL,
    CID_Basic,
    "FIRE_TRIGGER_EVENT",
    OID_Nil,
    0,
    NULL,
},

//  FIRE_UNTRIGGER_EVENT (3 = 0x3)
{
    OID_FIRE_UNTRIGGER_EVENT,
    OTYP_Void,
    0 | FOPTI_Fn,
    COptThunkMgr::ThunkOptBasicFIRE_UNTRIGGER_EVENT,
    false,
    NULL,
    NULL,
    NULL,
    CID_Basic,
    "FIRE_UNTRIGGER_EVENT",
    OID_Nil,
    0,
    NULL,
},

As you can see, it includes various ints and enums as well as a few literal strings, function pointers and pointers into other static data tables.

5

There are 5 answers

1
Alan Stokes On BEST ANSWER

Might be worth turning off all optimisation on this file (it's not going to buy you anything anyway) in case it's the optimiser that is going N^2.

2
Ben Voigt On

I've seen (can't remember where) a technique for converting large static data directly into object files. Your C++ code then declares the array as extern, and the linker matches the two together. That way the array data never undergoes a compilation step at all.

The Microsoft C/C++ tool CVTRES.exe worked on a similar principle, but it didn't generate symbols, but a separate resource section that needed special APIs to access (FindResource, LoadResource, LockResource).

Ahh, here's one of the tools I remembered finding: bin2coff The author has a whole bunch of related tools


Alternatively, you can try to reduce the dependencies, so that the particular source file never needs recompilation. Then minimal rebuild will automatically use the existing .obj file. Maybe even check that .obj file into source control.

0
Mark Latham On

You can try turning off Pure MISL CLR support in the C/C++ settings. Worked for me.

1
Janosch On

Try to make your array static const, this reduced compile-time (but not filesize) in a similar case i have seen down to the intangible.

1
Marco On

I had this very same problem. There was a const array of data that had approx 40'000 elements. Compile time was about 15 seconds. When I changed from "const uint8_t pData[] = { ... }" to "static const uint8_t pData[] = { ... }" compile time went down to less than 1 second.