Translational Limits on Enum Constants

274 views Asked by At

I have a very specific question about the translation limits of C (as defined in the ANSI/ISO 9899:X standards family) regarding enumeration constants.

I have some thousand individually indentifyable data sources, I'd like to enumerate. Also I want to respect the minimal translational limits of the C-Standard, as the actual limits are implementation defined, and exceeding those is undefined behavior (see Is it undefined behavior to exceed translation limits and are there checker tools to find it?).

I know that there are translation limits on the number of enumeration constants within the same enum (C90: 127), the number of identifiers specified within the same block (C90: 127) and external identifiers within a translation unit (C90: 511).

I think enumeration constants do not have a linkage (please correct me), and surely I can place them out of block scope ... so puts any translation limit constraints to the following pattern (besides the limits of integral types of the target plattform, and of course the number of constants within one single enum) - and if so, why?

 MyEnumeration.h
 ---------------

 enum e1
 {
    VAL11 = 0,
    VAL12,
    /* ... */
    VAL_1N,
    END1 = VAL_1N
 };

 enum e2
 {
    VAL21 = END1,
    VAL22,
    /* ... */
    VAL_2N,
    END2 = VAL_2N
 };


/* ... */

 enum eN
 {
    VALN1 = ENDN_1,
    VALN2,
    /* ... */
    VAL_NN,
    ENDN = VAL_NN
 };

 #define NUM_ENUM ENDN

Note: Switching to #define won't help, as there are also translation limits on defined marco identifiers (C90: 1024). I would be forced to #undef in a complicated way, maybe with a complex #include pattern.

1

There are 1 answers

0
supercat On BEST ANSWER

There is no requirement that a compiler allow a programmer to define 511 different enum variables, each with 127 different value names, each with 31 characters. Even if names are stored in the absolutely optimal format, a compiler would still need about 1.5 megabytes to store all those--not exactly likely on a compiler that runs on a machine with 64K of total RAM and two 360K floppy drives [the source file defining all those names might be a lot less than 64K if the names are generated using macro expansions]. Note that while such a machine would have been on the small side in 1989, C was commonly used on machines that were even smaller, and the authors of the Standard did not want to forbid such implementations.

A good compiler will allow a certain amount of storage for identifiers, and will abort compilation if a program would exceed that limit (on systems which don't limit memory usage by individual programs, the compiler should set the limit high enough that no realistic program will hit it, but low enough that an evilly-written source file won't be able to crash the entire system. If the compiler is designed for systems with many megabytes or gigabytes of RAM, the limits suggested by the Standard shouldn't be a factor. There will be some limit, but there's probably no way to know what it is unless one hits it.