BSS Vs COMMON

michaeln32 wrote on Tuesday, January 02, 2018:

Hi
In the map file I get BSS section and COMMON section.

Can anybody please tell me what is the difference between those two ?

Thank you

Michael

richard_damon wrote on Tuesday, January 02, 2018:

This is really a tools question, and not a FreeRTOS question (so telling us what tool you are using could be helpful), but in general, the BSS section is used for variables which are not initialized, so the whole area can just be zeroed, while many tools, for historical reasons, call the region that is initialized as ‘COMMON’, so that area will have initialization information.

davidbrown wrote on Tuesday, January 02, 2018:

On 02/01/18 11:43, Michael Nesher wrote:

Hi
In the map file I get BSS section and COMMON section.

Can anybody please tell me what is the difference between those two ?

As Richard says, this is a tools question. But I can give some
information here, which might be of interest to people here. It has
nothing to do with FreeRTOS, but we all use C !

It comes from the way objects (variables) are declared in different C
files. (The explanation below is for common practice - tools can, in
theory, do something different.)

The standard way to define global (file scope, non-static) variables is
to have a single definition (“int x = 1;” or “int x;”) in one C file,
and to have as many external declarations (“extern int x;”) as you want

  • usually by putting the declaration in a header.

When you have an initialised definition (“int x = 1;”), the variable
gets allocated in the “.data” section. If you try to have more than one
initialised definition in two C files, you will get a linker error. So
far, so good.

When you use an uninitialised definition (“int x;”), it is called a
“tentative definition” and you can have repeated “int x;” definitions in
the same file, or at most one initialised definition. If there are no
other initialisations, this is treated much like “int x = 0;”, except
that the compiler will put the allocation in the “.bss” section to make
the “initialisation to 0” more efficient.

If you have uninitialised definitions (“int x;”) in more than one C
file, this is not allowed in standard C. However, it is something that
used to be common in older C code, and was accepted by compilers. The
way this was handled was the same as for Fortran (where it is allowed,
AFAIK). On seeing “int x;” without an initialiser, the compiler puts
“x” in a “.common” section. Each definition of “x” from different
translation units is merged, and the C startup code will clear it to 0.

As long as you are consistent, this works out okay. A single “int x;”
in one C file is fine. Having “int x;” in other files will work too, as
long as none of them have an initialiser. The problem comes if you are
inconsistent - like having a 4-byte “int x;” in one file and an 8-byte
“double x;” in another. The linker will allocate both “x” at the same
address in “.common”, and it will be either 4 bytes or 8 bytes (there is
no general way to specify which - a toolchain may pick the first size it
meets, the last size, the maximum, or whatever). Clearly the code has a
bug, and clearly it may go seriously wrong.

For that reason, you should disable “common” support if your toolchain
supports it. For gcc, it is the “-fno-common” flag. It will give you
better error checking by complaining if the same identifier is used in
more than one definition in the program. If you can’t disable “common”
support in your toolchain, then still be careful to avoid duplicate
definitions. The clue here is that every file-scope definition should
be either “static”, or have a matching “extern” declaration in the
header - /never/ put something like “int x;” in a header.