I usually put them at the top. If it’s still confusing, the function is probably too big / complex and could be split up. The most important things for me when reading code are: a) where did that variable come from? and b) what changes this variable? Narrow scope and smaller functions make both of those things easier to determine
So, it’s a timing thing then, right? Isn’t that what heisenbugs generally are?
Not necessarily timing, could also be concurrency or just global state being mutated in an unexpected way.
I hear you… but then I don’t want to derail this thread
Yep, as @kisielk says it could be other things, I’m not sure what the internals of the AVR32 are, but there may be interrupts generated on the serial and USB ports, and thus we might be altering the sequence of events.
In any case, the Atmel USB framework expects you to wait until it’s told you via callback that the USB stick is available. We’re not doing that now, so the fix is really to rewrite the code to do it that way, which will have the added benefit of also allowing saving and loading at any time and not just at boot.
a) I started programming before C allowed variables to be declared anywhere but the top of the function (or block), so by old habits I put everything at the top. This may not seem like much of a reason, but your code will be portable to extremely old compilers, which can be common in the embedded world.
b) A great rule of thumb is to make functions small enough that they fit on the page in your editor and can be grokked in a single viewing. In this sense, you might as well put them at the top because there isn’t going to be much of a “middle” in the function anyway.
Elder, yea. But the modern way (4.2BSD, POSIX.1) is source first.
void bcopy(const void *s1, void *s2, size_t n);
The bcopy() function copies n bytes from string s1 to string s2. The two
strings may overlap. If n is zero, no bytes are copied.
Ah, I think we’d have to disagree on this point. Smallest scope possible unless you have a good reason. IMO there is a reason why it was added to the standard.
I can’t see myself wanting to work with proprietary C compilers anyway, as I’m not paid to do this.
Just need to put this out there.
Screw you C
Had some weird bugs with the teletype beta. If I add
-fno-common to the compiler flags I get a few errors in
libavr32, will discuss them in the 2.0 beta thread once I get some time to write it up.
Anyway once more with feeling.
Screw you C
Teletype 2.0 beta (release candidate 2 released 13th July 2017)
Really, really awesome tip! Seems that multiple definitions can arise when you forget to include the extern keyword in global variable declarations inside a header, when that header is #include-ed by several .o files.
Kind of makes me realise I still don’t fully understand all the implications/ramifications/pitfalls of variable declaration in C:
- inside header, inside .c file or both?
- what’s the ‘default’ behaviour when you don’t declare extern/static?
- program compiled via .o files, vs all .c files compiled to executable in one go?
- rule of thumb for when to use the volatile keyword?
- anything else that’s gone under the radar? (e.g this multiple definition thing)
Some bedtime reading/revision:
There are a number of murky and convoluted backwaters left unexplored on grounds of sympathy and compassion for the sufferer, and some without any better home. This chapter gathers them together—it’s the toxic waste dump for the nasty bits of C.
That I knew about.
This is what happened to me… I had 2 global variables with the same name and compatible types in
.c files only!
bool dirtyin one
uint8_t dirtyin another
Somehow it turns out that these should refer to the same location in memory and are the same variable. This is the ‘common model’, which can be disabled with
--fno-common. The same would apply if they were both the exact same type too (I had quite a few of them too).
We think of header files are somehow being special, when they aren’t really part of the compiler as such, but more part of the preprocessor. The compiler/linker sees 2 variables with the same name and type but no
extern qualifier and has to make a guess as to what to do, by the time you’re at the linking stage the preprocessing stage is long gone.
Nonetheless, I can’t think of any good reason why a programmer would deliberately choose to declare
extern variables in this way, rather it’s that they’ve forgotten the
static qualifier. IMO when the compiler is faced with such an ambiguity it should print out a warning.
For the life of me I can’t remember the thought process that lead me to start Googling “same variable name in different c files”, especially as there is no debugger available. I do remember having to persevere and try multiple permutations of the question before I got some results.
Afaik if you have no definition, declarations are extern by default. That’s because there’s no definition there, so it must assume it’s defined elsewhere unless told otherwise.
Going by the excellent cppreference.com (from the C language section), there appears to be a ‘tentative’ variable declaration, so:
int i = 3is assumed to be
int iis assumed to be ‘tentative’
Quoting from the same:
A tentative definition is a declaration that may or may not act as a definition. If an actual external definition is found earlier or later in the same translation unit, then the tentative definition just acts as a declaration.
It’s definitely a bit confusing. Or at least I find it so.
The GCC docs for
-fno-common are also useful.
In C code, this option controls the placement of global variables defined without an initializer, known as tentative definitions in the C standard. Tentative definitions are distinct from declarations of a variable with the extern keyword, which do not allocate storage.
Unix C compilers have traditionally allocated storage for uninitialized global variables in a common block. This allows the linker to resolve all tentative definitions of the same variable in different compilation units to the same object, or to a non-tentative definition. This is the behavior specified by
-fcommon, and is the default for GCC on most targets. On the other hand, this behavior is not required by ISO C, and on some targets may carry a speed or code size penalty on variable references.
-fno-commonoption specifies that the compiler should instead place uninitialized global variables in the data section of the object file. This inhibits the merging of tentative definitions by the linker so you get a multiple-definition error if the same variable is defined in more than one compilation unit. Compiling with
-fno-commonis useful on targets for which it provides better performance, or if you wish to verify that the program will work on other systems that always treat uninitialized variable definitions this way.
There is more information out there if you you Google “-fno-common”, there are some links that discuss what the linker does and the
Personally I think I will enable
-fno-common on all my C projects (not that I have any).
(FYI, C++ doesn’t have this issue.)
Yet another reason to use C++ rather than some antiquated language
as i am trying to understand aleph code (ops to begin with ; and specifically the arrow (->) operator) i’ve found this book by Jens Gustedt : Modern C. i thought it belongs here.
i’ll be curious to know your opinions about it.
(this link was in a site dedicated to ‘‘open source baremetal coding resources for ARM Cortex-M’’. here : http://asm.thi.ng/)
dsp advice, not c specific:
This can’t be understated! The other factor is that a second set of eyes will see more than the first because they’re exploring where the first was trailblazing. It’s hard not to look at a new, functional piece of code and miss the forest for the trees.
I read that just the other day! Started writing a heap allocator until I saw a very basic one (but still far better than I could do) on hacker news. Been planning on adding it inside an #ifdef block where if the custom one isn’t defined, it just aliases to malloc & free.
Although not C, I did a JS port of Gen (http://charlie-roberts.com/genish) and had to write my own allocator for it. All the existing JS memory management libraries performed safety checks before letting you write to a given block… which was just too slow in an audio callback.