Post by Marco van de VoortPost by Martin BrownPost by Marco van de VoortPost by Martin Brownand seen it find obscure faults in seldom travelled paths (usually but
not always the error handling for never previously encountered errors).
Free Pascal has a "poison variables" option that initializes local variables
to random parameters. As a concept very powerful, specially in combination
with range checks (the randomized variables nearly immediately fail on
rangechecks when used as array index or if they are enumerations)
I am quite hardline on this I favour executing code in an environment
where a memory fetch from a location that has not been declared as a
hardware input or previously written to should be an immediate trap.
Interesting view of course. It's interesting how you implement it.
I know I nice test, allocate a 1GB array, and then access every other item
:-)
There would obviously be an overhead in any software implementation.
On hardware assisted parity memory it is conceptually easy - set all of
the memory to an invalid parity state before you start execution.
Any fetch from an unitialised location generates a parity error trap. So
on the right hardware it could be made to work at full speed and would
catch a lot of very common errors immediately.
Floating point is also easy you store a pattern that is an immediate
fault, on some architectures you can store a pattern that is both an
invalid FP and an invalid pointer - a variant on your poison variables.
The other way which is more painful is to have a virtual machine that
keeps a bit array for has been written to and is checked on every fetch.
Major performance hit but you get aggressive error detection.
I much prefer static analysis on a cost benefit basis.
Post by Marco van de VoortPost by Martin BrownPost by Marco van de VoortOf course it is a runtime solution though, so it suffers from the codepaths
never travelled problem.
That is the huge advantage of static analysis.
Yes. But poisoning locals is magnitudes more simple. Same for the other
I disagree. Most compilers now have the model of dataflow internally but
fail to pass comment when variables appear to be on shaky ground. This
makes no sense at all when CPU cycles are so cheap and skilled human
ones so expensive. The tools should be watching out for errors and not
just compiling what they are given to wrong machine code.
Post by Marco van de VoortPost by Martin BrownThe compiler has to do most of the work anyway so if in the process it
can spot any logically distinct path where a variable remains
This is still about Modula2 isn't it? The modular system might result in
the complete program never being in memory in a compiler controlled
representation. (only some modules and definitions of the other)
IOW, this is a feature (reducing the program) at the global level.
OTOH if the modules are proven to be internally consistent and always
deterministic on their stated inputs this is an advantage.
Post by Marco van de VoortStatic analysis is not perfect anyway, since the uncertainty of external
input propagates through the system, making more and more of the system
impossible to analyse.
It will thus only uncover a subset of problems.
All tools can only find a subset of problems. But it is madness that in
this day and age static analysis is not more widely used.
Post by Marco van de VoortPost by Martin Brownuninitialised then it is far better to hard fail at that stage than to
leave a potential landmine waiting for someone to step on it.
Like always, the best is a combination of techniques, since all have
downsides.
Agreed.
--
Regards,
Martin Brown