import std.stdio; class Class { void set(int a, double b, void delegate(int revents) callback) { writefln("(callback !is null) %s", (callback !is null)); } } void main() { auto c = new Class; c.set(1, 1, null); // prints: (callback !is null) true c.set(1, 1, null); // prints: (callback !is null) false } ----- Very strange. It dependent on previous argument number and types. For example if they both has type int, it work correct. DMD64 D Compiler v2.066.1 Linux desktop 3.13.0-45-generic #74-Ubuntu SMP Tue Jan 13 19:36:28 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
Starting program: /work/research/test [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Breakpoint 1, test.Class.set() (this=0x7ffff7ed8ff0, callback=..., b=1, a=1) at /work/research/test.d:7 7 writefln("(callback !is null) %s", (callback !is null)); (gdb) print callback $1 = {ctxptr = 0x0, funcptr = 0x3ff0000000000000} (gdb) ------- This is debug output that I took.
This seems to be a heisenbug in the backend. It disappears when optimizations are enabled. Looking at the assembly, the compiler doesn't clear RDX for the first call to test.Class.set, but does for the second call, resulting in garbage in RDX for the first call. Still occurs with recent git version.
I can't reproduce this. I suspect it was a duplicate of one of the many 64-bit ABI bugs fixed in the last couple of years.