import std.stdio; string s(string t1, string t2){ return "writeln(\"" ~ t1 ~ " " ~ t2 ~ " \"~ typeof(true ? cast(" ~ t1 ~ ") 1 : cast(" ~ t2 ~ ") 1).stringof);\n"; } void main() { mixin(s("ifloat","double")); } result of above code doesn't make sense to me. if something like auto a = true ? 1i : 1; is allowed, it seems the result type should be complex; not imaginary, and not real. I'm also curious why combining a char with a wchar or some such results in a uint
char and wchar both implicitly convert to int, so that is expected behavior. The common type of float and ifloat etc should of course be cfloat. etc https://github.com/D-Programming-Language/dmd/pull/198
Imaginary and complex numbers are going away.
Closing as per Don's comment on Daniel's pull request ( https://github.com/dlang/dmd/pull/198#issuecomment-1492958 ), and because the built-in complex number types are going away ( http://dlang.org/deprecate#Imaginary%20and%20complex%20types ).