Given a template T(uint N), int values for N are implicitly cast to uint where N is used within the template body. However, the instantiation of T!(n) precedes the coercion of n to uint such that the template instances T!(1) and T!(1u), and any types defined within them, are distinct. I believe this should not be the case. Reproduced with DMD 1.046 under Linux and LDC r1522 under Solaris. Test case and compiler output below: --- module Test; struct Test(uint N) { int x; Test!(N) f(Test!(N) rhs) { return Test!(N)(x+rhs.x); } } void main() { Test!(1) x; x.f(x); } --- Test.d(16): Error: function Test.Test!(1).Test.f (Test!(1u)) does not match parameter types (Test!(1)) Test.d(16): Error: cannot implicitly convert expression (b) of type Test!(1) to Test!(1u)
*** This issue has been marked as a duplicate of issue 3467 ***