I've been playing with some of the new features in C#, one of which being the var keyword. Which is not dynamic typing, it is just type inference. Meaning that the compiler will be assigning the type based on the value that you initialise the variable with when declaring it. And you have to initialise it with an actual value when using var.
After playing a bit, I discovered this little problem, which is probably just an incorrect assumption from my side, but I'm sure I'm not the only one that will make this assumption. If you look at this code sample.
var value1 = 5;
var value2 = 42;
var result = value2 / value1;
Console.WriteLine("The value of result is " + result);
Console.WriteLine("value1 is of type " + value1.GetType().Name);
Console.WriteLine("value2 is of type " + value2.GetType().Name);
Console.WriteLine("result is of type " + result.GetType().Name);
value1 is initialised using an int/Int32 value, value2 is initialised with an int/Int32 value, which infers that both their types would be Int32 when queried. Now the result variable, in my mind, is initialised with a double, after all, 42 divided by 5 gives you 8.4, but in this case, because you're dividing an int by an int, the result is given as an int, and the answer is shortened to 8 by default.
To make sure that you get the correct answer, you either have to initialise value2 with a double value as in
var value2 = 42.0;
or, you have to explicitly declare result as a double, as in
double result = value2 / value1;
either which way will give you a value of 8.4, and a type for result as double.
So, being lazy could bite you if you're not careful.