Type inference, don’t get bitten by var

I've been playing with some of the new features in C#, one of which being the var keyword.  Which is not dynamic typing, it is just type inference.  Meaning that the compiler will be assigning the type based on the value that you initialise the variable with when declaring it.  And you have to initialise it with an actual value when using var.

After playing a bit, I discovered this little problem, which is probably just an incorrect assumption from my side, but I'm sure I'm not the only one that will make this assumption.  If you look at this code sample.

            var value1 = 5;
            var value2 = 42;
            var result = value2 / value1;

            Console.WriteLine("The value of result is " + result);
            Console.WriteLine("value1 is of type " + value1.GetType().Name);
            Console.WriteLine("value2 is of type " + value2.GetType().Name);
            Console.WriteLine("result is of type " + result.GetType().Name);

value1 is initialised using an int/Int32 value, value2 is initialised with an int/Int32 value, which infers that both their types would be Int32 when queried.  Now the result variable, in my mind, is initialised with a double, after all, 42 divided by 5 gives you 8.4, but in this case, because you're dividing an int by an int, the result is given as an int, and the answer is shortened to 8 by default.

To make sure that you get the correct answer, you either have to initialise value2 with a double value as in 

var value2 = 42.0;

or, you have to explicitly declare result as a double, as in

double result = value2 / value1;

either which way will give you a value of 8.4, and a type for result as double.

So, being lazy could bite you if you're not careful.

3 thoughts on “Type inference, don’t get bitten by var”

  1. The problem is that you are doing integer division and the result of integer division is an integer. The problem is not with the keyword.

    You will get the same problem if you had to do the following
    int x = 5;
    int y = 42;
    double result = y / x;

    var x = 5;
    var y = 42;
    double result = y / x;

    The way to solve this is to cast one of the variables to a double
    int x = 5;
    int y = 42;
    var result = y / (double)x;

    The result of a int/double division is a double.

    I hope this helps 🙂

  2. @Pieter, I figured that one out easily, I’m complaining that it’s not natural. When doing the division in your head, you naturally know that the result will not be an integer…

  3. Quoting "SAMS- ASP.NET 3.5 Unleashed" (page 906): in many circumstances when using LINQ to SQL, you won’t actually know the name of the type of a variable, so you have to let the compiler infer the type.
    I’m just surprised that a strongly typed language such as C# starts using these "lazy" features.

Leave a Reply to baldy Cancel reply

Your email address will not be published. Required fields are marked *