Swift is a type-safe language but it also supports type inference.
This means that when you don't explicitly specify the type of a variable, the compiler tries to guess what type it is. And sometimes it guesses wrong. (Well, not much wrong as it doesn't have enough context to determine what you really want.)
In the example:
print(1 + 3.0)
the compiler assumes that 1
is an Int
. But then it encounters 3.0
, which is a Double
rather than an Int
. So it backtracks and assigns the type Double
to 1
and your print
statement prints the result of adding two Double
s.
Here:
let x = 1
let z = 3.0 + x
print(z)
the compiler decides that x
is an Int
and then moves on the next statement. But then it sees z = 3.0 + x
, where 3.0
is a Double
literal, and realizes you are trying to add a Double
and an Int
, which you cannot do, so it chokes. The compiler cannot figure out that x
should be a Double
because it has already decided it should be an Int
and continued processing your code.
With:
let x = 3.0
let z = x + 1
print(z)
x
is declared as a Double
(because you assigned the literal 3.0
). z
then becomes Double
plus Int
, except that, as in the first example, the compiler is able to figure out that 1
should be a Double
instead of an Int
so z
is assigned the result of adding two Double
s.
The difference between the second and third examples is that the compiler has contextual clues that x
should be a Double
in the latter but lacks those clues in the former.