Playground code:
import simd
let test = int4(1,2,3,4) // this works
let x = 1
let test2 = int4(x,2,3,4) // doesn't work (nor does let x: Int = 1)
let y: Int32 = 1
let test3 = int4(y,2,3,4) // works
It's clear that int4
expects Int32
values, but in the first case it seems able to figure it out without explicitly specifying the type of Int, but in the second case (when the integer is first stored as a separate variable) it doesn't.
Is this expected behaviour in Swift?
At a first glance, this looks like expected behavior.
When you specify
the integer literals there are implicitly initialized as Int32 types. When you just do a
x by default has a type of Int. As a safety measure, Swift doesn't do implicit type conversion on integers and floating point types. You would need to explicitly cast it to an Int32 to get this to work:
which is why your third example works. Rather than relying on type inference to set what
y
is, you give it the Int32 type and the types now align.