to obtain float result division of 2 int
s, static_cast
1 of operands float
, so:
int = 2; int b = 3; float c = static_cast<float>(a) / b; // c = 0.666666 float d = / static_cast<float>(b); // d = 0.666666
in above case, shouldn't matter operand static_cast
ed. however, suppose 1 of operands compile-time constant, while other not, so:
int = foo(); // value not available @ compile-time. const int b = some_constant; // compile-time constant.
does compiler optimization make difference 2 static_cast
s, described below?
float c = static_cast<float>(a) / b;
in case, compiler can replace b
value, since a
isn't known, cast can happen @ runtime.
float d = / static_cast<float>(b);
however, in case, compiler knows b
, casting @ compile-time, , replace b
directly float
value.
in both cases, after casting, integer/float (or float/integer) division happens @ runtime.
is intuition correct, or can compilers smart enough optimize equally in both cases? there other factors have overlooked?
no int/float
or float/int
divisions happen @ runtime. ever.
since 1 operand being cast - explicitly converted - float
, other implicitly converted float
division.
both cases equivalent to
static_cast<float>(a) / static_cast<float>(b);
Comments
Post a Comment