So, lately I've been running into some problems because of the different integer types used in several shims inside miri. The most prominent one has been
clock_gettime because the sizes of
time_t change with the processor architecture and miri crashes when running the tests in CI for 32 bits linux.
Is it there any way to know which is the target platform when interpreting a program?
you can ask for the type from libc I think
basically like we resolve constants
what do you mean by asking for the type?
are you telling me that if I do
ecx.resolve_path(&["libc", "long"])?.ty(tcx) that would give me the actual
oh sorry, I got lost in my tabs and forgot about this one
is what we use in clippy
Ohhh ok, I'll check it out
What happens with
Scalar::from_int(t, Size::from_bits(n)) when the size of the type of
t is larger/smaller than
For the future generations: It truncates
t removing the most significant bytes.
actually it should ICE if
t is too big for the given size
But it does not
but we have an assert right here:
@Christian Poveda what exactly did you test? I have seen that ICE several times, and was glad for it each time as it caught critical bugs.
oh wait @Christian Poveda what exactly do you mean by "the size of the type of
the type of
t does not matter
is that the size of t given by its type or just how many bits/bytes t uses?
it is always cast to
it will never truncate but if you have an
i64 that can be loslessly turned into an
i32 you are good
yeah that's the confusion
here we think of
t as an abstract integer, not a machine-integer with some bitwidth
yes got it
and we check if that abstract integer fits into the given size
it seems like a code smell though if you have a "too big" type there
as it might mean that ICE is reachable, which it never should be
yes so, for the particular case I'm using it, that would mean that this might blow up if the host platform returns Integers that are too large for the target platform
so do we want an ice happening there? because the other option would be to let those Integers overflow which might produce undesired results without erroring anywhere
throw_unsup sounds like the best reaction for that case
Ok then, I added the same check that
Scalar::from_int has but it errors instead.
I wanted to get the target's type max and min values and compare but I think it's not worth the trouble
In particular because that would need to cast the values again to i128 to compare in the first place