Fix Int/BigInt pivot
We've been wrongly representing large ints as BigInt, causing them to behave differently in the VM through builtins like 'serialise_data'. Indeed, we expect anything that fits in 8 bytes to be encoded as Major Type 0 or 1. But we were switching to encoding as Major type 6 (tagged, PosBigInt, NegBigInt) for much smaller values! Anything outside of the range [-2^32, 2^32-1] would be treated as big int (positive or negative). Why? Because we checked whether a value i would fit in an i64, and if it didn't we treated it as big int. But the reality is more subtle... Fortunately, Rust has i128 and the minicbor library implements TryFrom which enforces that the value fits in a range of [-2^64, 2^64 - 1], so we're back on track easily.
This commit is contained in:
7
examples/acceptance_tests/094/aiken.lock
Normal file
7
examples/acceptance_tests/094/aiken.lock
Normal file
@@ -0,0 +1,7 @@
|
||||
# This file was generated by Aiken
|
||||
# You typically do not need to edit this file
|
||||
|
||||
requirements = []
|
||||
packages = []
|
||||
|
||||
[etags]
|
||||
3
examples/acceptance_tests/094/aiken.toml
Normal file
3
examples/acceptance_tests/094/aiken.toml
Normal file
@@ -0,0 +1,3 @@
|
||||
name = "aiken-lang/acceptance_test_094"
|
||||
version = "0.0.0"
|
||||
description = ""
|
||||
9
examples/acceptance_tests/094/lib/foo.ak
Normal file
9
examples/acceptance_tests/094/lib/foo.ak
Normal file
@@ -0,0 +1,9 @@
|
||||
use aiken/builtin
|
||||
|
||||
test u32_boundary_down() {
|
||||
builtin.serialise_data(0xdeadbeefdeadbeef) == #"1bdeadbeefdeadbeef"
|
||||
}
|
||||
|
||||
test u32_boundary_up() {
|
||||
builtin.serialise_data(-0xdeadbeefdeadbeef) == #"3bdeadbeefdeadbeee"
|
||||
}
|
||||
Reference in New Issue
Block a user