Compare commits

...

227 Commits

Author SHA1 Message Date
waalge 15a43d1a87 fix rust version 2023-09-27 22:25:55 +00:00
waalge cf7020c528 shortRev 2023-09-27 21:27:55 +00:00
waalge 873b39169e rev 2023-09-27 21:03:47 +00:00
waalge c925d09792 try patch 2023-09-27 20:54:04 +00:00
waalge a2359aeaa5 insert into rust 2023-09-27 20:32:25 +00:00
waalge 5f1d437b29 add rev 2023-09-27 19:13:21 +00:00
microproofs 1bcc9e8524 fix: expect on tuples from data now checks for no more items after the last 2023-09-26 12:49:50 -04:00
microproofs 8e75007a5f changelog: remove accidental duplicated Fixed 2023-09-25 21:16:19 -04:00
microproofs 38d15c677f Update changelog 2023-09-25 21:16:19 -04:00
microproofs 1ca3499128 chore: rename type 2023-09-25 21:16:19 -04:00
microproofs eb0b4dd6d8 update lock files 2023-09-25 21:16:19 -04:00
microproofs b8737a1021 add one more test for unbound generics 2023-09-25 21:16:19 -04:00
microproofs 534eb62a07 fix: There was a stack overflow due to passing unbound types to a function 2023-09-25 21:16:19 -04:00
microproofs 1cab479b81 fix: dependency hoisting for cyclic functions
Add more tests
2023-09-25 21:16:19 -04:00
microproofs 2f80d07132 fix: minor stuff including fixing the var name used in modify_cyclic_calls
and carefully controling the functions we add to sorted dependencies
2023-09-25 21:16:19 -04:00
microproofs f4310bcf33 feat: finished up mutual recursion
Now we "handle" vars that call the cyclic function.
That includes vars in the cyclic function as well as in other functions
"handle" meaning we modify the var to be a call that takes in more arguments.
2023-09-25 21:16:19 -04:00
microproofs ae3053522e feat: Update cyclic functions to be aware of being in a cycle.
Finish the creation of cyclic functions
The last part is to update vars that call into a function in the cycle
2023-09-25 21:16:19 -04:00
microproofs 794fc93084 remove unused structs 2023-09-25 21:16:19 -04:00
microproofs 0b38855ce4 add new enum for hoistablefunctions 2023-09-25 21:16:19 -04:00
microproofs ced818c455 checkpoint commit 2023-09-25 21:16:19 -04:00
microproofs 0fb9837ddf chore: change UserFunction to HoistableFunction to prepare for mututal recursion 2023-09-25 21:16:19 -04:00
KtorZ 984237075a Add new acceptance test scenario: 066
Mutua recursion.
2023-09-25 21:16:19 -04:00
microproofs 74b8ab62b2 chore: add comments 2023-09-25 21:16:19 -04:00
microproofs a4aa51ed2d WIP: first part of mutual recursion is done.
This involves creating the function definition and detecting cycles.
The remaining part is to "fix" the call sites
of the mutually recursive functions
2023-09-25 21:16:19 -04:00
Chris Gianelloni ecc5769c64 fix: restore static binary builds
Signed-off-by: Chris Gianelloni <cgianelloni@applause.com>
2023-09-20 16:25:45 -04:00
microproofs 5b018b7c07 test: add a test around a tuple of constructors when with many conditions 2023-09-20 16:20:42 -04:00
rvcas 4ca8681ca0
chore: commit example lock files 2023-09-20 13:26:49 -04:00
rvcas 1ecdf38842
fix: release 2023-09-20 13:03:38 -04:00
KtorZ ee4001d2c8
chore: Release 2023-09-20 18:03:46 +02:00
KtorZ 91d4cb9b12
Fix a date in the CHANGELOG for 0.17.0 2023-09-20 18:02:39 +02:00
microproofs 4650c64f6b update changelog 2023-09-20 11:51:01 -04:00
KtorZ f379039efc
Fix record shorthand causing parsing ambiguity in if/else expressions.
Fixes #735.
2023-09-15 09:41:00 +02:00
rvcas 1dea348a2e
chore: rust rover error 2023-09-13 21:29:05 -04:00
rvcas 7b915b7dcf
chore: allow clippy::arc_with_non_send_sync in tests 2023-09-13 19:07:45 -04:00
rvcas d808197507
chore: clippy fix 2023-09-13 18:17:59 -04:00
rvcas bc0824f4eb
chore: new aiken.lock files for examples 2023-09-13 18:17:40 -04:00
rvcas 9a4f181a0f
chore: clippy fix 2023-09-13 17:19:31 -04:00
KtorZ 06347c3efa Add CHANGELOG entry. 2023-09-13 17:17:32 -04:00
KtorZ c711a97e69 Throttle calls to package registry for version resolution
The 'HEAD' call that is done to resolve package revisions from
  unpinned versions is already quite cheap, but it would still be better
  to avoid overloading Github with such calls; especially for users of a
  language-server that would compile on-the-fly very often. Upstream
  packages don't change often so there's no need to constantly check the
  etag.

  So we now keep a local version of etags that we fetched, as well as a
  timestamp from the last time we fetched them so that we only re-fetch
  them if more than an hour has elapsed. This should be fairly resilient
  while still massively improving the UX for people showing up after a
  day and trying to use latest 'main' features.

  This means that we now effectively have two caching levels:

  - In the manifest, we store previously fetched etags.
  - In the filesystem, we have a cache of already downloaded zip archives.

  The first cache is basically invalidated every hour, while the second
  cache is only invalidated when a etag changes. For pinned versions,
  nothing is invalidated as they are considered immutable.
2023-09-13 17:17:32 -04:00
KtorZ 15efeb3069 Remove unused code & data-type 'UseManifest'
If it's unused, it shall be gone. It obfuscate what functions are
  doing and require managing extra complexity for no reason.
2023-09-13 17:17:32 -04:00
KtorZ 5381762e50 Rework logs around dependency fetching. 2023-09-13 17:17:32 -04:00
KtorZ 76ff09ba0e Ensure that version resolution works offline
And so, even for unpinned package. In this case, we can't do a HEAD request. So we fallback by looking at what's available in the cache and using the most recently downloaded version from the cache. This is only a best effort as the most recently downloaded one may not be the actual latest. But common, this is a case where (a) someone didn't pin any version, (b) is trying to build on in an offline setup. We could possibly make that edge-case better but, let's see if anyone ever complains about it first.
2023-09-13 17:17:32 -04:00
KtorZ 87087a1811 Always check package status when version is not pinned
When the version isn't a git sha or a tag, we always check that we got
  the last version of a particular dependency before building. This is
  to avoid those awkward moments where someone try to use something from
  the stdlib that is brand new, and despite using 'main' they get a
  strange build failure regarding how it's not available.

  An important note is that we don't actually re-download the package
  when the case occurs; we merely check an HTTP ETag from a (cheap) 'HEAD'
  request on the package registry. If the tag hasn't changed then that
  means the local version is correct.

  The behavior is completely bypassed if the version is specified using
  a git sha or a tag, as here, we can assume that fetching it once it
  enough (and that it can change). If a package maintainer force-pushed
  a tag however, there may be discrepency and the only way around that
  is to `rm -r ./build`.
2023-09-13 17:17:32 -04:00
KtorZ 3c3a7f2423 Define 'is_git_sha_or_version' to assert whether a version is 'immutable'
Best-effort to assert whether a version refers is a git sha digest or a tag. When it is, we
avoid re-downloading it if it's already fetched. But when it isn't, and thus refer to a branch,
we always re-download it. Note however that the download might be short-circuited by the
system-wide package cache, so a download doesn't actually mean a network request.

The package cache is however smart-enough to assert whether a package in the cache must be
re-downloaded (using HTTP ETag). So this is mostly about delegating the re-downloading logic to
the global packages cache.
2023-09-13 17:17:32 -04:00
KtorZ 65fb3a640a Remove dead-code. 2023-09-13 17:17:32 -04:00
KtorZ a72628a4dc Auto-derive 'Debug' trait instance for types in deps
Actually useful to debug / troubleshoot things.
2023-09-13 17:17:32 -04:00
microproofs a45001376d fix: is_record was used incorrectly in code gen,
the real solution was to look up the datatype and check constructors length
2023-09-13 00:33:02 -04:00
microproofs d042d55d42 fix clippy warnings in code gen 2023-09-12 21:25:05 -04:00
KtorZ 9782c094b7
Fix clippy suggestions. 2023-09-08 16:21:07 +02:00
KtorZ 8ba5946c32
Preserve escape sequence after formatting
Bumped into this randomly. We do correctly parse escape sequence, but
  the format would simply but the unescaped string back on save. Now it
  properly re-escapes strings before flushing them back. I also removed
  the escape sequence for 'backspace' and 'new page' form feed as I
  don't see any use case for those in an Aiken program really...
2023-09-08 12:12:15 +02:00
KtorZ 5cfc3de7bf
Add CODEOWNERS 2023-09-08 10:21:33 +02:00
rvcas 6b70292dfb
chore: cargo fmt 2023-09-06 21:10:50 -04:00
rvcas 1de7b2866a
feat(cli): add --deny to build, check, and docs
This is useful for CI, where people that may have
a stricter workflow want to force CI to fail if any warnings
are detected.
2023-09-06 17:19:44 -04:00
microproofs 819a0a20e6 add tests for case and constr
Fix a minor issue with decoding order
2023-09-03 11:52:49 -04:00
microproofs c9b01ab365 chore: fill in cost model
test: Add case and constr eval tests
2023-09-03 11:52:49 -04:00
microproofs 85901dc141 chore: update cost model with placeholders for new terms to pass tests 2023-09-03 11:52:49 -04:00
microproofs 40e1d39f8b Add placeholders for cost model 2023-09-03 11:52:49 -04:00
microproofs 33d6d3049e add compute for the new terms constr and case 2023-09-03 11:52:49 -04:00
microproofs e566c4e1de feat(uplc): add Case and Const terms
- parsering
- interning
- flat encoding and decoding
- pretty printing
- debruijn conversion

Co-authored-by: Lucas Rosa <x@rvcas.dev>
2023-09-03 11:52:49 -04:00
rvcas dfe433ea46
fix: trim whitespace when loading hex strings from files closes #720 2023-08-31 18:22:09 -04:00
rvcas 097d1fa893
chore: update changelog 2023-08-31 18:01:52 -04:00
rvcas 437a95bfe8
fix: behave like rust with hyphens closes #722 closes #690 2023-08-31 18:00:21 -04:00
rvcas a87a8a7b35
chore: update changelog 2023-08-31 17:41:36 -04:00
rvcas dca09811c1
fix: empty records crashing code gen closes #728 2023-08-31 17:39:38 -04:00
rvcas fb967d4c7b
fix: uplc formatter of Data closes #716 2023-08-31 17:20:48 -04:00
microproofs 51c44c6a30 fix: add an assert for better error messages when doing empty types 2023-08-30 13:50:37 -04:00
waalge cd3a02416f chore: rm unused pub function 2023-08-29 22:30:06 -04:00
waalge 756e16c14b fix: rename assert to expect 2023-08-29 22:30:06 -04:00
microproofs baa6917af5 Fix: Change type map length assert to check for greater than equals instead of equals to argument length 2023-08-29 21:59:15 -04:00
Matthias Benkort d01766d735
Merge pull request #721 from waalge/waalge/rm-mut
rm unnecessary mut
2023-08-29 21:00:05 +02:00
Chris Gianelloni 67986d9416 chore: build static binaries for Linux/Windows
Signed-off-by: Chris Gianelloni <cgianelloni@applause.com>
2023-08-29 00:46:50 -04:00
waalge d4b9f22ac3 rm unnecessary mut 2023-08-26 16:30:44 +00:00
rvcas 1715496d5b
chore: update resolver in virtual workspace 2023-08-24 15:51:39 -06:00
rvcas 0e7f1597bf
chore: add release instructions in contributing.md 2023-08-24 15:43:26 -06:00
rvcas b075d85b40
chore: Release 2023-08-24 15:05:12 -06:00
rvcas b3494a7f63
chore: fix versions 2023-08-24 15:04:52 -06:00
rvcas a7062ccb88
chore: fix versions 2023-08-24 15:04:19 -06:00
rvcas 747e057d05
fix: tags 2023-08-24 15:00:09 -06:00
KtorZ 379368c530
Fix clippy. 2023-08-22 13:30:30 +02:00
KtorZ 2f0211a7b1
Bump all version manually because cargo workspaces didn't work. 2023-08-22 13:27:10 +02:00
KtorZ 780a61e3e8
Release 1.0.16-alpha
aiken@1.0.16-alpha

Generated by cargo-workspaces
2023-08-22 13:18:48 +02:00
KtorZ d3fe241ccd
Wrap-up CHANGELOG 2023-08-22 13:14:35 +02:00
KtorZ 7883aff5f7
revert 619b73d03e
There's really no scenario where we want to generate boilerplate that
  always end up being removed. In particular, the boilerplate breaks
  tutorial as it generate conflicting validators in the blueprint.

  The only argument in favor of the boilerplate is to serve as example
  and show people some syntax reminder. However, this is better done in
  the README or on the user manual directly.
2023-08-22 12:59:36 +02:00
microproofs 89c55a23fa chore: Release 2023-08-19 20:17:00 -04:00
microproofs 0eec4c188a update changelog for v1.0.15 2023-08-19 20:11:24 -04:00
microproofs 084b900b2a change: traverse_with_tree now has a boolean to determine when with is called
fix: Opaque types are now properly handled in code gen (i.e. code gen functions, in datums/redeemers, in from data casts)
chore: add specific nested opaque type tests to code gen
2023-08-19 20:07:37 -04:00
KtorZ c6f764d2db Refresh Cargo.lock & fill-in CHANGELOG. 2023-08-19 13:39:39 -04:00
KtorZ 139226cdab Support interactive blueprint parameter application. 2023-08-19 13:39:39 -04:00
KtorZ c1b8040ae2 Add helper for splitting a long line into multiple lines. 2023-08-19 13:39:39 -04:00
KtorZ 961e323c36 Enable iterating over validator's parameters with a callback
This is how we'll construct parameters interactively. We need to lookup the definition, and provide a data representation for it.
2023-08-19 13:39:39 -04:00
KtorZ 051e9a6851 Add some utility functions for displaying Term/PlutusData
This is useful with the blueprint stuff, where Term are often just plain PlutusData.
2023-08-19 13:39:39 -04:00
rvcas 690e41261e
chore: Release 2023-08-16 23:30:21 -04:00
rvcas be20426329
chore: update changelog 2023-08-16 23:16:56 -04:00
rvcas f5a49c4df4
fix: aliased import of single type throws compiler error closes #707 2023-08-16 23:15:51 -04:00
rvcas 6d90c27587
chore: update changelog 2023-08-16 23:06:53 -04:00
rvcas 2600937447
chore: cargo fmt 2023-08-16 22:56:22 -04:00
rvcas b138cb0ccd
chore: update changelog 2023-08-16 22:55:59 -04:00
logicalmechanism 649039c993 tx simulate still need fee work 2023-08-16 22:37:00 -04:00
logicalmechanism 050c41c8dc tx simulate returns a vector of exbudgets now 2023-08-16 22:37:00 -04:00
Ariady Putra 8cf92ce8ed `aiken new`: Try to get the latest tag of stdlib 2023-08-16 22:30:15 -04:00
microproofs c95f43ae07 add one more test 2023-08-16 21:59:25 -04:00
microproofs 20aa54b5ca fix: last test fixed 2023-08-16 21:59:25 -04:00
microproofs a45e04fd9b fix: using the wrong var for pattern matching 2023-08-16 21:59:25 -04:00
microproofs 2456801b17 fix list clauses with guards and add more tests 2023-08-16 21:59:25 -04:00
rvcas f4d0f231d7
test: fix acceptance tests 2023-08-16 14:52:06 -04:00
rvcas 80e4a5c6a2
chore: remove build folder 2023-08-16 14:17:33 -04:00
rvcas ae216bd932
test(ci): run examples too 2023-08-16 13:33:54 -04:00
rvcas 6ecb3f08b0
chore: default stdlib on new is 1.5.0 2023-08-16 13:24:10 -04:00
rvcas 0ff64e3bac test: check and format tests for logical op chain 2023-08-15 09:58:35 -04:00
rvcas e14d51600f feat(format): logical op chain 2023-08-15 09:58:35 -04:00
rvcas 2c2f3c90fb feat: new snapshots 2023-08-15 09:58:35 -04:00
rvcas 05eb281f40 chore: can safely remove this at this point 2023-08-15 09:58:35 -04:00
rvcas e4ef386c44 feat(tipo): inference for and/or chains 2023-08-15 09:58:35 -04:00
rvcas ab3a418b9c feat(parser): add support for and/or chaining 2023-08-15 09:58:35 -04:00
microproofs 4a1ae9f412 set eval to public 2023-08-12 16:42:23 -04:00
microproofs ca4a9fcd3d chore: make eval function and SlotConfig public 2023-08-11 20:33:08 -04:00
microproofs 8af253e1df chore: make slot_to_begin_posix_time a public function 2023-08-11 20:05:22 -04:00
microproofs 2f7784f31e chore: update changelog
expecting a type on List<Data> from data now only checks that type is a list and not each element
2023-08-10 23:01:46 -04:00
microproofs eda388fb29 test(aiken-lang): add a new test for list edge case in when clause patterns 2023-08-08 20:47:35 -04:00
microproofs 252f68de17 fix clippy 2023-08-07 19:08:18 -04:00
microproofs 367dabafb5 fix: update last 2 tests for new recursion optimization 2023-08-07 19:06:00 -04:00
Pi Lanningham f464eb3702 Cargo fmt + clippy, with latest rust 2023-08-07 19:00:39 -04:00
Pi Lanningham 0d99afe5e2 Cargo fmt 2023-08-07 19:00:39 -04:00
microproofs 90c7753201 update tests for new recursion optmization 2023-08-07 19:00:39 -04:00
microproofs 65984ed15b fix: move where we call the with in traverse_tree_with 2023-08-07 19:00:39 -04:00
Pi Lanningham dba0e11ba7 Add other shadowing cases 2023-08-07 19:00:39 -04:00
Pi Lanningham fc948f0029 Add the same optimization to dependent functions
I originally didn't add this because I thought this was mutually
recursive functions, which I couldn't picture how that would work;

I refactored all this logic into modify_self_calls, which maybe needs a
better name now.

Perf gain on some stdlib tests (line concat tests) is 93%!!
2023-08-07 19:00:39 -04:00
Pi Lanningham c45caaefc8 Rudimentary implementation
Adds an identify_recursive_static_params; doesn't handle all shadowing cases yet
2023-08-07 19:00:39 -04:00
Pi Lanningham 09f889b121 Add codegen for recursive statics case
We also flip the recursive_statics fields to recursive_nonstatics; This makes the codegen a little easier.  It also has a hacky way to hard code in some recursive statics for testing
2023-08-07 19:00:39 -04:00
Pi Lanningham 586a2d7972 Add recursive_static_params to AIR
Any methods to a recursive function that are unchanged and forwarded
don't need to be applied each time we recurse; instead, you can
define a containing lambda, reducing the number of applications
dramatically when recursing
2023-08-07 19:00:39 -04:00
microproofs 71a941e0b0 Update changelog 2023-08-07 12:11:39 -04:00
microproofs 6254eeb2ed add acceptance test 88 2023-08-07 12:02:44 -04:00
microproofs f7d278a472 fix: 2 acceptance tests were throwing errors due to exhaustiveness checker 2023-08-07 12:02:44 -04:00
microproofs 1d9878c5ee fix: code gen tests now up to date using trace
fix: Formatter should take ErrorTerm and return "fail"
fix: fail with no reason should just return ErrorTerm
2023-08-07 12:02:44 -04:00
microproofs 624fdee9ea keep traces in tests 2023-08-07 12:02:44 -04:00
microproofs 36c80f36c1 fix tests 2023-08-07 12:02:44 -04:00
microproofs 29599879b2 one minor tail fix in clause pattern 2023-08-07 12:02:44 -04:00
microproofs 6a1b2db698 use retain instead of position 2023-08-07 12:02:44 -04:00
microproofs 7bf22fa58b fix multivalidator issue 2023-08-07 12:02:44 -04:00
microproofs 281a8363c0 fixes to tuples and usage of discard.
Also a fix to tail and its type in when list pattern matches
2023-08-07 12:02:44 -04:00
microproofs bfa4cc2efc fix: some function dependency tree path was not being updated in order
fix: revert to old implicit way of casting to data for now
2023-08-07 12:02:44 -04:00
microproofs a45ff692a6 last changes for today 2023-08-07 12:02:44 -04:00
microproofs db79468435 remove old stuff 2023-08-07 12:02:44 -04:00
microproofs aca79bd728 remove warning 2023-08-07 12:02:44 -04:00
microproofs 3189a60bdb fixes to how we sort dependencies.
Also update dependency path based on a functions path.
2023-08-07 12:02:44 -04:00
microproofs 80b950b8aa fix edge case with assign and pattern matching 2023-08-07 12:02:44 -04:00
microproofs 186e1235fd checkpoint 2023-08-07 12:02:44 -04:00
microproofs 1ee7492f1f fix tuple clause 2023-08-07 12:02:44 -04:00
microproofs 49a0a91103 fix tests again 2023-08-07 12:02:44 -04:00
microproofs f5c7d222aa optmization fix 2023-08-07 12:02:44 -04:00
microproofs 5aecb96668 constants are back. I had broke them when switching how data casting works 2023-08-07 12:02:44 -04:00
microproofs 0b8266dfd1 some type conversion fixes 2023-08-07 12:02:44 -04:00
microproofs 02948616cd some more fixes involving clauses 2023-08-07 12:02:44 -04:00
microproofs a689b8748f fix: working on minor edge cases
fix: zero arg function dependencies should not count as hoisted
fix: tuple index was receiving the wrong type
2023-08-07 12:02:44 -04:00
microproofs 4e3ced5b75 fix and clean up tests and handle
one edge case on rearrange clauses
2023-08-07 12:02:44 -04:00
microproofs 018453f6b1 fix expect on tuple type using the wrong internal type 2023-08-07 12:02:44 -04:00
microproofs f03ed41e03 fix some unnecessary lambdas in assign 2023-08-07 12:02:44 -04:00
microproofs 03dd13dc7d fixing list condition edge cases and clean up rearrange list clauses 2023-08-07 12:02:44 -04:00
microproofs e8fa8f5423 fixing list clause issues 2023-08-07 12:02:44 -04:00
microproofs c6f90a999b checkpoint 2023-08-07 12:02:44 -04:00
microproofs 389699f485 fix to subject type for assignment on single clause 2023-08-07 12:02:44 -04:00
microproofs dcb3a9b45b some type and expect fixes 2023-08-07 12:02:44 -04:00
microproofs 3545bad3c4 fix typing to constr 2023-08-07 12:02:44 -04:00
microproofs 52ebc9b6c1 some more fixes 2023-08-07 12:02:44 -04:00
microproofs 58b327e5b3 fixing bugs and edge cases 2023-08-07 12:02:44 -04:00
microproofs 2f4319f162 fix: tuples and list patterns had a few issues 2023-08-07 12:02:44 -04:00
microproofs 960a15c4ec checkpoint - fixing tests and stuff 2023-08-07 12:02:44 -04:00
microproofs 72b6f0f847 all but six tests passing 2023-08-07 12:02:44 -04:00
microproofs 6eeb282dee Now code gen is finished and we just have testing 2023-08-07 12:02:44 -04:00
microproofs 518bea5be4 feat: fixed up generate and generate test
last step is checking on uplc code gen
2023-08-07 12:02:44 -04:00
microproofs 18ea44adb0 chore: rename unwrapData and wrapData
add validator cast function for extra validator params
2023-08-07 12:02:44 -04:00
microproofs 55dd1a1a56 out with the old code and in with the air tree 2023-08-07 12:02:44 -04:00
microproofs 02ce3761ae final checkpoint 2023-08-07 12:02:44 -04:00
microproofs 8641c305f4 feat: airtree now hoists function. Now all that is left is finishing uplc gen 2023-08-07 12:02:44 -04:00
microproofs 5ad8b520fd checkpoint 2023-08-07 12:02:44 -04:00
microproofs 5a51764cff remove some warnings 2023-08-07 12:02:44 -04:00
microproofs a099c01734 feat: add support for hoisting code gen functions
fix: code gen vars should be module functions
fix: missed a recursive call in do_find_air_tree_node under binop
2023-08-07 12:02:44 -04:00
microproofs c0f09856d3 feat: Here's a first, we hoisted some user functions onto the validator 2023-08-07 12:02:44 -04:00
microproofs 62660e04b5 checkpoint;
Remaining work is on function hoisting. Functions have been defined and monomorphized
2023-08-07 12:02:44 -04:00
microproofs ae9de11e77 big checkpoint:
feat: add monomorphize and other useful tree function abstractions
feat: started testing function hositing result so far
2023-08-07 12:02:44 -04:00
microproofs 947c118175 checkpoint 2023-08-07 12:02:44 -04:00
microproofs 9704cafefe a checkpoint for function hoisting start and type fix 2023-08-07 12:02:44 -04:00
microproofs 55ae708e3e checkpoint: start on function hoisting 2023-08-07 12:02:44 -04:00
microproofs 2b7e7ead1c feat: add support for validator arguments
feat: finish expect type on data constr
fix: tuple clause was exposing all items regardless of discard
fix: tuple clause was not receiving complex_clause flag
fix: condition for assert where constructor had 0 args was tripping assert
fix: had to rearrange var and discard assignment to ensure correct val is returned
fix: binop had the wrong type
2023-08-07 12:02:44 -04:00
microproofs 7d4e136467 checkpoint 2023-08-07 12:02:44 -04:00
microproofs fd83c9a739 feat: fix up generic type functions to work with the new air tree functions
chore: remove commented code
2023-08-07 12:02:44 -04:00
microproofs b3714ca9d0 fix: list clause guard for final clause needs to use list accessor 2023-08-07 12:02:44 -04:00
microproofs 2c61ecd4bb feat: finish up nested clauses 2023-08-07 12:02:44 -04:00
microproofs a3afb62861 chore: fixing nested clauses to match aiken stack air 2023-08-07 12:02:44 -04:00
microproofs 95af421f95 feat: finish tuple conditions 2023-08-07 12:02:44 -04:00
microproofs 05b6b2a97d chore: rename some functions 2023-08-07 12:02:44 -04:00
microproofs c025073056 fix: List clauses were destructuring the next element unnecessarily
feat: finish nested constructor clauses
2023-08-07 12:02:44 -04:00
microproofs f6e163d16d feat: start on nested clauses
chore: remove then field from list clause guard and clause guard
2023-08-07 12:02:44 -04:00
microproofs 5bcc425f0f feat: changed air expressions clause guard
and list clause guard to air statements
2023-08-07 12:02:44 -04:00
microproofs 023be88bf6 chore: another checkpoint
fix: guard clause to properly check condition
2023-08-07 12:02:44 -04:00
microproofs f94c8213b6 checkpoint 2023-08-07 12:02:44 -04:00
microproofs 0854d71836 chore: another checkpoint and renamed ClauseProperties fields 2023-08-07 12:02:44 -04:00
microproofs d731757123 feat: start on clauses in when conditions
**checkpoint**
2023-08-07 12:02:44 -04:00
microproofs 96959011e9 feat: finish up build. just have helper methods
feat: Create an air and AirTree iterator.
This allows us to iterate forwards or backwards over the tree as a vec.
chore: moved around some functions
2023-08-07 12:02:44 -04:00
microproofs ba3265054c chore: move tree to gen_uplc2 and create a duplicate air file without scope 2023-08-07 12:02:44 -04:00
microproofs 7cee9a4d15 chore: move assignment_air_tree and expect_type to gen_uplc
feat: add is_primitive check to types
2023-08-07 12:02:44 -04:00
microproofs cd726b561e feat: add removal of discard lets
chore: Name change for AirTree sequence
feat: finish up assignment  constructor and list for airtree builder
2023-08-07 12:02:44 -04:00
microproofs 59362e3d8c feat: almost done assignment have tuple and constr left
feat: modified the AirTree structure to have statements, sequences, and expressions
feat: changed the hoist_over function to be universal
2023-08-07 12:02:44 -04:00
microproofs 65bb7e48e2 feat: start on build assignment
feat: implement assignment hoisting
2023-08-07 12:02:44 -04:00
microproofs c359bd35d7 feat: update tree to allow for let hoisting
feat: start on build for when expressions
feat: add builder methods for AirTree
2023-08-07 12:02:44 -04:00
microproofs 83ade9335f feat: implement most of airtree build 2023-08-07 12:02:44 -04:00
microproofs 5e097d42ba feat: add AirTree types and builder functions 2023-08-07 12:02:44 -04:00
Olof Blomqvist d25bb9ae60 format 2023-08-04 14:56:16 -04:00
Olof Blomqvist 4e4a477ff1 meh 2023-08-04 14:56:16 -04:00
Olof Blomqvist 17eef195a9 fix diagnostics and formatting on windows vscode 2023-08-04 14:56:16 -04:00
rvcas 266b6bbb7d
fix(exhaustiveness): for constructor use correct name because import aliases 2023-08-03 16:28:47 -04:00
rvcas 60ac8ab591
fix(exhaustiveness): adjust helper method to get contructors properly 2023-08-03 16:14:42 -04:00
KtorZ 675b737898
Check exhaustiveness behavior on pattern guards. 2023-08-02 10:40:59 +02:00
KtorZ 4f7f39292d
Fix subtle bug in pattern rendering
When rendering missing or redundant patterns, linked-list would
  wrongly suggest the last nil constructor as a pattern on non-empty
  list.

  For example, before this commit, the exhaustivness checker would yield:

  ```
  [(_, True), []]
  ```

  as a suggestion, for being the result of being a list pattern with a
  single argument being `(_, True) :: Nil`. Blindly following the
  compiler suggestion here would cause a type unification error (since
  `[]` doesn't unify with a 2-tuple).

  Indeed, we mustn't render the Nil constructor when rendering non-empty
  lists! So the correct suggestion should be:

  ```
  [(_, True)]
  ```
2023-08-02 10:31:35 +02:00
KtorZ 00b255e960
Remove now-dead code. 2023-08-02 09:22:21 +02:00
rvcas f3cab94ae1 test(check): a bunch of tests for the new exhaustiveness stuff 2023-08-01 21:13:50 -04:00
rvcas 75e18d485d fix: redundant might be wildcard which doesn't match technically 2023-08-01 21:13:50 -04:00
rvcas a6b230aad4 fix: exhaustiveness on types from other modules 2023-08-01 21:13:50 -04:00
rvcas 7e531d0da1 fix: wrong var for name in UnknownModule error 2023-08-01 21:13:50 -04:00
rvcas b6ac39f322 feat(exhaustiveness): show both clauses in redundant error 2023-08-01 21:13:50 -04:00
rvcas ef2fc57ca9 feat(exhaustiveness): check tuple patterns 2023-08-01 21:13:50 -04:00
rvcas f1100e901d feat(exhaustiveness): pretty print missing patterns 2023-08-01 21:13:50 -04:00
rvcas de2791fe82 feat(tipo): add new error for redundant clauses 2023-08-01 21:13:50 -04:00
rvcas 0061bcf78d feat: add support for list patterns 2023-08-01 21:13:50 -04:00
rvcas e8a71cd63b chore: rename usefulness module 2023-08-01 21:13:50 -04:00
rvcas 03efb46e6f feat(exhaustiveness): algorithm U borrowed from elm 2023-08-01 21:13:50 -04:00
microproofs 55887d3a45 fix: decode should always print to textual 2023-08-01 00:47:29 -04:00
233 changed files with 13774 additions and 8396 deletions

1
.github/CODEOWNERS vendored Normal file
View File

@ -0,0 +1 @@
* @aiken-lang/maintainers

View File

@ -90,12 +90,23 @@ jobs:
echo "SDKROOT=$(xcrun -sdk macosx --show-sdk-path)" >> $GITHUB_ENV
echo "MACOSX_DEPLOYMENT_TARGET=$(xcrun -sdk macosx --show-sdk-platform-version)" >> $GITHUB_ENV
- name: Linux AMD setup
if: ${{ matrix.job.target == 'x86_64-unknown-linux-gnu' }}
run: |
echo "RUSTFLAGS=-C target-feature=+crt-static" >> $GITHUB_ENV
- name: Linux ARM setup
if: ${{ matrix.job.target == 'aarch64-unknown-linux-gnu' }}
run: |
sudo apt-get update -y
sudo apt-get install -y gcc-aarch64-linux-gnu libssl-dev:armhf
echo "CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER=aarch64-linux-gnu-gcc" >> $GITHUB_ENV
echo "RUSTFLAGS=-C target-feature=+crt-static" >> $GITHUB_ENV
- name: Windows setup
if: ${{ matrix.job.os == 'windows-latest' }}
run: |
echo "RUSTFLAGS=-C target-feature=+crt-static" >> $GITHUB_ENV
- name: Build binaries
uses: actions-rs/cargo@v1

View File

@ -15,9 +15,15 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Build
run: cargo build --verbose --workspace
run: cargo build --verbose --workspace --target x86_64-unknown-linux-gnu
env:
RUSTFLAGS: -C target-feature=+crt-static
- name: Run unit tests
run: cargo test --verbose --workspace
- name: Run examples
run: |
cargo run -- check examples/hello_world
cargo run -- check examples/gift_card
- name: Run acceptance tests
working-directory: examples/acceptance_tests
run: |

View File

@ -1,5 +1,115 @@
# Changelog
## v1.0.18-alpha - 2023-MM-DD
### Added
- **aiken-lang**: Code gen now allows for mutual recursion
### Fixed
- **aiken-lang**: fixed stack overflow with unbound typing being passed into a
function with inferred types
### Changed
- **aiken-lang**: (Code Gen): Rename some of the types to use aliases
## v1.0.17-alpha - 2023-09-20
### Added
- **aiken**: add ability to force warnings to cause a failing exit code on
check, build, and docs
- **aiken-lang**: automatically resolve and fetch latest revision of a package
on build when a branch is specified as version
- **uplc**: Add Case and Constr Terms; This includes their flat serialization
and evaluation
### Fixed
- **uplc**: trim whitespace when loading files with hex strings to avoid
confusing errors #720
- **uplc**: uplc `Constant::Data` formatting
- **aiken-lang**: code gen fixes including nested constr when matches and expect
on None
- **aiken-lang**: empty records properly parse as record sugar
- **aiken-lang**: escape sequences are now properly preserved after formatting
- **aiken-lang**: fixed parser ambiguity when using record constructor in if
conditions followed by single-line var expressions #735
- **aiken-project**: when a module name has a hyphen we should behave like rust
and force an underscore
## v1.0.16-alpha - 2023-08-24
### Removed
- **aiken**: `aiken new` no longer accept an `--empty` flag. Projects are
generated empty by default.
## v1.0.15-alpha - 2023-08-19
### Added
- **aiken**: Parameters for `blueprint apply` can now be built interactively.
### Changed
- **aiken-lang**: Opaque types are now properly handled in code gen (i.e. in
code gen functions, in datums/redeemers, in from data casts).
- **aiken**: `blueprint apply` now expects only one OPTIONAL argument. When not
provided, the parameter will be prompted interactively.
- **aiken-lang**: New tests for code gen around opaque types.
- **aiken-lang**: `traverse_with_tree` now has a boolean parameter to determine
when `with` is called.
### Removed
- **aiken**: `blueprint apply` no longer accept a target directory. The command
has to be executed within the same directory as the `aiken.toml`.
## v1.0.14-alpha - 2023-08-16
### Added
- **aiken**: `new` command now fetches latest stdlib version
- **aiken-lang**: new `and` and `or` chaining
```
and {
1 == 1,
or {
2 == 2,
3 != 2,
True,
},
}
```
### Changed
- **aiken-lang**: significantly improved pattern matching exhuastiveness
checking
- **aiken-lang**: Tx Simulate now returns a list of execution budgets for each
redeemer instead of calculating the total units required to run all the
scripts.
- **aiken-lang**: Now code gen uses a tree abstraction to build the Aiken
Intermediary Representation. This now fixes quite a number of minor issues
while making the code more maintainable. This is a large leap towards a stable
version and future updates will be on further simplifying and stabilizing the
tree abstraction.
- **aiken-lang**: Zero argument anonymous functions now are implemted as a
delayed function body and calling them simply does force
- **aiken-lang**: Matching on int in expect and when cases is now implemented
- **aiken-lang**: Using assign in nested pattern matches is now implemented
- **aiken-lang**: Using List<Data> as a validator params only checks the type is
a list and does not attempt to check each item
- **aiken-lang**: Recursion optimization to prevent static parameters from being
passed through every recursion
- **aiken-lang**: aliased import of single type throws compiler error
- **aiken-lsp**: fix diagnostics and formatting on windows vscode
- **aiken**: decode should always print to textual
- **uplc**: pair type formatting
## v1.0.13-alpha - 2023-07-15
### Added
@ -8,24 +118,28 @@
### Fixed
- **aiken-lang**: fail, todo, and trace had issues with sequences and expressions
- **aiken-lang**: fail, todo, and trace had issues with sequences and
expressions
## v1.0.12-alpha - 2023-07-14
### Added
- **aiken**: added a `blueprint policy` command to compute the policy ID of a minting script
- **aiken**: added a `blueprint policy` command to compute the policy ID of a
minting script
- **uplc**: parsing and pretty printing for PlutusData
### Fixed
- **aiken-lang**: Prevent mutual recursion caused by conflicting function names for generic expect type
- **aiken-lang**: Prevent mutual recursion caused by conflicting function names
for generic expect type
- **aiken-lang**: UPLC evaluation of large integers literals (> max u64)
- **aiken-lang**: Parsing of error / todo keywords in when cases
- **aiken-lang**: Parsing of negative integer patterns and constants
- **aiken-lang**: automatically infer unused validator args as `Data`
- **aiken-lang**: test crashing when referencing validators
- **aiken**: mem and cpu values were not showing in white terminals, switched to cyan
- **aiken**: mem and cpu values were not showing in white terminals, switched to
cyan
### Changed
@ -52,8 +166,10 @@
### Fixed
- **aiken-lang**: Explain discards and expect a bit better in the unused var warning
- **aiken-lang**: Fix expect \_ = ... not including the cast from data logic if the type is data and right hand has a type annotation
- **aiken-lang**: Explain discards and expect a bit better in the unused var
warning
- **aiken-lang**: Fix expect \_ = ... not including the cast from data logic if
the type is data and right hand has a type annotation
- **aiken-lang**: Fix for the final clause of a when expecting another clause
afterwards in nested list cases.
- **aiken-lang**: Fix for all elements were being destructured in tuple clauses
@ -61,7 +177,8 @@
- **aiken-lang**: Fix for tuple clause not consuming the next case causing
incomplete contracts. Now tuple clause will always consume the next case
unless it is the final clause
- **aiken-lang**: Fix for builtins using the incorrect data to type conversion when used as a function param.
- **aiken-lang**: Fix for builtins using the incorrect data to type conversion
when used as a function param.
## v1.0.10-alpha - 2023-06-13

View File

@ -71,14 +71,36 @@
Want to give some financial support? Have a look at the ways to sponsor below for more details.
- [rvcas](https://github.com/sponsors/rvcas/)
- [microproofs](https://github.com/sponsors/microproofs/)
- [rvcas](https://github.com/sponsors/rvcas)
- [microproofs](https://github.com/sponsors/microproofs)
- [ktorz](https://github.com/sponsors/KtorZ)
Want to support with crypto?
- Our Ada address is `addr1q83nlzwu4zjeu927m8t24xa68upgmwgt5w29ww5ka695hc5rez2r4q7gcvj7z0ma6d88w3j220szsqk05sn43ghcsn4szvuklq`
- Our Ada handle is `$aiken_lang`
## Releasing
To be able to create a release you need to be on the [maintainers](https://github.com/orgs/aiken-lang/teams/maintainers) team.
This means that only core team members can create releases. We have a
[github action](https://github.com/aiken-lang/aiken/blob/main/.github/workflows/release.yml) for creating the binaries and a github release.
The process follows these steps:
1. `cargo release --execute`
2. After a release is created by the github action fill in the release notes. Try to tag contributors so that they show up in the release.
3. Screenshot the result of `aikup` and post on twitter saying "We've just released vx.x.x". [example](https://twitter.com/aiken_eng/status/1693084816931987930?s=20)
> `cargo release` takes arguments and flags to tell it how to bump the version number. Examples include `cargo release 1.0.16-alpha` and `cargo release major`.
>
> The root `Cargo.toml` in the repo contains this configuration for `cargo release`:
>
> ```toml
> [workspace.metadata.release]
> shared-version = true
> tag-name = "v{{version}}"
> ```
## About Issues
### :bug: How To Report A Bug

106
Cargo.lock generated vendored
View File

@ -51,7 +51,7 @@ dependencies = [
[[package]]
name = "aiken"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"aiken-lang",
"aiken-lsp",
@ -63,7 +63,10 @@ dependencies = [
"hex",
"ignore",
"indoc",
"inquire",
"miette",
"num-bigint",
"ordinal",
"owo-colors",
"pallas-addresses",
"pallas-codec",
@ -78,7 +81,7 @@ dependencies = [
[[package]]
name = "aiken-lang"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"chumsky",
"hex",
@ -90,6 +93,7 @@ dependencies = [
"num-bigint",
"ordinal",
"owo-colors",
"petgraph",
"pretty_assertions",
"strum",
"thiserror",
@ -99,7 +103,7 @@ dependencies = [
[[package]]
name = "aiken-lsp"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"aiken-lang",
"aiken-project",
@ -120,7 +124,7 @@ dependencies = [
[[package]]
name = "aiken-project"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"aiken-lang",
"askama",
@ -637,6 +641,31 @@ dependencies = [
"cfg-if",
]
[[package]]
name = "crossterm"
version = "0.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e64e6c0fbe2c17357405f7c758c1ef960fce08bdfb2c03d88d2a18d7e09c4b67"
dependencies = [
"bitflags",
"crossterm_winapi",
"libc",
"mio",
"parking_lot",
"signal-hook",
"signal-hook-mio",
"winapi",
]
[[package]]
name = "crossterm_winapi"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "acdd7c62a3665c7f6830a51635d9ac9b23ed385797f70a83bb8bafe9c572ab2b"
dependencies = [
"winapi",
]
[[package]]
name = "crypto-bigint"
version = "0.5.1"
@ -723,6 +752,12 @@ dependencies = [
"winapi",
]
[[package]]
name = "dyn-clone"
version = "1.0.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbfc4744c1b8f2a09adc0e55242f60b1af195d88596bd8700be74418c056c555"
[[package]]
name = "ecdsa"
version = "0.16.4"
@ -824,7 +859,7 @@ checksum = "0ce7134b9999ecaf8bcd65542e436736ef32ddca1b3e06094cb6ec5755203b80"
[[package]]
name = "flat-rs"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"anyhow",
"proptest",
@ -1237,6 +1272,22 @@ version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9f2cb48b81b1dc9f39676bf99f5499babfec7cd8fe14307f7b3d747208fb5690"
[[package]]
name = "inquire"
version = "0.6.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c33e7c1ddeb15c9abcbfef6029d8e29f69b52b6d6c891031b88ed91b5065803b"
dependencies = [
"bitflags",
"crossterm",
"dyn-clone",
"lazy_static",
"newline-converter",
"thiserror",
"unicode-segmentation",
"unicode-width",
]
[[package]]
name = "insta"
version = "1.30.0"
@ -1569,6 +1620,15 @@ dependencies = [
"tempfile",
]
[[package]]
name = "newline-converter"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1f71d09d5c87634207f894c6b31b6a2b2c64ea3bdcf71bd5599fdbbe1600c00f"
dependencies = [
"unicode-segmentation",
]
[[package]]
name = "nom"
version = "7.1.3"
@ -1983,16 +2043,15 @@ dependencies = [
[[package]]
name = "proptest"
version = "1.1.0"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "29f1b898011ce9595050a68e60f90bad083ff2987a695a42357134c8381fba70"
checksum = "4e35c06b98bf36aba164cc17cb25f7e232f5c4aeea73baa14b8a9f0d92dbfa65"
dependencies = [
"bit-set",
"bitflags",
"byteorder",
"lazy_static",
"num-traits",
"quick-error 2.0.1",
"rand",
"rand_chacha",
"rand_xorshift",
@ -2028,12 +2087,6 @@ version = "1.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1d01941d82fa2ab50be1e79e6714289dd7cde78eba4c074bc5a4374f650dfe0"
[[package]]
name = "quick-error"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a993555f31e5a609f617c12db6250dedcac1b0a85076912c436e6fc9b2c8e6a3"
[[package]]
name = "quote"
version = "1.0.26"
@ -2230,7 +2283,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cb3dcc6e454c328bb824492db107ab7c0ae8fcffe4ad210136ef014458c1bc4f"
dependencies = [
"fnv",
"quick-error 1.2.3",
"quick-error",
"tempfile",
"wait-timeout",
]
@ -2415,6 +2468,27 @@ dependencies = [
"digest",
]
[[package]]
name = "signal-hook"
version = "0.3.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8621587d4798caf8eb44879d42e56b9a93ea5dcd315a6487c357130095b62801"
dependencies = [
"libc",
"signal-hook-registry",
]
[[package]]
name = "signal-hook-mio"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "29ad2e15f37ec9a6cc544097b78a1ec90001e9f71b81338ca39f430adaca99af"
dependencies = [
"libc",
"mio",
"signal-hook",
]
[[package]]
name = "signal-hook-registry"
version = "1.4.1"
@ -2912,7 +2986,7 @@ checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
[[package]]
name = "uplc"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
dependencies = [
"anyhow",
"cryptoxide",

View File

@ -1,5 +1,6 @@
[workspace]
members = ["crates/*"]
resolver = "2"
[profile.release]
strip = true

View File

@ -1,7 +1,7 @@
[package]
name = "aiken-lang"
description = "The Aiken compiler"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
repository = "https://github.com/aiken-lang/aiken"
homepage = "https://github.com/aiken-lang/aiken"
@ -24,8 +24,9 @@ owo-colors = { version = "3.5.0", features = ["supports-colors"] }
strum = "0.24.1"
thiserror = "1.0.39"
vec1 = "1.10.1"
uplc = { path = '../uplc', version = "1.0.13-alpha" }
uplc = { path = '../uplc', version = "1.0.17-alpha" }
num-bigint = "0.4.3"
petgraph = "0.6.3"
[target.'cfg(not(target_family="wasm"))'.dependencies]
chumsky = "0.9.2"

View File

@ -6,7 +6,11 @@ use crate::{
};
use miette::Diagnostic;
use owo_colors::{OwoColorize, Stream::Stdout};
use std::{fmt, ops::Range, sync::Arc};
use std::{
fmt::{self, Display},
ops::Range,
rc::Rc,
};
use vec1::Vec1;
pub const ASSERT_VARIABLE: &str = "_try";
@ -99,7 +103,6 @@ impl TypedModule {
fn str_to_keyword(word: &str) -> Option<Token> {
// Alphabetical keywords:
match word {
"assert" => Some(Token::Expect),
"expect" => Some(Token::Expect),
"else" => Some(Token::Else),
"is" => Some(Token::Is),
@ -116,13 +119,17 @@ fn str_to_keyword(word: &str) -> Option<Token> {
"type" => Some(Token::Type),
"trace" => Some(Token::Trace),
"test" => Some(Token::Test),
// TODO: remove this in a future release
"error" => Some(Token::Fail),
"fail" => Some(Token::Fail),
"and" => Some(Token::And),
"or" => Some(Token::Or),
"validator" => Some(Token::Validator),
_ => None,
}
}
pub type TypedFunction = Function<Arc<Type>, TypedExpr>;
pub type TypedFunction = Function<Rc<Type>, TypedExpr>;
pub type UntypedFunction = Function<(), UntypedExpr>;
#[derive(Debug, Clone, PartialEq)]
@ -139,7 +146,7 @@ pub struct Function<T, Expr> {
pub can_error: bool,
}
pub type TypedTypeAlias = TypeAlias<Arc<Type>>;
pub type TypedTypeAlias = TypeAlias<Rc<Type>>;
pub type UntypedTypeAlias = TypeAlias<()>;
impl TypedFunction {
@ -199,7 +206,7 @@ pub struct TypeAlias<T> {
pub tipo: T,
}
pub type TypedDataType = DataType<Arc<Type>>;
pub type TypedDataType = DataType<Rc<Type>>;
impl TypedDataType {
pub fn ordering() -> Self {
@ -237,7 +244,7 @@ impl TypedDataType {
}
}
pub fn option(tipo: Arc<Type>) -> Self {
pub fn option(tipo: Rc<Type>) -> Self {
DataType {
constructors: vec![
RecordConstructor {
@ -301,7 +308,7 @@ pub struct Use<PackageName> {
pub unqualified: Vec<UnqualifiedImport>,
}
pub type TypedModuleConstant = ModuleConstant<Arc<Type>>;
pub type TypedModuleConstant = ModuleConstant<Rc<Type>>;
pub type UntypedModuleConstant = ModuleConstant<()>;
#[derive(Debug, Clone, PartialEq)]
@ -315,7 +322,7 @@ pub struct ModuleConstant<T> {
pub tipo: T,
}
pub type TypedValidator = Validator<Arc<Type>, TypedExpr>;
pub type TypedValidator = Validator<Rc<Type>, TypedExpr>;
pub type UntypedValidator = Validator<(), UntypedExpr>;
#[derive(Debug, Clone, PartialEq)]
@ -328,7 +335,7 @@ pub struct Validator<T, Expr> {
pub params: Vec<Arg<T>>,
}
pub type TypedDefinition = Definition<Arc<Type>, TypedExpr, String>;
pub type TypedDefinition = Definition<Rc<Type>, TypedExpr, String>;
pub type UntypedDefinition = Definition<(), UntypedExpr, ()>;
#[derive(Debug, Clone, PartialEq)]
@ -457,7 +464,7 @@ pub enum Constant {
}
impl Constant {
pub fn tipo(&self) -> Arc<Type> {
pub fn tipo(&self) -> Rc<Type> {
match self {
Constant::Int { .. } => builtins::int(),
Constant::String { .. } => builtins::string(),
@ -530,7 +537,7 @@ impl<T: PartialEq> RecordConstructorArg<T> {
}
}
pub type TypedArg = Arg<Arc<Type>>;
pub type TypedArg = Arg<Rc<Type>>;
pub type UntypedArg = Arg<()>;
#[derive(Debug, Clone, PartialEq)]
@ -779,6 +786,15 @@ pub enum BinOp {
ModInt,
}
impl From<LogicalOpChainKind> for BinOp {
fn from(value: LogicalOpChainKind) -> Self {
match value {
LogicalOpChainKind::And => BinOp::And,
LogicalOpChainKind::Or => BinOp::Or,
}
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum UnOp {
// !
@ -808,7 +824,7 @@ impl BinOp {
}
pub type UntypedPattern = Pattern<(), ()>;
pub type TypedPattern = Pattern<PatternConstructor, Arc<Type>>;
pub type TypedPattern = Pattern<PatternConstructor, Rc<Type>>;
#[derive(Debug, Clone, PartialEq)]
pub enum Pattern<Constructor, Type> {
@ -931,7 +947,7 @@ impl AssignmentKind {
pub type MultiPattern<PatternConstructor, Type> = Vec<Pattern<PatternConstructor, Type>>;
pub type UntypedMultiPattern = MultiPattern<(), ()>;
pub type TypedMultiPattern = MultiPattern<PatternConstructor, Arc<Type>>;
pub type TypedMultiPattern = MultiPattern<PatternConstructor, Rc<Type>>;
#[derive(Debug, Clone, PartialEq)]
pub struct UntypedClause {
@ -944,8 +960,8 @@ pub struct UntypedClause {
#[derive(Debug, Clone, PartialEq)]
pub struct TypedClause {
pub location: Span,
pub pattern: Pattern<PatternConstructor, Arc<Type>>,
pub guard: Option<ClauseGuard<Arc<Type>>>,
pub pattern: Pattern<PatternConstructor, Rc<Type>>,
pub guard: Option<ClauseGuard<Rc<Type>>>,
pub then: TypedExpr,
}
@ -963,7 +979,7 @@ impl TypedClause {
}
pub type UntypedClauseGuard = ClauseGuard<()>;
pub type TypedClauseGuard = ClauseGuard<Arc<Type>>;
pub type TypedClauseGuard = ClauseGuard<Rc<Type>>;
#[derive(Debug, Clone, PartialEq)]
pub enum ClauseGuard<Type> {
@ -1063,7 +1079,7 @@ impl<A> ClauseGuard<A> {
}
impl TypedClauseGuard {
pub fn tipo(&self) -> Arc<Type> {
pub fn tipo(&self) -> Rc<Type> {
match self {
ClauseGuard::Var { tipo, .. } => tipo.clone(),
ClauseGuard::Constant(constant) => constant.tipo(),
@ -1225,6 +1241,21 @@ impl chumsky::Span for Span {
}
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum LogicalOpChainKind {
And,
Or,
}
impl Display for LogicalOpChainKind {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
LogicalOpChainKind::And => write!(f, "and"),
LogicalOpChainKind::Or => write!(f, "or"),
}
}
}
#[derive(Debug, thiserror::Error, Diagnostic)]
pub enum Error {
#[error(

View File

@ -9,7 +9,7 @@ use crate::{
IdGenerator,
};
use indexmap::IndexMap;
use std::{cell::RefCell, collections::HashMap, sync::Arc};
use std::{cell::RefCell, collections::HashMap, rc::Rc};
use strum::IntoEnumIterator;
use uplc::builtins::DefaultFunction;
@ -290,6 +290,10 @@ pub fn prelude(id_gen: &IdGenerator) -> TypeInfo {
);
// Void
prelude
.types_constructors
.insert(VOID.to_string(), vec![VOID.to_string()]);
prelude.values.insert(
VOID.to_string(),
ValueConstructor::public(
@ -667,7 +671,6 @@ pub fn prelude_functions(id_gen: &IdGenerator) -> IndexMap<FunctionAccessKey, Ty
FunctionAccessKey {
module_name: "".to_string(),
function_name: "not".to_string(),
variant_name: "".to_string(),
},
Function {
arguments: vec![Arg {
@ -718,7 +721,6 @@ pub fn prelude_functions(id_gen: &IdGenerator) -> IndexMap<FunctionAccessKey, Ty
FunctionAccessKey {
module_name: "".to_string(),
function_name: "identity".to_string(),
variant_name: "".to_string(),
},
Function {
arguments: vec![Arg {
@ -765,7 +767,6 @@ pub fn prelude_functions(id_gen: &IdGenerator) -> IndexMap<FunctionAccessKey, Ty
FunctionAccessKey {
module_name: "".to_string(),
function_name: "always".to_string(),
variant_name: "".to_string(),
},
Function {
can_error: false,
@ -828,7 +829,6 @@ pub fn prelude_functions(id_gen: &IdGenerator) -> IndexMap<FunctionAccessKey, Ty
FunctionAccessKey {
module_name: "".to_string(),
function_name: "flip".to_string(),
variant_name: "".to_string(),
},
Function {
can_error: false,
@ -959,8 +959,8 @@ pub fn prelude_data_types(id_gen: &IdGenerator) -> IndexMap<DataTypeKey, TypedDa
data_types
}
pub fn int() -> Arc<Type> {
Arc::new(Type::App {
pub fn int() -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: INT.to_string(),
module: "".to_string(),
@ -968,8 +968,8 @@ pub fn int() -> Arc<Type> {
})
}
pub fn data() -> Arc<Type> {
Arc::new(Type::App {
pub fn data() -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: DATA.to_string(),
module: "".to_string(),
@ -977,8 +977,8 @@ pub fn data() -> Arc<Type> {
})
}
pub fn byte_array() -> Arc<Type> {
Arc::new(Type::App {
pub fn byte_array() -> Rc<Type> {
Rc::new(Type::App {
args: vec![],
public: true,
name: BYTE_ARRAY.to_string(),
@ -986,12 +986,12 @@ pub fn byte_array() -> Arc<Type> {
})
}
pub fn tuple(elems: Vec<Arc<Type>>) -> Arc<Type> {
Arc::new(Type::Tuple { elems })
pub fn tuple(elems: Vec<Rc<Type>>) -> Rc<Type> {
Rc::new(Type::Tuple { elems })
}
pub fn bool() -> Arc<Type> {
Arc::new(Type::App {
pub fn bool() -> Rc<Type> {
Rc::new(Type::App {
args: vec![],
public: true,
name: BOOL.to_string(),
@ -999,8 +999,8 @@ pub fn bool() -> Arc<Type> {
})
}
pub fn list(t: Arc<Type>) -> Arc<Type> {
Arc::new(Type::App {
pub fn list(t: Rc<Type>) -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: LIST.to_string(),
module: "".to_string(),
@ -1008,8 +1008,8 @@ pub fn list(t: Arc<Type>) -> Arc<Type> {
})
}
pub fn string() -> Arc<Type> {
Arc::new(Type::App {
pub fn string() -> Rc<Type> {
Rc::new(Type::App {
args: vec![],
public: true,
name: STRING.to_string(),
@ -1017,8 +1017,8 @@ pub fn string() -> Arc<Type> {
})
}
pub fn void() -> Arc<Type> {
Arc::new(Type::App {
pub fn void() -> Rc<Type> {
Rc::new(Type::App {
args: vec![],
public: true,
name: VOID.to_string(),
@ -1026,8 +1026,8 @@ pub fn void() -> Arc<Type> {
})
}
pub fn result(a: Arc<Type>, e: Arc<Type>) -> Arc<Type> {
Arc::new(Type::App {
pub fn result(a: Rc<Type>, e: Rc<Type>) -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: RESULT.to_string(),
module: "".to_string(),
@ -1035,8 +1035,8 @@ pub fn result(a: Arc<Type>, e: Arc<Type>) -> Arc<Type> {
})
}
pub fn option(a: Arc<Type>) -> Arc<Type> {
Arc::new(Type::App {
pub fn option(a: Rc<Type>) -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: OPTION.to_string(),
module: "".to_string(),
@ -1044,8 +1044,8 @@ pub fn option(a: Arc<Type>) -> Arc<Type> {
})
}
pub fn ordering() -> Arc<Type> {
Arc::new(Type::App {
pub fn ordering() -> Rc<Type> {
Rc::new(Type::App {
public: true,
name: ORDERING.to_string(),
module: "".to_string(),
@ -1053,24 +1053,24 @@ pub fn ordering() -> Arc<Type> {
})
}
pub fn function(args: Vec<Arc<Type>>, ret: Arc<Type>) -> Arc<Type> {
Arc::new(Type::Fn { ret, args })
pub fn function(args: Vec<Rc<Type>>, ret: Rc<Type>) -> Rc<Type> {
Rc::new(Type::Fn { ret, args })
}
pub fn generic_var(id: u64) -> Arc<Type> {
let tipo = Arc::new(RefCell::new(TypeVar::Generic { id }));
pub fn generic_var(id: u64) -> Rc<Type> {
let tipo = Rc::new(RefCell::new(TypeVar::Generic { id }));
Arc::new(Type::Var { tipo })
Rc::new(Type::Var { tipo })
}
pub fn unbound_var(id: u64) -> Arc<Type> {
let tipo = Arc::new(RefCell::new(TypeVar::Unbound { id }));
pub fn unbound_var(id: u64) -> Rc<Type> {
let tipo = Rc::new(RefCell::new(TypeVar::Unbound { id }));
Arc::new(Type::Var { tipo })
Rc::new(Type::Var { tipo })
}
pub fn wrapped_redeemer(redeemer: Arc<Type>) -> Arc<Type> {
Arc::new(Type::App {
pub fn wrapped_redeemer(redeemer: Rc<Type>) -> Rc<Type> {
Rc::new(Type::App {
public: true,
module: "".to_string(),
name: REDEEMER_WRAPPER.to_string(),

View File

@ -1,12 +1,13 @@
use std::sync::Arc;
use std::rc::Rc;
use vec1::Vec1;
use crate::{
ast::{
self, Annotation, Arg, AssignmentKind, BinOp, ByteArrayFormatPreference, CallArg,
DefinitionLocation, IfBranch, ParsedCallArg, Pattern, RecordUpdateSpread, Span, TraceKind,
TypedClause, TypedRecordUpdateArg, UnOp, UntypedClause, UntypedRecordUpdateArg,
DefinitionLocation, IfBranch, LogicalOpChainKind, ParsedCallArg, Pattern,
RecordUpdateSpread, Span, TraceKind, TypedClause, TypedRecordUpdateArg, UnOp,
UntypedClause, UntypedRecordUpdateArg,
},
builtins::void,
parser::token::Base,
@ -17,19 +18,19 @@ use crate::{
pub enum TypedExpr {
UInt {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
value: String,
},
String {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
value: String,
},
ByteArray {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
bytes: Vec<u8>,
},
@ -56,30 +57,30 @@ pub enum TypedExpr {
Fn {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
is_capture: bool,
args: Vec<Arg<Arc<Type>>>,
args: Vec<Arg<Rc<Type>>>,
body: Box<Self>,
return_annotation: Option<Annotation>,
},
List {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
elements: Vec<Self>,
tail: Option<Box<Self>>,
},
Call {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
fun: Box<Self>,
args: Vec<CallArg<Self>>,
},
BinOp {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
name: BinOp,
left: Box<Self>,
right: Box<Self>,
@ -87,22 +88,22 @@ pub enum TypedExpr {
Assignment {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
value: Box<Self>,
pattern: Pattern<PatternConstructor, Arc<Type>>,
pattern: Pattern<PatternConstructor, Rc<Type>>,
kind: AssignmentKind,
},
Trace {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
then: Box<Self>,
text: Box<Self>,
},
When {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
subject: Box<Self>,
clauses: Vec<TypedClause>,
},
@ -111,12 +112,12 @@ pub enum TypedExpr {
location: Span,
branches: Vec1<IfBranch<Self>>,
final_else: Box<Self>,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
RecordAccess {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
label: String,
index: u64,
record: Box<Self>,
@ -124,7 +125,7 @@ pub enum TypedExpr {
ModuleSelect {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
label: String,
module_name: String,
module_alias: String,
@ -133,25 +134,25 @@ pub enum TypedExpr {
Tuple {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
elems: Vec<Self>,
},
TupleIndex {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
index: usize,
tuple: Box<Self>,
},
ErrorTerm {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
RecordUpdate {
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
spread: Box<Self>,
args: Vec<TypedRecordUpdateArg>,
},
@ -159,13 +160,13 @@ pub enum TypedExpr {
UnOp {
location: Span,
value: Box<Self>,
tipo: Arc<Type>,
tipo: Rc<Type>,
op: UnOp,
},
}
impl TypedExpr {
pub fn tipo(&self) -> Arc<Type> {
pub fn tipo(&self) -> Rc<Type> {
match self {
Self::Var { constructor, .. } => constructor.tipo.clone(),
Self::Trace { then, .. } => then.tipo(),
@ -533,6 +534,12 @@ pub enum UntypedExpr {
location: Span,
value: Box<Self>,
},
LogicalOpChain {
kind: LogicalOpChainKind,
expressions: Vec<Self>,
location: Span,
},
}
pub const DEFAULT_TODO_STR: &str = "aiken::todo";
@ -553,14 +560,15 @@ impl UntypedExpr {
}
pub fn fail(reason: Option<Self>, location: Span) -> Self {
UntypedExpr::Trace {
location,
kind: TraceKind::Error,
then: Box::new(UntypedExpr::ErrorTerm { location }),
text: Box::new(reason.unwrap_or_else(|| UntypedExpr::String {
if let Some(reason) = reason {
UntypedExpr::Trace {
location,
value: DEFAULT_ERROR_STR.to_string(),
})),
kind: TraceKind::Error,
then: Box::new(UntypedExpr::ErrorTerm { location }),
text: Box::new(reason),
}
} else {
UntypedExpr::ErrorTerm { location }
}
}
@ -715,6 +723,7 @@ impl UntypedExpr {
| Self::FieldAccess { location, .. }
| Self::RecordUpdate { location, .. }
| Self::UnOp { location, .. }
| Self::LogicalOpChain { location, .. }
| Self::If { location, .. } => *location,
Self::Sequence {
location,

View File

@ -1,11 +1,11 @@
use crate::{
ast::{
Annotation, Arg, ArgName, AssignmentKind, BinOp, ByteArrayFormatPreference, CallArg,
ClauseGuard, Constant, DataType, Definition, Function, IfBranch, ModuleConstant, Pattern,
RecordConstructor, RecordConstructorArg, RecordUpdateSpread, Span, TraceKind, TypeAlias,
TypedArg, UnOp, UnqualifiedImport, UntypedArg, UntypedClause, UntypedClauseGuard,
UntypedDefinition, UntypedFunction, UntypedModule, UntypedPattern, UntypedRecordUpdateArg,
Use, Validator, CAPTURE_VARIABLE,
ClauseGuard, Constant, DataType, Definition, Function, IfBranch, LogicalOpChainKind,
ModuleConstant, Pattern, RecordConstructor, RecordConstructorArg, RecordUpdateSpread, Span,
TraceKind, TypeAlias, TypedArg, UnOp, UnqualifiedImport, UntypedArg, UntypedClause,
UntypedClauseGuard, UntypedDefinition, UntypedFunction, UntypedModule, UntypedPattern,
UntypedRecordUpdateArg, Use, Validator, CAPTURE_VARIABLE,
},
docvec,
expr::{FnStyle, UntypedExpr, DEFAULT_ERROR_STR, DEFAULT_TODO_STR},
@ -21,7 +21,7 @@ use crate::{
use itertools::Itertools;
use num_bigint::BigInt;
use ordinal::Ordinal;
use std::sync::Arc;
use std::rc::Rc;
use vec1::Vec1;
const INDENT: isize = 2;
@ -171,7 +171,7 @@ impl<'comments> Formatter<'comments> {
line(),
);
let declarations = join(declarations.into_iter(), lines(2));
let declarations = join(declarations, lines(2));
let sep = if has_imports && has_declarations {
lines(2)
@ -712,7 +712,9 @@ impl<'comments> Formatter<'comments> {
.group(),
ByteArrayFormatPreference::Utf8String => nil()
.append("\"")
.append(Document::String(String::from_utf8(bytes.to_vec()).unwrap()))
.append(Document::String(escape(
&String::from_utf8(bytes.to_vec()).unwrap(),
)))
.append("\""),
}
}
@ -773,6 +775,10 @@ impl<'comments> Formatter<'comments> {
..
} => self.if_expr(branches, final_else),
UntypedExpr::LogicalOpChain {
kind, expressions, ..
} => self.logical_op_chain(kind, expressions),
UntypedExpr::PipeLine {
expressions,
one_liner,
@ -860,7 +866,7 @@ impl<'comments> Formatter<'comments> {
.append(suffix)
}
UntypedExpr::ErrorTerm { .. } => "error".to_doc(),
UntypedExpr::ErrorTerm { .. } => "fail".to_doc(),
UntypedExpr::TraceIfFalse { value, .. } => self.trace_if_false(value),
};
@ -868,8 +874,10 @@ impl<'comments> Formatter<'comments> {
commented(document, comments)
}
fn string<'a>(&self, string: &'a String) -> Document<'a> {
let doc = "@".to_doc().append(string.to_doc().surround("\"", "\""));
fn string<'a>(&self, string: &'a str) -> Document<'a> {
let doc = "@"
.to_doc()
.append(Document::String(escape(string)).surround("\"", "\""));
if string.contains('\n') {
doc.force_break()
} else {
@ -942,6 +950,7 @@ impl<'comments> Formatter<'comments> {
if args.is_empty() && with_spread {
if is_record {
name.append(" { .. }")
// TODO: not possible
} else {
name.append("(..)")
}
@ -1109,6 +1118,27 @@ impl<'comments> Formatter<'comments> {
}
}
fn logical_op_chain<'a>(
&mut self,
kind: &'a LogicalOpChainKind,
expressions: &'a [UntypedExpr],
) -> Document<'a> {
kind.to_doc()
.append(" {")
.append(
line()
.append(join(
expressions.iter().map(|expression| self.expr(expression)),
",".to_doc().append(line()),
))
.nest(INDENT)
.group(),
)
.append(",")
.append(line())
.append("}")
}
fn pipeline<'a>(
&mut self,
expressions: &'a Vec1<UntypedExpr>,
@ -1430,7 +1460,7 @@ impl<'comments> Formatter<'comments> {
name: &'a str,
args: &'a [TypedArg],
return_annotation: &'a Option<Annotation>,
return_type: Arc<Type>,
return_type: Rc<Type>,
) -> Document<'a> {
let head = name.to_doc().append(self.docs_fn_args(args)).append(" -> ");
@ -1459,7 +1489,7 @@ impl<'comments> Formatter<'comments> {
wrap_args(args.iter().map(|e| (self.docs_fn_arg(e), false)))
}
fn docs_fn_arg<'a>(&mut self, arg: &'a Arg<Arc<Type>>) -> Document<'a> {
fn docs_fn_arg<'a>(&mut self, arg: &'a TypedArg) -> Document<'a> {
self.docs_fn_arg_name(&arg.arg_name)
.append(self.type_or_annotation(&arg.annotation, &arg.tipo))
.group()
@ -1476,7 +1506,7 @@ impl<'comments> Formatter<'comments> {
fn type_or_annotation<'a>(
&mut self,
annotation: &'a Option<Annotation>,
type_info: &Arc<Type>,
type_info: &Rc<Type>,
) -> Document<'a> {
match annotation {
Some(a) => self.annotation(a),
@ -1745,6 +1775,16 @@ impl<'a> Documentable<'a> for &'a UnqualifiedImport {
}
}
impl<'a> Documentable<'a> for &'a LogicalOpChainKind {
fn to_doc(self) -> Document<'a> {
match self {
LogicalOpChainKind::And => "and",
LogicalOpChainKind::Or => "or",
}
.to_doc()
}
}
impl<'a> Documentable<'a> for &'a BinOp {
fn to_doc(self) -> Document<'a> {
match self {
@ -2007,3 +2047,17 @@ fn is_breakable_expr(expr: &UntypedExpr) -> bool {
| UntypedExpr::If { .. }
)
}
fn escape(string: &str) -> String {
string
.chars()
.flat_map(|c| match c {
'\n' => vec!['\\', 'n'],
'\r' => vec!['\\', 'r'],
'\t' => vec!['\\', 't'],
'"' => vec!['\\', c],
'\\' => vec!['\\', c],
_ => vec![c],
})
.collect::<String>()
}

File diff suppressed because it is too large Load Diff

View File

@ -1,5 +1,5 @@
use indexmap::IndexSet;
use std::sync::Arc;
use std::rc::Rc;
use uplc::builtins::DefaultFunction;
use crate::{
@ -7,424 +7,184 @@ use crate::{
tipo::{Type, ValueConstructor},
};
use super::scope::Scope;
#[derive(Debug, Clone, PartialEq)]
pub enum Air {
// Primitives
Int {
scope: Scope,
value: String,
},
String {
scope: Scope,
value: String,
},
ByteArray {
scope: Scope,
bytes: Vec<u8>,
},
Bool {
scope: Scope,
value: bool,
},
List {
scope: Scope,
count: usize,
tipo: Arc<Type>,
tipo: Rc<Type>,
tail: bool,
},
Tuple {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
count: usize,
},
Void {
scope: Scope,
},
Void,
Var {
scope: Scope,
constructor: ValueConstructor,
name: String,
variant_name: String,
},
// Functions
Call {
scope: Scope,
count: usize,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
DefineFunc {
scope: Scope,
func_name: String,
module_name: String,
params: Vec<String>,
recursive: bool,
recursive_nonstatic_params: Vec<String>,
variant_name: String,
},
DefineCyclicFuncs {
func_name: String,
module_name: String,
variant_name: String,
// just the params
contained_functions: Vec<Vec<String>>,
},
Fn {
scope: Scope,
params: Vec<String>,
},
Builtin {
scope: Scope,
count: usize,
func: DefaultFunction,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
// Operators
BinOp {
scope: Scope,
name: BinOp,
tipo: Arc<Type>,
tipo: Rc<Type>,
argument_tipo: Rc<Type>,
},
UnOp {
scope: Scope,
op: UnOp,
},
// Assignment
Let {
scope: Scope,
name: String,
},
UnWrapData {
scope: Scope,
tipo: Arc<Type>,
CastFromData {
tipo: Rc<Type>,
},
WrapData {
scope: Scope,
tipo: Arc<Type>,
CastToData {
tipo: Rc<Type>,
},
AssertConstr {
scope: Scope,
constr_index: usize,
},
AssertBool {
scope: Scope,
is_true: bool,
},
// When
When {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
subject_name: String,
subject_tipo: Rc<Type>,
},
Clause {
scope: Scope,
tipo: Arc<Type>,
subject_tipo: Rc<Type>,
subject_name: String,
complex_clause: bool,
},
ListClause {
scope: Scope,
tipo: Arc<Type>,
subject_tipo: Rc<Type>,
tail_name: String,
next_tail_name: Option<String>,
next_tail_name: Option<(String, String)>,
complex_clause: bool,
},
WrapClause {
scope: Scope,
},
WrapClause,
TupleClause {
scope: Scope,
tipo: Arc<Type>,
subject_tipo: Rc<Type>,
indices: IndexSet<(usize, String)>,
predefined_indices: IndexSet<(usize, String)>,
subject_name: String,
count: usize,
complex_clause: bool,
},
ClauseGuard {
scope: Scope,
subject_name: String,
tipo: Arc<Type>,
subject_tipo: Rc<Type>,
},
ListClauseGuard {
scope: Scope,
tipo: Arc<Type>,
subject_tipo: Rc<Type>,
tail_name: String,
next_tail_name: Option<String>,
inverse: bool,
},
Finally {
scope: Scope,
TupleGuard {
subject_tipo: Rc<Type>,
indices: IndexSet<(usize, String)>,
subject_name: String,
},
Finally,
// If
If {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
// Record Creation
Record {
scope: Scope,
Constr {
tag: usize,
tipo: Arc<Type>,
tipo: Rc<Type>,
count: usize,
},
RecordUpdate {
scope: Scope,
highest_index: usize,
indices: Vec<(usize, Arc<Type>)>,
tipo: Arc<Type>,
indices: Vec<(usize, Rc<Type>)>,
tipo: Rc<Type>,
},
// Field Access
RecordAccess {
scope: Scope,
record_index: u64,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
FieldsExpose {
scope: Scope,
indices: Vec<(usize, String, Arc<Type>)>,
indices: Vec<(usize, String, Rc<Type>)>,
check_last_item: bool,
},
// ListAccess
ListAccessor {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
names: Vec<String>,
tail: bool,
check_last_item: bool,
},
ListExpose {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
tail_head_names: Vec<(String, String)>,
tail: Option<(String, String)>,
},
// Tuple Access
TupleAccessor {
scope: Scope,
names: Vec<String>,
tipo: Arc<Type>,
tipo: Rc<Type>,
check_last_item: bool,
},
TupleIndex {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
tuple_index: usize,
},
// Misc.
ErrorTerm {
scope: Scope,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
Trace {
scope: Scope,
tipo: Arc<Type>,
},
NoOp {
scope: Scope,
},
FieldsEmpty {
scope: Scope,
},
ListEmpty {
scope: Scope,
tipo: Rc<Type>,
},
}
impl Air {
pub fn scope(&self) -> Scope {
match self {
Air::Int { scope, .. }
| Air::String { scope, .. }
| Air::ByteArray { scope, .. }
| Air::Bool { scope, .. }
| Air::List { scope, .. }
| Air::Tuple { scope, .. }
| Air::Void { scope }
| Air::Var { scope, .. }
| Air::Call { scope, .. }
| Air::DefineFunc { scope, .. }
| Air::Fn { scope, .. }
| Air::Builtin { scope, .. }
| Air::BinOp { scope, .. }
| Air::UnOp { scope, .. }
| Air::Let { scope, .. }
| Air::UnWrapData { scope, .. }
| Air::WrapData { scope, .. }
| Air::AssertConstr { scope, .. }
| Air::AssertBool { scope, .. }
| Air::When { scope, .. }
| Air::Clause { scope, .. }
| Air::ListClause { scope, .. }
| Air::WrapClause { scope }
| Air::TupleClause { scope, .. }
| Air::ClauseGuard { scope, .. }
| Air::ListClauseGuard { scope, .. }
| Air::Finally { scope }
| Air::If { scope, .. }
| Air::Record { scope, .. }
| Air::RecordUpdate { scope, .. }
| Air::RecordAccess { scope, .. }
| Air::FieldsExpose { scope, .. }
| Air::FieldsEmpty { scope }
| Air::ListEmpty { scope }
| Air::ListAccessor { scope, .. }
| Air::ListExpose { scope, .. }
| Air::TupleAccessor { scope, .. }
| Air::TupleIndex { scope, .. }
| Air::ErrorTerm { scope, .. }
| Air::Trace { scope, .. }
| Air::NoOp { scope, .. } => scope.clone(),
}
}
pub fn scope_mut(&mut self) -> &mut Scope {
match self {
Air::Int { scope, .. }
| Air::String { scope, .. }
| Air::ByteArray { scope, .. }
| Air::Bool { scope, .. }
| Air::List { scope, .. }
| Air::Tuple { scope, .. }
| Air::Void { scope }
| Air::Var { scope, .. }
| Air::Call { scope, .. }
| Air::DefineFunc { scope, .. }
| Air::Fn { scope, .. }
| Air::Builtin { scope, .. }
| Air::BinOp { scope, .. }
| Air::UnOp { scope, .. }
| Air::Let { scope, .. }
| Air::UnWrapData { scope, .. }
| Air::WrapData { scope, .. }
| Air::AssertConstr { scope, .. }
| Air::AssertBool { scope, .. }
| Air::When { scope, .. }
| Air::Clause { scope, .. }
| Air::ListClause { scope, .. }
| Air::WrapClause { scope }
| Air::TupleClause { scope, .. }
| Air::ClauseGuard { scope, .. }
| Air::ListClauseGuard { scope, .. }
| Air::Finally { scope }
| Air::If { scope, .. }
| Air::Record { scope, .. }
| Air::RecordUpdate { scope, .. }
| Air::RecordAccess { scope, .. }
| Air::FieldsExpose { scope, .. }
| Air::FieldsEmpty { scope }
| Air::ListEmpty { scope }
| Air::ListAccessor { scope, .. }
| Air::ListExpose { scope, .. }
| Air::TupleAccessor { scope, .. }
| Air::TupleIndex { scope, .. }
| Air::ErrorTerm { scope, .. }
| Air::Trace { scope, .. }
| Air::NoOp { scope, .. } => scope,
}
}
pub fn tipo(&self) -> Option<Arc<Type>> {
match self {
Air::Int { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "Int".to_string(),
args: vec![],
}
.into(),
),
Air::String { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "String".to_string(),
args: vec![],
}
.into(),
),
Air::ByteArray { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "ByteArray".to_string(),
args: vec![],
}
.into(),
),
Air::Bool { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "Bool".to_string(),
args: vec![],
}
.into(),
),
Air::Void { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "Void".to_string(),
args: vec![],
}
.into(),
),
Air::WrapData { .. } => Some(
Type::App {
public: true,
module: String::new(),
name: "Data".to_string(),
args: vec![],
}
.into(),
),
Air::Var { constructor, .. } => Some(constructor.tipo.clone()),
Air::List { tipo, .. }
| Air::Tuple { tipo, .. }
| Air::Call { tipo, .. }
| Air::Builtin { tipo, .. }
| Air::BinOp { tipo, .. }
| Air::UnWrapData { tipo, .. }
| Air::When { tipo, .. }
| Air::Clause { tipo, .. }
| Air::ListClause { tipo, .. }
| Air::TupleClause { tipo, .. }
| Air::ClauseGuard { tipo, .. }
| Air::If { tipo, .. }
| Air::ListClauseGuard { tipo, .. }
| Air::Record { tipo, .. }
| Air::RecordUpdate { tipo, .. }
| Air::RecordAccess { tipo, .. }
| Air::ListAccessor { tipo, .. }
| Air::ListExpose { tipo, .. }
| Air::TupleAccessor { tipo, .. }
| Air::TupleIndex { tipo, .. }
| Air::ErrorTerm { tipo, .. }
| Air::Trace { tipo, .. } => Some(tipo.clone()),
Air::DefineFunc { .. }
| Air::Fn { .. }
| Air::Let { .. }
| Air::WrapClause { .. }
| Air::AssertConstr { .. }
| Air::AssertBool { .. }
| Air::Finally { .. }
| Air::FieldsExpose { .. }
| Air::FieldsEmpty { .. }
| Air::ListEmpty { .. }
| Air::NoOp { .. } => None,
Air::UnOp { op, .. } => match op {
UnOp::Not => Some(
Type::App {
public: true,
module: String::new(),
name: "Bool".to_string(),
args: vec![],
}
.into(),
),
UnOp::Negate => Some(
Type::App {
public: true,
module: String::new(),
name: "Int".to_string(),
args: vec![],
}
.into(),
),
},
}
}
NoOp,
FieldsEmpty,
ListEmpty,
}

File diff suppressed because it is too large Load Diff

View File

@ -1,147 +0,0 @@
#[derive(Debug, Clone, Default, Eq, PartialEq)]
pub struct Scope(pub(self) Vec<u64>);
impl From<Vec<u64>> for Scope {
fn from(value: Vec<u64>) -> Self {
Self(value)
}
}
impl Scope {
pub fn push(&mut self, value: u64) {
self.0.push(value);
}
pub fn is_empty(&self) -> bool {
self.0.is_empty()
}
/// Find the common ancestor with the replacement,
/// remove it from `self`, and then prepend the
/// `replacement` to `self`.
pub fn replace(&mut self, mut replacement: Scope) {
let common = self.common_ancestor(&replacement);
// we know that common will always be in the front of the
// scope Vec so we can always drain `0..common.len()`.
self.0.drain(0..common.0.len());
replacement.0.extend(self.0.iter());
self.0 = replacement.0;
}
pub fn common_ancestor(&self, other: &Self) -> Scope {
let longest_length = self.0.len().max(other.0.len());
if *self.0 == *other.0 {
return self.clone();
}
for index in 0..longest_length {
if self.0.get(index).is_none() {
return self.clone();
} else if other.0.get(index).is_none() {
return other.clone();
} else if self.0[index] != other.0[index] {
return Scope(self.0[0..index].to_vec());
}
}
Scope::default()
}
}
#[cfg(test)]
mod test {
use pretty_assertions::assert_eq;
use super::Scope;
#[test]
fn common_ancestor_equal_vecs() {
let ancestor = Scope(vec![1, 2, 3, 4, 5, 6]);
let descendant = Scope(vec![1, 2, 3, 4, 5, 6]);
let result = ancestor.common_ancestor(&descendant);
assert_eq!(result, Scope(vec![1, 2, 3, 4, 5, 6]))
}
#[test]
fn common_ancestor_equal_ancestor() {
let ancestor = Scope(vec![1, 2, 3, 4]);
let descendant = Scope(vec![1, 2, 3, 4, 5, 6]);
let result = ancestor.common_ancestor(&descendant);
assert_eq!(result, Scope(vec![1, 2, 3, 4]));
}
#[test]
fn common_ancestor_not_subset() {
let ancestor = Scope(vec![1, 2, 3, 4, 5]);
let descendant = Scope(vec![1, 2, 3, 7, 8]);
let result = ancestor.common_ancestor(&descendant);
assert_eq!(result, Scope(vec![1, 2, 3]));
}
#[test]
fn common_ancestor_not_found() {
let ancestor = Scope(vec![1, 2, 3, 4, 5, 6]);
let descendant = Scope(vec![4, 5, 6]);
let result = ancestor.common_ancestor(&descendant);
assert_eq!(result, Scope::default());
}
#[test]
fn common_ancestor_no_shared_values() {
let ancestor = Scope(vec![1, 2, 3]);
let descendant = Scope(vec![4, 5, 6]);
let result = ancestor.common_ancestor(&descendant);
assert_eq!(result, Scope::default());
}
#[test]
fn replace_same_value() {
let mut value = Scope(vec![1, 2, 3, 4, 5, 6]);
let replacement = Scope(vec![1, 2, 3, 4, 5, 6]);
value.replace(replacement);
assert_eq!(value, Scope(vec![1, 2, 3, 4, 5, 6]));
}
#[test]
fn replace_with_pattern() {
let mut value = Scope(vec![1, 2, 3, 4, 5]);
let replacement = Scope(vec![1, 2, 8, 9]);
value.replace(replacement);
assert_eq!(value, Scope(vec![1, 2, 8, 9, 3, 4, 5]));
}
#[test]
fn replace_with_no_pattern() {
let mut value = Scope(vec![1, 2, 3, 4, 5]);
let replacement = Scope(vec![8, 9]);
value.replace(replacement);
assert_eq!(value, Scope(vec![8, 9, 1, 2, 3, 4, 5]));
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -54,16 +54,26 @@ pub fn parser() -> impl Parser<Token, ast::UntypedDefinition, Error = ParseError
|(((public, opaque), (name, parameters)), constructors), span| {
ast::UntypedDefinition::DataType(ast::DataType {
location: span,
constructors: constructors
.into_iter()
.map(|mut constructor| {
if constructor.sugar {
constructor.name = name.clone();
}
constructors: if constructors.is_empty() {
vec![ast::RecordConstructor {
location: span,
arguments: vec![],
doc: None,
name: name.clone(),
sugar: true,
}]
} else {
constructors
.into_iter()
.map(|mut constructor| {
if constructor.sugar {
constructor.name = name.clone();
}
constructor
})
.collect(),
constructor
})
.collect()
},
doc: None,
name,
opaque,
@ -119,4 +129,25 @@ mod tests {
"#
);
}
#[test]
fn record_sugar() {
assert_definition!(
r#"
pub type Foo {
wow: Int,
}
"#
);
}
#[test]
fn empty_record_sugar() {
assert_definition!(
r#"
pub type Foo {
}
"#
);
}
}

View File

@ -0,0 +1,24 @@
---
source: crates/aiken-lang/src/parser/definition/data_type.rs
description: "Code:\n\npub type Foo {\n}\n"
---
DataType(
DataType {
constructors: [
RecordConstructor {
location: 0..16,
name: "Foo",
arguments: [],
doc: None,
sugar: true,
},
],
doc: None,
location: 0..16,
name: "Foo",
opaque: false,
parameters: [],
public: true,
typed_parameters: [],
},
)

View File

@ -0,0 +1,39 @@
---
source: crates/aiken-lang/src/parser/definition/data_type.rs
description: "Code:\n\npub type Foo {\n wow: Int,\n}\n"
---
DataType(
DataType {
constructors: [
RecordConstructor {
location: 13..28,
name: "Foo",
arguments: [
RecordConstructorArg {
label: Some(
"wow",
),
annotation: Constructor {
location: 22..25,
module: None,
name: "Int",
arguments: [],
},
location: 17..25,
tipo: (),
doc: None,
},
],
doc: None,
sugar: true,
},
],
doc: None,
location: 0..28,
name: "Foo",
opaque: false,
parameters: [],
public: true,
typed_parameters: [],
},
)

View File

@ -0,0 +1,72 @@
use chumsky::prelude::*;
use crate::{
ast::LogicalOpChainKind,
expr::UntypedExpr,
parser::{error::ParseError, token::Token},
};
pub fn parser(
expression: Recursive<'_, Token, UntypedExpr, ParseError>,
) -> impl Parser<Token, UntypedExpr, Error = ParseError> + '_ {
choice((
just(Token::And).to(LogicalOpChainKind::And),
just(Token::Or).to(LogicalOpChainKind::Or),
))
.then(
expression
.separated_by(just(Token::Comma))
.allow_trailing()
.delimited_by(just(Token::LeftBrace), just(Token::RightBrace)),
)
.map_with_span(|(kind, exprs), span| UntypedExpr::LogicalOpChain {
kind,
expressions: exprs,
location: span,
})
}
#[cfg(test)]
mod tests {
use crate::assert_expr;
#[test]
fn and_chain() {
assert_expr!(
r#"
and {
1 == 2,
something,
}
"#
);
}
#[test]
fn or_chain() {
assert_expr!(
r#"
or {
1 == 2,
something,
}
"#
);
}
#[test]
fn and_or_chain() {
assert_expr!(
r#"
or {
1 == 2,
something,
and {
1 == 2,
something,
},
}
"#
);
}
}

View File

@ -24,15 +24,24 @@ mod tests {
use crate::assert_expr;
#[test]
fn block_basic() {
fn block_let() {
assert_expr!(
r#"
let b = {
let x = 4
x + 5
}
"#
);
}
#[test]
fn block_single() {
assert_expr!(
r#"{
foo
}
"#
);
}
}

View File

@ -28,4 +28,9 @@ mod tests {
fn bytearray_utf8_encoded() {
assert_expr!("\"aiken\"");
}
#[test]
fn bytearray_utf8_escaped() {
assert_expr!("\"\\\"aiken\\\"\"");
}
}

View File

@ -1,6 +1,5 @@
use chumsky::prelude::*;
use super::anonymous_binop::parser as anonymous_binop;
use super::anonymous_function::parser as anonymous_function;
use super::assignment;
use super::block::parser as block;
@ -14,6 +13,7 @@ use super::string::parser as string;
use super::tuple::parser as tuple;
use super::var::parser as var;
use super::when::parser as when;
use super::{and_or_chain, anonymous_binop::parser as anonymous_binop};
use crate::{
expr::UntypedExpr,
@ -33,6 +33,7 @@ pub fn parser<'a>(
field_access::parser(),
call(expression.clone()),
));
chain_start(sequence, expression)
.then(chain.repeated())
.foldl(|expr, chain| match chain {
@ -60,6 +61,7 @@ pub fn chain_start<'a>(
record_update(expression.clone()),
record(expression.clone()),
field_access::constructor(),
and_or_chain(expression.clone()),
var(),
tuple(expression.clone()),
bytearray(),

View File

@ -71,4 +71,19 @@ mod tests {
"#
);
}
#[test]
fn if_else_ambiguous_record() {
assert_expr!(
r#"
if ec1 == Infinity {
ec2
} else if ec1 == Foo { foo } {
ec1
} else {
Infinity
}
"#
);
}
}

View File

@ -1,6 +1,7 @@
use chumsky::prelude::*;
use vec1::Vec1;
mod and_or_chain;
mod anonymous_binop;
pub mod anonymous_function;
pub mod assignment;
@ -19,6 +20,7 @@ mod tuple;
mod var;
pub mod when;
pub use and_or_chain::parser as and_or_chain;
pub use anonymous_function::parser as anonymous_function;
pub use block::parser as block;
pub use bytearray::parser as bytearray;

View File

@ -72,6 +72,38 @@ pub fn parser(
},
),
))
// NOTE: There's an ambiguity when the record shorthand syntax is used
// from within an if-else statement in the case of single-variable if-branch.
//
// For example, imagine the following:
//
// ```
// if season == Summer {
// foo
// } else {
// bar
// }
// ```
//
// Without that next odd parser combinator, the parser would parse:
//
// ```
// if season == Summer { foo }
// else {
// bar
// }
// ```
//
// And immediately choke on the next `else` because the if-branch body has
// already been consumed and interpreted as a record definition. So the next
// combinator ensures that we give priority back to an if-then statement rather
// than to the record definition.
.then_ignore(
just(Token::RightBrace)
.ignore_then(just(Token::Else))
.not()
.rewind(),
)
.map(|(value, name)| ast::CallArg {
location: value.location(),
value,

View File

@ -0,0 +1,32 @@
---
source: crates/aiken-lang/src/parser/expr/and_or_chain.rs
description: "Code:\n\nand {\n 1 == 2,\n something,\n}\n"
---
LogicalOpChain {
kind: And,
expressions: [
BinOp {
location: 8..14,
name: Eq,
left: UInt {
location: 8..9,
value: "1",
base: Decimal {
numeric_underscore: false,
},
},
right: UInt {
location: 13..14,
value: "2",
base: Decimal {
numeric_underscore: false,
},
},
},
Var {
location: 18..27,
name: "something",
},
],
location: 0..30,
}

View File

@ -0,0 +1,60 @@
---
source: crates/aiken-lang/src/parser/expr/and_or_chain.rs
description: "Code:\n\nor {\n 1 == 2,\n something,\n and {\n 1 == 2,\n something,\n },\n}\n"
---
LogicalOpChain {
kind: Or,
expressions: [
BinOp {
location: 7..13,
name: Eq,
left: UInt {
location: 7..8,
value: "1",
base: Decimal {
numeric_underscore: false,
},
},
right: UInt {
location: 12..13,
value: "2",
base: Decimal {
numeric_underscore: false,
},
},
},
Var {
location: 17..26,
name: "something",
},
LogicalOpChain {
kind: And,
expressions: [
BinOp {
location: 40..46,
name: Eq,
left: UInt {
location: 40..41,
value: "1",
base: Decimal {
numeric_underscore: false,
},
},
right: UInt {
location: 45..46,
value: "2",
base: Decimal {
numeric_underscore: false,
},
},
},
Var {
location: 52..61,
name: "something",
},
],
location: 30..66,
},
],
location: 0..69,
}

View File

@ -0,0 +1,49 @@
---
source: crates/aiken-lang/src/parser/expr/block.rs
description: "Code:\n\nlet b = {\n let x = 4\n x + 5\n}\n"
---
Assignment {
location: 0..31,
value: Sequence {
location: 12..29,
expressions: [
Assignment {
location: 12..21,
value: UInt {
location: 20..21,
value: "4",
base: Decimal {
numeric_underscore: false,
},
},
pattern: Var {
location: 16..17,
name: "x",
},
kind: Let,
annotation: None,
},
BinOp {
location: 24..29,
name: AddInt,
left: Var {
location: 24..25,
name: "x",
},
right: UInt {
location: 28..29,
value: "5",
base: Decimal {
numeric_underscore: false,
},
},
},
],
},
pattern: Var {
location: 4..5,
name: "b",
},
kind: Let,
annotation: None,
}

View File

@ -0,0 +1,8 @@
---
source: crates/aiken-lang/src/parser/expr/block.rs
description: "Code:\n\n{\nfoo\n}\n"
---
Var {
location: 2..5,
name: "foo",
}

View File

@ -0,0 +1,17 @@
---
source: crates/aiken-lang/src/parser/expr/bytearray.rs
description: "Code:\n\n\"\\\"aiken\\\"\""
---
ByteArray {
location: 0..11,
bytes: [
34,
97,
105,
107,
101,
110,
34,
],
preferred_format: Utf8String,
}

View File

@ -2,14 +2,6 @@
source: crates/aiken-lang/src/parser/expr/fail_todo_trace.rs
description: "Code:\n\nfail\n"
---
Trace {
kind: Error,
ErrorTerm {
location: 0..4,
then: ErrorTerm {
location: 0..4,
},
text: String {
location: 0..4,
value: "aiken::error",
},
}

View File

@ -0,0 +1,66 @@
---
source: crates/aiken-lang/src/parser/expr/if_else.rs
description: "Code:\n\nif ec1 == Infinity {\n ec2\n} else if ec1 == Foo { foo } {\n ec1\n} else {\n Infinity\n}\n"
---
If {
location: 0..85,
branches: [
IfBranch {
condition: BinOp {
location: 3..18,
name: Eq,
left: Var {
location: 3..6,
name: "ec1",
},
right: Var {
location: 10..18,
name: "Infinity",
},
},
body: Var {
location: 23..26,
name: "ec2",
},
location: 3..28,
},
IfBranch {
condition: BinOp {
location: 37..55,
name: Eq,
left: Var {
location: 37..40,
name: "ec1",
},
right: Call {
arguments: [
CallArg {
label: Some(
"foo",
),
location: 50..53,
value: Var {
location: 50..53,
name: "foo",
},
},
],
fun: Var {
location: 44..47,
name: "Foo",
},
location: 44..55,
},
},
body: Var {
location: 60..63,
name: "ec1",
},
location: 37..65,
},
],
final_else: Var {
location: 75..83,
name: "Infinity",
},
}

View File

@ -0,0 +1,32 @@
---
source: crates/aiken-lang/src/parser/expr/and_or_chain.rs
description: "Code:\n\nor {\n 1 == 2,\n something,\n}\n"
---
LogicalOpChain {
kind: Or,
expressions: [
BinOp {
location: 7..13,
name: Eq,
left: UInt {
location: 7..8,
value: "1",
base: Decimal {
numeric_underscore: false,
},
},
right: UInt {
location: 12..13,
value: "2",
base: Decimal {
numeric_underscore: false,
},
},
},
Var {
location: 17..26,
name: "something",
},
],
location: 0..29,
}

View File

@ -24,16 +24,8 @@ When {
},
],
guard: None,
then: Trace {
kind: Error,
then: ErrorTerm {
location: 28..32,
then: ErrorTerm {
location: 28..32,
},
text: String {
location: 28..32,
value: "aiken::error",
},
},
},
],

View File

@ -196,10 +196,7 @@ pub fn lexer() -> impl Parser<char, Vec<(Token, Span)>, Error = ParseError> {
let escape = just('\\').ignore_then(
just('\\')
.or(just('/'))
.or(just('"'))
.or(just('b').to('\x08'))
.or(just('f').to('\x0C'))
.or(just('n').to('\n'))
.or(just('r').to('\r'))
.or(just('t').to('\t')),
@ -222,11 +219,12 @@ pub fn lexer() -> impl Parser<char, Vec<(Token, Span)>, Error = ParseError> {
let keyword = text::ident().map(|s: String| match s.as_str() {
"trace" => Token::Trace,
"fail" => Token::Fail,
// TODO: delete this eventually
// TODO: remove this in a future release
"error" => Token::Fail,
"fail" => Token::Fail,
"as" => Token::As,
"assert" => Token::Expect,
"and" => Token::And,
"or" => Token::Or,
"expect" => Token::Expect,
"const" => Token::Const,
"fn" => Token::Fn,

View File

@ -39,9 +39,9 @@ pub fn array_of_bytes(
.delimited_by(just(Token::LeftSquare), just(Token::RightSquare)),
)
.validate(|bytes, span, emit| {
let base = bytes.iter().fold(Ok(None), |acc, (_, base)| match acc {
Ok(None) => Ok(Some(base)),
Ok(Some(previous_base)) if previous_base == base => Ok(Some(base)),
let base = bytes.iter().try_fold(None, |acc, (_, base)| match acc {
None => Ok(Some(base)),
Some(previous_base) if previous_base == base => Ok(Some(base)),
_ => Err(()),
});

View File

@ -56,6 +56,8 @@ pub enum Token {
Vbar, // '|'
VbarVbar, // '||'
AmperAmper, // '&&'
And, // and
Or, // or
NewLinePipe, // '↳|>'
Pipe, // '|>'
Dot, // '.'
@ -143,6 +145,8 @@ impl fmt::Display for Token {
Token::Vbar => "|",
Token::VbarVbar => "||",
Token::AmperAmper => "&&",
Token::And => "and",
Token::Or => "or",
Token::NewLinePipe => "↳|>",
Token::Pipe => "|>",
Token::Dot => ".",

View File

@ -154,6 +154,486 @@ fn multi_validator_warning() {
))
}
#[test]
fn exhaustiveness_simple() {
let source_code = r#"
type Foo {
Bar
Baz
}
fn foo() {
let thing = Bar
when thing is {
Bar -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "Baz"
))
}
#[test]
fn exhaustiveness_missing_empty_list() {
let source_code = r#"
fn foo() {
let thing = [1, 2]
when thing is {
[a, ..] -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "[]"
))
}
#[test]
fn exhaustiveness_missing_list_wildcards() {
let source_code = r#"
fn foo() {
let thing = [1, 2]
when thing is {
[] -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "[_, ..]"
))
}
#[test]
fn exhaustiveness_missing_list_wildcards_2() {
let source_code = r#"
fn foo() {
let thing = [1, 2]
when thing is {
[] -> True
[a] -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "[_, _, ..]"
))
}
#[test]
fn exhaustiveness_int() {
let source_code = r#"
fn foo() {
let thing = 1
when thing is {
1 -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "_"
))
}
#[test]
fn exhaustiveness_int_redundant() {
let source_code = r#"
fn foo() {
let thing = 1
when thing is {
1 -> True
1 -> True
_ -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::RedundantMatchClause {
original: Some(_),
..
}
))
))
}
#[test]
fn exhaustiveness_let_binding() {
let source_code = r#"
fn foo() {
let Some(x) = None
True
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
is_let,
unmatched,
..
}
)) if unmatched[0] == "None" && is_let
))
}
#[test]
fn exhaustiveness_expect() {
let source_code = r#"
fn foo() {
expect Some(x) = None
True
}
"#;
assert!(check(parse(source_code)).is_ok())
}
#[test]
fn exhaustiveness_expect_no_warning() {
let source_code = r#"
pub type A {
int: Int,
b: B,
}
pub type B {
B0(Int)
B1(Int)
}
pub fn bad_let(x: A, _: A) {
expect A { b: B0(int), .. } = x
int > 0
}
"#;
let (warnings, _) = check(parse(source_code)).unwrap();
assert_eq!(warnings.len(), 0)
}
#[test]
fn exhaustiveness_expect_warning() {
let source_code = r#"
pub type A {
int: Int,
b: Int,
}
pub fn thing(x: A, _: A) {
expect A { b, .. } = x
b > 0
}
"#;
let (warnings, _) = check(parse(source_code)).unwrap();
assert!(matches!(
warnings[0],
Warning::SingleConstructorExpect { .. }
))
}
#[test]
fn exhaustiveness_missing_constr_with_args() {
let source_code = r#"
type Foo {
Bar
Why(Int)
Baz { other: Int }
}
fn foo() {
let thing = Bar
when thing is {
Bar -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "Why(_)" && unmatched[1] == "Baz { other }"
))
}
#[test]
fn exhaustiveness_redundant_pattern() {
let source_code = r#"
type Foo {
A
B
}
fn foo(a: Foo) {
when a is {
A -> todo
B -> todo
_ -> todo
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((_, Error::RedundantMatchClause { original: None, .. }))
))
}
#[test]
fn exhaustiveness_redundant_pattern_2() {
let source_code = r#"
type Foo {
A
B
}
fn foo(a: Foo) {
when a is {
A -> todo
B -> todo
A -> todo
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::RedundantMatchClause {
original: Some(_),
..
}
))
))
}
#[test]
fn exhaustiveness_complex() {
let source_code = r#"
type Foo {
Bar
Why(Int)
Baz { other: Int }
}
type Hello {
Yes
No { idk: Int, thing: Foo }
}
fn foo() {
let thing = ((Yes, 1), (Yes, [1, 2]))
when thing is {
((Yes, _), (Yes, [])) -> True
((Yes, _), (No { .. }, _)) -> True
((No { .. }, _), (No { .. }, _)) -> True
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "((Yes, _), (Yes, [_, ..]))" && unmatched[1] == "((No { idk, thing }, _), (Yes, _))"
))
}
#[test]
fn exhaustiveness_tuple() {
let source_code = r#"
fn foo() {
when (14, True) is {
(14, True) -> Void
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "(_, _)"
))
}
#[test]
fn exhaustiveness_nested_list_and_tuples() {
fn assert_step(step: &str, expected: &str) {
let result = check(parse(step));
assert!(matches!(
result,
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == expected
));
}
assert_step(
r#"
fn foo() {
let xs : List<(List<(Int, Bool)>, Int)> = [([(14, True)], 42)]
when xs is {
[ ] -> Void
[([(14, True)], 42), ..] -> Void
}
}
"#,
"[([], _), ..]",
);
assert_step(
r#"
fn foo() {
let xs : List<(List<(Int, Bool)>, Int)> = [([(14, True)], 42)]
when xs is {
[ ] -> Void
[([(_, True)], 42), ..] -> Void
[([ ], _), ..] -> Void
}
}
"#,
"[([(_, False), ..], _), ..]",
);
assert_step(
r#"
fn foo() {
let xs : List<(List<(Int, Bool)>, Int)> = [([(14, True)], 42)]
when xs is {
[ ] -> Void
[([(_, True ) ], 42), ..] -> Void
[([ ], _), ..] -> Void
[([(_, False), ..], _), ..] -> Void
}
}
"#,
"[([(_, True), _, ..], _), ..]",
);
assert_step(
r#"
fn foo() {
let xs : List<(List<(Int, Bool)>, Int)> = [([(14, True)], 42)]
when xs is {
[ ] -> Void
[([(_, True ) ], 42), ..] -> Void
[([ ], _), ..] -> Void
[([(_, False) , ..], _), ..] -> Void
[([(_, True ), _, ..], _), ..] -> Void
}
}
"#,
"[([(_, True)], _), ..]",
);
let source_code = r#"
fn foo() {
let xs : List<(List<(Int, Bool)>, Int)> = [([(14, True)], 42)]
when xs is {
[ ] -> Void
[([(_, True ) ], 42), ..] -> Void
[([ ], _), ..] -> Void
[([(_, False) , ..], _), ..] -> Void
[([(_, True ), _, ..], _), ..] -> Void
[([(_, True ) ], _), ..] -> Void
}
}
"#;
assert!(check(parse(source_code)).is_ok())
}
#[test]
fn exhaustiveness_guard() {
let source_code = r#"
fn foo() {
when [(True, 42)] is {
[(True, x), ..] if x == 42 -> Void
[(False, x), ..] -> Void
[] -> Void
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((
_,
Error::NotExhaustivePatternMatch {
unmatched,
..
}
)) if unmatched[0] == "[(True, _), ..]"
));
}
#[test]
fn expect_sugar_correct_type() {
let source_code = r#"
@ -163,7 +643,7 @@ fn expect_sugar_correct_type() {
}
"#;
assert!(matches!(check(parse(source_code)), Ok(_)))
assert!(check(parse(source_code)).is_ok())
}
#[test]
@ -176,7 +656,28 @@ fn expect_sugar_incorrect_type() {
"#;
assert!(matches!(
dbg!(check(parse(source_code))),
check(parse(source_code)),
Err((_, Error::CouldNotUnify { .. }))
))
}
#[test]
fn logical_op_chain_expressions_should_be_bool() {
let source_code = r#"
fn foo() {
and {
1 == 1,
False,
or {
2 == 3,
1
}
}
}
"#;
assert!(matches!(
check(parse(source_code)),
Err((_, Error::CouldNotUnify { .. }))
))
}
@ -465,12 +966,12 @@ fn trace_non_strings() {
#[test]
fn trace_if_false_ok() {
let source_code = r#"
fn or(a: Bool, b: Bool) {
fn or_func(a: Bool, b: Bool) {
(a || b)?
}
test foo() {
or(True, False)?
or_func(True, False)?
}
test bar() {

View File

@ -20,6 +20,15 @@ fn format_simple_module() {
);
}
#[test]
fn format_logical_op_chain() {
assert_format!(
r#"
fn smth() { and { foo, bar, or { bar, foo }} }
"#
);
}
#[test]
fn format_if() {
assert_format!(
@ -250,7 +259,7 @@ fn format_nested_function_calls() {
_ -> fail "expected inline datum"
},
]
|> list.and
|> list.and_func
}
"#
);
@ -384,6 +393,20 @@ fn format_bytearray_literals() {
);
}
#[test]
fn escaped_utf8() {
assert_format!(
r#"
const escaped_1 = "\"my_string\""
const escaped_2 = "foo\nbar"
const escaped_3 = "foo\rbar"
const escaped_4 = "foo\tbar"
const escaped_5 = "1/2"
const escaped_6 = "1//2"
"#
);
}
#[test]
fn format_string_literal() {
assert_format!(

View File

@ -0,0 +1,16 @@
---
source: crates/aiken-lang/src/tests/format.rs
description: "Code:\n\nconst escaped_1 = \"\\\"my_string\\\"\"\nconst escaped_2 = \"foo\\nbar\"\nconst escaped_3 = \"foo\\rbar\"\nconst escaped_4 = \"foo\\tbar\"\nconst escaped_5 = \"1/2\"\nconst escaped_6 = \"1//2\"\n"
---
const escaped_1 = "\"my_string\""
const escaped_2 = "foo\nbar"
const escaped_3 = "foo\rbar"
const escaped_4 = "foo\tbar"
const escaped_5 = "1/2"
const escaped_6 = "1//2"

View File

@ -0,0 +1,15 @@
---
source: crates/aiken-lang/src/tests/format.rs
description: "Code:\n\nfn smth() { and { foo, bar, or { bar, foo }} }\n"
---
fn smth() {
and {
foo,
bar,
or {
bar,
foo,
},
}
}

View File

@ -1,6 +1,6 @@
---
source: crates/aiken-lang/src/tests/format.rs
description: "Code:\n\nfn foo(output) {\n [\n output.address.stake_credential == Some(\n Inline(\n VerificationKeyCredential(\n #\"66666666666666666666666666666666666666666666666666666666\",\n ))\n )\n ,\n when output.datum is {\n InlineDatum(_) -> True\n _ -> fail \"expected inline datum\"\n },\n ]\n |> list.and\n}\n"
description: "Code:\n\nfn foo(output) {\n [\n output.address.stake_credential == Some(\n Inline(\n VerificationKeyCredential(\n #\"66666666666666666666666666666666666666666666666666666666\",\n ))\n )\n ,\n when output.datum is {\n InlineDatum(_) -> True\n _ -> fail \"expected inline datum\"\n },\n ]\n |> list.and_func\n}\n"
---
fn foo(output) {
[
@ -16,6 +16,6 @@ fn foo(output) {
_ -> fail @"expected inline datum"
},
]
|> list.and
|> list.and_func
}

View File

@ -3,11 +3,12 @@ use crate::{
ast::{Constant, DefinitionLocation, ModuleKind, Span},
tipo::fields::FieldMap,
};
use std::{cell::RefCell, collections::HashMap, ops::Deref, sync::Arc};
use std::{cell::RefCell, collections::HashMap, ops::Deref, rc::Rc};
use uplc::{ast::Type as UplcType, builtins::DefaultFunction};
mod environment;
pub mod error;
mod exhaustive;
mod expr;
pub mod fields;
mod hydrator;
@ -30,27 +31,27 @@ pub enum Type {
public: bool,
module: String,
name: String,
args: Vec<Arc<Type>>,
args: Vec<Rc<Type>>,
},
/// The type of a function. It takes arguments and returns a value.
///
Fn {
args: Vec<Arc<Type>>,
ret: Arc<Type>,
args: Vec<Rc<Type>>,
ret: Rc<Type>,
},
/// A type variable. See the contained `TypeVar` enum for more information.
///
Var {
tipo: Arc<RefCell<TypeVar>>,
tipo: Rc<RefCell<TypeVar>>,
},
// /// A tuple is an ordered collection of 0 or more values, each of which
// /// can have a different type, so the `tuple` type is the sum of all the
// /// contained types.
// ///
Tuple {
elems: Vec<Arc<Type>>,
elems: Vec<Rc<Type>>,
},
}
@ -74,20 +75,29 @@ impl Type {
matches!(self, Self::Fn { .. })
}
pub fn return_type(&self) -> Option<Arc<Self>> {
pub fn return_type(&self) -> Option<Rc<Self>> {
match self {
Self::Fn { ret, .. } => Some(ret.clone()),
_ => None,
}
}
pub fn function_types(&self) -> Option<(Vec<Arc<Self>>, Arc<Self>)> {
pub fn function_types(&self) -> Option<(Vec<Rc<Self>>, Rc<Self>)> {
match self {
Self::Fn { args, ret, .. } => Some((args.clone(), ret.clone())),
_ => None,
}
}
pub fn is_primitive(&self) -> bool {
self.is_bool()
|| self.is_bytearray()
|| self.is_int()
|| self.is_string()
|| self.is_void()
|| self.is_data()
}
pub fn is_void(&self) -> bool {
match self {
Self::App { module, name, .. } if "Void" == name && module.is_empty() => true,
@ -170,6 +180,14 @@ impl Type {
}
}
pub fn is_2_tuple(&self) -> bool {
match self {
Type::Var { tipo } => tipo.borrow().is_2_tuple(),
Type::Tuple { elems } => elems.len() == 2,
_ => false,
}
}
pub fn is_data(&self) -> bool {
match self {
Self::App { module, name, .. } => "Data" == name && module.is_empty(),
@ -206,7 +224,7 @@ impl Type {
}
}
pub fn arg_types(&self) -> Option<Vec<Arc<Self>>> {
pub fn arg_types(&self) -> Option<Vec<Rc<Self>>> {
match self {
Self::Fn { args, .. } => Some(args.clone()),
Self::App { args, .. } => Some(args.clone()),
@ -222,7 +240,7 @@ impl Type {
}
}
pub fn get_inner_types(&self) -> Vec<Arc<Type>> {
pub fn get_inner_types(&self) -> Vec<Rc<Type>> {
if self.is_list() {
match self {
Self::App { args, .. } => args.clone(),
@ -292,7 +310,7 @@ impl Type {
name: &str,
arity: usize,
environment: &mut Environment<'_>,
) -> Option<Vec<Arc<Self>>> {
) -> Option<Vec<Rc<Self>>> {
match self {
Self::App {
module: m,
@ -323,7 +341,7 @@ impl Type {
// We are an unbound type variable! So convert us to a type link
// to the desired type.
*tipo.borrow_mut() = TypeVar::Link {
tipo: Arc::new(Self::App {
tipo: Rc::new(Self::App {
name: name.to_string(),
module: module.to_owned(),
args: args.clone(),
@ -389,7 +407,7 @@ pub enum TypeVar {
/// Link is type variable where it was an unbound variable but we worked out
/// that it is some other type and now we point to that one.
///
Link { tipo: Arc<Type> },
Link { tipo: Rc<Type> },
/// A Generic variable stands in for any possible type and cannot be
/// specialised to any one type
///
@ -473,6 +491,13 @@ impl TypeVar {
}
}
pub fn is_2_tuple(&self) -> bool {
match self {
Self::Link { tipo } => tipo.is_2_tuple(),
_ => false,
}
}
pub fn is_data(&self) -> bool {
match self {
Self::Link { tipo } => tipo.is_data(),
@ -496,16 +521,17 @@ impl TypeVar {
}
}
pub fn arg_types(&self) -> Option<Vec<Arc<Type>>> {
pub fn arg_types(&self) -> Option<Vec<Rc<Type>>> {
match self {
Self::Link { tipo } => tipo.arg_types(),
_ => None,
}
}
pub fn get_inner_types(&self) -> Vec<Arc<Type>> {
pub fn get_inner_types(&self) -> Vec<Rc<Type>> {
match self {
Self::Link { tipo } => tipo.get_inner_types(),
Self::Unbound { .. } => vec![],
var => {
vec![Type::Var {
tipo: RefCell::new(var.clone()).into(),
@ -527,11 +553,11 @@ impl TypeVar {
pub struct ValueConstructor {
pub public: bool,
pub variant: ValueConstructorVariant,
pub tipo: Arc<Type>,
pub tipo: Rc<Type>,
}
impl ValueConstructor {
pub fn public(tipo: Arc<Type>, variant: ValueConstructorVariant) -> ValueConstructor {
pub fn public(tipo: Rc<Type>, variant: ValueConstructorVariant) -> ValueConstructor {
ValueConstructor {
public: true,
variant,
@ -608,7 +634,7 @@ pub enum ValueConstructorVariant {
impl ValueConstructorVariant {
fn to_module_value_constructor(
&self,
tipo: Arc<Type>,
tipo: Rc<Type>,
module_name: &str,
function_name: &str,
) -> ModuleValueConstructor {
@ -685,14 +711,14 @@ pub struct TypeConstructor {
pub public: bool,
pub location: Span,
pub module: String,
pub parameters: Vec<Arc<Type>>,
pub tipo: Arc<Type>,
pub parameters: Vec<Rc<Type>>,
pub tipo: Rc<Type>,
}
#[derive(Debug, Clone)]
pub struct AccessorsMap {
pub public: bool,
pub tipo: Arc<Type>,
pub tipo: Rc<Type>,
pub accessors: HashMap<String, RecordAccessor>,
}
@ -701,7 +727,7 @@ pub struct RecordAccessor {
// TODO: smaller int. Doesn't need to be this big
pub index: u64,
pub label: String,
pub tipo: Arc<Type>,
pub tipo: Rc<Type>,
}
#[derive(Debug, Clone, PartialEq, Eq)]
@ -717,7 +743,7 @@ pub enum ModuleValueConstructor {
Record {
name: String,
arity: usize,
tipo: Arc<Type>,
tipo: Rc<Type>,
field_map: Option<FieldMap>,
location: Span,
},

View File

@ -1,15 +1,13 @@
use std::{
collections::{HashMap, HashSet},
ops::Deref,
sync::Arc,
rc::Rc,
};
use itertools::Itertools;
use crate::{
ast::{
Annotation, CallArg, DataType, Definition, Function, ModuleConstant, ModuleKind, Pattern,
RecordConstructor, RecordConstructorArg, Span, TypeAlias, TypedDefinition,
Annotation, CallArg, DataType, Definition, Function, ModuleConstant, ModuleKind,
RecordConstructor, RecordConstructorArg, Span, TypeAlias, TypedDefinition, TypedPattern,
UnqualifiedImport, UntypedArg, UntypedDefinition, Use, Validator, PIPE_VARIABLE,
},
builtins::{self, function, generic_var, tuple, unbound_var},
@ -19,9 +17,10 @@ use crate::{
use super::{
error::{Error, Snippet, Warning},
exhaustive::{simplify, Matrix, PatternStack},
hydrator::Hydrator,
AccessorsMap, PatternConstructor, RecordAccessor, Type, TypeConstructor, TypeInfo, TypeVar,
ValueConstructor, ValueConstructorVariant,
AccessorsMap, RecordAccessor, Type, TypeConstructor, TypeInfo, TypeVar, ValueConstructor,
ValueConstructorVariant,
};
#[derive(Debug)]
@ -105,11 +104,11 @@ impl<'a> Environment<'a> {
pub fn match_fun_type(
&mut self,
tipo: Arc<Type>,
tipo: Rc<Type>,
arity: usize,
fn_location: Span,
call_location: Span,
) -> Result<(Vec<Arc<Type>>, Arc<Type>), Error> {
) -> Result<(Vec<Rc<Type>>, Rc<Type>), Error> {
if let Type::Var { tipo } = tipo.deref() {
let new_value = match tipo.borrow().deref() {
TypeVar::Link { tipo, .. } => {
@ -321,7 +320,7 @@ impl<'a> Environment<'a> {
self.imported_modules
.get(m)
.ok_or_else(|| Error::UnknownModule {
name: name.to_string(),
name: m.to_string(),
imported_modules: self
.importable_modules
.keys()
@ -493,7 +492,7 @@ impl<'a> Environment<'a> {
&mut self,
name: String,
variant: ValueConstructorVariant,
tipo: Arc<Type>,
tipo: Rc<Type>,
) {
self.scope.insert(
name,
@ -508,10 +507,10 @@ impl<'a> Environment<'a> {
/// Instantiate converts generic variables into unbound ones.
pub fn instantiate(
&mut self,
t: Arc<Type>,
ids: &mut HashMap<u64, Arc<Type>>,
t: Rc<Type>,
ids: &mut HashMap<u64, Rc<Type>>,
hydrator: &Hydrator,
) -> Arc<Type> {
) -> Rc<Type> {
match t.deref() {
Type::App {
public,
@ -523,7 +522,7 @@ impl<'a> Environment<'a> {
.iter()
.map(|t| self.instantiate(t.clone(), ids, hydrator))
.collect();
Arc::new(Type::App {
Rc::new(Type::App {
public: *public,
name: name.clone(),
module: module.clone(),
@ -535,7 +534,7 @@ impl<'a> Environment<'a> {
match tipo.borrow().deref() {
TypeVar::Link { tipo } => return self.instantiate(tipo.clone(), ids, hydrator),
TypeVar::Unbound { .. } => return Arc::new(Type::Var { tipo: tipo.clone() }),
TypeVar::Unbound { .. } => return Rc::new(Type::Var { tipo: tipo.clone() }),
TypeVar::Generic { id } => match ids.get(id) {
Some(t) => return t.clone(),
@ -551,7 +550,7 @@ impl<'a> Environment<'a> {
}
},
}
Arc::new(Type::Var { tipo: tipo.clone() })
Rc::new(Type::Var { tipo: tipo.clone() })
}
Type::Fn { args, ret, .. } => function(
@ -583,7 +582,7 @@ impl<'a> Environment<'a> {
args: &[String],
location: &Span,
hydrator: &mut Hydrator,
) -> Result<Vec<Arc<Type>>, Error> {
) -> Result<Vec<Rc<Type>>, Error> {
let mut type_vars = Vec::new();
for arg in args {
@ -631,13 +630,13 @@ impl<'a> Environment<'a> {
}
/// Create a new generic type that can stand in for any type.
pub fn new_generic_var(&mut self) -> Arc<Type> {
pub fn new_generic_var(&mut self) -> Rc<Type> {
generic_var(self.next_uid())
}
/// Create a new unbound type that is a specific type, we just don't
/// know which one yet.
pub fn new_unbound_var(&mut self) -> Arc<Type> {
pub fn new_unbound_var(&mut self) -> Rc<Type> {
unbound_var(self.next_uid())
}
@ -932,7 +931,7 @@ impl<'a> Environment<'a> {
let parameters = self.make_type_vars(parameters, location, &mut hydrator)?;
let tipo = Arc::new(Type::App {
let tipo = Rc::new(Type::App {
public: *public,
module: module.to_owned(),
name: name.clone(),
@ -1277,8 +1276,8 @@ impl<'a> Environment<'a> {
#[allow(clippy::only_used_in_recursion)]
pub fn unify(
&mut self,
t1: Arc<Type>,
t2: Arc<Type>,
t1: Rc<Type>,
t2: Rc<Type>,
location: Span,
allow_cast: bool,
) -> Result<(), Error> {
@ -1306,7 +1305,7 @@ impl<'a> Environment<'a> {
if let Type::Var { tipo } = t1.deref() {
enum Action {
Unify(Arc<Type>),
Unify(Rc<Type>),
CouldNotUnify,
Link,
}
@ -1435,173 +1434,106 @@ impl<'a> Environment<'a> {
}
/// Checks that the given patterns are exhaustive for given type.
/// Currently only performs exhaustiveness checking for custom types,
/// only at the top level (without recursing into constructor arguments).
/// https://github.com/elm/compiler/blob/047d5026fe6547c842db65f7196fed3f0b4743ee/compiler/src/Nitpick/PatternMatches.hs#L397-L475
/// http://moscova.inria.fr/~maranget/papers/warn/index.html
pub fn check_exhaustiveness(
&mut self,
patterns: Vec<Pattern<PatternConstructor, Arc<Type>>>,
value_typ: Arc<Type>,
unchecked_patterns: &[&TypedPattern],
location: Span,
) -> Result<(), Vec<String>> {
match &*value_typ {
Type::App {
name: type_name,
module,
..
} => {
let m = if module.is_empty() || module == self.current_module {
None
} else {
Some(module.clone())
};
is_let: bool,
) -> Result<(), Error> {
let mut matrix = Matrix::new();
if type_name == "List" && module.is_empty() {
return self.check_list_pattern_exhaustiveness(patterns);
}
for unchecked_pattern in unchecked_patterns {
let pattern = simplify(self, unchecked_pattern)?;
let pattern_stack = PatternStack::from(pattern);
if let Ok(constructors) = self.get_constructors_for_type(&m, type_name, location) {
let mut unmatched_constructors: HashSet<String> =
constructors.iter().cloned().collect();
if matrix.is_useful(&pattern_stack) {
matrix.push(pattern_stack);
} else {
let original = matrix
.flatten()
.into_iter()
.enumerate()
.find(|(_, p)| p == pattern_stack.head())
.and_then(|(index, _)| unchecked_patterns.get(index))
.map(|typed_pattern| typed_pattern.location());
for p in &patterns {
// ignore Assign patterns
let mut pattern = p;
while let Pattern::Assign {
pattern: assign_pattern,
..
} = pattern
{
pattern = assign_pattern;
}
match pattern {
// If the pattern is a Discard or Var, all constructors are covered by it
Pattern::Discard { .. } => return Ok(()),
Pattern::Var { .. } => return Ok(()),
// If the pattern is a constructor, remove it from unmatched patterns
Pattern::Constructor {
constructor: PatternConstructor::Record { name, .. },
..
} => {
unmatched_constructors.remove(name);
}
_ => return Ok(()),
}
}
if !unmatched_constructors.is_empty() {
return Err(unmatched_constructors.into_iter().sorted().collect());
}
}
Ok(())
}
_ => Ok(()),
}
}
pub fn check_list_pattern_exhaustiveness(
&mut self,
patterns: Vec<Pattern<PatternConstructor, Arc<Type>>>,
) -> Result<(), Vec<String>> {
let mut cover_empty = false;
let mut cover_tail = false;
let patterns = patterns.iter().map(|p| match p {
Pattern::Assign { pattern, .. } => pattern,
_ => p,
});
// TODO: We could also warn on redundant patterns. As soon as we've matched the entire
// list, any new pattern is redundant. For example:
//
// when xs is {
// [] => ...
// [x, ..] => ...
// [y] => ...
// }
//
// That last pattern is actually redundant / unreachable.
for p in patterns {
match p {
Pattern::Var { .. } => {
cover_empty = true;
cover_tail = true;
}
Pattern::Discard { .. } => {
cover_empty = true;
cover_tail = true;
}
Pattern::List { elements, tail, .. } => {
if elements.is_empty() {
cover_empty = true;
}
match tail {
None => {}
Some(p) => match **p {
Pattern::Discard { .. } => {
cover_tail = true;
}
Pattern::Var { .. } => {
cover_tail = true;
}
_ => {
unreachable!()
}
},
}
}
_ => {}
return Err(Error::RedundantMatchClause {
original,
redundant: unchecked_pattern.location(),
});
}
}
if cover_empty && cover_tail {
Ok(())
} else {
let mut missing = vec![];
if !cover_empty {
missing.push("[]".to_owned());
}
if !cover_tail {
missing.push("[_, ..]".to_owned());
}
Err(missing)
let missing_patterns = matrix.collect_missing_patterns(1).flatten();
if !missing_patterns.is_empty() {
let unmatched = missing_patterns
.into_iter()
.map(|pattern| pattern.pretty())
.collect();
return Err(Error::NotExhaustivePatternMatch {
location,
unmatched,
is_let,
});
}
Ok(())
}
/// Lookup constructors for type in the current scope.
///
pub fn get_constructors_for_type(
&mut self,
full_module_name: &Option<String>,
full_module_name: &String,
name: &str,
location: Span,
) -> Result<&Vec<String>, Error> {
match full_module_name {
None => self
.module_types_constructors
) -> Result<Vec<ValueConstructor>, Error> {
if full_module_name.is_empty() || full_module_name == self.current_module {
self.module_types_constructors
.get(name)
.ok_or_else(|| Error::UnknownType {
name: name.to_string(),
types: self.module_types.keys().map(|t| t.to_string()).collect(),
location,
}),
Some(m) => {
let module =
self.importable_modules
.get(m)
.ok_or_else(|| Error::UnknownModule {
location,
})?
.iter()
.map(|constructor| {
self.scope
.get(constructor)
.cloned()
.ok_or_else(|| Error::UnknownModuleValue {
name: name.to_string(),
imported_modules: self
.importable_modules
module_name: self.current_module.clone(),
value_constructors: self
.module_values
.keys()
.map(|t| t.to_string())
.collect(),
})?;
location,
})
})
.collect()
} else {
let module = self
.importable_modules
.get(full_module_name)
.ok_or_else(|| Error::UnknownModule {
location,
name: name.to_string(),
imported_modules: self
.importable_modules
.keys()
.map(|t| t.to_string())
.collect(),
})?;
self.unused_modules.remove(m);
self.unused_modules.remove(full_module_name);
let constructors =
module
.types_constructors
.get(name)
@ -1610,8 +1542,25 @@ impl<'a> Environment<'a> {
name: name.to_string(),
module_name: module.name.clone(),
type_constructors: module.types.keys().map(|t| t.to_string()).collect(),
})?;
constructors
.iter()
.map(|constructor| {
module.values.get(constructor).cloned().ok_or_else(|| {
Error::UnknownModuleValue {
name: name.to_string(),
module_name: module.name.clone(),
value_constructors: module
.values
.keys()
.map(|t| t.to_string())
.collect(),
location,
}
})
}
})
.collect()
}
}
}
@ -1636,7 +1585,7 @@ pub enum EntityKind {
/// prevents the algorithm from inferring recursive types, which
/// could cause naively-implemented type checking to diverge.
/// While traversing the type tree.
fn unify_unbound_type(tipo: Arc<Type>, own_id: u64, location: Span) -> Result<(), Error> {
fn unify_unbound_type(tipo: Rc<Type>, own_id: u64, location: Span) -> Result<(), Error> {
if let Type::Var { tipo } = tipo.deref() {
let new_value = match tipo.borrow().deref() {
TypeVar::Link { tipo, .. } => {
@ -1689,11 +1638,7 @@ fn unify_unbound_type(tipo: Arc<Type>, own_id: u64, location: Span) -> Result<()
}
}
fn unify_enclosed_type(
e1: Arc<Type>,
e2: Arc<Type>,
result: Result<(), Error>,
) -> Result<(), Error> {
fn unify_enclosed_type(e1: Rc<Type>, e2: Rc<Type>, result: Result<(), Error>) -> Result<(), Error> {
// If types cannot unify, show the type error with the enclosing types, e1 and e2.
match result {
Err(Error::CouldNotUnify {
@ -1767,7 +1712,7 @@ pub(super) fn assert_no_labeled_arguments<A>(args: &[CallArg<A>]) -> Option<(Spa
None
}
pub(super) fn collapse_links(t: Arc<Type>) -> Arc<Type> {
pub(super) fn collapse_links(t: Rc<Type>) -> Rc<Type> {
if let Type::Var { tipo } = t.deref() {
if let TypeVar::Link { tipo } = tipo.borrow().deref() {
return tipo.clone();
@ -1808,12 +1753,12 @@ fn get_compatible_record_fields<A>(
/// Takes a level and a type and turns all type variables within the type that have
/// level higher than the input level into generalized (polymorphic) type variables.
#[allow(clippy::only_used_in_recursion)]
pub(crate) fn generalise(t: Arc<Type>, ctx_level: usize) -> Arc<Type> {
pub(crate) fn generalise(t: Rc<Type>, ctx_level: usize) -> Rc<Type> {
match t.deref() {
Type::Var { tipo } => match tipo.borrow().deref() {
TypeVar::Unbound { id } => generic_var(*id),
TypeVar::Link { tipo } => generalise(tipo.clone(), ctx_level),
TypeVar::Generic { .. } => Arc::new(Type::Var { tipo: tipo.clone() }),
TypeVar::Generic { .. } => Rc::new(Type::Var { tipo: tipo.clone() }),
},
Type::App {
@ -1827,7 +1772,7 @@ pub(crate) fn generalise(t: Arc<Type>, ctx_level: usize) -> Arc<Type> {
.map(|t| generalise(t.clone(), ctx_level))
.collect();
Arc::new(Type::App {
Rc::new(Type::App {
public: *public,
module: module.clone(),
name: name.clone(),

View File

@ -1,6 +1,6 @@
use super::Type;
use crate::{
ast::{Annotation, BinOp, CallArg, Span, UntypedPattern},
ast::{Annotation, BinOp, CallArg, LogicalOpChainKind, Span, UntypedPattern},
expr::{self, UntypedExpr},
format::Formatter,
levenshtein,
@ -13,7 +13,7 @@ use owo_colors::{
OwoColorize,
Stream::{Stderr, Stdout},
};
use std::{collections::HashMap, fmt::Display, sync::Arc};
use std::{collections::HashMap, fmt::Display, rc::Rc};
#[derive(Debug, thiserror::Error, Diagnostic, Clone)]
#[error("Something is possibly wrong here...")]
@ -55,7 +55,22 @@ impl Diagnostic for UnknownLabels {
#[derive(Debug, thiserror::Error, Diagnostic, Clone)]
pub enum Error {
#[error("I discovered a type cast from Data without an annotation")]
#[error("I discovered an {} chain with less than 2 expressions.", op.if_supports_color(Stdout, |s| s.purple()))]
#[diagnostic(code("illegal::logical_op_chain"))]
#[diagnostic(help(
"Logical {}/{} chains require at least 2 expressions. You are missing {}.",
"and".if_supports_color(Stdout, |s| s.purple()),
"or".if_supports_color(Stdout, |s| s.purple()),
missing
))]
LogicalOpChainMissingExpr {
op: LogicalOpChainKind,
#[label]
location: Span,
missing: u8,
},
#[error("I discovered a type cast from Data without an annotation.")]
#[diagnostic(code("illegal::type_cast"))]
#[diagnostic(help("Try adding an annotation...\n\n{}", format_suggestion(value)))]
CastDataNoAnn {
@ -74,8 +89,8 @@ pub enum Error {
expected.to_pretty_with_names(rigid_type_names.clone(), 0),
)]
location: Span,
expected: Arc<Type>,
given: Arc<Type>,
expected: Rc<Type>,
given: Rc<Type>,
situation: Option<UnifyErrorSituation>,
rigid_type_names: HashMap<u64, String>,
},
@ -459,7 +474,7 @@ If you really meant to return that last expression, try to replace it with the f
NotATuple {
#[label]
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
#[error("{}\n", if *is_let {
@ -506,7 +521,7 @@ In this particular instance, the following cases are unmatched:
NotFn {
#[label]
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
#[error("I discovered a positional argument after a label argument.\n")]
@ -540,6 +555,24 @@ Maybe you meant to turn it public using the '{keyword_pub}' keyword?"#
leaked: Type,
},
#[error(
"{}\n",
format!(
"I discovered a '{keyword_when}/{keyword_is}' expression with a redundant pattern.",
keyword_is = "is".if_supports_color(Stdout, |s| s.purple()),
keyword_when = "when".if_supports_color(Stdout, |s| s.purple())
)
)]
#[diagnostic(url("https://aiken-lang.org/language-tour/control-flow#matching"))]
#[diagnostic(code("redundant_pattern_match"))]
#[diagnostic(help("Double check these patterns and then remove one of the clauses."))]
RedundantMatchClause {
#[label("first found here")]
original: Option<Span>,
#[label("redundant")]
redundant: Span,
},
#[error("I couldn't figure out the type of a record you're trying to access.\n")]
#[diagnostic(url(
"https://aiken-lang.org/language-tour/variables-and-constants#type-annotations"
@ -736,7 +769,7 @@ Perhaps, try the following:
UnknownRecordField {
#[label]
location: Span,
typ: Arc<Type>,
typ: Rc<Type>,
label: String,
fields: Vec<String>,
situation: Option<UnknownRecordFieldSituation>,
@ -847,7 +880,7 @@ The best thing to do from here is to remove it."#))]
ValidatorMustReturnBool {
#[label("invalid return type")]
location: Span,
return_type: Arc<Type>,
return_type: Rc<Type>,
},
#[error("Validators require at least 2 arguments and at most 3 arguments.\n")]
@ -1039,8 +1072,8 @@ fn suggest_constructor_pattern(
}
fn suggest_unify(
expected: &Arc<Type>,
given: &Arc<Type>,
expected: &Rc<Type>,
given: &Rc<Type>,
situation: &Option<UnifyErrorSituation>,
rigid_type_names: &HashMap<u64, String>,
) -> String {
@ -1323,7 +1356,7 @@ pub enum Warning {
Todo {
#[label("An expression of type {} is expected here.", tipo.to_pretty(0))]
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
#[error("I found a type hole in an annotation.\n")]
@ -1331,7 +1364,7 @@ pub enum Warning {
UnexpectedTypeHole {
#[label("{}", tipo.to_pretty(0))]
location: Span,
tipo: Arc<Type>,
tipo: Rc<Type>,
},
#[error(

View File

@ -0,0 +1,668 @@
use std::{collections::BTreeMap, iter, ops::Deref};
use itertools::Itertools;
use crate::{
ast, builtins,
tipo::{self, environment::Environment, error::Error},
};
const NIL_NAME: &str = "[]";
const CONS_NAME: &str = "::";
const TUPLE_NAME: &str = "__Tuple";
#[derive(Debug, Clone)]
pub(crate) struct PatternStack(Vec<Pattern>);
impl From<Pattern> for PatternStack {
fn from(value: Pattern) -> Self {
Self(vec![value])
}
}
impl From<Vec<Pattern>> for PatternStack {
fn from(value: Vec<Pattern>) -> Self {
Self(value)
}
}
impl From<PatternStack> for Vec<Pattern> {
fn from(value: PatternStack) -> Self {
value.0
}
}
impl PatternStack {
fn is_empty(&self) -> bool {
self.0.is_empty()
}
fn insert(&mut self, index: usize, element: Pattern) {
self.0.insert(index, element);
}
pub(super) fn head(&self) -> &Pattern {
&self.0[0]
}
fn tail(&self) -> PatternStack {
self.0
.iter()
.skip(1)
.cloned()
.collect::<Vec<Pattern>>()
.into()
}
fn iter(&self) -> impl Iterator<Item = &Pattern> {
self.0.iter()
}
fn chain_tail_to_iter<'a>(&'a self, front: impl Iterator<Item = &'a Pattern>) -> PatternStack {
front
.chain(self.iter().skip(1))
.cloned()
.collect::<Vec<Pattern>>()
.into()
}
fn chain_tail_into_iter(&self, front: impl Iterator<Item = Pattern>) -> PatternStack {
front
.chain(self.iter().skip(1).cloned())
.collect::<Vec<Pattern>>()
.into()
}
// INVARIANT: (length row == N) ==> (length result == arity + N - 1)
fn specialize_row_by_ctor(&self, name: &String, arity: usize) -> Option<PatternStack> {
match self.head() {
Pattern::Constructor(p_name, _, p_args) => {
if p_name == name && p_args.len() == arity {
Some(self.chain_tail_to_iter(p_args.iter()))
} else {
None
}
}
Pattern::Wildcard => {
Some(self.chain_tail_into_iter(vec![Pattern::Wildcard; arity].into_iter()))
}
Pattern::Literal(_) => unreachable!(
"constructors and literals should never align in pattern match exhaustiveness checks."
),
}
}
// INVARIANT: (length row == N) ==> (length result == N-1)
fn specialize_row_by_wildcard(&self) -> Option<PatternStack> {
if self.is_empty() {
return None;
}
match self.head() {
Pattern::Constructor(_, _, _) => None,
Pattern::Literal(_) => None,
Pattern::Wildcard => Some(self.tail()),
}
}
// INVARIANT: (length row == N) ==> (length result == N-1)
fn specialize_row_by_literal(&self, literal: &Literal) -> Option<PatternStack> {
match self.head() {
Pattern::Literal(p_literal) => {
if p_literal == literal {
Some(self.tail())
} else {
None
}
}
Pattern::Wildcard => Some(self.tail()),
Pattern::Constructor(_, _, _) => unreachable!(
"constructors and literals should never align in pattern match exhaustiveness checks."
),
}
}
fn split_at(self, arity: usize) -> (PatternStack, PatternStack) {
let mut rest = self.0;
let mut args = rest.split_off(arity);
std::mem::swap(&mut rest, &mut args);
(args.into(), rest.into())
}
}
#[derive(Debug)]
pub(super) struct Matrix(Vec<PatternStack>);
impl Matrix {
pub(super) fn new() -> Self {
Matrix(vec![])
}
pub(crate) fn is_empty(&self) -> bool {
self.0.is_empty()
}
pub(crate) fn push(&mut self, pattern_stack: PatternStack) {
self.0.push(pattern_stack);
}
/// Iterate over the first component of each row
pub(super) fn iter(&self) -> impl Iterator<Item = &PatternStack> {
self.0.iter()
}
/// Iterate over the first component of each row, mutably
pub(super) fn into_iter(self) -> impl Iterator<Item = PatternStack> {
self.0.into_iter()
}
pub(super) fn concat(self, other: Matrix) -> Matrix {
let mut patterns = self.0;
patterns.extend(other.0);
Matrix(patterns)
}
pub(crate) fn is_complete(&self) -> Complete {
let ctors = self.collect_ctors();
let num_seen = ctors.len();
if num_seen == 0 {
Complete::No
} else {
let (_, alts) = ctors.first_key_value().to_owned().unwrap();
if num_seen == alts.len() {
Complete::Yes(alts.to_vec())
} else {
Complete::No
}
}
}
pub(crate) fn collect_ctors(&self) -> BTreeMap<String, Vec<tipo::ValueConstructor>> {
let mut ctors = BTreeMap::new();
for pattern_stack in self.iter() {
match pattern_stack.head() {
Pattern::Constructor(name, alts, _) => {
ctors.insert(name.clone(), alts.clone());
}
Pattern::Wildcard | Pattern::Literal(_) => {}
}
}
ctors
}
fn specialize_rows_by_ctor(&self, name: &String, arity: usize) -> Matrix {
self.iter()
.filter_map(|p_stack| p_stack.specialize_row_by_ctor(name, arity))
.collect()
}
fn specialize_rows_by_wildcard(&self) -> Matrix {
self.iter()
.filter_map(|p_stack| p_stack.specialize_row_by_wildcard())
.collect()
}
fn specialize_rows_by_literal(&self, literal: &Literal) -> Matrix {
self.iter()
.filter_map(|p_stack| p_stack.specialize_row_by_literal(literal))
.collect()
}
pub(super) fn is_useful(&self, vector: &PatternStack) -> bool {
// No rows are the same as the new vector! The vector is useful!
if self.is_empty() {
return true;
}
// There is nothing left in the new vector, but we still have
// rows that match the same things. This is not a useful vector!
if vector.is_empty() {
return false;
}
let first_pattern = vector.head();
match first_pattern {
Pattern::Constructor(name, _, args) => {
let arity = args.len();
let new_matrix = self.specialize_rows_by_ctor(name, arity);
let new_vector: PatternStack = vector.chain_tail_to_iter(args.iter());
new_matrix.is_useful(&new_vector)
}
Pattern::Wildcard => {
// check if all alts appear in matrix
match self.is_complete() {
Complete::No => {
// This Wildcard is useful because some Ctors are missing.
// But what if a previous row has a Wildcard?
// If so, this one is not useful.
let new_matrix = self.specialize_rows_by_wildcard();
let new_vector = vector.tail();
new_matrix.is_useful(&new_vector)
}
Complete::Yes(alts) => alts.into_iter().any(|alt| {
let tipo::ValueConstructor { variant, .. } = alt;
let (name, arity) = match variant {
tipo::ValueConstructorVariant::Record { name, arity, .. } => {
(name, arity)
}
_ => unreachable!("variant should be a ValueConstructorVariant"),
};
let new_matrix = self.specialize_rows_by_ctor(&name, arity);
let new_vector =
vector.chain_tail_into_iter(vec![Pattern::Wildcard; arity].into_iter());
new_matrix.is_useful(&new_vector)
}),
}
}
Pattern::Literal(literal) => {
let new_matrix: Matrix = self.specialize_rows_by_literal(literal);
let new_vector = vector.tail();
new_matrix.is_useful(&new_vector)
}
}
}
pub(super) fn flatten(self) -> Vec<Pattern> {
self.into_iter().fold(vec![], |mut acc, p_stack| {
acc.extend(p_stack.0);
acc
})
}
// INVARIANTS:
//
// The initial rows "matrix" are all of length 1
// The initial count of items per row "n" is also 1
// The resulting rows are examples of missing patterns
//
pub(super) fn collect_missing_patterns(self, n: usize) -> Matrix {
if self.is_empty() {
return Matrix(vec![vec![Pattern::Wildcard; n].into()]);
}
if n == 0 {
return Matrix::new();
}
let ctors = self.collect_ctors();
let num_seen = ctors.len();
if num_seen == 0 {
let new_matrix = self.specialize_rows_by_wildcard();
let new_matrix = new_matrix.collect_missing_patterns(n - 1);
let new_matrix = new_matrix
.iter()
.map(|p_stack| {
let mut new_p_stack = p_stack.clone();
new_p_stack.insert(0, Pattern::Wildcard);
new_p_stack
})
.collect::<Matrix>();
return new_matrix;
}
let (_, alts) = ctors.first_key_value().unwrap();
if num_seen < alts.len() {
let new_matrix = self.specialize_rows_by_wildcard();
let new_matrix = new_matrix.collect_missing_patterns(n - 1);
let prefix = alts.iter().filter_map(|alt| is_missing(alts, &ctors, alt));
let mut m = Matrix::new();
for p_stack in new_matrix.into_iter() {
for p in prefix.clone() {
let mut p_stack = p_stack.clone();
p_stack.insert(0, p);
m.push(p_stack);
}
}
return m;
}
alts.iter()
.map(|ctor| {
let tipo::ValueConstructor { variant, .. } = ctor;
let (name, arity) = match variant {
tipo::ValueConstructorVariant::Record { name, arity, .. } => (name, arity),
_ => unreachable!("variant should be a ValueConstructorVariant"),
};
let new_matrix = self.specialize_rows_by_ctor(name, *arity);
let new_matrix = new_matrix.collect_missing_patterns(*arity + n - 1);
new_matrix
.into_iter()
.map(|p_stack| recover_ctor(alts.clone(), name, *arity, p_stack))
.collect()
})
.fold(Matrix::new(), |acc, m| acc.concat(m))
}
}
#[derive(Debug)]
pub(crate) enum Complete {
Yes(Vec<tipo::ValueConstructor>),
No,
}
#[derive(Debug, Clone, PartialEq)]
pub(crate) enum Pattern {
Wildcard,
Literal(Literal),
Constructor(String, Vec<tipo::ValueConstructor>, Vec<Pattern>),
}
#[derive(Debug, Clone, PartialEq)]
pub(crate) enum Literal {
Int(String),
}
impl Pattern {
pub(super) fn pretty(self) -> String {
match self {
Pattern::Wildcard => "_".to_string(),
Pattern::Literal(_) => unreachable!("maybe never happens?"),
Pattern::Constructor(name, _alts, args) if name.contains(TUPLE_NAME) => {
let mut pretty_pattern = "(".to_string();
pretty_pattern.push_str(&args.into_iter().map(Pattern::pretty).join(", "));
pretty_pattern.push(')');
pretty_pattern
}
Pattern::Constructor(name, _alts, args) if name == CONS_NAME => {
let mut pretty_pattern = "[".to_string();
let args = args
.into_iter()
.enumerate()
.filter_map(|(index, p)| {
if index == 1 {
let tail = pretty_tail(p);
if tail == "[]" {
None
} else {
Some(tail)
}
} else {
Some(p.pretty())
}
})
.join(", ");
pretty_pattern.push_str(&args);
pretty_pattern.push(']');
pretty_pattern
}
Pattern::Constructor(mut name, alts, args) => {
let field_map = alts.into_iter().find_map(|alt| {
let tipo::ValueConstructor { variant, .. } = alt;
match variant {
tipo::ValueConstructorVariant::Record {
name: r_name,
field_map,
..
} if r_name == name => field_map,
_ => None,
}
});
if let Some(field_map) = field_map {
name.push_str(" { ");
let labels = field_map
.fields
.into_iter()
.sorted_by(|(_, (index_a, _)), (_, (index_b, _))| index_a.cmp(index_b))
.map(|(label, _)| label)
.zip(args)
.map(|(label, arg)| match arg {
Pattern::Wildcard => label,
rest => format!("{label}: {}", rest.pretty()),
})
.join(", ");
name.push_str(&labels);
name.push_str(" }");
name
} else {
if !args.is_empty() {
name.push('(');
name.push_str(&args.into_iter().map(Pattern::pretty).join(", "));
name.push(')');
}
name
}
}
}
}
}
fn pretty_tail(tail: Pattern) -> String {
match tail {
Pattern::Constructor(name, _alts, args) if name == CONS_NAME => {
let mut pretty_pattern = "".to_string();
let args = args
.into_iter()
.enumerate()
.map(|(index, p)| {
if index == 1 {
pretty_tail(p)
} else {
p.pretty()
}
})
.join(", ");
pretty_pattern.push_str(&args);
pretty_pattern
}
Pattern::Wildcard => "..".to_string(),
rest => rest.pretty(),
}
}
fn list_constructors() -> Vec<tipo::ValueConstructor> {
let list_parameter = builtins::generic_var(0);
let list_type = builtins::list(list_parameter);
vec![
tipo::ValueConstructor {
public: true,
tipo: list_type.clone(),
variant: tipo::ValueConstructorVariant::Record {
name: CONS_NAME.to_string(),
arity: 2,
field_map: None,
location: ast::Span::empty(),
module: "".to_string(),
constructors_count: 2,
},
},
tipo::ValueConstructor {
public: true,
tipo: list_type,
variant: tipo::ValueConstructorVariant::Record {
name: NIL_NAME.to_string(),
arity: 0,
field_map: None,
location: ast::Span::empty(),
module: "".to_string(),
constructors_count: 2,
},
},
]
}
pub(super) fn simplify(
environment: &mut Environment,
value: &ast::TypedPattern,
) -> Result<Pattern, Error> {
match value {
ast::Pattern::Int { value, .. } => Ok(Pattern::Literal(Literal::Int(value.clone()))),
ast::Pattern::Assign { pattern, .. } => simplify(environment, pattern.as_ref()),
ast::Pattern::List { elements, tail, .. } => {
let mut p = if let Some(t) = tail {
simplify(environment, t)?
} else {
Pattern::Constructor(NIL_NAME.to_string(), list_constructors(), vec![])
};
for hd in elements.iter().rev() {
p = Pattern::Constructor(
CONS_NAME.to_string(),
list_constructors(),
vec![simplify(environment, hd)?, p],
);
}
Ok(p)
}
ast::Pattern::Constructor {
arguments,
location,
tipo,
with_spread,
constructor: super::PatternConstructor::Record { name, .. },
..
} => {
let (module, type_name, arity) = match tipo.deref() {
tipo::Type::App {
name: type_name,
module,
..
} => (module, type_name, 0),
tipo::Type::Fn { ret, args, .. } => match ret.deref() {
tipo::Type::App {
name: type_name,
module,
..
} => (module, type_name, args.len()),
_ => {
unreachable!("ret should be a Type::App")
}
},
_ => unreachable!("tipo should be a Type::App"),
};
let alts = environment.get_constructors_for_type(module, type_name, *location)?;
let mut args = Vec::new();
for argument in arguments {
args.push(simplify(environment, &argument.value)?);
}
if *with_spread {
for _ in 0..(arity - arguments.len()) {
args.push(Pattern::Wildcard)
}
}
Ok(Pattern::Constructor(name.to_string(), alts, args))
}
ast::Pattern::Tuple { elems, .. } => {
let mut args = vec![];
for elem in elems {
args.push(simplify(environment, elem)?);
}
Ok(Pattern::Constructor(
TUPLE_NAME.to_string(),
vec![tipo::ValueConstructor {
tipo: tipo::Type::Tuple { elems: vec![] }.into(),
public: true,
variant: tipo::ValueConstructorVariant::Record {
name: TUPLE_NAME.to_string(),
arity: elems.len(),
field_map: None,
location: ast::Span::empty(),
module: "".to_string(),
constructors_count: 1,
},
}],
args,
))
}
ast::Pattern::Var { .. } | ast::Pattern::Discard { .. } => Ok(Pattern::Wildcard),
}
}
impl iter::FromIterator<PatternStack> for Matrix {
fn from_iter<T: IntoIterator<Item = PatternStack>>(iter: T) -> Self {
Matrix(iter.into_iter().collect())
}
}
fn recover_ctor(
alts: Vec<tipo::ValueConstructor>,
name: &str,
arity: usize,
patterns: PatternStack,
) -> PatternStack {
let (args, mut rest) = patterns.split_at(arity);
rest.insert(0, Pattern::Constructor(name.to_string(), alts, args.into()));
rest
}
fn is_missing(
alts: &[tipo::ValueConstructor],
ctors: &BTreeMap<String, Vec<tipo::ValueConstructor>>,
ctor: &tipo::ValueConstructor,
) -> Option<Pattern> {
let tipo::ValueConstructor { variant, .. } = ctor;
let (name, arity) = match variant {
tipo::ValueConstructorVariant::Record { name, arity, .. } => (name, arity),
_ => unreachable!("variant should be a ValueConstructorVariant"),
};
if ctors.contains_key(name) {
None
} else {
Some(Pattern::Constructor(
name.clone(),
alts.to_vec(),
vec![Pattern::Wildcard; *arity],
))
}
}

View File

@ -1,15 +1,13 @@
use crate::ast::TypedPattern;
use std::{cmp::Ordering, collections::HashMap, sync::Arc};
use std::{cmp::Ordering, collections::HashMap, rc::Rc};
use vec1::Vec1;
use crate::{
ast::{
Annotation, Arg, ArgName, AssignmentKind, BinOp, ByteArrayFormatPreference, CallArg,
ClauseGuard, Constant, IfBranch, RecordUpdateSpread, Span, TraceKind, Tracing, TypedArg,
TypedCallArg, TypedClause, TypedClauseGuard, TypedIfBranch, TypedRecordUpdateArg, UnOp,
UntypedArg, UntypedClause, UntypedClauseGuard, UntypedIfBranch, UntypedPattern,
UntypedRecordUpdateArg,
ClauseGuard, Constant, IfBranch, LogicalOpChainKind, RecordUpdateSpread, Span, TraceKind,
Tracing, TypedArg, TypedCallArg, TypedClause, TypedClauseGuard, TypedIfBranch,
TypedPattern, TypedRecordUpdateArg, UnOp, UntypedArg, UntypedClause, UntypedClauseGuard,
UntypedIfBranch, UntypedPattern, UntypedRecordUpdateArg,
},
builtins::{bool, byte_array, function, int, list, string, tuple},
expr::{FnStyle, TypedExpr, UntypedExpr},
@ -46,12 +44,9 @@ pub(crate) struct ExprTyper<'a, 'b> {
impl<'a, 'b> ExprTyper<'a, 'b> {
fn check_when_exhaustiveness(
&mut self,
subject: &Type,
typed_clauses: &[TypedClause],
location: Span,
) -> Result<(), Vec<String>> {
let value_typ = collapse_links(Arc::new(subject.clone()));
) -> Result<(), Error> {
// Currently guards in exhaustiveness checking are assumed that they can fail,
// so we go through all clauses and pluck out only the patterns
// for clauses that don't have guards.
@ -63,12 +58,14 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
..
} = clause
{
patterns.push(pattern.clone())
patterns.push(pattern)
}
}
self.environment
.check_exhaustiveness(patterns, value_typ, location)
.check_exhaustiveness(&patterns, location, false)?;
Ok(())
}
pub fn do_infer_call(
@ -76,7 +73,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
fun: UntypedExpr,
args: Vec<CallArg<UntypedExpr>>,
location: Span,
) -> Result<(TypedExpr, Vec<TypedCallArg>, Arc<Type>), Error> {
) -> Result<(TypedExpr, Vec<TypedCallArg>, Rc<Type>), Error> {
let fun = self.infer(fun)?;
let (fun, args, typ) = self.do_infer_call_with_known_fun(fun, args, location)?;
@ -89,7 +86,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
fun: TypedExpr,
mut args: Vec<CallArg<UntypedExpr>>,
location: Span,
) -> Result<(TypedExpr, Vec<TypedCallArg>, Arc<Type>), Error> {
) -> Result<(TypedExpr, Vec<TypedCallArg>, Rc<Type>), Error> {
// Check to see if the function accepts labelled arguments
match self.get_field_map(&fun, location)? {
// The fun has a field map so labelled arguments may be present and need to be reordered.
@ -130,7 +127,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
pub fn do_infer_fn(
&mut self,
args: Vec<UntypedArg>,
expected_args: &[Arc<Type>],
expected_args: &[Rc<Type>],
body: UntypedExpr,
return_annotation: &Option<Annotation>,
) -> Result<(Vec<TypedArg>, TypedExpr), Error> {
@ -216,6 +213,12 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
location, value, ..
} => Ok(self.infer_string(value, location)),
UntypedExpr::LogicalOpChain {
kind,
expressions,
location,
} => self.infer_logical_op_chain(kind, expressions, location),
UntypedExpr::PipeLine { expressions, .. } => self.infer_pipeline(expressions),
UntypedExpr::Fn {
@ -709,19 +712,19 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
let constructor = match &constructor.variant {
variant @ ValueConstructorVariant::ModuleFn { name, module, .. } => {
variant.to_module_value_constructor(Arc::clone(&tipo), module, name)
variant.to_module_value_constructor(Rc::clone(&tipo), module, name)
}
variant @ (ValueConstructorVariant::LocalVariable { .. }
| ValueConstructorVariant::ModuleConstant { .. }
| ValueConstructorVariant::Record { .. }) => {
variant.to_module_value_constructor(Arc::clone(&tipo), &module_name, &label)
variant.to_module_value_constructor(Rc::clone(&tipo), &module_name, &label)
}
};
Ok(TypedExpr::ModuleSelect {
label,
tipo: Arc::clone(&tipo),
tipo: Rc::clone(&tipo),
location: select_location,
module_name,
module_alias: module_alias.to_string(),
@ -822,7 +825,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
fn infer_param(
&mut self,
arg: UntypedArg,
expected: Option<Arc<Type>>,
expected: Option<Rc<Type>>,
) -> Result<TypedArg, Error> {
let Arg {
arg_name,
@ -916,35 +919,20 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
)?
};
// We currently only do limited exhaustiveness checking of custom types
// at the top level of patterns.
// Do not perform exhaustiveness checking if user explicitly used `assert`.
match kind {
AssignmentKind::Let => {
if let Err(unmatched) = self.environment.check_exhaustiveness(
vec![pattern.clone()],
collapse_links(value_typ.clone()),
location,
) {
return Err(Error::NotExhaustivePatternMatch {
location,
unmatched,
is_let: true,
});
}
self.environment
.check_exhaustiveness(&[&pattern], location, true)?
}
AssignmentKind::Expect => {
let is_exaustive_pattern = self
.environment
.check_exhaustiveness(
vec![pattern.clone()],
collapse_links(value_typ.clone()),
location,
)
.check_exhaustiveness(&[&pattern], location, false)
.is_ok();
if !value_is_data && !value_typ.is_list() && is_exaustive_pattern {
if !value_is_data && is_exaustive_pattern {
self.environment
.warnings
.push(Warning::SingleConstructorExpect {
@ -996,7 +984,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
fn infer_call_argument(
&mut self,
value: UntypedExpr,
tipo: Arc<Type>,
tipo: Rc<Type>,
) -> Result<TypedExpr, Error> {
let tipo = collapse_links(tipo);
@ -1431,7 +1419,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
fn infer_fn(
&mut self,
args: Vec<UntypedArg>,
expected_args: &[Arc<Type>],
expected_args: &[Rc<Type>],
body: UntypedExpr,
is_capture: bool,
return_annotation: Option<Annotation>,
@ -1457,7 +1445,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
&mut self,
args: Vec<TypedArg>,
body: UntypedExpr,
return_type: Option<Arc<Type>>,
return_type: Option<Rc<Type>>,
) -> Result<(Vec<TypedArg>, TypedExpr), Error> {
assert_no_assignment(&body)?;
@ -1589,6 +1577,52 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
}
}
fn infer_logical_op_chain(
&mut self,
kind: LogicalOpChainKind,
expressions: Vec<UntypedExpr>,
location: Span,
) -> Result<TypedExpr, Error> {
let mut typed_expressions = vec![];
for expression in expressions {
let typed_expression = self.infer(expression)?;
self.unify(
bool(),
typed_expression.tipo(),
typed_expression.location(),
false,
)?;
typed_expressions.push(typed_expression);
}
if typed_expressions.len() < 2 {
return Err(Error::LogicalOpChainMissingExpr {
op: kind,
location,
missing: 2 - typed_expressions.len() as u8,
});
}
let name: BinOp = kind.into();
let chain = typed_expressions
.into_iter()
.rev()
.reduce(|acc, typed_expression| TypedExpr::BinOp {
location: Span::empty(),
tipo: bool(),
name,
left: typed_expression.into(),
right: acc.into(),
})
.expect("should have at least two");
Ok(chain)
}
fn infer_pipeline(&mut self, expressions: Vec1<UntypedExpr>) -> Result<TypedExpr, Error> {
PipeTyper::infer(self, expressions)
}
@ -1878,15 +1912,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
}
}
if let Err(unmatched) =
self.check_when_exhaustiveness(&subject_type, &typed_clauses, location)
{
return Err(Error::NotExhaustivePatternMatch {
location,
unmatched,
is_let: false,
});
}
self.check_when_exhaustiveness(&typed_clauses, location)?;
Ok(TypedExpr::When {
location,
@ -1896,7 +1922,7 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
})
}
fn instantiate(&mut self, t: Arc<Type>, ids: &mut HashMap<u64, Arc<Type>>) -> Arc<Type> {
fn instantiate(&mut self, t: Rc<Type>, ids: &mut HashMap<u64, Rc<Type>>) -> Rc<Type> {
self.environment.instantiate(t, ids, &self.hydrator)
}
@ -1909,19 +1935,19 @@ impl<'a, 'b> ExprTyper<'a, 'b> {
}
}
pub fn new_unbound_var(&mut self) -> Arc<Type> {
pub fn new_unbound_var(&mut self) -> Rc<Type> {
self.environment.new_unbound_var()
}
pub fn type_from_annotation(&mut self, annotation: &Annotation) -> Result<Arc<Type>, Error> {
pub fn type_from_annotation(&mut self, annotation: &Annotation) -> Result<Rc<Type>, Error> {
self.hydrator
.type_from_annotation(annotation, self.environment)
}
fn unify(
&mut self,
t1: Arc<Type>,
t2: Arc<Type>,
t1: Rc<Type>,
t2: Rc<Type>,
location: Span,
allow_cast: bool,
) -> Result<(), Error> {
@ -1953,6 +1979,7 @@ fn assert_no_assignment(expr: &UntypedExpr) -> Result<(), Error> {
| UntypedExpr::TupleIndex { .. }
| UntypedExpr::UnOp { .. }
| UntypedExpr::Var { .. }
| UntypedExpr::LogicalOpChain { .. }
| UntypedExpr::TraceIfFalse { .. }
| UntypedExpr::When { .. } => Ok(()),
}

View File

@ -1,4 +1,4 @@
use std::{collections::HashMap, sync::Arc};
use std::{collections::HashMap, rc::Rc};
use crate::{
ast::Annotation,
@ -26,7 +26,7 @@ use super::{
///
#[derive(Debug)]
pub struct Hydrator {
created_type_variables: HashMap<String, Arc<Type>>,
created_type_variables: HashMap<String, Rc<Type>>,
/// A rigid type is a generic type that was specified as being generic in
/// an annotation. As such it should never be instantiated into an unbound
/// variable. This type_id => name map is used for reporting the original
@ -37,7 +37,7 @@ pub struct Hydrator {
#[derive(Debug)]
pub struct ScopeResetData {
created_type_variables: HashMap<String, Arc<Type>>,
created_type_variables: HashMap<String, Rc<Type>>,
rigid_type_names: HashMap<u64, String>,
}
@ -90,7 +90,7 @@ impl Hydrator {
&mut self,
ast: &Option<Annotation>,
environment: &mut Environment,
) -> Result<Arc<Type>, Error> {
) -> Result<Rc<Type>, Error> {
match ast {
Some(ast) => self.type_from_annotation(ast, environment),
None => Ok(environment.new_unbound_var()),
@ -101,7 +101,7 @@ impl Hydrator {
&mut self,
annotation: &Annotation,
environment: &mut Environment,
) -> Result<Arc<Type>, Error> {
) -> Result<Rc<Type>, Error> {
let mut unbounds = vec![];
let tipo = self.do_type_from_annotation(annotation, environment, &mut unbounds)?;
@ -122,7 +122,7 @@ impl Hydrator {
annotation: &'a Annotation,
environment: &mut Environment,
unbounds: &mut Vec<&'a Span>,
) -> Result<Arc<Type>, Error> {
) -> Result<Rc<Type>, Error> {
match annotation {
Annotation::Constructor {
location,

View File

@ -269,7 +269,6 @@ fn infer_definition(
.get_variable(&fun.name)
.expect("Could not find preregistered type for function");
let preregistered_type = preregistered_fn.tipo.clone();
let (args_types, _return_type) = preregistered_type
@ -308,8 +307,11 @@ fn infer_definition(
environment,
tracing,
kind,
)? else {
unreachable!("validator definition inferred as something other than a function?")
)?
else {
unreachable!(
"validator definition inferred as something other than a function?"
)
};
if !typed_fun.return_type.is_bool() {
@ -319,7 +321,8 @@ fn infer_definition(
});
}
let typed_params = typed_fun.arguments
let typed_params = typed_fun
.arguments
.drain(0..params_length)
.map(|mut arg| {
if arg.tipo.is_unbound() {
@ -330,7 +333,6 @@ fn infer_definition(
})
.collect();
if typed_fun.arguments.len() < 2 || typed_fun.arguments.len() > 3 {
return Err(Error::IncorrectValidatorArity {
count: typed_fun.arguments.len() as u32,
@ -356,7 +358,8 @@ fn infer_definition(
environment,
tracing,
kind,
)? else {
)?
else {
unreachable!(
"validator definition inferred as something other than a function?"
)
@ -398,8 +401,6 @@ fn infer_definition(
})
.transpose();
Ok(Definition::Validator(Validator {
doc,
end_position,

View File

@ -3,7 +3,7 @@
use std::{
collections::{HashMap, HashSet},
ops::Deref,
sync::Arc,
rc::Rc,
};
use itertools::Itertools;
@ -44,7 +44,7 @@ impl<'a, 'b> PatternTyper<'a, 'b> {
fn insert_variable(
&mut self,
name: &str,
typ: Arc<Type>,
typ: Rc<Type>,
location: Span,
err_location: Span,
) -> Result<(), Error> {
@ -132,7 +132,7 @@ impl<'a, 'b> PatternTyper<'a, 'b> {
pattern: UntypedPattern,
subject: &Type,
) -> Result<TypedPattern, Error> {
self.unify(pattern, Arc::new(subject.clone()), None, false)
self.unify(pattern, Rc::new(subject.clone()), None, false)
}
/// When we have an assignment or a case expression we unify the pattern with the
@ -141,8 +141,8 @@ impl<'a, 'b> PatternTyper<'a, 'b> {
pub fn unify(
&mut self,
pattern: UntypedPattern,
tipo: Arc<Type>,
ann_type: Option<Arc<Type>>,
tipo: Rc<Type>,
ann_type: Option<Rc<Type>>,
is_assignment: bool,
) -> Result<TypedPattern, Error> {
match pattern {

View File

@ -1,4 +1,4 @@
use std::{ops::Deref, sync::Arc};
use std::{ops::Deref, rc::Rc};
use vec1::Vec1;
@ -17,7 +17,7 @@ use super::{
#[derive(Debug)]
pub(crate) struct PipeTyper<'a, 'b, 'c> {
size: usize,
argument_type: Arc<Type>,
argument_type: Rc<Type>,
argument_location: Span,
location: Span,
expressions: Vec<TypedExpr>,

View File

@ -1,4 +1,4 @@
use std::{collections::HashMap, sync::Arc};
use std::{collections::HashMap, rc::Rc};
use itertools::Itertools;
@ -45,7 +45,7 @@ impl Printer {
}
// TODO: have this function return a Document that borrows from the Type.
// Is this possible? The lifetime would have to go through the Arc<Refcell<Type>>
// Is this possible? The lifetime would have to go through the Rc<Refcell<Type>>
// for TypeVar::Link'd types.
pub fn print<'a>(&mut self, typ: &Type) -> Document<'a> {
match typ {
@ -141,7 +141,7 @@ impl Printer {
chars.into_iter().rev().collect()
}
fn args_to_aiken_doc<'a>(&mut self, args: &[Arc<Type>]) -> Document<'a> {
fn args_to_aiken_doc<'a>(&mut self, args: &[Rc<Type>]) -> Document<'a> {
if args.is_empty() {
return nil();
}
@ -284,13 +284,13 @@ mod tests {
name: "Pair".to_string(),
public: true,
args: vec![
Arc::new(Type::App {
Rc::new(Type::App {
module: "whatever".to_string(),
name: "Int".to_string(),
public: true,
args: vec![],
}),
Arc::new(Type::App {
Rc::new(Type::App {
module: "whatever".to_string(),
name: "Bool".to_string(),
public: true,
@ -303,20 +303,20 @@ mod tests {
assert_string!(
Type::Fn {
args: vec![
Arc::new(Type::App {
Rc::new(Type::App {
args: vec![],
module: "whatever".to_string(),
name: "Int".to_string(),
public: true,
}),
Arc::new(Type::App {
Rc::new(Type::App {
args: vec![],
module: "whatever".to_string(),
name: "Bool".to_string(),
public: true,
}),
],
ret: Arc::new(Type::App {
ret: Rc::new(Type::App {
args: vec![],
module: "whatever".to_string(),
name: "Bool".to_string(),
@ -327,8 +327,8 @@ mod tests {
);
assert_string!(
Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Link {
tipo: Arc::new(Type::App {
tipo: Rc::new(RefCell::new(TypeVar::Link {
tipo: Rc::new(Type::App {
args: vec![],
module: "whatever".to_string(),
name: "Int".to_string(),
@ -340,28 +340,28 @@ mod tests {
);
assert_string!(
Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Unbound { id: 2231 })),
tipo: Rc::new(RefCell::new(TypeVar::Unbound { id: 2231 })),
},
"a",
);
assert_string!(
function(
vec![Arc::new(Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Unbound { id: 78 })),
vec![Rc::new(Type::Var {
tipo: Rc::new(RefCell::new(TypeVar::Unbound { id: 78 })),
})],
Arc::new(Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Unbound { id: 2 })),
Rc::new(Type::Var {
tipo: Rc::new(RefCell::new(TypeVar::Unbound { id: 2 })),
}),
),
"fn(a) -> b",
);
assert_string!(
function(
vec![Arc::new(Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Generic { id: 78 })),
vec![Rc::new(Type::Var {
tipo: Rc::new(RefCell::new(TypeVar::Generic { id: 78 })),
})],
Arc::new(Type::Var {
tipo: Arc::new(RefCell::new(TypeVar::Generic { id: 2 })),
Rc::new(Type::Var {
tipo: Rc::new(RefCell::new(TypeVar::Generic { id: 2 })),
}),
),
"fn(a) -> b",
@ -378,7 +378,7 @@ mod tests {
);
}
fn pretty_print(typ: Arc<Type>) -> String {
fn pretty_print(typ: Rc<Type>) -> String {
Printer::new().pretty_print(&typ, 0)
}
}

View File

@ -1,6 +1,6 @@
[package]
name = "aiken-lsp"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
description = "Cardano smart contract language and toolchain"
repository = "https://github.com/aiken-lang/aiken"
@ -24,5 +24,5 @@ tracing = "0.1.37"
url = "2.3.1"
urlencoding = "2.1.2"
aiken-lang = { path = '../aiken-lang', version = "1.0.13-alpha" }
aiken-project = { path = '../aiken-project', version = "1.0.13-alpha" }
aiken-lang = { path = '../aiken-lang', version = "1.0.17-alpha" }
aiken-project = { path = '../aiken-project', version = "1.0.17-alpha" }

View File

@ -163,7 +163,21 @@ impl Server {
aiken_lang::format::pretty(&mut new_text, module, extra, src);
}
None => {
let src = fs::read_to_string(path).map_err(ProjectError::from)?;
let src = {
#[cfg(not(target_os = "windows"))]
{
fs::read_to_string(path).map_err(ProjectError::from)?
}
#[cfg(target_os = "windows")]
{
let temp = match urlencoding::decode(path) {
Ok(decoded) => decoded.to_string(),
Err(_) => path.to_owned(),
};
fs::read_to_string(temp.trim_start_matches("/"))
.map_err(ProjectError::from)?
}
};
let (module, extra) = parser::module(&src, ModuleKind::Lib).map_err(|errs| {
aiken_project::error::Error::from_parse_errors(errs, Path::new(path), &src)
@ -620,6 +634,7 @@ impl Server {
data: None,
};
#[cfg(not(target_os = "windows"))]
let path = path.canonicalize()?;
self.push_diagnostic(path.clone(), lsp_diagnostic.clone());

View File

@ -1,15 +1,15 @@
[package]
name = "aiken-project"
description = "Aiken project utilities"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
repository = "https://github.com/aiken-lang/aiken/crates/project"
homepage = "https://github.com/aiken-lang/aiken"
license = "Apache-2.0"
authors = [
"Lucas Rosa <x@rvcas.dev>",
"Kasey White <kwhitemsg@gmail.com>",
"KtorZ <matthias.benkort@gmail.com>",
"Lucas Rosa <x@rvcas.dev>",
"Kasey White <kwhitemsg@gmail.com>",
"KtorZ <matthias.benkort@gmail.com>",
]
rust-version = "1.66.1"
@ -32,7 +32,7 @@ petgraph = "0.6.3"
pulldown-cmark = { version = "0.9.2", default-features = false }
rayon = "1.7.0"
regex = "1.7.1"
reqwest = "0.11.14"
reqwest = { version = "0.11.14", features = ["blocking", "json"] }
serde = { version = "1.0.152", features = ["derive"] }
serde_json = { version = "1.0.94", features = ["preserve_order"] }
strip-ansi-escapes = "0.1.1"
@ -42,10 +42,9 @@ toml = "0.7.2"
walkdir = "2.3.2"
zip = "0.6.4"
aiken-lang = { path = "../aiken-lang", version = "1.0.13-alpha" }
uplc = { path = '../uplc', version = "1.0.13-alpha" }
aiken-lang = { path = "../aiken-lang", version = "1.0.17-alpha" }
uplc = { path = '../uplc', version = "1.0.17-alpha" }
[dev-dependencies]
proptest = "1.1.0"
proptest = "1.2.0"
pretty_assertions = "1.3.0"

View File

@ -8,7 +8,7 @@ use std::{
collections::{BTreeMap, HashMap},
fmt::{self, Display},
ops::Deref,
sync::Arc,
rc::Rc,
};
// ---------- Definitions
@ -68,7 +68,7 @@ impl<T> Definitions<T> {
pub fn register<F, E>(
&mut self,
type_info: &Type,
type_parameters: &HashMap<u64, Arc<Type>>,
type_parameters: &HashMap<u64, Rc<Type>>,
build_schema: F,
) -> Result<Reference, E>
where
@ -124,7 +124,7 @@ impl Display for Reference {
}
impl Reference {
pub fn from_type(type_info: &Type, type_parameters: &HashMap<u64, Arc<Type>>) -> Self {
pub fn from_type(type_info: &Type, type_parameters: &HashMap<u64, Rc<Type>>) -> Self {
match type_info {
Type::App {
module, name, args, ..
@ -168,7 +168,7 @@ impl Reference {
}
}
fn from_types(args: &Vec<Arc<Type>>, type_parameters: &HashMap<u64, Arc<Type>>) -> Self {
fn from_types(args: &Vec<Rc<Type>>, type_parameters: &HashMap<u64, Rc<Type>>) -> Self {
if args.is_empty() {
Reference::new("")
} else {

View File

@ -1,7 +1,7 @@
use crate::blueprint::definitions::{Definitions, Reference};
use crate::CheckedModule;
use aiken_lang::{
ast::{DataType, Definition, TypedDefinition},
ast::{Definition, TypedDataType, TypedDefinition},
builtins::wrapped_redeemer,
tipo::{pretty, Type, TypeVar},
};
@ -12,7 +12,8 @@ use serde::{
ser::{Serialize, SerializeStruct, Serializer},
};
use serde_json as json;
use std::{collections::HashMap, fmt, ops::Deref, sync::Arc};
use std::rc::Rc;
use std::{collections::HashMap, fmt, ops::Deref};
// NOTE: Can be anything BUT 0
pub const REDEEMER_DISCRIMINANT: usize = 1;
@ -124,7 +125,7 @@ impl Annotated<Schema> {
pub fn as_wrapped_redeemer(
definitions: &mut Definitions<Annotated<Schema>>,
schema: Reference,
type_info: Arc<Type>,
type_info: Rc<Type>,
) -> Reference {
definitions
.register(
@ -156,7 +157,7 @@ impl Annotated<Schema> {
fn do_from_type(
type_info: &Type,
modules: &HashMap<String, CheckedModule>,
type_parameters: &mut HashMap<u64, Arc<Type>>,
type_parameters: &mut HashMap<u64, Rc<Type>>,
definitions: &mut Definitions<Self>,
) -> Result<Reference, Error> {
match type_info {
@ -403,9 +404,9 @@ impl Annotated<Schema> {
impl Data {
fn from_data_type(
data_type: &DataType<Arc<Type>>,
data_type: &TypedDataType,
modules: &HashMap<String, CheckedModule>,
type_parameters: &mut HashMap<u64, Arc<Type>>,
type_parameters: &mut HashMap<u64, Rc<Type>>,
definitions: &mut Definitions<Annotated<Schema>>,
) -> Result<Self, Error> {
let mut variants = vec![];
@ -451,9 +452,9 @@ impl Data {
}
fn collect_type_parameters<'a>(
type_parameters: &'a mut HashMap<u64, Arc<Type>>,
generics: &'a [Arc<Type>],
applications: &'a [Arc<Type>],
type_parameters: &'a mut HashMap<u64, Rc<Type>>,
generics: &'a [Rc<Type>],
applications: &'a [Rc<Type>],
) {
for (index, generic) in generics.iter().enumerate() {
match &**generic {
@ -474,7 +475,7 @@ fn collect_type_parameters<'a>(
}
}
fn find_data_type(name: &str, definitions: &[TypedDefinition]) -> Option<DataType<Arc<Type>>> {
fn find_data_type(name: &str, definitions: &[TypedDefinition]) -> Option<TypedDataType> {
for def in definitions {
match def {
Definition::DataType(data_type) if name == data_type.name => {
@ -1335,6 +1336,7 @@ pub mod tests {
)
}
#[allow(clippy::arc_with_non_send_sync)]
fn arbitrary_data() -> impl Strategy<Value = Data> {
let leaf = prop_oneof![Just(Data::Opaque), Just(Data::Bytes), Just(Data::Integer)];
@ -1361,6 +1363,7 @@ pub mod tests {
})
}
#[allow(clippy::arc_with_non_send_sync)]
fn arbitrary_schema() -> impl Strategy<Value = Schema> {
prop_compose! {
fn data_strategy()(data in arbitrary_data()) -> Schema {

View File

@ -5,6 +5,7 @@ use super::{
schema::{Annotated, Schema},
};
use crate::module::{CheckedModule, CheckedModules};
use std::rc::Rc;
use aiken_lang::{
ast::{TypedArg, TypedFunction, TypedValidator},
@ -12,7 +13,10 @@ use aiken_lang::{
};
use miette::NamedSource;
use serde;
use uplc::ast::{DeBruijn, Program, Term};
use uplc::{
ast::{Constant, DeBruijn, Program, Term},
PlutusData,
};
#[derive(Debug, PartialEq, Clone, serde::Serialize, serde::Deserialize)]
pub struct Validator {
@ -174,6 +178,39 @@ impl Validator {
}
}
}
pub fn ask_next_parameter<F>(
&self,
definitions: &Definitions<Annotated<Schema>>,
ask: F,
) -> Result<Term<DeBruijn>, Error>
where
F: Fn(&Annotated<Schema>, &Definitions<Annotated<Schema>>) -> Result<PlutusData, Error>,
{
match self.parameters.split_first() {
None => Err(Error::NoParametersToApply),
Some((head, _)) => {
let schema = definitions
.lookup(&head.schema)
.map(|s| {
Ok(Annotated {
title: s.title.clone().or_else(|| head.title.clone()),
description: s.description.clone(),
annotated: s.annotated.clone(),
})
})
.unwrap_or_else(|| {
Err(Error::UnresolvedSchemaReference {
reference: head.schema.clone(),
})
})?;
let data = ask(&schema, definitions)?;
Ok(Term::Constant(Rc::new(Constant::Data(data.clone()))))
}
}
}
}
#[cfg(test)]
@ -418,8 +455,8 @@ mod tests {
"$ref": "#/definitions/test_module~1Input"
}
},
"compiledCode": "5902a201000032323232323232323232322223232533300a4a22930b1900299919119299980699b87480000044c8c8c8c8c8c94ccc05cc0640084c8c9263300a004232498dd700099299980a19b87480000044c8c94ccc068c07000852615330174901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a6034002602400c2a6602a9212b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016301200515330144901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375c602e002602e0046eb0c054004c054008c04c004c02c00854cc0392412b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300b0013001001222533301000214984c8ccc010010c04c00c008c004c044008010c800cc94ccc024cdc3a40000022a66601a600e0062930a9980524811d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153330093370e90010008a99980698038018a4c2a6601492011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153330093370e90020008a99980698038018a4c2a6601492011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300a4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300700233001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "b57f3d9e610afae77ef6d2662d945b3bb1e1c8698ff55fe8e9287b00",
"compiledCode": "59029101000032323232323232323232322223232533300a4a22930b19299980519b874800000454ccc038c020010526153300b4911d4578706563746564206e6f206669656c647320666f7220436f6e73747200161533300a3370e90010008a99980718040020a4c2a6601692011d4578706563746564206e6f206669656c647320666f7220436f6e73747200161533300a3370e90020008a99980718040020a4c2a6601692011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300b4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e74001630080033253330093370e900000089919191919192999809980a8010991924c646600200200a44a66602c00229309919801801980c801191bae00130170013253330103370e900000089919299980b180c0010a4c2a660269201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a602c002601c00c2a660229212b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300e00515330104901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375c602600260260046eb0c044004c044008c03c004c01c01054cc0292412b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300700333001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "401a6c4bac4f3554a9bbe260aa12d2eec8c97bf903d23cd6ad426d1e",
"definitions": {
"ByteArray": {
"dataType": "bytes"
@ -530,8 +567,8 @@ mod tests {
"$ref": "#/definitions/Tuple$Int_Int_Int"
}
},
"compiledCode": "58ab01000032323232323232222323253330064a22930b1919190019bae300a002375a6010002646466ec0c030008c030004c030004dd60021919191919192999807180800108030a99805a481334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a601c002601c0046eb4c030004c030008dd698050009bac0025734ae7155ceaab9e5573eae855d101",
"hash": "f8258ac5409f8c0a921f99f4427e3f9362e0ed0146ff71914f80fc4e",
"compiledCode": "58a8010000323232323232322223232323253330084a22930b1919191919299980818090010a4c2a6601a921334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a602000260200046eb4c038004c038008dd698060009bac00432375c60120046eb4c01c004c8c8cdd81805801180580098058009bac0035734ae7155ceaab9e5573eae855d101",
"hash": "d21ff2a6ebd64fb9c3bbfe555b7db490a878566185be79241fc22b1e",
"definitions": {
"ByteArray": {
"dataType": "bytes"
@ -599,8 +636,8 @@ mod tests {
"$ref": "#/definitions/test_module~1Either$ByteArray_test_module~1Interval$Int"
}
},
"compiledCode": "59020c0100003232323232323232323232223253330084a22930b1900199299980419b87480000044c8c94ccc038c040008526153300b491334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375c601c002600c0062a66601066e1d200200113232533300e3010002132498c94ccc02ccdc3a400000226464a66602260260042930a99807249334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a602200260120042a66601666e1d20020011533300f3009002149854cc03124011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300c4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e7400163009001153300b4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016300e001300600315330094912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300600233001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "80d2bf8e5785ac1fd753a00c28cc808e1c9f0dac08e42bdb0d2a3142",
"compiledCode": "59020a0100003232323232323232323232223253330084a22930b19299980419b87480000044c8c94ccc038c040008526153300b4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375c601c002600c0062a66601066e1d200200113232533300e3010002132498c94ccc02ccdc3a400000226464a66602260260042930a99807249334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a602200260120042a66601666e1d20020011533300f3009002149854cc03124011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300c4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e7400163009001153300b4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016300e001300600315330094912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300600233001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "8439b07179746c195c7631777b49e48c2931887547e3258f5f4a59f0",
"definitions": {
"ByteArray": {
"dataType": "bytes"
@ -683,8 +720,8 @@ mod tests {
"$ref": "#/definitions/test_module~1Dict$test_module~1UUID_Int"
}
},
"compiledCode": "590115010000323232323232323232223253330064a22930b1900199919119299980499b87480000044c8c94ccc03cc0440084c9263300500123232498dd698080011bae300e001153300c4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163756601e00260186ea800854cc0292412b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300a37540026002002444a66601800429309919980200218078018011800980680100119800800a40004444666600a66e1c00400c0288cccc014014cdc0002240046018002004004ae695ce2ab9d5573caae7d5d0aba21",
"hash": "780668561f5650bba680ecc5a1ccee2829df0bbe27d29f9c5c456bbc",
"compiledCode": "590106010000323232323232323232223253330064a22930b19299980319b87480000044c8c94ccc030c0380084c926323300100100222533300e00114984c8cc00c00cc044008c8c8dd698078011bae300d001300f0011533009491334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163756601800260126ea800c54cc01d2412b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300737540046600200290001111199980299b8700100300a2333300500533700008900118060008010012b9a5738aae7555cf2ab9f5742ae89",
"hash": "683885e262c8857f80788a1626c1a327267d85cb49e08382288933b2",
"definitions": {
"ByteArray": {
"dataType": "bytes"
@ -746,8 +783,8 @@ mod tests {
"$ref": "#/definitions/test_module~1Dict$test_module~1UUID_Int"
}
},
"compiledCode": "5855010000323232323232223253330044a22930b19190011999180080091129998050010a4c264666008008601a0060046002601600400246464931bad3008002375c600c0026eac0095cd2ab9d5573caae7d5d0aba21",
"hash": "857336762a5637afaacef8f0b536f23763fa05006e5f4f2401e3c7d9",
"compiledCode": "584c01000032323232323222323253330054a22930b19198008008011129998048008a4c26466006006601800464646eb4c028008dd7180400098050009bab0025734aae7555cf2ab9f5742ae881",
"hash": "6ab85c61be6a417c860621155f9c9c91cbaff382efbe7d532173b7ea",
"definitions": {
"ByteArray": {
"dataType": "bytes"
@ -798,8 +835,8 @@ mod tests {
"$ref": "#/definitions/Int"
}
},
"compiledCode": "58e4010000323232323232323232222323253330084a22930b1900299299980419b87480000044c8c94ccc038c040008526153300b4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016300e001300b375400a2a660129212b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300937540086eb4008cc0040052000222233330053370e0020060144666600a00a66e000112002300c0010020025734ae7155ceaab9e5573eae855d101",
"hash": "b8bce36b335ed232d2204ac8de888fc3bf28bf0bc2b4c4c8116d409f",
"compiledCode": "58e1010000323232323232323232222323253330084a22930b1bad0033253330073370e900000089919299980698078010a4c2a660149201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016300d001300a37540082a660109212b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300837540066600200290001111199980299b8700100300a2333300500533700008900118060008010012b9a5738aae7555cf2ab9f5742ae89",
"hash": "4adc0e010fd62343583ca163c1b82e2085fcb221fafd68955685bb2e",
"definitions": {
"Data": {
"title": "Data",
@ -853,8 +890,8 @@ mod tests {
"$ref": "#/definitions/test_module~1Expr"
}
},
"compiledCode": "5901c901000032323232323232323232223253330074a22930b1900199918008009119299980499b87480000044c8c94ccc03cc044008526153300c491334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a601e00260100042a66601266e1d20020011323232325333011301300213232498cc020020008cc01c01c00c54cc0392401334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630110013011002300f0013008002153330093370e9002000899191919299980898098010991924c660100100046600e00e0062a6601c9201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630110013011002300f0013008002153300a4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300a37540020046600200290001111199980319b8700100300b233330050053370000890011806800801001118029baa0015734ae7155ceaab9e5573eae855d101",
"hash": "d89d1c0bdde26ab12979ff50a140f3f1f7a47d50ccf4cf633b3ef3d3",
"compiledCode": "5901c701000032323232323232323232223253330074a22930b19918008009119299980499b87480000044c8c94ccc03cc044008526153300c4901334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016375a601e00260100042a66601266e1d20020011323232325333011301300213232498cc020020008cc01c01c00c54cc0392401334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630110013011002300f0013008002153330093370e9002000899191919299980898098010991924c660100100046600e00e0062a6601c9201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630110013011002300f0013008002153300a4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300a37540020046600200290001111199980319b8700100300b233330050053370000890011806800801001118029baa0015734ae7155ceaab9e5573eae855d101",
"hash": "e3d30c1599b2c29686f1053f6596f85116ee65556d1c2bcd4e354fcc",
"definitions": {
"Int": {
"dataType": "integer"
@ -944,8 +981,8 @@ mod tests {
"$ref": "#/definitions/test_module~1LinkedList$Int"
}
},
"compiledCode": "590366010000323232323232323232323222232323232533300c4a22930b19003999191919119299980899b87480000044c8c94ccc05cc0640084c92630070011533014491334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163017001300f002153330113370e9001000899191919299980c980d801099191924c660120024649318078009bac3019002375c602e0022a6602c9201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163232337606036004603600260360026eb0c064004c064008dd6980b80098078010a9980924812b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300f0013001001222533301400214984c8ccc010010c05c00c008c004c054008c00400488c94ccc038cdc3a4000002264646464a66602c603000426493198038038008a99809a481334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016301600130160023370e900118089baa3014001300c0021533300e3370e90010008a99980918060010a4c2a6601e9211d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300f4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300c00100632005300100430010012232533300b3370e90000008991919192999809980a80109924c6600e00e0022a66020921334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630130013013002375a602200260120042a66601666e1d20020011533300f3009002149854cc03124011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300c4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300900133001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "b5da17417d29be9264832a3550a73b4fddd4f82a74a488c91d861262",
"compiledCode": "590358010000323232323232323232323222232323232533300c4a22930b180100299919119299980719b87480000044c8c94ccc050c0580084c92630050011533011491334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163014001300c0021533300e3370e9001000899191919299980b180c00109924c6464646600200200444a66603400229309919801801980e801191807000980d8009bac3016002375c60280022a660269201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e20657870656374656400163232337606030004603000260300026eb0c058004c058008dd6980a00098060010a99807a4812b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300c00130010012232533300d3370e9000000899191919299980a980b80109924c6600e00e0022a660249201334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e2065787065637465640016301500130150023370e900118081baa3013001300b0021533300d3370e90010008a99980898058010a4c2a6601c9211d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300e4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300b00100530010012232533300b3370e90000008991919192999809980a80109924c6600e00e0022a66020921334c6973742f5475706c652f436f6e73747220636f6e7461696e73206d6f7265206974656d73207468616e206578706563746564001630130013013002375a602200260120042a66601666e1d20020011533300f3009002149854cc03124011d4578706563746564206e6f206669656c647320666f7220436f6e7374720016153300c4912b436f6e73747220696e64657820646964206e6f74206d6174636820616e7920747970652076617269616e740016300900133001001480008888cccc01ccdc38008018061199980280299b8000448008c0380040080088c018dd5000918021baa0015734ae7155ceaab9e5573eae855d11",
"hash": "2250642962915ebe2fc08ad9cd0377f2a4b8c281d94f8bae6782fd63",
"definitions": {
"Bool": {
"title": "Bool",

View File

@ -1,4 +1,4 @@
use crate::{package_name::PackageName, paths, Error};
use crate::{github::repo::LatestRelease, package_name::PackageName, paths, Error};
use aiken_lang::ast::Span;
use miette::NamedSource;
use serde::{Deserialize, Serialize};
@ -16,14 +16,14 @@ pub struct Config {
pub dependencies: Vec<Dependency>,
}
#[derive(Deserialize, Serialize, Clone)]
#[derive(Deserialize, Serialize, Clone, Debug)]
pub struct Repository {
pub user: String,
pub project: String,
pub platform: Platform,
}
#[derive(Deserialize, Serialize, PartialEq, Eq, Clone, Copy)]
#[derive(Deserialize, Serialize, PartialEq, Eq, Clone, Copy, Debug)]
#[serde(rename_all = "lowercase")]
pub enum Platform {
Github,
@ -31,7 +31,7 @@ pub enum Platform {
Bitbucket,
}
#[derive(Deserialize, Serialize, PartialEq, Eq, Clone)]
#[derive(Deserialize, Serialize, PartialEq, Eq, Clone, Debug)]
pub struct Dependency {
pub name: PackageName,
pub version: String,
@ -65,7 +65,10 @@ impl Config {
owner: "aiken-lang".to_string(),
repo: "stdlib".to_string(),
},
version: "1.3.0".to_string(),
version: match LatestRelease::of("aiken-lang/stdlib") {
Ok(stdlib) => stdlib.tag_name,
_ => "1.5.0".to_string(),
},
source: Platform::Github,
}],
}

View File

@ -10,7 +10,7 @@ use crate::{
error::Error,
package_name::PackageName,
paths,
telemetry::{Event, EventListener},
telemetry::{DownloadSource, Event, EventListener},
};
use self::{
@ -26,7 +26,7 @@ pub enum UseManifest {
No,
}
#[derive(Deserialize, Serialize)]
#[derive(Deserialize, Serialize, Debug)]
pub struct LocalPackages {
packages: Vec<Dependency>,
}
@ -100,17 +100,16 @@ impl LocalPackages {
pub fn missing_local_packages<'a>(
&self,
manifest: &'a Manifest,
packages: &'a [Package],
root: &PackageName,
) -> Vec<&'a Package> {
manifest
.packages
packages
.iter()
.filter(|p| {
&p.name != root
&& !matches!(
self.packages.iter().find(|p2| p2.name == p.name),
Some(Dependency { version, .. }) if &p.version == version,
Some(Dependency { version, .. }) if paths::is_git_sha_or_tag(version) && &p.version == version,
)
})
.collect()
@ -133,12 +132,7 @@ impl From<&Manifest> for LocalPackages {
}
}
pub fn download<T>(
event_listener: &T,
use_manifest: UseManifest,
root_path: &Path,
config: &Config,
) -> Result<Manifest, Error>
pub fn download<T>(event_listener: &T, root_path: &Path, config: &Config) -> Result<Manifest, Error>
where
T: EventListener,
{
@ -164,20 +158,14 @@ where
let runtime = tokio::runtime::Runtime::new().expect("Unable to start Tokio");
let (manifest, changed) = Manifest::load(
runtime.handle().clone(),
event_listener,
config,
use_manifest,
root_path,
)?;
let (mut manifest, changed) = Manifest::load(event_listener, config, root_path)?;
let local = LocalPackages::load(root_path)?;
local.remove_extra_packages(&manifest, root_path)?;
runtime.block_on(fetch_missing_packages(
&manifest,
&mut manifest,
&local,
project_name,
root_path,
@ -194,7 +182,7 @@ where
}
async fn fetch_missing_packages<T>(
manifest: &Manifest,
manifest: &mut Manifest,
local: &LocalPackages,
project_name: PackageName,
root_path: &Path,
@ -203,30 +191,50 @@ async fn fetch_missing_packages<T>(
where
T: EventListener,
{
let mut count = 0;
let packages = manifest.packages.to_owned();
let mut missing = local
.missing_local_packages(manifest, &project_name)
.missing_local_packages(&packages, &project_name)
.into_iter()
.map(|package| {
count += 1;
package
})
.peekable();
if missing.peek().is_some() {
let start = Instant::now();
event_listener.handle_event(Event::DownloadingPackage {
name: "packages".to_string(),
event_listener.handle_event(Event::ResolvingPackages {
name: format!("{project_name}"),
});
let downloader = Downloader::new(root_path);
downloader.download_packages(missing, &project_name).await?;
let statuses = downloader
.download_packages(event_listener, missing, &project_name, manifest)
.await?;
event_listener.handle_event(Event::PackagesDownloaded { start, count });
let downloaded_from_network = statuses
.iter()
.filter(|(_, downloaded)| *downloaded)
.count();
if downloaded_from_network > 0 {
event_listener.handle_event(Event::PackagesDownloaded {
start,
count: downloaded_from_network,
source: DownloadSource::Network,
});
}
let downloaded_from_cache = statuses
.iter()
.filter(|(_, downloaded)| !downloaded)
.count();
if downloaded_from_cache > 0 {
event_listener.handle_event(Event::PackagesDownloaded {
start,
count: downloaded_from_cache,
source: DownloadSource::Cache,
});
}
}
Ok(())
manifest.save(root_path)
}

View File

@ -8,9 +8,11 @@ use reqwest::Client;
use zip::result::ZipError;
use crate::{
deps::manifest::Manifest,
error::Error,
package_name::PackageName,
paths::{self, CacheKey},
telemetry::EventListener,
};
use super::manifest::Package;
@ -28,31 +30,41 @@ impl<'a> Downloader<'a> {
}
}
pub async fn download_packages<T>(
pub async fn download_packages<I, T>(
&self,
packages: T,
event_listener: &T,
packages: I,
project_name: &PackageName,
) -> Result<(), Error>
manifest: &mut Manifest,
) -> Result<Vec<(PackageName, bool)>, Error>
where
T: Iterator<Item = &'a Package>,
T: EventListener,
I: Iterator<Item = &'a Package>,
{
let tasks = packages
.filter(|package| project_name != &package.name)
.map(|package| self.ensure_package_in_build_directory(package));
let mut tasks = vec![];
let _results = future::try_join_all(tasks).await?;
for package in packages.filter(|package| project_name != &package.name) {
let cache_key =
paths::CacheKey::new(&self.http, event_listener, package, manifest).await?;
let task = self.ensure_package_in_build_directory(package, cache_key);
tasks.push(task);
}
Ok(())
future::try_join_all(tasks).await
}
pub async fn ensure_package_in_build_directory(
&self,
package: &Package,
) -> Result<bool, Error> {
let cache_key = paths::CacheKey::new(&self.http, package).await?;
self.ensure_package_downloaded(package, &cache_key).await?;
self.extract_package_from_cache(&package.name, &cache_key)
cache_key: CacheKey,
) -> Result<(PackageName, bool), Error> {
let downloaded = self
.ensure_package_downloaded(package, &cache_key)
.await
.map(|downloaded| (package.name.clone(), downloaded))?;
self.extract_package_from_cache(&package.name, &cache_key)
.await?;
Ok(downloaded)
}
pub async fn ensure_package_downloaded(
@ -92,8 +104,6 @@ impl<'a> Downloader<'a> {
let bytes = response.bytes().await?;
// let PackageSource::Github { url } = &package.source;
tokio::fs::write(&zipball_path, bytes).await?;
Ok(true)
@ -103,14 +113,9 @@ impl<'a> Downloader<'a> {
&self,
name: &PackageName,
cache_key: &CacheKey,
) -> Result<bool, Error> {
) -> Result<(), Error> {
let destination = self.root_path.join(paths::build_deps_package(name));
// If the directory already exists then there's nothing for us to do
if destination.is_dir() {
return Ok(false);
}
tokio::fs::create_dir_all(&destination).await?;
let zipball_path = self.root_path.join(paths::package_cache_zipball(cache_key));
@ -135,7 +140,7 @@ impl<'a> Downloader<'a> {
result?;
Ok(true)
Ok(())
}
}

View File

@ -1,8 +1,12 @@
use std::{fs, path::Path};
use aiken_lang::ast::Span;
use miette::NamedSource;
use serde::{Deserialize, Serialize};
use std::{
collections::BTreeMap,
fs,
path::Path,
time::{Duration, SystemTime},
};
use crate::{
config::{Config, Dependency, Platform},
@ -12,20 +16,18 @@ use crate::{
telemetry::{Event, EventListener},
};
use super::UseManifest;
#[derive(Deserialize, Serialize)]
#[derive(Deserialize, Serialize, Debug)]
pub struct Manifest {
pub requirements: Vec<Dependency>,
pub packages: Vec<Package>,
#[serde(default)]
pub etags: BTreeMap<String, (SystemTime, String)>,
}
impl Manifest {
pub fn load<T>(
runtime: tokio::runtime::Handle,
event_listener: &T,
config: &Config,
use_manifest: UseManifest,
root_path: &Path,
) -> Result<(Self, bool), Error>
where
@ -35,15 +37,10 @@ impl Manifest {
// If there's no manifest (or we have been asked not to use it) then resolve
// the versions anew
let should_resolve = match use_manifest {
_ if !manifest_path.exists() => true,
UseManifest::No => true,
UseManifest::Yes => false,
};
let should_resolve = !manifest_path.exists();
if should_resolve {
let manifest = resolve_versions(runtime, config, None, event_listener)?;
let manifest = resolve_versions(config, event_listener)?;
return Ok((manifest, true));
}
@ -61,13 +58,12 @@ impl Manifest {
help: e.to_string(),
})?;
// If the config has unchanged since the manifest was written then it is up
// If the config is unchanged since the manifest was written then it is up
// to date so we can return it unmodified.
if manifest.requirements == config.dependencies {
Ok((manifest, false))
} else {
let manifest = resolve_versions(runtime, config, Some(&manifest), event_listener)?;
let manifest = resolve_versions(config, event_listener)?;
Ok((manifest, true))
}
}
@ -86,9 +82,37 @@ impl Manifest {
Ok(())
}
pub fn lookup_etag(&self, package: &Package) -> Option<String> {
match self.etags.get(&etag_key(package)) {
None => None,
Some((last_fetched, etag)) => {
let elapsed = SystemTime::now().duration_since(*last_fetched).unwrap();
// Discard any etag older than an hour. So that we throttle call to the package
// registry but we ensure a relatively good synchonization of local packages.
if elapsed > Duration::from_secs(3600) {
None
} else {
Some(etag.clone())
}
}
}
}
pub fn insert_etag(&mut self, package: &Package, etag: String) {
self.etags
.insert(etag_key(package), (SystemTime::now(), etag));
}
}
#[derive(Deserialize, Serialize, Clone)]
fn etag_key(package: &Package) -> String {
format!(
"{}/{}@{}",
package.name.owner, package.name.repo, package.version
)
}
#[derive(Deserialize, Serialize, Clone, Debug)]
pub struct Package {
pub name: PackageName,
pub version: String,
@ -96,30 +120,12 @@ pub struct Package {
pub source: Platform,
}
fn resolve_versions<T>(
_runtime: tokio::runtime::Handle,
config: &Config,
_manifest: Option<&Manifest>,
event_listener: &T,
) -> Result<Manifest, Error>
fn resolve_versions<T>(config: &Config, event_listener: &T) -> Result<Manifest, Error>
where
T: EventListener,
{
event_listener.handle_event(Event::ResolvingVersions);
// let resolved = hex::resolve_versions(
// PackageFetcher::boxed(runtime.clone()),
// mode,
// config,
// manifest,
// )?;
// let packages = runtime.block_on(future::try_join_all(
// resolved
// .into_iter()
// .map(|(name, version)| lookup_package(name, version)),
// ))?;
let manifest = Manifest {
packages: config
.dependencies
@ -132,6 +138,7 @@ where
})
.collect(),
requirements: config.dependencies.clone(),
etags: BTreeMap::new(),
};
Ok(manifest)

View File

@ -14,7 +14,7 @@ use serde::Serialize;
use serde_json as json;
use std::{
path::{Path, PathBuf},
sync::Arc,
rc::Rc,
time::{Duration, SystemTime},
};
@ -524,7 +524,7 @@ struct DocTypeConstructor {
}
impl DocTypeConstructor {
fn from_record_constructor(constructor: &RecordConstructor<Arc<Type>>) -> Self {
fn from_record_constructor(constructor: &RecordConstructor<Rc<Type>>) -> Self {
DocTypeConstructor {
definition: format::Formatter::new()
.docs_record_constructor(constructor)

View File

@ -107,6 +107,13 @@ pub enum Error {
)]
UnknownPackageVersion { package: Package },
#[error(
"I need to resolve a package {}/{}, but couldn't find it.",
package.name.owner,
package.name.repo,
)]
UnableToResolvePackage { package: Package },
#[error("I couldn't parse the provided stake address.")]
MalformedStakeAddress {
error: Option<pallas::ledger::addresses::Error>,
@ -188,6 +195,7 @@ impl GetSource for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -213,6 +221,7 @@ impl GetSource for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -247,6 +256,7 @@ impl Diagnostic for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion { .. } => Some(Box::new("aiken::packages::resolve")),
Error::UnableToResolvePackage { .. } => Some(Box::new("aiken::package::download")),
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -306,6 +316,7 @@ impl Diagnostic for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion{..} => Some(Box::new("Perhaps, double-check the package repository and version?")),
Error::UnableToResolvePackage{..} => Some(Box::new("The network is unavailable and the package isn't in the local cache either. Try connecting to the Internet so I can look it up?")),
Error::Json(error) => Some(Box::new(format!("{error}"))),
Error::MalformedStakeAddress { error } => Some(Box::new(format!("A stake address must be provided either as a base16-encoded string, or as a bech32-encoded string with the 'stake' or 'stake_test' prefix.{hint}", hint = match error {
Some(error) => format!("\n\nHere's the error I encountered: {error}"),
@ -366,6 +377,7 @@ impl Diagnostic for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -391,6 +403,7 @@ impl Diagnostic for Error {
Error::ZipExtract(_) => None,
Error::JoinError(_) => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -416,6 +429,7 @@ impl Diagnostic for Error {
Error::ZipExtract { .. } => None,
Error::JoinError { .. } => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,
@ -441,6 +455,7 @@ impl Diagnostic for Error {
Error::ZipExtract { .. } => None,
Error::JoinError { .. } => None,
Error::UnknownPackageVersion { .. } => None,
Error::UnableToResolvePackage { .. } => None,
Error::Json { .. } => None,
Error::MalformedStakeAddress { .. } => None,
Error::NoValidatorNotFound { .. } => None,

View File

@ -0,0 +1 @@
pub mod repo;

View File

@ -0,0 +1,22 @@
use reqwest::{blocking::Client, header::USER_AGENT, Error};
use serde::Deserialize;
#[derive(Deserialize)]
pub struct LatestRelease {
pub tag_name: String,
}
impl LatestRelease {
pub fn of<Repo: AsRef<str>>(repo: Repo) -> Result<Self, Error> {
Ok({
Client::new()
.get(format!(
"https://api.github.com/repos/{}/releases/latest",
repo.as_ref()
))
.header(USER_AGENT, "aiken")
.send()?
.json::<Self>()?
})
}
}

View File

@ -4,6 +4,7 @@ pub mod deps;
pub mod docs;
pub mod error;
pub mod format;
pub mod github;
pub mod module;
pub mod options;
pub mod package_name;
@ -14,7 +15,11 @@ pub mod telemetry;
#[cfg(test)]
mod tests;
use crate::blueprint::Blueprint;
use crate::blueprint::{
definitions::Definitions,
schema::{Annotated, Schema},
Blueprint,
};
use aiken_lang::{
ast::{Definition, Function, ModuleKind, Tracing, TypedDataType, TypedFunction, Validator},
builtins,
@ -22,7 +27,6 @@ use aiken_lang::{
tipo::TypeInfo,
IdGenerator,
};
use deps::UseManifest;
use indexmap::IndexMap;
use miette::NamedSource;
use options::{CodeGenMode, Options};
@ -43,6 +47,7 @@ use telemetry::EventListener;
use uplc::{
ast::{DeBruijn, Name, Program, Term},
machine::cost_model::ExBudget,
PlutusData,
};
use crate::{
@ -415,6 +420,36 @@ where
})
}
pub fn construct_parameter_incrementally<F>(
&self,
title: Option<&String>,
ask: F,
) -> Result<Term<DeBruijn>, Error>
where
F: Fn(
&Annotated<Schema>,
&Definitions<Annotated<Schema>>,
) -> Result<PlutusData, blueprint::error::Error>,
{
// Read blueprint
let blueprint = File::open(self.blueprint_path())
.map_err(|_| blueprint::error::Error::InvalidOrMissingFile)?;
let blueprint: Blueprint = serde_json::from_reader(BufReader::new(blueprint))?;
// Construct parameter
let when_too_many =
|known_validators| Error::MoreThanOneValidatorFound { known_validators };
let when_missing = |known_validators| Error::NoValidatorNotFound { known_validators };
let term = blueprint.with_validator(title, when_too_many, when_missing, |validator| {
validator
.ask_next_parameter(&blueprint.definitions, &ask)
.map_err(|e| e.into())
})?;
Ok(term)
}
pub fn apply_parameter(
&self,
title: Option<&String>,
@ -455,12 +490,7 @@ where
}
fn compile_deps(&mut self) -> Result<(), Vec<Error>> {
let manifest = deps::download(
&self.event_listener,
UseManifest::Yes,
&self.root,
&self.config,
)?;
let manifest = deps::download(&self.event_listener, &self.root, &self.config)?;
for package in manifest.packages {
let lib = self.root.join(paths::build_deps_package(&package.name));
@ -880,7 +910,7 @@ where
.to_string();
// normalise windows paths
name.replace('\\', "/")
name.replace('\\', "/").replace('-', "_")
}
}
@ -889,7 +919,7 @@ fn is_aiken_path(path: &Path, dir: impl AsRef<Path>) -> bool {
let re = Regex::new(&format!(
"^({module}{slash})*{module}\\.ak$",
module = "[a-z][_a-z0-9]*",
module = "[a-z][-_a-z0-9]*",
slash = "(/|\\\\)",
))
.expect("is_aiken_path() RE regex");

View File

@ -317,7 +317,6 @@ impl CheckedModules {
FunctionAccessKey {
module_name: module.name.clone(),
function_name: func.name.clone(),
variant_name: String::new(),
},
func,
);

View File

@ -6,7 +6,7 @@ use std::{
};
use thiserror::Error;
#[derive(PartialEq, Eq, Hash, Clone)]
#[derive(PartialEq, Eq, Hash, Clone, Debug)]
pub struct PackageName {
pub owner: String,
pub repo: String,

View File

@ -1,7 +1,13 @@
use crate::deps::manifest::Package;
use crate::{error::Error, package_name::PackageName};
use crate::{
deps::manifest::Manifest,
error::Error,
package_name::PackageName,
telemetry::{Event, EventListener},
};
use regex::Regex;
use reqwest::Client;
use std::path::PathBuf;
use std::{fs, path::PathBuf};
pub fn project_config() -> PathBuf {
PathBuf::from("aiken.toml")
@ -28,7 +34,7 @@ pub fn build_deps_package(package_name: &PackageName) -> PathBuf {
}
pub fn package_cache_zipball(cache_key: &CacheKey) -> PathBuf {
packages_cache().join(cache_key.get_key())
packages_cache().join(format!("{}.zip", cache_key.get_key()))
}
pub fn packages_cache() -> PathBuf {
@ -47,40 +53,148 @@ pub struct CacheKey {
}
impl CacheKey {
pub async fn new(http: &Client, package: &Package) -> Result<CacheKey, Error> {
let version = match hex::decode(&package.version) {
Ok(..) => Ok(package.version.to_string()),
Err(..) => {
let url = format!(
"https://api.github.com/repos/{}/{}/zipball/{}",
package.name.owner, package.name.repo, package.version
);
let response = http
.head(url)
.header("User-Agent", "aiken-lang")
.send()
.await?;
let etag = response
.headers()
.get("etag")
.ok_or(Error::UnknownPackageVersion {
package: package.clone(),
})?
.to_str()
.unwrap()
.replace('"', "");
Ok(format!("main@{etag}"))
}
};
version.map(|version| CacheKey {
key: format!(
"{}-{}-{}.zip",
package.name.owner, package.name.repo, version
),
})
pub async fn new<T>(
http: &Client,
event_listener: &T,
package: &Package,
manifest: &mut Manifest,
) -> Result<CacheKey, Error>
where
T: EventListener,
{
Ok(CacheKey::from_package(
package,
if is_git_sha_or_tag(&package.version) {
Ok(package.version.to_string())
} else {
match manifest.lookup_etag(package) {
None => match new_etag_from_network(http, package).await {
Err(_) => {
event_listener.handle_event(Event::PackageResolveFallback {
name: format!("{}", package.name),
});
new_cache_key_from_cache(package)
}
Ok(etag) => {
manifest.insert_etag(package, etag.clone());
Ok(format!(
"{version}@{etag}",
version = package.version.replace('/', "_")
))
}
},
Some(etag) => Ok(format!(
"{version}@{etag}",
version = package.version.replace('/', "_")
)),
}
}?,
))
}
fn from_package(package: &Package, version: String) -> CacheKey {
CacheKey {
key: format!("{}-{}-{}", package.name.owner, package.name.repo, version),
}
}
pub fn get_key(&self) -> &str {
self.key.as_ref()
}
}
async fn new_etag_from_network(http: &Client, package: &Package) -> Result<String, Error> {
let url = format!(
"https://api.github.com/repos/{}/{}/zipball/{}",
package.name.owner, package.name.repo, package.version
);
let response = http
.head(url)
.header("User-Agent", "aiken-lang")
.send()
.await?;
let etag = response
.headers()
.get("etag")
.ok_or(Error::UnknownPackageVersion {
package: package.clone(),
})?;
Ok(etag.to_str().unwrap().replace('"', ""))
}
fn new_cache_key_from_cache(target: &Package) -> Result<String, Error> {
let packages = fs::read_dir(packages_cache())?;
let prefix = CacheKey::from_package(target, target.version.replace('/', "_"))
.get_key()
.to_string();
let mut most_recently_modified_date = None;
let mut most_recently_modified = None;
for pkg in packages {
let entry = pkg.unwrap();
let filename = entry
.file_name()
.into_string()
.expect("cache filename are valid utf8 strings");
if filename.starts_with(&prefix) {
let last_modified = entry.metadata()?.modified()?;
if Some(last_modified) > most_recently_modified_date {
most_recently_modified_date = Some(last_modified);
most_recently_modified = Some(filename);
}
}
}
match most_recently_modified {
None => Err(Error::UnableToResolvePackage {
package: target.clone(),
}),
Some(pkg) => Ok(format!(
"{version}{etag}",
version = target.version,
etag = pkg
.strip_prefix(&prefix)
.expect("cache filename starts with a valid version prefix")
.strip_suffix(".zip")
.expect("cache files are all zip archives")
)),
}
}
// Best-effort to assert whether a version refers is a git sha digest or a tag. When it is, we
// avoid re-downloading it if it's already fetched. But when it isn't, and thus refer to a branch,
// we always re-download it. Note however that the download might be short-circuited by the
// system-wide package cache, so a download doesn't actually mean a network request.
//
// The package cache is however smart-enough to assert whether a package in the cache must be
// re-downloaded (using HTTP ETag). So this is mostly about delegating the re-downloading logic to
// the global packages cache.
pub fn is_git_sha_or_tag(version: &str) -> bool {
let r_sha = Regex::new("^[0-9a-f]{7,10}$|^[0-9a-f]{40}$").unwrap();
let r_version = Regex::new("^v?[0-9]+\\.[0-9]+(\\.[0-9]+)?([-+].+)?$").unwrap();
r_sha.is_match(version) || r_version.is_match(version)
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_is_git_sha_or_tag() {
assert!(
is_git_sha_or_tag("8ba5946c32a7dc99ae199e0e7b9948f9f361aaee"),
"sha full"
);
assert!(is_git_sha_or_tag("8ba5946"), "sha short");
assert!(is_git_sha_or_tag("1.1.0"), "semver");
assert!(is_git_sha_or_tag("1.1.0-rc1"), "semver rc");
assert!(is_git_sha_or_tag("1.1.0+foo"), "semver patch");
assert!(is_git_sha_or_tag("v1.6"), "major/minor + prefix");
assert!(!is_git_sha_or_tag("release/2.0.0"), "release branch");
assert!(!is_git_sha_or_tag("main"), "main branch");
assert!(!is_git_sha_or_tag("8ba594659468ba"), "not sha");
}
}

View File

@ -1,3 +1,5 @@
use std::cmp;
pub fn ansi_len(s: &str) -> usize {
String::from_utf8(strip_ansi_escapes::strip(s).unwrap())
.unwrap()
@ -120,3 +122,22 @@ pub fn style_if(styled: bool, s: String, apply_style: fn(String) -> String) -> S
s
}
}
pub fn multiline(max_len: usize, s: String) -> Vec<String> {
let mut xs = Vec::new();
let mut i = 0;
let len = s.len();
loop {
let lo = i * max_len;
let hi = cmp::min(len - 1, lo + max_len - 1);
if lo >= len {
break;
}
let chunk = &s[lo..hi];
xs.push(chunk.to_string());
i += 1;
}
xs
}

View File

@ -1,5 +1,5 @@
use crate::script::EvalInfo;
use std::path::PathBuf;
use std::{fmt::Display, path::PathBuf};
pub trait EventListener {
fn handle_event(&self, _event: Event) {}
@ -37,12 +37,30 @@ pub enum Event {
tests: Vec<EvalInfo>,
},
WaitingForBuildDirLock,
DownloadingPackage {
ResolvingPackages {
name: String,
},
PackageResolveFallback {
name: String,
},
PackagesDownloaded {
start: tokio::time::Instant,
count: usize,
source: DownloadSource,
},
ResolvingVersions,
}
pub enum DownloadSource {
Network,
Cache,
}
impl Display for DownloadSource {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
DownloadSource::Network => write!(f, "network"),
DownloadSource::Cache => write!(f, "cache"),
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -81,7 +81,7 @@ impl TestProject {
module.kind,
&self.package.to_string(),
&self.module_types,
Tracing::NoTraces,
Tracing::KeepTraces,
&mut warnings,
)
.expect("Failed to type-check module");

View File

@ -1,18 +1,27 @@
[package]
name = "aiken"
description = "Cardano smart contract language and toolchain"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
repository = "https://github.com/aiken-lang/aiken"
homepage = "https://github.com/aiken-lang/aiken"
license = "Apache-2.0"
authors = ["Lucas Rosa <x@rvcas.dev>", "Kasey White <kwhitemsg@gmail.com>", "KtorZ <matthias.benkort@gmail.com>"]
authors = [
"Lucas Rosa <x@rvcas.dev>",
"Kasey White <kwhitemsg@gmail.com>",
"KtorZ <matthias.benkort@gmail.com>",
]
rust-version = "1.66.1"
build = "build.rs"
[dependencies]
anyhow = "1.0.69"
clap = { version = "4.1.8", features = ["derive", "wrap_help", "unicode", "string"] }
clap = { version = "4.1.8", features = [
"derive",
"wrap_help",
"unicode",
"string",
] }
hex = "0.4.3"
ignore = "0.4.20"
indoc = "2.0"
@ -27,11 +36,14 @@ regex = "1.7.1"
serde_json = "1.0.94"
thiserror = "1.0.39"
aiken-lang = { path = "../aiken-lang", version = "1.0.13-alpha" }
aiken-lsp = { path = "../aiken-lsp", version = "1.0.13-alpha" }
aiken-project = { path = '../aiken-project', version = "1.0.13-alpha" }
uplc = { path = '../uplc', version = "1.0.13-alpha" }
aiken-lang = { path = "../aiken-lang", version = "1.0.17-alpha" }
aiken-lsp = { path = "../aiken-lsp", version = "1.0.17-alpha" }
aiken-project = { path = '../aiken-project', version = "1.0.17-alpha" }
uplc = { path = '../uplc', version = "1.0.17-alpha" }
clap_complete = "4.3.2"
inquire = "0.6.2"
num-bigint = "0.4.3"
ordinal = "0.3.2"
[build-dependencies]
built = { version = "0.6.0", features = ["git2"] }

View File

@ -34,7 +34,7 @@ pub fn exec(
rebuild,
}: Args,
) -> miette::Result<()> {
with_project(directory, |p| {
with_project(directory, false, |p| {
if rebuild {
p.build(false, Tracing::NoTraces)?;
}

View File

@ -1,8 +1,21 @@
use crate::with_project;
use aiken_project::{blueprint, error::Error};
use aiken_project::{
blueprint::{
self,
definitions::Definitions,
schema::{Annotated, Constructor, Data, Declaration, Items, Schema},
},
error::Error,
pretty::multiline,
};
use inquire;
use num_bigint::BigInt;
use ordinal::Ordinal;
use owo_colors::{OwoColorize, Stream::Stderr};
use pallas_primitives::alonzo::PlutusData;
use std::str::FromStr;
use std::{fs, path::PathBuf, process, rc::Rc};
use uplc::ast::{Constant, DeBruijn, Term};
use uplc::ast::{Constant, Data as UplcData, DeBruijn, Term};
/// Apply a parameter to a parameterized validator.
#[derive(clap::Args)]
@ -12,10 +25,7 @@ pub struct Args {
/// For example, `182A` designates an integer of value 42. If you're unsure about the shape of
/// the parameter, look at the schema specified in the project's blueprint (i.e.
/// `plutus.json`), or use the `cbor.serialise` function from the Aiken standard library.
parameter: String,
/// Path to project
directory: Option<PathBuf>,
parameter: Option<String>,
/// Output file. Optional, print on stdout when omitted.
#[clap(short, long)]
@ -33,55 +43,12 @@ pub struct Args {
pub fn exec(
Args {
parameter,
directory,
out,
module,
validator,
}: Args,
) -> miette::Result<()> {
eprintln!(
"{} inputs",
" Parsing"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
);
let bytes = hex::decode(parameter)
.map_err::<Error, _>(|e| {
blueprint::error::Error::MalformedParameter {
hint: format!("Invalid hex-encoded string: {e}"),
}
.into()
})
.unwrap_or_else(|e| {
println!();
e.report();
process::exit(1)
});
let data = uplc::plutus_data(&bytes)
.map_err::<Error, _>(|e| {
blueprint::error::Error::MalformedParameter {
hint: format!("Invalid Plutus data; malformed CBOR encoding: {e}"),
}
.into()
})
.unwrap_or_else(|e| {
println!();
e.report();
process::exit(1)
});
let term: Term<DeBruijn> = Term::Constant(Rc::new(Constant::Data(data)));
eprintln!(
"{} blueprint",
" Analyzing"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
);
with_project(directory, |p| {
with_project(None, false, |p| {
let title = module.as_ref().map(|m| {
format!(
"{m}{}",
@ -95,10 +62,65 @@ pub fn exec(
let title = title.as_ref().or(validator.as_ref());
eprintln!(
"{} parameter",
"{} blueprint",
" Analyzing"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
);
let term: Term<DeBruijn> = match &parameter {
Some(param) => {
eprintln!(
"{} inputs",
" Parsing"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
);
let bytes = hex::decode(param)
.map_err::<Error, _>(|e| {
blueprint::error::Error::MalformedParameter {
hint: format!("Invalid hex-encoded string: {e}"),
}
.into()
})
.unwrap_or_else(|e| {
println!();
e.report();
process::exit(1)
});
let data = uplc::plutus_data(&bytes)
.map_err::<Error, _>(|e| {
blueprint::error::Error::MalformedParameter {
hint: format!("Invalid Plutus data; malformed CBOR encoding: {e}"),
}
.into()
})
.unwrap_or_else(|e| {
println!();
e.report();
process::exit(1)
});
Term::Constant(Rc::new(Constant::Data(data)))
}
None => p.construct_parameter_incrementally(title, ask_schema)?,
};
eprintln!(
"{} {}",
" Applying"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
match TryInto::<PlutusData>::try_into(term.clone()) {
Ok(data) => {
let padding = "\n ";
multiline(48, UplcData::to_hex(data)).join(padding)
}
Err(_) => term.to_pretty(),
}
);
let blueprint = p.apply_parameter(title, &term)?;
@ -126,3 +148,262 @@ pub fn exec(
Ok(())
})
}
fn ask_schema(
schema: &Annotated<Schema>,
definitions: &Definitions<Annotated<Schema>>,
) -> Result<PlutusData, blueprint::error::Error> {
match schema.annotated {
Schema::Data(Data::Integer) => {
let input = prompt_primitive("an integer", schema)?;
let n = BigInt::from_str(input.as_str()).map_err(|e| {
blueprint::error::Error::MalformedParameter {
hint: format!("Unable to convert input to integer: {e}"),
}
})?;
Ok(UplcData::integer(n))
}
Schema::Data(Data::Bytes) => {
let input = prompt_primitive("a byte-array", schema)?;
let bytes =
hex::decode(input).map_err(|e| blueprint::error::Error::MalformedParameter {
hint: format!("Invalid hex-encoded string: {e}"),
})?;
Ok(UplcData::bytestring(bytes))
}
Schema::Data(Data::List(Items::Many(ref decls))) => {
eprintln!(" {}", asking(schema, "Found", &format!("a {}-tuple", decls.len())));
let mut elems = vec![];
for (ix, decl) in decls.iter().enumerate() {
eprintln!(
" {} Tuple's {}{} element",
"Asking".if_supports_color(Stderr, |s| s.purple()).if_supports_color(Stderr, |s| s.bold()),
ix+1,
Ordinal::<usize>(ix+1).suffix()
);
let inner_schema = lookup_declaration(&decl.clone().into(), definitions);
elems.push(ask_schema(&inner_schema, definitions)?);
}
Ok(UplcData::list(elems))
}
Schema::Data(Data::List(Items::One(ref decl))) => {
eprintln!(" {}", asking(schema, "Found", "a list"));
let inner_schema = lookup_declaration(&decl.clone().into(), definitions);
let mut elems = vec![];
while prompt_iterable(schema, "item")? {
elems.push(ask_schema(&inner_schema, definitions)?);
}
Ok(UplcData::list(elems))
}
Schema::Data(Data::Map(ref key_decl, ref value_decl)) => {
eprintln!(" {}", asking(schema, "Found", "an associative map"));
let key_schema = lookup_declaration(&key_decl.clone().into(), definitions);
let value_schema = lookup_declaration(&value_decl.clone().into(), definitions);
let mut elems = vec![];
while prompt_iterable(schema, "key/value entry")? {
elems.push((
ask_schema(&key_schema, definitions)?,
ask_schema(&value_schema, definitions)?,
));
}
Ok(UplcData::map(elems))
}
Schema::Data(Data::AnyOf(ref constructors)) => {
eprintln!(
" {}",
asking(
schema,
"Found",
if constructors.len() == 1 {
"a record"
} else {
"a data-type"
}
)
);
let ix = prompt_constructor(constructors, schema)?;
let mut fields = Vec::new();
for field in &constructors[ix].annotated.fields {
let inner_schema = lookup_declaration(field, definitions);
fields.push(ask_schema(&inner_schema, definitions)?);
}
Ok(UplcData::constr(ix.try_into().unwrap(), fields))
}
_ => unimplemented!("Hey! You've found a case that we haven't implemented yet. Yes, we've been a bit lazy on that one... If that use-case is important to you, please let us know on Discord or on Github."),
}
}
fn lookup_declaration(
decl: &Annotated<Declaration<Data>>,
definitions: &Definitions<Annotated<Schema>>,
) -> Annotated<Schema> {
match decl.annotated {
Declaration::Inline(ref data) => Annotated {
title: decl.title.clone(),
description: decl.description.clone(),
annotated: Schema::Data(*(*data).clone()),
},
Declaration::Referenced(ref reference) => {
let schema = definitions
.lookup(reference)
.expect("reference to unknown type in blueprint?");
Annotated {
title: decl.title.clone().or_else(|| schema.title.clone()),
description: decl
.description
.clone()
.or_else(|| schema.description.clone()),
annotated: schema.annotated.clone(),
}
}
}
}
fn asking(schema: &Annotated<Schema>, verb: &str, type_name: &str) -> String {
let subject = get_subject(schema, type_name);
format!(
"{} {subject}",
verb.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold()),
subject = subject,
)
}
fn prompt_primitive(
type_name: &str,
schema: &Annotated<Schema>,
) -> Result<String, blueprint::error::Error> {
inquire::Text::new(&format!(" {}:", asking(schema, "Asking", type_name)))
.with_description(schema.description.as_ref())
.prompt()
.map_err(|e| blueprint::error::Error::MalformedParameter {
hint: format!("Invalid input received from prompt: {e}"),
})
}
fn prompt_iterable(
schema: &Annotated<Schema>,
elem_name: &str,
) -> Result<bool, blueprint::error::Error> {
inquire::Confirm::new(&format!(
" {} one more {elem_name}?",
"Adding"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold())
))
.with_description(schema.description.as_ref())
.with_default(true)
.prompt()
.map_err(|e| blueprint::error::Error::MalformedParameter {
hint: format!("Invalid input received from prompt: {e}"),
})
}
fn prompt_constructor(
constructors: &[Annotated<Constructor>],
schema: &Annotated<Schema>,
) -> Result<usize, blueprint::error::Error> {
let mut choices = Vec::new();
for c in constructors {
let name = c
.title
.as_ref()
.cloned()
.unwrap_or_else(|| format!("{}", c.annotated.index));
choices.push(name);
}
let mut choice = choices
.first()
.expect("Data-type with no constructor?")
.to_string();
if choices.len() > 1 {
choice = inquire::Select::new(
&format!(
" {} constructor",
"Selecting"
.if_supports_color(Stderr, |s| s.purple())
.if_supports_color(Stderr, |s| s.bold())
),
choices.clone(),
)
.with_description(schema.description.as_ref())
.prompt()
.map_err(|e| blueprint::error::Error::MalformedParameter {
hint: format!("Invalid input received from prompt: {e}"),
})?;
}
Ok(choices.into_iter().position(|c| c == choice).unwrap())
}
fn get_subject<T>(schema: &Annotated<T>, type_name: &str) -> String {
schema
.title
.as_ref()
.map(|title| format!("{title} ({type_name})"))
.unwrap_or_else(|| type_name.to_string())
}
trait WithDescription<'a> {
fn with_description(self, opt: Option<&'a String>) -> Self;
}
impl<'a> WithDescription<'a> for inquire::Confirm<'a> {
fn with_description(
self: inquire::Confirm<'a>,
opt: Option<&'a String>,
) -> inquire::Confirm<'a> {
match opt {
Some(description) => self.with_help_message(description),
None => self,
}
}
}
impl<'a> WithDescription<'a> for inquire::Text<'a> {
fn with_description(self: inquire::Text<'a>, opt: Option<&'a String>) -> inquire::Text<'a> {
match opt {
Some(description) => self.with_help_message(description),
None => self,
}
}
}
impl<'a, T> WithDescription<'a> for inquire::Select<'a, T>
where
T: std::fmt::Display,
{
fn with_description(
self: inquire::Select<'a, T>,
opt: Option<&'a String>,
) -> inquire::Select<'a, T> {
match opt {
Some(description) => self.with_help_message(description),
None => self,
}
}
}

View File

@ -29,7 +29,7 @@ pub fn exec(
rebuild,
}: Args,
) -> miette::Result<()> {
with_project(directory, |p| {
with_project(directory, false, |p| {
if rebuild {
p.build(false, Tracing::NoTraces)?;
}

View File

@ -29,7 +29,7 @@ pub fn exec(
rebuild,
}: Args,
) -> miette::Result<()> {
with_project(directory, |p| {
with_project(directory, false, |p| {
if rebuild {
p.build(false, Tracing::NoTraces)?;
}

View File

@ -6,6 +6,10 @@ pub struct Args {
/// Path to project
directory: Option<PathBuf>,
/// Deny warnings; warnings will be treated as errors
#[clap(short = 'D', long)]
deny: bool,
/// Also dump textual uplc
#[clap(short, long)]
uplc: bool,
@ -18,9 +22,10 @@ pub struct Args {
pub fn exec(
Args {
directory,
deny,
uplc,
keep_traces,
}: Args,
) -> miette::Result<()> {
crate::with_project(directory, |p| p.build(uplc, keep_traces.into()))
crate::with_project(directory, deny, |p| p.build(uplc, keep_traces.into()))
}

View File

@ -6,6 +6,10 @@ pub struct Args {
/// Path to project
directory: Option<PathBuf>,
/// Deny warnings; warnings will be treated as errors
#[clap(short = 'D', long)]
deny: bool,
/// Skip tests; run only the type-checker
#[clap(short, long)]
skip_tests: bool,
@ -33,6 +37,7 @@ pub struct Args {
pub fn exec(
Args {
directory,
deny,
skip_tests,
debug,
match_tests,
@ -40,7 +45,7 @@ pub fn exec(
no_traces,
}: Args,
) -> miette::Result<()> {
crate::with_project(directory, |p| {
crate::with_project(directory, deny, |p| {
p.check(
skip_tests,
match_tests.clone(),

View File

@ -6,6 +6,10 @@ pub struct Args {
/// Path to project
directory: Option<PathBuf>,
/// Deny warnings; warnings will be treated as errors
#[clap(short = 'D', long)]
deny: bool,
/// Output directory for the documentation
#[clap(short = 'o', long)]
destination: Option<PathBuf>,
@ -14,8 +18,9 @@ pub struct Args {
pub fn exec(
Args {
directory,
deny,
destination,
}: Args,
) -> miette::Result<()> {
crate::with_project(directory, |p| p.docs(destination.clone()))
crate::with_project(directory, deny, |p| p.docs(destination.clone()))
}

View File

@ -21,9 +21,6 @@ pub struct Args {
/// Library only
#[clap(long, short)]
lib: bool,
/// Empty project
#[clap(long, short)]
empty: bool,
}
pub fn exec(args: Args) -> miette::Result<()> {
@ -42,10 +39,10 @@ fn create_project(args: Args, package_name: &PackageName) -> miette::Result<()>
})?;
}
create_lib(&root, package_name, args.empty)?;
create_lib(&root, package_name)?;
if !args.lib {
create_validators(&root, package_name, args.empty)?;
create_validators(&root)?;
}
readme(&root, &package_name.repo)?;
@ -99,60 +96,14 @@ fn print_success_message(package_name: &PackageName) {
)
}
fn create_lib(root: &Path, package_name: &PackageName, empty: bool) -> miette::Result<()> {
fn create_lib(root: &Path, package_name: &PackageName) -> miette::Result<()> {
let lib = root.join("lib").join(&package_name.repo);
fs::create_dir_all(&lib).into_diagnostic()?;
if empty {
return Ok(());
}
fs::write(
lib.join("types.ak"),
formatdoc! {
r#"
/// Custom type
pub type Datum {{
/// A utf8 encoded message
message: ByteArray,
}}
test sum() {{
1 + 1 == 2
}}
"#,
},
)
.into_diagnostic()
fs::create_dir_all(lib).into_diagnostic()
}
fn create_validators(root: &Path, package_name: &PackageName, empty: bool) -> miette::Result<()> {
fn create_validators(root: &Path) -> miette::Result<()> {
let validators = root.join("validators");
fs::create_dir_all(&validators).into_diagnostic()?;
if empty {
return Ok(());
}
fs::write(
validators.join("hello.ak"),
formatdoc! {
r#"
use aiken/transaction.{{ScriptContext}}
use {name}/types
validator {{
fn spend(datum: types.Datum, _redeemer: Data, _ctx: ScriptContext) -> Bool {{
datum.message == "Hello World!"
}}
}}
"#,
name = package_name.repo
},
)
.into_diagnostic()
fs::create_dir_all(validators).into_diagnostic()
}
fn readme(root: &Path, project_name: &str) -> miette::Result<()> {
@ -248,7 +199,7 @@ fn create_github_action(root: &Path) -> miette::Result<()> {
version: v{version}
- run: aiken fmt --check
- run: aiken check
- run: aiken check -D
- run: aiken build
"#,
version = built_info::PKG_VERSION,

View File

@ -133,13 +133,14 @@ pub fn exec(
match result {
Ok(redeemers) => {
let total_budget_used =
redeemers
.iter()
.fold(ExBudget { mem: 0, cpu: 0 }, |accum, curr| ExBudget {
mem: accum.mem + curr.ex_units.mem as i64,
cpu: accum.cpu + curr.ex_units.steps as i64,
});
// this should allow N scripts to be
let total_budget_used: Vec<ExBudget> = redeemers
.iter()
.map(|curr| ExBudget {
mem: curr.ex_units.mem as i64,
cpu: curr.ex_units.steps as i64,
})
.collect();
eprintln!("\n");
println!(

View File

@ -34,7 +34,7 @@ pub fn exec(
let bytes = if hex {
let hex_bytes = std::fs::read_to_string(&input).into_diagnostic()?;
hex::decode(hex_bytes).into_diagnostic()?
hex::decode(hex_bytes.trim()).into_diagnostic()?
} else {
std::fs::read(&input).into_diagnostic()?
};
@ -58,6 +58,8 @@ pub fn exec(
Program::from_flat(&bytes).into_diagnostic()?
};
let program: Program<Name> = program.try_into().unwrap();
program.to_pretty()
}
Format::Debruijn => {
@ -68,6 +70,8 @@ pub fn exec(
Program::from_flat(&bytes).into_diagnostic()?
};
let program: Program<Name> = program.try_into().unwrap();
program.to_pretty()
}
};

View File

@ -18,7 +18,7 @@ pub struct Args {
#[clap(short, long)]
cbor: bool,
/// Arguments to pass to the uplc program
/// Arguments to pass to the UPLC program
args: Vec<String>,
}
@ -33,7 +33,7 @@ pub fn exec(
let mut program = if cbor {
let cbor_hex = std::fs::read_to_string(&script).into_diagnostic()?;
let raw_cbor = hex::decode(cbor_hex).into_diagnostic()?;
let raw_cbor = hex::decode(cbor_hex.trim()).into_diagnostic()?;
let prog = Program::<FakeNamedDeBruijn>::from_cbor(&raw_cbor, &mut Vec::new())
.into_diagnostic()?;
@ -54,10 +54,9 @@ pub fn exec(
};
for arg in args {
let term: Term<NamedDeBruijn> = parser::term(&arg)
.into_diagnostic()?
.try_into()
.into_diagnostic()?;
let term = parser::term(&arg).into_diagnostic()?;
let term = Term::<NamedDeBruijn>::try_from(term).into_diagnostic()?;
program = program.apply_term(&term);
}
@ -71,7 +70,7 @@ pub fn exec(
match eval_result.result() {
Ok(term) => {
let term: Term<Name> = term.try_into().into_diagnostic()?;
let term = Term::<Name>::try_from(term).into_diagnostic()?;
let output = json!({
"result": term.to_pretty(),

View File

@ -1,4 +1,9 @@
use aiken_project::{pretty, script::EvalInfo, telemetry, Project};
use aiken_project::{
pretty,
script::EvalInfo,
telemetry::{self, DownloadSource},
Project,
};
use miette::IntoDiagnostic;
use owo_colors::{
OwoColorize,
@ -13,7 +18,7 @@ pub mod built_info {
include!(concat!(env!("OUT_DIR"), "/built.rs"));
}
pub fn with_project<A>(directory: Option<PathBuf>, mut action: A) -> miette::Result<()>
pub fn with_project<A>(directory: Option<PathBuf>, deny: bool, mut action: A) -> miette::Result<()>
where
A: FnMut(&mut Project<Terminal>) -> Result<(), Vec<aiken_project::error::Error>>,
{
@ -37,7 +42,7 @@ where
let warning_count = warnings.len();
for warning in warnings {
for warning in &warnings {
warning.report()
}
@ -85,6 +90,11 @@ where
warning_text.if_supports_color(Stderr, |s| s.yellow()),
);
}
if warning_count > 0 && deny {
process::exit(1);
}
Ok(())
}
@ -232,16 +242,30 @@ impl telemetry::EventListener for Terminal {
);
}
}
telemetry::Event::DownloadingPackage { name } => {
telemetry::Event::ResolvingPackages { name } => {
eprintln!(
"{} {}",
" Downloading"
" Resolving"
.if_supports_color(Stderr, |s| s.bold())
.if_supports_color(Stderr, |s| s.purple()),
name.if_supports_color(Stderr, |s| s.bold())
)
}
telemetry::Event::PackagesDownloaded { start, count } => {
telemetry::Event::PackageResolveFallback { name } => {
eprintln!(
"{} {}\n ↳ You're seeing this message because the package version is unpinned and the network is not accessible.",
" Using"
.if_supports_color(Stderr, |s| s.bold())
.if_supports_color(Stderr, |s| s.yellow()),
format!("uncertain local version for {name}")
.if_supports_color(Stderr, |s| s.yellow())
)
}
telemetry::Event::PackagesDownloaded {
start,
count,
source,
} => {
let elapsed = format!("{:.2}s", start.elapsed().as_millis() as f32 / 1000.);
let msg = match count {
@ -250,17 +274,20 @@ impl telemetry::EventListener for Terminal {
};
eprintln!(
"{} {}",
" Downloaded"
.if_supports_color(Stderr, |s| s.bold())
.if_supports_color(Stderr, |s| s.purple()),
"{} {} from {source}",
match source {
DownloadSource::Network => " Downloaded",
DownloadSource::Cache => " Fetched",
}
.if_supports_color(Stderr, |s| s.bold())
.if_supports_color(Stderr, |s| s.purple()),
msg.if_supports_color(Stderr, |s| s.bold())
)
}
telemetry::Event::ResolvingVersions => {
eprintln!(
"{}",
" Resolving versions"
" Resolving dependencies"
.if_supports_color(Stderr, |s| s.bold())
.if_supports_color(Stderr, |s| s.purple()),
)

View File

@ -1,7 +1,7 @@
[package]
name = "flat-rs"
description = "Flat codec"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
repository = "https://github.com/aiken-lang/aiken/crates/flat"
homepage = "https://github.com/aiken-lang/aiken"

View File

@ -171,7 +171,7 @@ impl<'b> Decoder<'b> {
/// Otherwise we decode an item in the list with the decoder function passed in.
/// Then decode the next bit in the buffer and repeat above.
/// Returns a list of items decoded with the decoder function.
pub fn decode_list_with<T: Decode<'b>, F>(&mut self, decoder_func: F) -> Result<Vec<T>, Error>
pub fn decode_list_with<T, F>(&mut self, decoder_func: F) -> Result<Vec<T>, Error>
where
F: Copy + FnOnce(&mut Decoder) -> Result<T, Error>,
{
@ -182,6 +182,21 @@ impl<'b> Decoder<'b> {
Ok(vec_array)
}
pub fn decode_list_with_debug<T, F>(
&mut self,
decoder_func: F,
state_log: &mut Vec<String>,
) -> Result<Vec<T>, Error>
where
F: Copy + FnOnce(&mut Decoder, &mut Vec<String>) -> Result<T, Error>,
{
let mut vec_array: Vec<T> = Vec::new();
while self.bit()? {
vec_array.push(decoder_func(self, state_log)?)
}
Ok(vec_array)
}
/// Decode the next bit in the buffer.
/// If the bit was 0 then return true.
/// Otherwise return false.

View File

@ -196,10 +196,7 @@ impl Encoder {
&mut self,
list: &[T],
encoder_func: for<'r> fn(&T, &'r mut Encoder) -> Result<(), Error>,
) -> Result<&mut Self, Error>
where
T: Encode,
{
) -> Result<&mut Self, Error> {
for item in list {
self.one();
encoder_func(item, self)?;

View File

@ -1,7 +1,7 @@
[package]
name = "uplc"
description = "Utilities for working with Untyped Plutus Core"
version = "1.0.13-alpha"
version = "1.0.17-alpha"
edition = "2021"
repository = "https://github.com/aiken-lang/aiken/crates/uplc"
homepage = "https://github.com/aiken-lang/aiken"
@ -33,7 +33,7 @@ serde_json = "1.0.94"
strum = "0.24.1"
strum_macros = "0.24.3"
thiserror = "1.0.39"
flat-rs = { path = "../flat-rs", version = "1.0.13-alpha" }
flat-rs = { path = "../flat-rs", version = "1.0.17-alpha" }
[target.'cfg(not(target_family="wasm"))'.dependencies]
secp256k1 = { version = "0.26.0" }

View File

@ -196,6 +196,14 @@ pub enum Term<T> {
Error,
// tag: 7
Builtin(DefaultFunction),
Constr {
tag: usize,
fields: Vec<Term<T>>,
},
Case {
constr: Rc<Term<T>>,
branches: Vec<Term<T>>,
},
}
impl<T> Term<T> {
@ -208,6 +216,20 @@ impl<T> Term<T> {
}
}
impl<T> TryInto<PlutusData> for Term<T> {
type Error = String;
fn try_into(self) -> Result<PlutusData, String> {
match self {
Term::Constant(rc) => match &*rc {
Constant::Data(data) => Ok(data.to_owned()),
_ => Err("not a data".to_string()),
},
_ => Err("not a data".to_string()),
}
}
}
impl<'a, T> Display for Term<T>
where
T: Binder<'a>,
@ -245,6 +267,13 @@ pub struct Data {}
// TODO: See about moving these builders upstream to Pallas?
impl Data {
pub fn to_hex(data: PlutusData) -> String {
let mut bytes = Vec::new();
pallas_codec::minicbor::Encoder::new(&mut bytes)
.encode(data)
.expect("failed to encode Plutus Data as cbor?");
hex::encode(bytes)
}
pub fn integer(i: BigInt) -> PlutusData {
match i.to_i64() {
Some(i) => PlutusData::BigInt(pallas::BigInt::Int(i.into())),

View File

@ -378,27 +378,4 @@ impl Term<Name> {
.lambda(CONSTR_GET_FIELD),
)
}
pub fn assert_on_list(self) -> Self {
self.lambda(EXPECT_ON_LIST)
.apply(Term::var(EXPECT_ON_LIST).apply(Term::var(EXPECT_ON_LIST)))
.lambda(EXPECT_ON_LIST)
.apply(
Term::var("__list_to_check")
.delayed_choose_list(
Term::unit(),
Term::var("__check_with")
.apply(Term::head_list().apply(Term::var("__list_to_check")))
.choose_unit(
Term::var(EXPECT_ON_LIST)
.apply(Term::var(EXPECT_ON_LIST))
.apply(Term::tail_list().apply(Term::var("__list_to_check")))
.apply(Term::var("__check_with")),
),
)
.lambda("__check_with")
.lambda("__list_to_check")
.lambda(EXPECT_ON_LIST),
)
}
}

View File

@ -79,6 +79,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.name_to_named_debruijn(term)?)),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.name_to_named_debruijn(field))
.collect::<Result<_, _>>()?,
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.name_to_named_debruijn(constr)?),
branches: branches
.iter()
.map(|branch| self.name_to_named_debruijn(branch))
.collect::<Result<_, _>>()?,
},
};
Ok(converted_term)
@ -117,6 +131,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.name_to_debruijn(term)?)),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.name_to_debruijn(field))
.collect::<Result<_, _>>()?,
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.name_to_debruijn(constr)?),
branches: branches
.iter()
.map(|branch| self.name_to_debruijn(branch))
.collect::<Result<_, _>>()?,
},
};
Ok(converted_term)
@ -167,6 +195,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.named_debruijn_to_name(term)?)),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.named_debruijn_to_name(field))
.collect::<Result<_, _>>()?,
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.named_debruijn_to_name(constr)?),
branches: branches
.iter()
.map(|branch| self.named_debruijn_to_name(branch))
.collect::<Result<_, _>>()?,
},
};
Ok(converted_term)
@ -218,6 +260,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.debruijn_to_name(term)?)),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.debruijn_to_name(field))
.collect::<Result<_, _>>()?,
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.debruijn_to_name(constr)?),
branches: branches
.iter()
.map(|branch| self.debruijn_to_name(branch))
.collect::<Result<_, _>>()?,
},
};
Ok(converted_term)
@ -243,6 +299,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.named_debruijn_to_debruijn(term))),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.named_debruijn_to_debruijn(field))
.collect(),
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.named_debruijn_to_debruijn(constr)),
branches: branches
.iter()
.map(|branch| self.named_debruijn_to_debruijn(branch))
.collect(),
},
}
}
@ -272,6 +342,20 @@ impl Converter {
Term::Force(term) => Term::Force(Rc::new(self.debruijn_to_named_debruijn(term))),
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.debruijn_to_named_debruijn(field))
.collect(),
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.debruijn_to_named_debruijn(constr)),
branches: branches
.iter()
.map(|branch| self.debruijn_to_named_debruijn(branch))
.collect(),
},
}
}
@ -302,6 +386,20 @@ impl Converter {
}
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.fake_named_debruijn_to_named_debruijn(field))
.collect(),
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.fake_named_debruijn_to_named_debruijn(constr)),
branches: branches
.iter()
.map(|branch| self.fake_named_debruijn_to_named_debruijn(branch))
.collect(),
},
}
}
@ -332,6 +430,20 @@ impl Converter {
}
Term::Error => Term::Error,
Term::Builtin(builtin) => Term::Builtin(*builtin),
Term::Constr { tag, fields } => Term::Constr {
tag: *tag,
fields: fields
.iter()
.map(|field| self.named_debruijn_to_fake_named_debruijn(field))
.collect(),
},
Term::Case { constr, branches } => Term::Case {
constr: Rc::new(self.named_debruijn_to_fake_named_debruijn(constr)),
branches: branches
.iter()
.map(|branch| self.named_debruijn_to_fake_named_debruijn(branch))
.collect(),
},
}
}

View File

@ -165,6 +165,20 @@ where
builtin.encode(e)?;
}
Term::Constr { tag, fields } => {
encode_term_tag(8, e)?;
tag.encode(e)?;
e.encode_list_with(fields, |term, e| (*term).encode(e))?;
}
Term::Case { constr, branches } => {
encode_term_tag(9, e)?;
constr.encode(e)?;
e.encode_list_with(branches, |term, e| (*term).encode(e))?;
}
}
Ok(())
@ -192,6 +206,19 @@ where
5 => Ok(Term::Force(Rc::new(Term::decode(d)?))),
6 => Ok(Term::Error),
7 => Ok(Term::Builtin(DefaultFunction::decode(d)?)),
8 => {
let tag = usize::decode(d)?;
let fields = d.decode_list_with(|d| Term::<T>::decode(d))?;
Ok(Term::Constr { tag, fields })
}
9 => {
let constr = (Term::<T>::decode(d)?).into();
let branches = d.decode_list_with(|d| Term::<T>::decode(d))?;
Ok(Term::Case { constr, branches })
}
x => {
let buffer_slice: Vec<u8> = d
.buffer
@ -354,6 +381,29 @@ where
}
}
}
8 => {
state_log.push("(constr ".to_string());
let tag = usize::decode(d)?;
let fields = d.decode_list_with_debug(
|d, state_log| Term::<T>::decode_debug(d, state_log),
state_log,
)?;
Ok(Term::Constr { tag, fields })
}
9 => {
state_log.push("(case ".to_string());
let constr = Term::<T>::decode_debug(d, state_log)?.into();
let branches = d.decode_list_with_debug(
|d, state_log| Term::<T>::decode_debug(d, state_log),
state_log,
)?;
Ok(Term::Case { constr, branches })
}
x => {
state_log.push("parse error".to_string());

View File

@ -27,9 +27,18 @@ enum MachineState {
#[derive(Clone)]
enum Context {
FrameApplyFun(Value, Box<Context>),
FrameApplyArg(Env, Term<NamedDeBruijn>, Box<Context>),
FrameAwaitArg(Value, Box<Context>),
FrameAwaitFunTerm(Env, Term<NamedDeBruijn>, Box<Context>),
FrameAwaitFunValue(Value, Box<Context>),
FrameForce(Box<Context>),
FrameConstr(
Env,
usize,
Vec<Term<NamedDeBruijn>>,
Vec<Value>,
Box<Context>,
),
FrameCases(Env, Vec<Term<NamedDeBruijn>>, Box<Context>),
NoFrame,
}
@ -37,7 +46,7 @@ pub struct Machine {
costs: CostModel,
pub ex_budget: ExBudget,
slippage: u32,
unbudgeted_steps: [u32; 8],
unbudgeted_steps: [u32; 10],
pub logs: Vec<String>,
version: Language,
}
@ -53,7 +62,7 @@ impl Machine {
costs,
ex_budget: initial_budget,
slippage,
unbudgeted_steps: [0; 8],
unbudgeted_steps: [0; 10],
logs: vec![],
version,
}
@ -117,7 +126,11 @@ impl Machine {
self.step_and_maybe_spend(StepKind::Apply)?;
Ok(MachineState::Compute(
Context::FrameApplyArg(env.clone(), argument.as_ref().clone(), context.into()),
Context::FrameAwaitFunTerm(
env.clone(),
argument.as_ref().clone(),
context.into(),
),
env,
function.as_ref().clone(),
))
@ -147,20 +160,43 @@ impl Machine {
Value::Builtin { fun, runtime },
))
}
Term::Constr { tag, mut fields } => {
self.step_and_maybe_spend(StepKind::Constr)?;
if !fields.is_empty() {
let popped_field = fields.remove(0);
Ok(MachineState::Compute(
Context::FrameConstr(env.clone(), tag, fields, vec![], context.into()),
env,
popped_field,
))
} else {
Ok(MachineState::Return(
context,
Value::Constr {
tag,
fields: vec![],
},
))
}
}
Term::Case { constr, branches } => {
self.step_and_maybe_spend(StepKind::Case)?;
Ok(MachineState::Compute(
Context::FrameCases(env.clone(), branches, context.into()),
env,
constr.as_ref().clone(),
))
}
}
}
fn return_compute(&mut self, context: Context, value: Value) -> Result<MachineState, Error> {
match context {
Context::FrameApplyFun(function, ctx) => self.apply_evaluate(*ctx, function, value),
Context::FrameApplyArg(arg_var_env, arg, ctx) => Ok(MachineState::Compute(
Context::FrameApplyFun(value, ctx),
arg_var_env,
arg,
)),
Context::FrameForce(ctx) => self.force_evaluate(*ctx, value),
Context::NoFrame => {
if self.unbudgeted_steps[7] > 0 {
if self.unbudgeted_steps[9] > 0 {
self.spend_unbudgeted_steps()?;
}
@ -168,6 +204,46 @@ impl Machine {
Ok(MachineState::Done(term))
}
Context::FrameForce(ctx) => self.force_evaluate(*ctx, value),
Context::FrameAwaitFunTerm(arg_env, arg, ctx) => Ok(MachineState::Compute(
Context::FrameAwaitArg(value, ctx),
arg_env,
arg,
)),
Context::FrameAwaitArg(fun, ctx) => self.apply_evaluate(*ctx, fun, value),
Context::FrameAwaitFunValue(arg, ctx) => self.apply_evaluate(*ctx, value, arg),
Context::FrameConstr(env, tag, mut fields, mut resolved_fields, ctx) => {
resolved_fields.insert(0, value);
if !fields.is_empty() {
let popped_field = fields.remove(0);
Ok(MachineState::Compute(
Context::FrameConstr(env.clone(), tag, fields, resolved_fields, ctx),
env,
popped_field,
))
} else {
Ok(MachineState::Return(
*ctx,
Value::Constr {
tag,
fields: resolved_fields,
},
))
}
}
Context::FrameCases(env, branches, ctx) => match value {
Value::Constr { tag, fields } => match branches.get(tag) {
Some(t) => Ok(MachineState::Compute(
transfer_arg_stack(fields, *ctx),
env,
t.clone(),
)),
None => todo!(),
},
_ => todo!(),
},
}
}
@ -258,9 +334,9 @@ impl Machine {
fn step_and_maybe_spend(&mut self, step: StepKind) -> Result<(), Error> {
let index = step as u8;
self.unbudgeted_steps[index as usize] += 1;
self.unbudgeted_steps[7] += 1;
self.unbudgeted_steps[9] += 1;
if self.unbudgeted_steps[7] >= self.slippage {
if self.unbudgeted_steps[9] >= self.slippage {
self.spend_unbudgeted_steps()?;
}
@ -279,7 +355,7 @@ impl Machine {
self.unbudgeted_steps[i] = 0;
}
self.unbudgeted_steps[7] = 0;
self.unbudgeted_steps[9] = 0;
Ok(())
}
@ -296,6 +372,16 @@ impl Machine {
}
}
fn transfer_arg_stack(mut args: Vec<Value>, ctx: Context) -> Context {
if args.is_empty() {
ctx
} else {
let popped_field = args.remove(0);
transfer_arg_stack(args, Context::FrameAwaitFunValue(popped_field, ctx.into()))
}
}
impl From<&Constant> for Type {
fn from(constant: &Constant) -> Self {
match constant {
@ -394,4 +480,67 @@ mod tests {
);
}
}
#[test]
fn case_constr_case_0() {
let make_program =
|fun: DefaultFunction, tag: usize, n: i32, m: i32| Program::<NamedDeBruijn> {
version: (0, 0, 0),
term: Term::Case {
constr: Term::Constr {
tag,
fields: vec![
Term::Constant(Constant::Integer(n.into()).into()),
Term::Constant(Constant::Integer(m.into()).into()),
],
}
.into(),
branches: vec![Term::Builtin(fun), Term::sub_integer()],
},
};
let test_data = vec![
(DefaultFunction::AddInteger, 0, 8, 3, 11),
(DefaultFunction::AddInteger, 1, 8, 3, 5),
];
for (fun, tag, n, m, result) in test_data {
let eval_result = make_program(fun, tag, n, m).eval(ExBudget::max());
assert_eq!(
eval_result.result().unwrap(),
Term::Constant(Constant::Integer(result.into()).into())
);
}
}
#[test]
fn case_constr_case_1() {
let make_program = |tag: usize| Program::<NamedDeBruijn> {
version: (0, 0, 0),
term: Term::Case {
constr: Term::Constr {
tag,
fields: vec![],
}
.into(),
branches: vec![
Term::integer(5.into()),
Term::integer(10.into()),
Term::integer(15.into()),
],
},
};
let test_data = vec![(0, 5), (1, 10), (2, 15)];
for (tag, result) in test_data {
let eval_result = make_program(tag).eval(ExBudget::max());
assert_eq!(
eval_result.result().unwrap(),
Term::Constant(Constant::Integer(result.into()).into())
);
}
}
}

View File

@ -36,6 +36,13 @@ impl ExBudget {
cpu: 10000000000,
}
}
pub fn max() -> Self {
ExBudget {
mem: 14000000000000,
cpu: 10000000000000,
}
}
}
impl Default for ExBudget {
@ -84,6 +91,8 @@ pub struct MachineCosts {
delay: ExBudget,
force: ExBudget,
apply: ExBudget,
constr: ExBudget,
case: ExBudget,
/// Just the cost of evaluating a Builtin node, not the builtin itself.
builtin: ExBudget,
}
@ -99,6 +108,8 @@ impl MachineCosts {
StepKind::Delay => self.delay,
StepKind::Force => self.force,
StepKind::Builtin => self.builtin,
StepKind::Constr => self.constr,
StepKind::Case => self.case,
StepKind::StartUp => self.startup,
}
}
@ -134,6 +145,15 @@ impl MachineCosts {
mem: 100,
cpu: 23000,
},
// Placeholder values
constr: ExBudget {
mem: 30000000000,
cpu: 30000000000,
},
case: ExBudget {
mem: 30000000000,
cpu: 30000000000,
},
}
}
}
@ -171,6 +191,15 @@ impl Default for MachineCosts {
mem: 100,
cpu: 23000,
},
// Placeholder values
constr: ExBudget {
mem: 30000000000,
cpu: 30000000000,
},
case: ExBudget {
mem: 30000000000,
cpu: 30000000000,
},
}
}
}
@ -2223,6 +2252,22 @@ pub fn initialize_cost_model(version: &Language, costs: &[i64]) -> CostModel {
.get("cek_builtin_cost-exBudgetCPU")
.unwrap_or(&30000000000),
},
constr: ExBudget {
mem: *cost_map
.get("cek_constr_cost-exBudgetmem")
.unwrap_or(&30000000000),
cpu: *cost_map
.get("cek_constr_cost-exBudgetCPU")
.unwrap_or(&30000000000),
},
case: ExBudget {
mem: *cost_map
.get("cek_case_cost-exBudgetmem")
.unwrap_or(&30000000000),
cpu: *cost_map
.get("cek_case_cost-exBudgetCPU")
.unwrap_or(&30000000000),
},
},
builtin_costs: BuiltinCosts {
add_integer: CostingFun {
@ -3191,8 +3236,10 @@ pub enum StepKind {
Delay = 4,
Force = 5,
Builtin = 6,
Constr = 7,
Case = 8,
// DO NOT USE THIS IN `step_and_maybe_spend`
StartUp = 7,
StartUp = 9,
}
impl TryFrom<u8> for StepKind {
@ -3207,6 +3254,8 @@ impl TryFrom<u8> for StepKind {
4 => Ok(StepKind::Delay),
5 => Ok(StepKind::Force),
6 => Ok(StepKind::Builtin),
7 => Ok(StepKind::Constr),
8 => Ok(StepKind::Case),
v => Err(super::error::Error::InvalidStepKind(v)),
}
}

Some files were not shown because too many files have changed in this diff Show More