This has been removed from the CIP-0057 specification since validators
are often re-used for multiple purposes (especially validators with
arity 2). It's misleading to assign a validator a purpose since the
purpose distinction actually happens _within_ the validator itself.
This has been bothering me and the more I thought of it the more I
disliked the idea of a warning. The rationale being that in this very
context, there's absolutely no ambiguity. So it is only frustrating
that the parser is even able to make the exact suggestion of what
should be fixed, but still fails.
I can imagine it is going to be very common for people to type:
```
trace "foo"
```
...yet terribly frustrating if they have to remember each time that
this should actually be a string. Because of the `trace`, `todo` and
`error` keywords, we know exactly the surrounding context and what to
expect here. So we can work it nicely.
However, the formatter will re-format it to:
```
trace @"foo"
```
Just for the sake of remaining consistent with the type-system. This
way, we still only manipulate `String` in the AST, but we conveniently
parse a double-quote utf-8 literal when coupled with one of the
specific keywords.
I believe that's the best of both worlds.
This will probably save people minutes/hours of puzzled debugging. This is only a warning because there may be cases where one do actually want to specify an hex-encoded bytearray. In which case, they can get rid of the warning by using the plain bytearray syntax (i.e. as an array of bytes).
The core observation is that **in the context of Aiken** (i.e. on-chain logic)
people do not generally want to use String. Instead, they want
bytearrays.
So, it should be easy to produce bytearrays when needed and it should
be the default. Before this commit, `"foo"` would parse as a `String`.
Now, it parses as a `ByteArray`, whose bytes are the UTF-8 bytes
encoding of "foo".
Now, to make this change really "fool-proof", we now want to:
- [ ] Emit a parse error if we parse a UTF-8 bytearray literal in
place where we would expect a `String`. For example, `trace`,
`error` and `todo` can only be followed by a `String`.
So when we see something like:
```
trace "foo"
```
we know it's a mistake and we can suggest users to use:
```
trace @"foo"
```
instead.
- [ ] Emit a warning if we ever see a bytearray literals UTF-8, which
is either 56 or 64 character long and is a valid hexadecimal string.
For example:
```
let policy_id = "29d222ce763455e3d7a09a665ce554f00ac89d2e99a1a83d267170c6"
```
This is _most certainly_ a mistake, as this generates a ByteArray of
56 bytes, which is effectively the hex-encoding of the provided string.
In this scenario, we want to warn the user and inform them they probably meant to use:
```
let policy_id = #"29d222ce763455e3d7a09a665ce554f00ac89d2e99a1a83d267170c6"
```
This is not supported by the code generation, so it's a bit of a lie
to have them in the language in the first place. There's arguably not
even any use for constant records, list and tuples to begin with. So
this cleans this up everywhere for the sake of moving forward with the
alpha release.
This now reduces constants to:
- Integer
- ByteArray
- String
Anything else can be declared via a function anyway. We can revisit
this choice later.... or not.
Tracing is now turn OFF by default when:
- building project
- building documentation
- building dependencies
It can be turned ON only when building project using `--keep-traces`.
That means it's not possible to build dependencies with traces. The
address `--rebuild` flag will also rebuild without traces.
Tracing is however turn ON by default when:
- checking the project (and running tests).
In this scenario, tracing can be disabled using `--no-traces` (if for
example, one want to analyze the execution units of specific functions
without having to manually remove traces from code).
This caused me some trouble. In my first approach, I ended up having
multiple traces because nested values would be evaluated twice; once
as condition, and once as part of the continuation.
To prevent this, we can simply evaluate the condition once, and return
plain True / False boolean as outcome. So this effectively transforms any
expression:
```
expr
```
as
```
if expr { True } else { trace("...", False) }
```
Interestingly enough, chumsky seems to fail when given a 'choice' with
more than 25 elements. That's why this commit groups together some of
the choices as another nested 'choice'.
The goal is to handle this without bothering the code generation down the line. That is, we can handle it when transforming from the untyped AST to the typed one. That's why there's no 'TraceIfFalse' constructor in the typed AST. It has disappeared during type-check.
We want the lookup to yield a result when there's only a single
validator; and no title is provided. So that users can simply do
'aiken address' in their project if it's unambiguous. The validator's
name is only required to disambiguate between multiple validators.
I also noticed that the order of arguments in with_validator was
wrong. Somehow.
Todo is fundamentally just a trace and an error. The only reason we kept it as a separate element in the AST is for the formatter to work out whether it should format something back to a todo or something else.
However, this introduces redundancy in the code internally and makes the AIR more complicated than it needs to be. Both todo and errors can actually be represented as trace + errors, and we only need to record their preferred shape when parsing so that we can format them back to what's expected.
We now parse errors as a combination of a trace plus and error term. This is a baby step in order to simplify the code generation down the line and the internal representation of todo / errors.
This however enforces that the argument unifies to a `String`. So this
is more flexible than the previous form, but does fundamentally the
same thing.
Fixes#378.
Not sure what this special case was trying to achieve, but it's not right. There's no need to handle function call with a single argument differently than the others.
List Clauses patterns handle var cases
Fixed Tuple Clauses issue with last clause not being a tuple
Redid how zero arg functions and dependencies are handled. Tough one lol
There's arguably no use case ever for that in the context of on-chain
Plutus. Strings are really just meant to be used for tracing. They
aren't meant to be manipulated as heavily as in classic programming
languages.
Before that commit, the type-checker would allow unsafe list patterns
such as:
```
let [x] = xs
when xs is {
[x] -> ...
[x, ..] -> ...
}
```
This is quite unsafe and can lead to confusing situations. Now at
least the compiler warns about this. It isn't perfect though,
especially in the presence of clause guards. But that's a start.
Whoopsie... || and && were treated with the same precedence, causing very surprising behavior down the line.
I noticed this because of the auto-formatter adding parenthesis where it really shouldn't. The problem came actually from the parser and how it constructed the AST.
fix conversion from inner opaque type for when and assignment
This fixes Clause being used in cases where ListClause or TupleClause should be used
Reset defined and zero arg functions between each code gen
Fixes for optimizations when encountering shadowed variables
* fix assert on pattern Var
* fix tuple index unwrapping closes#334
* allow wrapping when casting with let
* allow wrapping when casting via function call
I decided to invert how I'm doing it. I'm passing
in a new argument to unify in environment called
allow_cast: bool and essentially at various
unification sites I can control whether or not I
want to allow casting to even occur. So we can
assume it's false by default always and then we
turn it on in a few places vs. just opening the
flood gates and locking it down at various sites
as they come up# Please enter the commit message
for your changes. Lines starting
* you cannot cast FROM Data with a `let`
* you cannot cast FROM Data by passing
Data to none Data when calling a function
* you MUST use `assert` to cast from data
* you can cast INTO Data with a `let`
* you can cast INTO Data by passing none Data
to Data when calling a function
* You cannot assert cast Data without an
annotation
Weirdly enough, we got the parsing wrong for byte literals in expressions (but did okay in constants). But got the formatting wrong in constants (yet did okay for formatting expressions). I've factored out the code in both cases to avoid the duplication that led to this in the first place. Plus added test coverage to make sure this doesn't happen in the future.
In an ideal world, I should have handlded that directly at the conflicting commit in the rebase, but this would have bubbled up through all commits... which I wasn't really quite keen on going through. So here's an extra ugly commit that comes and 'fix the rebase'.
This is quite something, because now we have a testing pipeline that
can also be used for testing other compiler-related stuff such as the
type-checker or the code generator.
The blueprint is generated at the root of the repository and is
intended to be versioned with the rest. It acts as a business card
that contains many practical information. There's a variety of tools
we can then build on top of open-source contracts. And, quite
importantly, the blueprint is language-agnostic; it isn't specific to
Aiken. So it is really meant as an interop format within the
ecosystem.
With pretty parse errors on failures. The type-checker was already
implemented for those, so it now only requires some work in the code
generation.
Fixes#297.
This is a bit annoying as we are forced to use #[related] here which isn't quite what we want.
Ideally, this would use #[diagnostic_source] but, there's a bug upstream. See: zkat/miette#172.
In the similar spirit to what we did for sequences. Yet, we need to handle the case of body being just an assignment -- or a trace of an assignment which is basically the same thing.
Far less verbose than defining classes by hand, plus, it allows to have everything about a single error be co-located. And finally, it allows to use 'related', 'label' and so on more easily.
While Gleam originally allowed various kinds of expressions to be discarded in a sequence, we simply do not allow expressions to be discarded implicitly. So any non-final expression in a sequence must be a let-binding. This prevents silly mistakes.
- Display function's signature next to the function name
(instead of being repeated below the function documentation).
- Same for module constants
- Display record constructors in a more concise manner, with
constructors fields next to constructors.
- Display generic parameters, if any, next to the type
- Plus some minor color and icon rework.
Somehow missed it when reworking tuples. We need to allow the new
'NewLineLeftParen' token in this situation as well. Especially because
this is what the formatter outputs.
Due to how PlutusData works it doesn't make sense
to allow user defined types to contain
functions.
```
type Foo {
bar: fn(Int) -> Int
}
```
The above definition will now return an error.
This possibly breaks many Aiken programs out there, but it's for the
best. We haven't released the alpha yet so we still have a bit of
freedom when it comes to breaking change.
Plus, the migration path is easy, simply run:
```
find . -name "*.ak" | xargs sed -i "s/#(/(/g"
```
(or `-i ''` on MacOS).
This changes allow to use parenthesis `(` `)` to encapsulate
expressions in addition to braces `{` `}` used to define blocks.
The main use-case is for arithmetic and boolean expressions for which
developers are used to using parenthesis. For example:
```
{ 14 + 42 } * 1337
```
can now be written as:
```
( 14 + 42 ) * 1337
```
This may sound straightforward at first but wasn't necessarily trivial
in Aiken given that (a) everything is an expression, (b) whitespaces
do not generally matter and (c) there's no symbol indicating the end
of a 'statement' (because there's no statement).
Thus, we have to properly disambiguate between:
```
let foo = bar(14 + 42)
```
and
```
let foo = bar
(14 + 42)
```
Before this commit, the latter would be interpreted as a function call
and would lead to a somewhat puzzling error. Now, the newline serves
as a delimiting symbol. The trade-off being that for a function call,
the left parenthesis has to be on the same line as the function name
identifier -- which is a fair trade off. So this is still allowed:
```
let foo = bar(
14 + 42
)
```
As there's very little ambiguity about it.
This fixes#236 and would seemingly allow us to get rid of the leading
`#` in front of tuples.
* add unary op
* parse, typecheck, and code gen it
* express boolean not as unary op as well, previously called negate
Co-authored-by: rvcas <x@rvcas.dev>
## Before
```
× Type-checking
╰─▶ Unknown module field 'ValidityRaneg' in module 'aiken/transaction'
```
## After
```
× Type-checking
╰─▶ Unknown import 'ValidityRaneg' from module 'aiken/transaction'
╭─[../stdlib/validators/tmp.ak:2:1]
2 │ use aiken/interval.{Interval, IntervalBound, IntervalBoundType}
3 │ use aiken/transaction.{ScriptContext, ValidityRaneg}
· ─────────────
4 │
╰────
help: Did you mean to import 'ValidityRange'?
```
## Before
```
× Checking
╰─▶ Unexpected labeled argument
t
╭─[/Users/mati/Devel/OpenSource/time_lock_aiken/validators/time_lock.ak:13:1]
13 │ let now = when context.transaction.validity_range.lower_bound.bound_type is {
14 │ Finite { t } -> t
· ─
15 │ NegativeInfinity -> 0
╰────
```
## After
```
× Type-checking
╰─▶ Unexpected labeled argument 't'
╭─[../stdlib/validators/tmp.ak:10:1]
10 │ let now = when context.transaction.validity_range.lower_bound.bound_type is {
11 │ interval.Finite { t } -> t
· ─
12 │ interval.NegativeInfinity -> 0
╰────
help: The constructor 'Finite' does not have any labeled field. Its fields
must therefore be matched only by position.
Perhaps, try the following:
╰─▶ interval.Finite(t)
```
## Before
```
× Checking
╰─▶ Unknown variable
Finite
╭─[../stdlib/validators/tmp.ak:10:1]
10 │ let now = when context.transaction.validity_range.lower_bound.bound_type is {
11 │ Finite { t } -> t
· ────────────
12 │ NegativeInfinity -> 0
╰────
```
## After
```
× Type-checking
╰─▶ Unknown data-type constructor 'Finite'
╭─[../stdlib/validators/tmp.ak:10:1]
10 │ let now = when context.transaction.validity_range.lower_bound.bound_type is {
11 │ Finite { t } -> t
· ────────────
12 │ NegativeInfinity -> 0
╰────
help: Did you forget to import it?
Data-type constructors are not automatically imported, even if their type is
imported. So, if a module `aiken/pet` defines the following type:
┍━ aiken/pet.ak ━━━━━━━━
│ pub type Pet {
│ Cat
│ Dog
│ }
You must import its constructors explicitly to use them, or prefix them
with the module's name.
┍━ foo.ak ━━━━━━━━
│ use aiken/pet.{Pet, Dog}
│
│ fn foo(pet : Pet) {
│ when pet is {
│ pet.Cat -> // ...
│ Dog -> // ...
│ }
│ }
```