Dale Weiler
|
a934e0fe4b
|
Happy new year!
|
2014-01-01 06:24:16 -05:00 |
|
Wolfgang Bumiller
|
b1016c7f48
|
fold_binary now used instead of ast_binary_new, which calls fold_superfluous
|
2013-10-25 13:40:31 +02:00 |
|
Dale Weiler
|
8699053887
|
Fix handling on intrinsic folding, this closes #118.
|
2013-10-17 00:14:42 -04:00 |
|
Dale Weiler
|
50d165e173
|
Some intrinsic code cleanup. The args check is handled anyways in the parser. We use a generated array alongside to prevent generating the intrinsic multiple times instead of using static storage. Other various cleanups as well.
|
2013-10-16 00:04:39 -04:00 |
|
Dale Weiler
|
cc69370575
|
Another peephole optimization which removes superfluous expressions such as (A + 0), (A - 0), (A * 1) and (A / 1).
|
2013-10-04 06:46:54 -04:00 |
|
Dale Weiler
|
15b0555546
|
Implement constant folding on ternary operations via fold_cond.
|
2013-09-26 06:51:49 -04:00 |
|
Dale Weiler
|
a50635bcd7
|
intrinsic folding cleanups (and improvements.)
|
2013-08-30 07:12:16 -04:00 |
|
Dale Weiler
|
3b4a5667ea
|
Constant fold intrinsics if their arguments are constant. TODO: reference count intrinsics such that they're not generated unless they're used, currently when an intrinsic can be folded-away it's marked for generation and makes it to the final output binary even though it isn't used.
|
2013-08-28 12:46:22 -04:00 |
|
Dale Weiler
|
73d9aa29c4
|
Made intrinsics seperate from the parser.
|
2013-08-14 06:02:15 +00:00 |
|
Dale Weiler
|
10b75fd8b9
|
Move const-branch-elision into fold.c
|
2013-07-31 19:34:38 +00:00 |
|
Dale Weiler
|
d0ee56f25f
|
more fixes
|
2013-07-31 16:31:45 +00:00 |
|
Dale Weiler
|
920dbaf1e0
|
Work in progress constant-folding rewrite.
|
2013-07-31 09:04:19 +00:00 |
|