Term Rewriting
- Abstract Rewrite Systems
- Completion
- Ground Rewriting
- Term Orderings
- Termination
- Implementation
- Narrowing
- AC
- Higher order rewriting
- Egraph
- String rewriting systems
- Graph Rewriting
- Systems
- Other Systems
- TRaaT Rust
- 2020 Term rewritng notes
- Misc
https://www.stephendiehl.com/posts/exotic02.html
Term rewriting and all that Term rewriting - Terese
Abstract Rewrite Systems
https://en.wikipedia.org/wiki/Abstract_rewriting_system This is sort of a study of relations and transitvitym, symmettry.
confluence church rosser
If the reduction relation is a subset of a well founded relation, it is terminating. This is kind of a definition of termination
Completion
Knuth bendix
Naive completion
# Fig 7.1 in TRAAT
# rewrites as tuples
R = {("a", "b")}
while not Rnew.issubset(R):
R = R.union(Rnew)
Rnew = {}
for (lhs,rhs) in R:
lhs1 = reduce(lhs)
rhs1 = reduce(rhs)
if lhs1 < rhs1:
Rnew.add((lhs1, rhs1))
else:
Rnew.add((rhs1, lhs1))
# Could use maude to actuall perform reduction.
def init_eqs(E):
R = set()
return (E, R)
def deduce(ER, u, s, t):
(E,R) = ER
# confirm u -> s, u -> t
return (E.union({(s,t)}), R)
def delete(ER,s):
(E,R) = ER
assert s in E:
return (E.diff((s,s)), R)
def
Newman’s lemma weak conlfuence + termination => strong confluence
critical pair for terms - 1 is unifiable with the other.
Similarity to grobner basis similarity to gaussian elimination
Knuth Bendix asks a lot though. The more obvious approahc is to heuristically aspply rewrite to your actual problem rather than try to completely solve all possible problems up front. Why not do this fro grobner? It sounds stupid for gaussian though. Maybe. Just sort of greedily try to write your query polynomial in terms of the ones you have? This might be the analog of some iteraitve scheme like gauss jacobi.
Finite state machines are rewrite systems with each state being a single simple constant
https://github.com/bytekid/mkbtt - does knuth bendix completion. There is a web interface https://github.com/bytekid/maedmax ? http://cime.lri.fr/ cime
unfailing completion
Ground Rewriting
A Ground rewriting system is an egraph in a loose sense. Both are pile of ground equations. Completion is canonicalization.
def size(t):
return 1 + sum(map(size, t[1:]))
def term_order(t1,t2):
s1 = size(t1)
s2 = size(t2)
if s1 < s2:
return True
elif s1 == s2:
if t1[0] < t2[0]:
return True
elif t1[0] == t2[0]:
for t,s in zip(t1[1:], t2[1:]):
if term_order(t,s):
return True
return False
def groundrewrite(lhs, rhs, t):
if lhs == t:
return rhs
return [t[0]] + [map(groundrewrite(lhs,rhs) , t[1:]
Hmm. Interesting. Term orderings on ground terms are a good hting to consider first. A bit less complicated than terms with variables.
if str_order(t,s):
if len(t) < len(s):
return True
elif len(t) == len(s):
return t < s
else:
return False
No this is a bad ordering. It doesn’t obey subterm properties?
Term ordering Mod E. Term Ordering mode beta
The egraph implements a fast on the fly defind term ordering. But we can probably define an easy to compute ahead of time ground ordering. Memoize fingerprints, etc.
As compared to string rwriting, I don’t want to (or don’t have to) consider non-complete overlap pairs? That’s what really makes this process terminating perhaps.
Term Orderings
Things to Know when Implementing KBO Things to Know when Implementing LPO rewrite ordering reduction ordering
Interpretations f(x) -> a + x a counts occurrances of f basically f(f(f(X))) = 3a + x
Simplification orderings homomorphism
KBO RPO LPO
stable under subsitition stable under contect terminating
weighted path order WPO
kbo - maps all variables to number that is less than all actual symbols Then upon subsitution, the cost can only increase first check variabnle count is ok complicated tie breaking https://www.cs.miami.edu/home/geoff/Courses/TPTPSYS/FirstOrder/SyntacticRefinements.shtml
Empirical Properties of Term Orderings for Superposition - Schulz
Basically term size is a good idea for a termination ordering
But also obviously definitional systems are obviously terminating. Systems that do not have recursive definitions. However, these unfolder systems clearly grow term sizes. So we need an intuition for these. THe lexicographic order works. THe ordering on symbols is the definition order
No let rec
let f x = x
let g x y = f f f f f f f x
let h x = g (f x) (f x)
Any program in this system is obviously terminating. Non recursive Macro expansions
But maybe you need some mix of “called on smaller arguments” + definition unpacking
Hmm. But weighted size is more or less KBO. And if we made the weight very high for the lower definitions, that might work too.
You can remove extra ways from some algortihmic description of an ordering and it stays a partial ordering, just a weaker one.
REST: Integrating Term Rewriting with Program Verification So I think they are doing ordered rewriting but they maintain an abstraction over which term order they are using.And then they have a search procesdure that branches when they can’t make progress to try and specialize the order.
KBO
- s > var_X if var_X is in s. Base case for variables.
- All number of occurances of variables must be equal or reduced.
- weighting function must be reduced
- to tie break some weighting function equivalnces, we can start a. check ordering on head symbols b. recursing into subterms
Weighting function
- weight for all variables must be the same
- zero weights is fishy
Otherwise it’s the obvious add up the number of occurances of symbols weighted
Going ground is smart for understanding
#use "topfind";;
#require "ppx_jane sexplib";;
open Sexplib.Std
type term =
| Const of string
| App of term * term [@@deriving sexp]
(* Apply term * term ?? *)
let app (f : term) (args : term list) : term = List.fold_left (fun f x -> App (f,x)) f args
let rec size = function
| Const _ -> 1
| App (f,x) -> size f + size x
(* List.fold_left (fun acc a -> acc + size a) (size a) args *)
let rec ground_kbo t1 t2 =
let s1 = size t1 in (* obviously silly to be recomputing thi a bunch *)
let s2 = size t2 in
if Int.equal s1 s2 then
match t1, t2 with
| Const x1, Const x2 -> String.compare x1 x2 (* could compare weights first *)
| Const x1, App (f,args) -> -1
| App (f,args), Const x1 -> 1
| App (f1,args1), App (f2,args2) ->
let cf = ground_kbo f1 f2 in
if Int.equal cf 0 then
ground_kbo args1 args2 else
cf
else Int.compare s2 s2
let f = fun x -> App ((Const "f"), x)
let a = Const "a"
let () = print_int (compare (f (f a)) (f a))
let () = print_int (compare (f a) (f a))
let () = print_int (compare (app (Const "f") [a;a]) (f a))
(*
type memo_term =
| Const of string
| Apply of {size : int; head : memo_term, memo_term term array}
and data =
{size : int; id : int; hash : int; feature : int array ; sym : memo_term }
*)
A unified ordering for termination proving - WPO weighted path order But also interesting survey
dependency pairs matrix interpretations POLO
Termination
aprove is pretty domainantn. C, llvm, haskell, java, prolog, trs,some integer transition stuff
https://aprove.informatik.rwth-aachen.de/download
AProve http://aprove.informatik.rwth-aachen.de/index.asp?subform=home.html aachen https://github.com/aprove-developers/aprove-releases/releases releases
ecipse plugin. Huh
echo "
[x,y]
plus(0,y) -> y
plus(s(x),y) -> s(plus(x,y))
" > /tmp/plus.tes
java -ea -jar ~/Downloads/aprove.jar -m wst /tmp/plus.tes
echo "
(VAR x y)
(RULES
plus(0,y) -> y
plus(s(x),y) -> s(plus(x,y))
)
" > /tmp/plus.trs
java -ea -jar ~/Downloads/aprove.jar -p plain -m wst /tmp/plus.trs
# help for prolog https://aprove.informatik.rwth-aachen.de/help_new/logic.html
echo "
%query: append(b,f,f)
append([],L,L).
append([X|Xs],Ys,[X|Zs]) :- append(Xs,Ys,Zs).
" > /tmp/append.pl
java -ea -jar ~/Downloads/aprove.jar -p plain -m wst /tmp/append.pl
#java -ea -cp ~/Downloads/aprove.jar aprove.CommandLineInterface.PrologFrontendMain
echo "
{-# htermination (mysum :: (List a) -> (List a))) #-}
import qualified Prelude
data MyBool = MyTrue | MyFalse
data List a = Cons a (List a) | Nil
mysum Nil = Nil
mysum (Cons x xs) = mysum xs
"
> /tmp/mysum.hs
java -ea -jar ~/Downloads/aprove.jar -m wst /tmp/mysum.hs
echo "
{-# htermination (foldr1 :: (a -> a -> a) -> (List a) -> a) #-}
import qualified Prelude
-- import List
data MyBool = MyTrue | MyFalse
data List a = Cons a (List a) | Nil
foldr1 :: (a -> a -> a) -> (List a) -> a
foldr1 f (Cons x Nil) = x
foldr1 f (Cons x xs) = f x (foldr1 f xs)
" > /tmp/fold.hs
java -ea -jar ~/Downloads/aprove.jar -m wst /tmp/fold.hs
Installed yices 1 so it would stop complainging.
startTerm? -q QUERY is used to specify a query, e.g., to analyze arbitrary methods or to provide information about the method’s arguments. The flag -b
#java -ea -cp ~/Downloads/aprove.jar aprove.CommandLineInterface.HaskellFrontendMain
"
sage: java -cp aprove.jar aprove.CommandLineInterface.HaskellFrontendMain [OPTION] HSFILE
Haskell to QDP/Graph dump export from AProVE 2014.
-h, --help print this help
-o, --outputDir DIR directory in which TRSs will be dumped (default '.')
-t, --timeout SECONDS timeout, in seconds (default 60)
-p, --proof print proof for steps from input to TRSs
-g, --graph yes|no export to Graph (default no)
-q, --query QUERY a query which tells AProVE what to analyze
-s, --startTerm TERM analyze termination starting with term TERM
-j, --json yes|no export JSON (default no), conflicts with other output
"
TTT innsbruck http://eptcs.web.cse.unsw.edu.au/paper.cgi?ThEdu19.4
http://cl-informatik.uibk.ac.at/software/ttt2/
ttt2 /tmp/plus.trs
http://cl-informatik.uibk.ac.at/software/ceta/ ceta. Certified termination
http://zenon.dsic.upv.es/muterm/ muterm
ceta
#usage: ceta [(parameters) certificate | --version]
#
# A "certificate" is an XML file in the certification problem format (CPF 3.x).
# (manually setting a parameter overwrites information in CPF):
## --allow-assumptions Allow (axiomatic) assumptions in the certificate.
# --inputf fname Read input from separate file.
# --propertyf fname Read property from separate file.
# --property p Read property from string p.
# --answerf fname Read answer from separate file.
# --answer a Read answer from string a.
# --version Print the version number (+ mercurial id).
koat loat cofloco
Nontermination checking Online termination checking
https://link.springer.com/chapter/10.1007/978-3-642-32347-8_17 stop when you’re almost full. Adventures in constructive termination
https://github.com/TermCOMP/TPDB termination problem database
https://termination-portal.org/wiki/Termination_Portal Termcomp 2022 https://termination-portal.org/wiki/Termination_Competition_2022 https://termcomp.github.io/Y2023/ 2023 results
Dumping into a constraint solver. For a parametrzied family, conceivably one can build a constraint problem over the parameters that requires a rewrite system follows a term order. For an equational system, one could have the “or” of the two directions as part of the constraints.
The dummiest version of this is a MIP for weights of symbols and uses a < inequality of the weight of the two sides. It’s the tie breaking that is a real pain.
2021- 02
Coq and termination Accessibility relations, Sam’s paper Bove-Capretta method - a Defunctionalization of the Acc method? Adding an accessibility parameter - delays the requirement of proof to when it receives this
2020-12-07
Polynomial interpetations - each function symbol goes to some polynomial. variable stay variables. f(g(x,y)) -> yada yada If you bound you coefficents to integers and small, the nonlinearity of the coefficients from f and g aren’t persay a problem
RPO. recsurive path orderings.
String rewriting termination. The simpler model. abc -> cdf assign each guy to an nonegative integer. a + b + c > c + d + f.
This one is actually an ILP.
This is polynomial interpetation where concat/cons symbol has intepretation of plus and each constant symbol has interpetation of a number.
http://www.cs.tau.ac.il/~nachumd/papers/termination.pdf dershowitz termination review 1987
The obviously terminating stuff always decreases Doesn’t always decrease, but clearly we lose 3 aaa to make a c but only gain 2 a from a c. We’re losing net a every time we make a step. c -> aa aaa -> c
strategy : build string model of finite depth term rewriting system. See if it works. build ground terms instantiation on term rewrite system. Constrain somehow Possibly try to encode “good” thing in objective function. Find failure. iterate. This is a cegis.
2020-03-09
A Harmonic Oscillator. Prove that it is stable. Build lyapunov function. Can do with SDP. V >= 0, st. $latex \frac{d}{dt} xVx <= 0$. Or better yet $latex \frac{d}{dt} xVx <= \eps$ or $latex \frac{d}{dt} xVx <= eps (x V x)$.
Also could consider breaking up space into polyhedral regions (quadrants for example. Find a piecewise linear lyapunov function.
Another interesting problem is to prove escape from a region. Prove that osciallator eventually escapes x>=1. No prob. We also get a bound on the time it takes. V(x) = cx. dot V = c xdt = c A x <= eps forall x >= 1.
min eps, forall x. c A x - eps <= lam (x - 1) … this is not possible. lam needs to become a positive polynomial. No, it is possible if lam = cA xhat and eps <= lam.
An interesting discrete analog of this system would be a swapper. x0 >= 1 implies x0’ = x1, x1’ = x2, x2’ = x3, x3’ = x0-1. The minus 1 gives us a small decay term.
For a string rewriting system, the simplest thing is just look at some kind of count on the symbols. Maybe with some weighting. It may be that you are always decreasing. If just count doesn’t work, you can try 2-grams or other features/combos. This feels something like a sherali-adams
For term rewriting, we could try to ignore the parse structure and use the count as string rewrite as above. Polynomial interpetation appears to want to interpret a term that is applied as a polynomial. This seems like it would create difficult nonlinear constraints for both verification and synthesis. Although if you constrain each polynomial to be bounded integers, it may make sense to pound them into a sat solver. Ok if each intepretation is only linear, then the combined would still be linear for verification.
Integer and Polynomial Programs. This means something rather different from integer programming. The idea is that all variables are integer, but you still have a notion of time. Guarded transitions. It seems likely I could compile this into an integer program. Reify inequality conditions.
Cegar loop. Can run program to get a bunch of traces. Use traces and find a decreasing function on all traces.
https://link.springer.com/chapter/10.1007/978-3-540-72788-0_33 SAT solving for temrination checking. It does appear they slam nonlinear arithmetic problems into sat solver.s
Dependency pair? What is this. People seem to think it is very important
The control community has similar questions and similar yet different appproaches. They often want continuous state.
https://github.com/blegat/SwitchOnSafety.jl
https://github.com/blegat/HybridSystems.jl
SpaceEx. Platzer’s stuff. JuliaReach
Barrier functions. I think the idea is the you put a function that is diverging at the region you’re worried about. If you can guarantee that this function is not diverging.
Sum of Squares certificates. Describing sets as sublevel sets.
The S-Procedure. I think this is that you can relax all inequalities using your assumptions $latex f(x) >=0 implies g(x) >=0$ then $latex g(x) >= \lambda f(x)$ and $latex \lambda >= 0 $ is sufficient. Sometimes it is necessary I think.
Hybrid Systems. Piecewise affine systems.
Reachability. We want to figure out how to rewrite one equation into another. Building an A* style lower bound distance could be quite helpful. A lower bound cost to get to some position. In terms of a control objective function S(x,x’), V(x,x’). In a small finite graph this could be computed. But suppose we didn’t have enough flexibility. Finite graph, linear function of the features.
Ok. Some things. My concern about nonlinearity for multiply composed functions. 1. you might interpet some things differently (as fixed polynomial interpretation). Makes sense for constructors and data structures, where we have some reasonable suggestions. Looking at HEADs seems to matter a lot / give important over approximations of the behavior of the system. integer transition system. We can pattern match on fixed integers. f(x, 1) = yada. f(x,y) -> f(y+1,x). This we can do with guards. f(x) | x > 7 -> g(x + 7). f ~ 1 + a x + bx^2 + … and so for g. Then f(x) >= g(x) + lam guard is lyapunov condition. We want struct inequality, that is the point of the integer nature of the system. f(x) | x < 7 -> f(x**2 - ) |
sum(n, acc) -> sum(n - 1, acc+n)
Implementation
Terms for Efficient Proof Checking and Parsing Two different types for terms that are expected to live different amounts of time. Differ only in rust pointer type. short terms have borrowed reference to long terms. Kontroli
Narrowing
https://maude.lcc.uma.es/manual271/maude-manualch16.html Narrowing is like prolog resolition Instead of rewriting ground terms, we cna rewrite terms with variables. Nondeterminsically we might unify and “narrow” the variables in there.
functional logic programming in maude
AC
- flat and orderless patterns in mathematica
- Associative-Commutative Rewriting on Large Terms - Eker ac slides Interesting, they use term in indexing to try and prune the explosion faster. Fingerprints / etc. They need AC if they want to represent Maps equationally in maude with the right asymptotic properties. S. M. Eker. Associative-commutative matching via bipartite graph matching. Oh snap. Some AC patterns can be fast
- Non-linear Associative-Commutative Many-to-One Pattern Matching with Sequence Variables
- Compilation of Pattern Matching with Associative-Commutative Functions
- matchpy docs
- Commutative unification
- Equational Unification and Matching, and Symbolic Reachability Analysis in Maude 3.2 (System Description)
- associative commutative rules Symbolics.jl
- Variadic Equational Matching in Associative and Commutative Theories comparing with mathemticas closed source solution. Systems of equations perspective like unification is interesting angle. Matching is solution of system of equations.
- asscaitve unification in maude
Even if you have a confluent terminating system, E-matching (in this other sense?) requires search. Interesting. So that’s probably what the variant stuff is about. If you do narrowing rewriting on the patterns (only interesting non critical pair overlap), eventually the pattern is so instantiated it can’t possibly match the term (?). That could help. This is back to the idea of treating AC as a macro on patterns. This is limitting in many ways. But maybe limitting is what you want. Pre resolution on datalog rules.
Higher order rewriting
Egraph
See
- egraphs
- egglog
A form of nondestructive term rewriting
String rewriting systems
Semi-Thue systems Word problem Monoid presentation
converting to term rewriting system fff —> f(f(f(X)))
Term Rewriting using linux command line tools
The string search and manipulation tools are very powerful and efficient. They compile queries like grep into simple machines and such I think.
There’s a big difference between ground and non ground terms. They appear subtly different when latexed, but the are way different beats Ground terms equation proving can be done through the e graph efficiently.
Ground term rewriting seems to be identical to string rewriting. Just layout serially a traversal of the term.
The implicit prefix and suffix are the context of the term
rules = [
("aa", "b"),
("b", "c")
]
def run_rules(s,rules):
old_s = None
while s != old_s:
old_s = s
for lhs,rhs in rules:
s = s.replace(lhs,rhs)
return s
print(run_rules("ababaaaaccaaa", rules))
def naive_completion(rules):
old_rules = None
while rules != old_rules:
old_rules = rules.copy()
for (lhs,rhs) in old_rules:
a = run_rules(lhs, rules)
b = run_rules(rhs, rules)
if a == b:
break
if a < b:
rules.add((b,a))
if a > b:
rules.add((a,b))
return rules
# an incomplete reduction routine?
# Triangular rewrite rules in some sense.
# Is this right at all? This is like a chunk of Huet's. I think just moving R -> E might be ok even if not one of listed rules. No, if I could do that ths move + simplfy would give me a more powerful R-simplify.
# I might be weakening my rules. That's not so bad imo.
def reduce_rules(E):
# worklist style
R = set()
# Sort E so smallest last probably. Most reduction power.
E = sorted(E, key=lambda k: len(k[0]), reverse=True)
while len(E) > 0:
(a,b) = E.pop()
a = run_rules(a, R) # simplify
b = run_rules(b, R)
print(a,b)
if a == b: #delete
continue
if (len(b), b) > (len(a), a): # len then lex ordering
R.add((b,a))
if (len(a), a) > (len(b), b):
R.add((a,b))
return R
rules = {
("aaa", "a"),
("aaa", "c")
}
print(naive_completion(rules))
rules = [
("ffa", "a"),
("fa", "a")
]
print(run_rules("ffffffffffffa", rules))
print(reduce_rules(rules))
rules = [
("12+", "21+"), # an application of comm
("23+1+", "2+31+") # an application of assoc
]
# I am really going to want a notion of indirection or compression here.
# Intern strings
class RPN():
def __init__(self, s):
self.s = str(s)
def __add__(self,b):
return RPN(self.s + b.s + "+")
def __repr__(self):
return self.s
b0 = RPN(0)
b1 = RPN(1)
b2 = RPN(2)
b3 = RPN(3)
E = [
(b1 + b0, b0 + b1),
(b0 + b1, b1),
(b1 + b2, b3),
(b2, b1 + b1)
]
E = [
("10+", "01+"),
("01+", "1"),
("12+", "3"),
("2", "11+"),
("00+", "0"),
("00+1+1+1+", "3"),
]
print(reduce_rules(E))
E = reduce_rules(E)
print(run_rules("00+1+0+1+0+2+",E))
E = [( str(i) + str(j) + "+" , str(i + j)) for i in range(4) for j in range(10) if i + j <= 9]
print(E)
print(reduce_rules(E))
print(run_rules("00+1+0+1+0+2+",E))
Ropes
We can of course compile a rule set to do better than this. In some sense every string represents a possibly infinite class of strings posible by running rules in reverse
String rewriting systems are a bit easier to think about and find good stock library functionality for.
string rewriting is unary term rewriting. A variable string pattern “aaaXbbbb” is a curious object from that perspective. While simple, it is a higher order pattern. a(a(a(X(b(b(b(Y)))))))
. You can also finitize a bit. foo(a)
can be made an atomic character. Or you can partially normalize a term rewriting system to string rewriting form
String orderings Lexicographic comparison Length shortlex - first length, then lex symbol counts
def critical_pairs(a,b):
assert len(a) > 0 and len(b) > 0
if len(b) <= len(a):
a,b = b,a # a is shorter
cps = []
if b.find(a) != -1: # b contains a
cps.append(b)
for n in range(1,len(a)): # check if overlapping concat is possible
if b[-n:] == a[:n]:
cps.append(b + a[n:])
if b[:n] == a[-n:]:
cps.append(a + b[n:])
return cps
print(critical_pairs("aba", "ac"))
print(critical_pairs("aba", "ca"))
print(critical_pairs("abab", "ba"))
'''
def reduce_rules(rules): # a very simple reduction, reducing rhs, and removing duplicate rules
rules1 = set()
for (lhs,rhs) in rules:
rhs = run_rules(rhs,rules)
rules1.add((lhs,rhs))
return rules1
'''
#run_rules
#reduce_rules
Building a suffix tree might be a good way to find critical pairs.
Lempel Ziv / Lzw is the analog of hash consing? Some kind of string compression is. That’s fun.
http://haskellformaths.blogspot.com/2010/05/string-rewriting-and-knuth-bendix.html
It seems like named pattern string rewriting and variable term rewriting might be close enough
You could imagine
((x + 0) + 0 + 0) laid out as + + + x 0 0 0. and the found ground rewite rule + x 0 -> x being applied iteratively.
Labelling shifts: f(g(a), b) -> f +0 g +1 a -1 b -0
the pattern
f(?x) -> ?x
becoming
f +0
And if we number from the bottom?
f +2
and
(S) -> S
(K) -> K
(I) -> I
IK -> K
IS -> S
II -> I
... all concrete
Sxyz = xz(yz)
Building a turing machien out of a string rewrite system. Have special characters represent the state. and have the patterns include the surrounding context. Enumerate all the characters in state and tape characters.
aSb -> acS
aSb -> Scb
Computational group theory is a thing.
Finite categories which have both finite morphisms and finite objects are approachable. It is clear that most questions one might ask about a finite category is approachable by brute force or maybe by encoding to some graph problems or SAT problem.
Finitely presented categories are the next up the chain in complexity. In this case we take a free category generated by some morphisms and some equations identifying certain composition patterns. It is less clear whether natural questions are decidable or not.
What questions do we care about:
- Are two morphisms expressions equal?
- Produce a morphism from object A to B
- Confirm some mapping is a Functor
- Confirm some functor mapping is a natural transformation.
It is my belief that some questions about these can
The next level of category one might want to talk about is one for which you have guaranteed constructions, such as cartesian, closed, dagger, monoidal, etc. I'm not sure which of these qualifiers are compatible with being finite. [https://arxiv.org/abs/0908.3347](https://arxiv.org/abs/0908.3347) To me, this feels analogous to being able to work with terms rather than just strings.
# Misc
[Gershom Bazerman on "Homological Computations for Term Rewriting Systems"](https://www.youtube.com/watch?v=WdawrT-6Qzk&ab_channel=PapersWeLove)
[rewritng modulo SMT and open system analysis](https://shemesh.larc.nasa.gov/fm/papers/wrla2014-draft.pdf)
[tools in term rewriting for education](https://arxiv.org/pdf/2002.12554.pdf)
<http://www.cs.tau.ac.il/~nachum/papers/klop.pdf> - Klop review on term Rewriting. Cody's RG pick
[A Taste of Rewrite Systems](https://homepage.divms.uiowa.edu/~fleck/181content/taste-fixed.pdf)
<http://cl-informatik.uibk.ac.at/teaching/ws20/trs/content.php> term rewriting course mitteldorp
[code generation via hogher order rewrite systems](https://link.springer.com/chapter/10.1007/978-3-642-12251-4_9)