EVM and Bitcoin Script are equally expressive if speaking strictly of which pure functions they can compute. The important way in which the two programming environments are different is unrelated to turing completeness, which has outlived its allowance as a layman’s catch-all for “programmable”.
To start, EVM is not turing complete if there is a block gas limit. That is the function of the gas limit, to bound the computation time and guarantee that the algorithm terminates. It is even a principle of EVM contract design that functions should be guaranteed to terminate using a constant worst-case gas, whenever possible.
Conversely, Bitcoin Script can reasonably be called turing complete if you consider transaction size as a cost resource metric like gas. In this formulation, it’s clearer that truly arbitrary computations are not a feature of either environment.
So what about the environment really matters?
The EVM has global state and ability for distinct objects to atomically interact with each other without pre-transaction coordination.
You can fill-or-kill arbitrary interactions across distinct applications which are completely unaware of each other.
This is the key new financial primitive.
Bitcoin Script does not have a
OP_BALANCE to query an object outside of the output atomically, let alone an atomic
OP_CALL in a synchronous global runtime.
This is the concrete advantage the EVM has when it comes to being able to implement defi “gadgets”. It has to do with the fact that A and B can synchronously and atomically call each other and cause native state changes either way, even if one is deployed with no knowledge of the other.
Bitcoin script has no global state and all secondary protocols are overlays must be aware of each other to interact. Bitcoin script is a local pure function of data given with the transaction (except for a few state header state variables like timestamp which take constant space and remain constant while evaluating a single block).
On the other hand, Bitcoin Script has an advantage that even its advocates do not highlight, instead seeking to convince others of the expressive power of the script, which is not convincing because it misses the point of EVM-likes.
What UTXO/Script advocates should be highlighting is that UTXO is maximally parallel by design. An algorithm exists to validate a UTXO block (and even the entire chain) in just O(log n) depth. UTXO prohibits cross-object interaction and the bitcoin “clears” completely from each of the inputs, which can have no bearing on the form of the outputs. This allows UTXO to scale horizontally. Bitcoin script is maximally sharded by default.
Evaluating a block in the UTXO model (evaluation and then merkelization) requires
O(n log n) work, which can be spread over
O(n) processors to a depth of just
This is massively parallel.
The key to achieving
O(log n) depth is to utilize an algorithm that can evaluate any subsequence of transactions and return a partial result not just of “unspent outputs” but also “needed inputs”.
The reader is encouraged to study various parallel prefix sum algorithms for a hint.
Evaluating an EVM block also requires
O(log n) work,
but while the merkelization step even for the state trie can be spread over
O(n) processors yielding
O(log n) depth,
the actual evaluation of transaction logic must be done in-order and cannot in general be parallelized, forcing
This falls out of the feature that EVM objects can call each other arbitrarily and manipulate state depending on the result.
In fact, sharding for EVM only allows speeding up validation by a factor proportional to number of shards. This is still
O(n/k) critical path runtime.
It does not give a log-time algorithm for validating an arbitrary sequence of transactions.
There is no point to a UTXO system with a small block size.
O(log n) depth validation algorithm for a UTXO chain cannot be applied to EVM-like arbitrary state transitions in general.
But Bitcoin Script cannot capitalize on this strategy to thread state through transactions even if such a strategy were to exist for that application.
The process of validating logically dependent sequential transactions can be parallelized when the right associative binary operator can be defined.
Usually this requires engineering the right auxiliary data structure…