"Using matrix multiplication's associativity and non-commutativity properties provides a natural definition of a cryptographic hash / digest / summary of an ordered list of elements. Due to the non-commutativity property, lists that only differ in element order result in a different summary. Due to the associativity property, arbitrarily divided adjacent sub-lists can be summarized independently and combined to quickly find the summary of their concatenation. This definition provides exactly the properties needed to define a list, and does not impose any unnecessary structure that could cause two equivalent lists to produce different summaries. The name *Merklist* is intended to be reminicent of other hash-based data structures like [Merkle Tree](https://en.wikipedia.org/wiki/Merkle_tree) and [Merklix Tree](https://www.deadalnix.me/2016/09/24/introducing-merklix-tree-as-an-unordered-merkle-tree-on-steroid/)."
"This definition of a hash of a list of elements is pretty simple:\n",
"\n",
"* A **list element** is an arbitrary buffer of bytes. Any length, any content. Just bytes.\n",
"* A **list**, then, is a sequence of such elements.\n",
"* The **hash of a list element** is the cryptographic hash of its bytes, formatted into a square matrix with byte elements. (More details later.)\n",
"* The **hash of a list** is reduction by matrix multiplication of the hashes of all the list elements in the same order as they appear in the list.\n",
"* The **hash of a list with 0 elements** is the identity matrix.\n",
"\n",
"This construction has a couple notable concequences:\n",
"\n",
"* The hash of a list with only one item is just the hash of the item itself.\n",
"* You can calculate the hash of any list concatenated with itself by matrix multiplication of the the hash with itself. This works for single elements as well as arbitrarily long lists.\n",
"* A list can have multiple copies of the same list item, and swapping them does not affect the list hash. Consider how swapping the first two elements in `[1, 1, 2]` doesn't change it.\n",
"* Concatenating two lists is accomplished by matrix multiplication of their hashes, in the correct order.\n",
"* Appending or prepending lists of 0 elements yields the same hash, as expected.\n",
"\n",
"Lets explore this definition in more detail with a simple implementation in python+numpy."
"The function `hash_m/1` takes a buffer of bytes as its first argument, and returns the sha512 hash of the bytes formatted as an 8×8 2-d array of 8-bit unsigned integers with wrapping overflow. **This is the hash of a list element consisting of those bytes.** Based on a shallow wikipedia dive, someone familiar with linear algebra might say it's a [matrix ring](https://en.wikipedia.org/wiki/Matrix_ring), $R_{256}^{8×8}$. Not coincidentally, sha512 outputs 512 bits = 64 bytes = 8 * 8 array of bytes, how convenient. (In fact, that might even be the primary reason why I chose sha512!)"
"Ok so we've got our element hashes, how do we combine them to construct the hash of a list? We defined the hash of the list to be reduction by matrix multiplication of the hash of each element:"
"* [Associativity](#Associativity) - Associativity enables you to reduce a computation using any partitioning because all partitionings yield the same result. Addition is associative $(1+2)+3 = 1+(2+3)$, subtraction is not $(5-3)-2\\neq5-(3-2)$. ([Associative property](https://en.wikipedia.org/wiki/Associative_property))\n",
"* [Non-Commutativity](#Non-Commutativity) - Commutativity allows you to swap elements without affecting the result. Addition is commutative $1+2 = 2+1$, but division is not $1\\div2 \\neq2\\div1$. And neither is matrix multiplication. ([Commutative property](https://en.wikipedia.org/wiki/Commutative_property))\n",
"Upon consideration, these are the exact properties that one would want in order to define the hash of a list of items. Non-commutativity enables the order of elements in the list to be well-defined, since swapping different elements produces a different hash. Associativity enables caching the summary of an arbitrary sublist; I expect that doing this heirarchally on a huge list enables an algorithm to calculate the hash of any sublist at the cost of `O(log(N))` time and space.\n",
"\n",
"Lets sanity-check that these properties can hold for the construction described above."
"This appears to me to be a reasonable way to define the hash of a list. The mathematical definition of a list aligns very nicely with the properties offered by matrix multiplication. But is it appropriate to use for the same things that a Merkle Tree would be? The big questions are related to the valuable properties of hash functions:\n",
"\n",
"* Given a merklist summary or sublist summaries of it, can you derive the hashes of elements or their order? (Elements themselves are protected by the preimage resistance of the underlying hash function.)\n",
" * If yes, when is that a problem?\n",
"* Given a merklist summary but not the elements, is it possible to produce a different list of elements that hash to the same summary? (~preimage resistance)\n",
"* Is it possible to predictably alter the merklist summary by concatenating it with some other sublist of real elements?\n",
"* Are there other desirable security properties that would be valuable for a list hash?\n",
"* Is there a better choice of hash function as a primitive than sha512?\n",
"* Is there a better choice of reduction function that still retains associativity+non-commutativity than simple matmul?\n",
"* Is there a more appropriate size than an 8x8 matrix / 64 bytes to represent merklist summaries?\n",
"\n",
"Matrixes are well-studied objects, perhaps such information is already known. If *you* know something about deriving the preimage of the multiplication of a [matrix ring](https://en.wikipedia.org/wiki/Matrix_ring), $R_{256}^{8×8}$, I would be very interested to know."
"***If** this construction has the appropriate security properties*, it seems to be a better merkle tree in all respects. Any use of a merkle tree could be replaced with this, and it could enable use-cases where merkle trees aren't useful. Some examples of what I think might be possible:\n",
"* Using a Merklist with a sublist summary tree structure enables creating a $O(1)$-sized 'Merklist Proof' that can verify the addition and subtraction of any number of elements at any single point in the list using only $O(log(N))$ time and $O(log(N))$ static space. As a bonus the proof generator and verifier can have totally different tree structures and can still communicate the proof successfully.\n",
"* Using a Merklist summary tree you can create a consistent hash of any ordered key-value store (like a btree) that can be maintained incrementally inline with regular node updates, e.g. as part of a [LSM-tree](https://en.wikipedia.org/wiki/Log-structured_merge-tree). This could facilitate verification and sync between database replicas.\n",
"* The sublist summary tree structure can be as dense or sparse as desired. You could summarize down to pairs of elements akin to a merkle tree, but you could also summarize a compressed sublist of hundreds or even millions of elements with a single hash. Of course, calculating or verifying a proof of changes to the middle of that sublist would require rehashing the whole sublist, but this turns it from a fixed structure into a tuneable parameter.\n",
"* If all possible elements had an easily calculatable inverse, that would enable \"subtracting\" an element by inserting its inverse in front of it. That would basically extend the group from a ring into a field, and might have interesting implications.\n",
" * For example you could define a cryptographically-secure rolling hash where advancing either end can be calculated in `O(1)` time.\n",