HiveBrain v1.2.0
Get Started
← Back to all entries
debugMinor

Error correcting permutation code

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
correctingpermutationerrorcode

Problem

Let's say you have $n$ symbols. You can encode a $\log_2(n!)$-bit message by permutating the symbols. I will call this a permutation code (if you have seen this concept before, I would love to see a reference).

Let's say we are encoding a message with $k<\log_2(n!)$ bits. It is possible to add error correction. One way is to simply add any old error correction scheme to the string before applying the permutation code. I'm wondering if there is a way that is optimal for permutation codes.

In particular, is there some way of doing a permutation coding that is resistant to transpositions (in terms of error detection and correction)? I want something that will tolerate up to $t$ transpositions of adjacent symbols.

Solution

The first reference seems to be Rank permutation group codes based on Kendall's correlation statistic by Chadwick and Kurz; your notion of distance is known as Kendall's tau distance. A more modern reference is Codes in Permutations and Error Correction for
Rank Modulation by Alexander Barg and Arya Mazumdar

Context

StackExchange Computer Science Q#63119, answer score: 2

Revisions (0)

No revisions yet.