Analysis of the paper 'Equidistant Letter Sequences' by Doron Witztum, Eliyahu Rips, Yoav Rosenberg

It is the observation of the authors that certain arbitrarily chosen, significant & related words can be found with a particular set of simple decryption algorithms in the books of Moses in a frequency higher than expected. Furthermore, to validate this finding, it was found that the same process applied to the novel 'War and Peace' did not yield a frequency higher than the statistical average.

I imagine the implicit premise of this study is that 1) The books of Moses have a cosmic quality not found in general text, and/or 2) That the existence of a simple decryption scheme is proof of a divine scheme.

I do not doubt that there is a divine scheme, nor that this study may be a portion of a proof of this. However this particular proof is both incomplete and has some flaws.

The test used to show proof of meaning is the proximity of the names of personalities and dates related to them (in English this would be impractical data to relate since names and dates generally use separate character sets). The example can be related to finding two related words such as 'hammer' and 'anvil' in close proximity which merely equates to the appearance of one of several character strings i.e.:

'hammeranvil', 'hammer*anvil', 'hammer**anvil', 'hammer***anvil', 'hammer****anvil' etc. where '*' represents any character.

The test did not incorporate letter frequency. A validity test of the entire project incorporating letter frequency and using simpler arithmetic could be constructed with the following formula:

P = (p1*p2*p3…) * (integer(T/D)-(L-1))

where P=Probability of word occurrence, p1,p2,etc= letter frequencies for the characters in the word, T=Number of characters in Text to be decrypted, D=Distance between characters chosen for the decryption scheme, and L=Length of sought word.

Example: sought word: 'Oz', Text: 'The quick brown fox jumped over the lazy dog', D:3, L:2, T:36, p1:(4/36), p2:(1/36)

P=(4/36 * 1/36) * (int(36/3)-(2-1)) P=4/1296 * 11 P= 0.03395

Decrypted String = 'eibwouevtlyg' 'Oz ' is not found here once, less than the probability.

The three decryption schemes that will find 'Oz' are Start:23, Distance:9; Start:14, Distance:17; and Start:11, Distance:21 out of 630 possible schemes (Start,Distance + Start,Distance + Start,Distance…) 1,35 2,34 3,33 etc

To avoid 630 calculations, lets say each formula had a probability of 0.03395. Times 630, that’s 21 times 'Oz should occur'; 3 is below average.

It should be pointed out that for any body of text and any search string, an infinite number of decryption schemes can be fabricated to maximize occurrence of the search string. For instance there can be made a cipher which, when parsing the books of Moses, yields several instances of "Betty Boop rules supreme.", while not even mentioning any saints.

Equidistant letter sequences are but a minute portion of popular decryption schemes, although they are simple and popular. The most common decryption algorithm is the letter substitution cipher. For every decryption scheme which yields above average results, there should be one that produces below average results.

To validate the implied premise of the authors would require either 1) that a divine inspiration specified that this particular decryption scheme would yield good results (and if you believe that you don’t need to perform the test anyhow) 2) that the application of all decryption schemes yielded an average of high results. This test is of course impossible since there can be infinite decoding strategies; it would be promising if as one progressed with ranges of tests that a high average was maintained, but then again if one were to roll dice a pentillion times, it would not be unlikely to roll a million snake eyes in a row, perhaps even from the onset. Now if one extends the premise to say that these on-going decoding results were meant to be a proof of divine plan, then it might also be fair to say that the high yielding tests occurring first are part of that plan, and who could say whether further testing would decrease the average or not.

If the premise is that the results are meant to be found, then at least all of the simple algorithms should be applied, and have it seen if this yields high average successes.

 

EreIAm.com top