Some 20 years ago I was chatting with a friend of mine ( Andrew Lipson ) and he said something like the following:
This isn't an exactConsider the sequence $k(n)$ with
So I started on that. I figured Andrew had already done the basic work of looking for a small counter-example, but I also figured that I would need to get warmed up and start to get a feel for the problem.
Before that, I need to put this into some historical context. This was when, if you wanted to do arithmetic on more than 32 bit numbers, you had to write your own arbitrary arithmetic package. It was before the WWW as we know it, although not pre-internet per se. So some of what follows will seem unnecessary. If you think that, I invite you to find what is common for the first 1000 counter-examples, except for a small number of them, and tell me which ones do not follow the pattern.
And so ...
n k(n) Prime Divides Confirmed 2 2 True True Yes 3 3 True True Yes 4 2 False False Yes 5 5 True True Yes 6 5 False False Yes 7 7 True True Yes 8 10 False False Yes 9 12 False False Yes 10 17 False False Yes 11 22 True True Yes 12 29 False False Yes 13 39 True True Yes 14 51 False False Yes 15 68 False False Yes 16 90 False False Yes 17 119 True True Yes 18 158 False False Yes 19 209 True True Yes 20 277 False False Yes 21 367 False False Yes 22 486 False False Yes 23 644 True True Yes 24 853 False False Yes 25 1130 False False Yes and so on ... OK, so working it by hand we can see that it's true up to 25, but we can also see that k(n) is starting to grow quite rapidly. Some messing about by hand shows that it's going to be exponential, just as the Fibonacci sequence is exponential. Not a surprise, really, since this is very Fibonacci-like, with each term defined by adding two previous terms.We can work out the ratio between successive terms and see that it starts to converge quite rapidly to about 1.324736... In fact we can show that:
$$k(n)=\alpha^n+\beta^n+\gamma^n$$
where $\alpha,$ $\beta,$ and $\gamma$ are solutions to the equation:
$$x^3=x+1$$
(Usually we need constants in front of the powers of the roots, but in this case the constants all turn out to be 1).
Skip thisNo? Can't see that? Well the definition is:
$$k(n)=a.k(n-1)+b.k(n-2)+c.k(n-3)$$
where a=0, b=1, and c=1. Then the characteristic equation is $x^3=a.x^2+b.x+c.$
Anyway ...
The roots to that cubic are:
Now the number of digits in a number m (base 10) is given by log(m) (base 10), so that means the number of digits in k(n) is log(k(n)) which is roughly $\log(\alpha^n),$ which in turn is $n.\log(\alpha).$
Now $\log(\alpha)$ is about 0.122..., so by the time n is 100, k(n) will have 12 digits (actually 13, because we should round up), and it will only get worse.
So we can't go far with this by hand. But that's OK, because what we want to know is this:
#!/usr/bin/python from time import clock for limit_power in [0, 1, 2, 3, 4]: n = 2 limit = 4**limit_power t_start = clock() while clock()-t_start < limit: a,b,c = 3,0,2 for i in xrange(n): a,b,c = b,c,(a+b) % n n += 1 print limit, n, float(n)/2**limit_powerIf we compute k(n) modulo n then it will divide if the answer is 0, and because we're working modulo n the numbers in the intermediate calculation don't grow too quickly. The only problem then is that after each n, we have to start all over again for the next one. That means that the time taken grows quadratically, and that will kill us fairly quickly - we won't get that far.
So here we have some Python code to compute k(n) mod n for as long as allowed by some time limit, and then we print how far we got. We're not actually doing any of the searching for a counter-example in this code, since we're not actually seeing whether the numbers are predicted to be prime, nor are we seeing if they actually are prime. This is just showing that it's going to be a slow process if we use this method.
Secs n n/sqrt(t) 1 3133 3133.0 4 6229 3114.5 16 12562 3140.5 64 25179 3147.4 256 50642 3165.1 The code multiplies the time limit by 4 each time, and we can see that we get twice as far. That means that to get twice as far takes four times as long, so, as expected, the time taken grows quadratically. The timings on my machine are shown at left.#!/usr/bin/python from time import clock from sys import argv def is_prime_q(n): if n in [2, 3, 5, 7]: return True if n < 11 : return False if n % 2 == 0 : return False if pow(2,n-1,n) != 1: return False if n % 3 == 0 : return False if pow(3,n-1,n) != 1: return False if n % 5 == 0 : return False if pow(5,n-1,n) != 1: return False d = 7 while d*d <= n: if (n % d) == 0: return False d += 2 return True def is_perrin_q(n): if n in [2, 3]: return True a0, a1, a2 = 3, 0, 2 i = n while i > 0: a0, a1, a2 = a1, a2, (a0+a1) % n i -= 1 return a0 == 0 # ====================================== n = 2 t = 1 while clock() < int(argv[1]): is_prime = is_prime_q(n) is_perrin = is_perrin_q(n) if is_prime != is_perrin: print n, is_prime, is_perrin if t < clock(): print 'Tested to %d in %d secs' % (n,t) t += 1 n += 1As you can see, it takes 4 minutes to get to 50 thousand, so that's not getting much done. We can derive a formula for how far we get in a given time:
$$n\;\approx\;3150\sqrt(t)$$
So here we have the code to start the testing process. We have a simple test for primality, first testing for being a small prime, then using Fermat's Little Theorem, and finally performing trial division up to the square root of the number being tested.
Then we have the routine to test the Perrin condition. We start with the first few terms in the Perrin sequence, then advance up the sequence, working modulo n to avoid having the numbers get stupidly large. In the end, we return True if the relevant term in the sequence is zero, indicating that n divides k(n).
The main loop starts at n=2 and then tests to see if the number is prime, and whether it passes the Perrin test. If these answers don't match, then the number is printed, and is a counter-example.
Specific measurements give:
$$n\;\approx\;4575\sqrt{t}$$
This is all very rough and ready, and will fall apart as we go significantly further, but it's enough to get an idea of what's going on. We can see that getting to a million will take about 13 hours, so we really, really want to speed this up.
So what do we do? How can we get further, faster?
In Part 2 we'll look at algorithmic improvements.
CommentsI've decided no longer to include comments directly via the Disqus (or any other) system. Instead, I'd be more than delighted to get emails from people who wish to make comments or engage in discussion. Comments will then be integrated into the page as and when they are appropriate.
If the number of emails/comments gets too large to handle then I might return to a semi-automated system. That's looking increasingly unlikely.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4