^{2}+9

^{2}+9

^{2}= 243, so eventually, wherever you start, you end up with a number less than 243 (and 243 --> 4+16+9 = 29 so it gets even smaller).. what would be the last number that produces a number larger than itself? Ok, simple stuff, if you start with one, you get one and that's dull so let's go on. If you begin with two you produce the sequence 2--> 4--> 16--> 37--> 58--> 89--> 145-->42-->20-->4 and then repeats the cycle of eight numbers forever. Now the big conjecture... start with ANY integer (oooohhhh, he said you could pick ANY integer, how bold) and eventually, either it gets to one, or it jumps into this chain. Now I wonder; is there a way to prove (short of hacking them all out, which I have done) that there is no "other" chain that some numbers might drop into?

John Cook did a computer search of all the numbers less than 1000 and they all went to one of these two absorbing states.

I also have a feeling that as the numbers go off to infinity, the proportion of numbers that go off to one has some non-zero limit; in fact, I suspect it might be around 1/7 or just a tiny bit more, but don't have a clue how to prove that. Any takers? All conjectural rationale will be considered.
I think the same kind of questions could be asked about forming the sum of the cube of the digits... would all roads lead back to one or two then?
A quick answer trying only a few numbers... NO, Follow the orbits of 2 or 3 or 7 and they all go to separate self-replicating numbers or one-cycles.. 4 goes into a triplet of 55 250,133,55; so I guess my new question about cubics is... How many of these finite cycles are there?

## 8 comments:

Pat, this is great! It reminds me of one the kids at my son's freeschool loved: start with any number, if even cut in half, if odd triple and add 1.

It has a name, which I've forgotten. Ok, looked it up. Googling 4-2-1 gave me mostly stuff on China's one-child policy, but also this. The numbers are called hailstone numbers, which I knew, and the

unsolvedproblem of whether you always get to 4-2-1, is called Collatz Problem (knew it was unsolved, didn't know that name).I think the kids will like your problem too.

Sue, from my Mathwords page, http://www.pballew.net/arithm15.html#syracuse

the following on "Collatz Problem": aka the Syracuse Algorithm

The Syracuse algorithm is a iteration problem that deals with the following algorithm..

If a number n is odd, then f(n)= 3n+1

if n is even, then f(n) = 1/2 (n)

Each answer then becomes the new value to input into the function. The problem, or should I say problems, resolve around what happens to the sequence of outcomes when we keep putting the answer back into the function. For example if we begin with 15 we get the following sequence, also called the orbit of the number:

15, 46, 23, 70, 35, 106, 53, 160, 80, 40, 20, 10, 5, 16, 8, 4, 2, 1...

One of the unproven conjectures is that for any number n, the sequence will always end in the number 1. This has been shown to be true for all numbers up to just beyond 1016. A second interesting question is how long it takes for a number to return to the value of 1. For the example above, the number 15 took 17 steps to get back to the unit value. Questions such as which three (or other n) digit number has the longest orbit. There are many variations of the problem, but if you are interested in a good introduction, check this link from Simon Fraser University"

http://www.cecm.sfu.ca/organics/papers/lagarias/paper/html/node2.html

The Syracuse Algorithm is often also called Collatz's Problem, Hasse's problem, Thwaite's problem, and Ulam's problem after people who have worked and written on the problem. It is unclear where the problem originated, as it seems to have had a long history of being passed by word of mouth before it was ever written down. It is often attributed to Lothar Collatz from the University of Hamburg who wrote about the problem as early as 1932. The name "Syracuse Problem" was applied by after H. Hasse, an associate of Collatz, visited and discussed the problem at Syracuse University in the 1950's. During the 1960's Stan Ulam circulated the problem at Los Alamos laboratory. One famous quote about the problem is from Paul Erdos who stated, "mathematics is not yet ready for such problems".

Cool (I like the powers of two sequence too: 4, 2, 1).

This loosely reminds me of Lychrel numbers (after all, I've had nothing but palindromes on my mind lately). See http://en.wikipedia.org/wiki/Lychrel_number.

(P.S. There's a typo in your link to me.)

Doctor _base2) I tried the link and it went to your site... tell me what I have wrong and I'll change it... you have a great blog and more of my students should see it..

pat

Thanks for the compliment.

The link works, you just spelled it "Expoloring Binary"

ahhh..fixed...thanks

This turns out to be less interesting in binary... squares, cubes, even higher powers...

In trinary 11-->2-->11 shows up quickly, or else getting stuck at 1. We can get stuck at 12, which I think is a nice twist. But those seem to be the three endings.

Cubes in trinary include 1-cycle for at least 1 and 122, and one chain: 2 --> 22 --> 121 --> 101 --> 2. Three endings again.

I have no rationale as yet, but my computer search seems to tell me that the proportion is more like 21% than 1/7.

Whatever the result, there should certainly be a limit, because a "really big" number will have a "random" sum of squares that's much smaller, since "most" big numbers have plenty of 1s in them, and we can thus easily add 3 or subtract 1 from the sum of squares by changing a digit here or there, thus establishing the "randomness".

So we're "randomly" picking a much smaller number, so whatever the density among the smaller numbers, it'll be the same density forever.

But hmmm, on further numeric investigation this seems to be a very slowly decreasing percentage of "bad" ones that don't eventually hit 4. By the time I reach 100 million, it seems to be more like 20% bad instead of 21%.

It's like there's some kind of slow oscillation superimposed on this behavior.

Hm, it's 20.0% at around 100 million, and by the time we get to 10 billion it's around 19.8%. So maybe it's not steadily increasing after all. I think I've about reached the limit of what can be done with straight stupid brute force. Next I guess I'd have to look at all the possible results for an n-digit number and see what kind of formulas might come out...

Post a Comment