So I am playing around a bit with programming in my spare time and I decided to try writing a program that would use the collatz conjecture to decompose numbers according to certain rules.
Long story short:
if the number is even, divide it by two.
if the number is odd, multiply it by 3 and add 1.
So it works for all smaller numbers but if I hit a certain number (5 000 000 000 triggers it, but 1 000 000 000 doesn't, I am not sure what is the minimum number to trigger the bug).
So when you enter a number too big the program goes crazy and spits out lots of numbers (some of them even go negative?) and it doesn't terminate. It just gets into an infinite loop.
I am really curious why does it do that and what is the source of this behaviour.
Keep in mind this is just me being curious and I don't actually have to do this to save my job or anything, I am doing it purely for fun so if you're a busy person don't waste time :).
#include <iostream>
using namespace std;
void collatz(long unsigned int value);
int main() {
long unsigned int input;
while (true) {
cout << "Please enter a starting number(0 to terminate): ";
cin >> input;
if (input <= 0) {
break;
}
collatz(input);
cout << endl;
}
system("Pause");
return 0;
}
void collatz(long unsigned int value) {
int steps = 0;
long unsigned int l1 = 0, l2 = 0, l3 = 0;
while (true) {
steps++;
if (l1 == 4 && l2 == 2 && l3 == 1)
{
cout << "Steps: " << steps << endl;
break;
}
if (value % 2 == 0)
{
l1 = l2;
l2 = l3;
l3 = value;
value = value / 2;
cout << value << " - ";
}
else
{
l1 = l2;
l2 = l3;
l3 = value;
value = value * 3;
value = value + 1;
cout << value << " - ";
}
}
}
Aucun commentaire:
Enregistrer un commentaire