Welcome To P8ntballer.com
The Home Of European Paintball
Sign Up & Join In

Is anything Random?

Matski

SO hot right now
Aug 8, 2001
1,737
0
0
Computers can produce strings that will appear very random, such as GUIDS - globally unique identifiers (an alpha-numeric string). The system is not Robbo-Random approved, but the possible combinations are 2^122 or 5,316,911,983,139,663,491,615,228,241,121,400,000 (I looked that up) - so getting close to the number of stars in the visible universe? I could be wrong on that but its a frickin beast of number if you take it in.

Because of the way in which guids are calculated, you would likely need to produce this many before reaching a point of duplication - even if all computers in the world were firing them out at the same time (which many will be right now). Software such as Microsoft's .NET Framework can produce one of these combinations at any point and this can then be used in a selection process. So even understanding of the system would be of no use to us, because our brains are too small to deal with the numbers and we will not have the computing power for thousands of years to be able to accurately determine which guid will be calculated. The numbers and the variables are (quite deliberately) just too vast. It's not totally random because variables go into a pot and a string comes out, but understanding how the system works will only leave you knowing how many combinations there are, based on those variables, and what they might be....one of many (for practical purposes anyway). It's still not truly random, but if you can guess a GUID here and now, your a god. It also means that computers can produce things that are incredibly random (to us ants and our machines at least).
 

Rabies

Trogdor!
Jul 1, 2002
1,344
8
63
London, UK
I been away for a few weeks and now I'm back, I have a few loose ends, or rather threads, to tie up.
I'll start with this one; Rabies, with no disrespect intended mate, your post smacks of it being copied, or at the very least, 'heavily quoted' shall we say ...... it makes no difference I suppose and I apologise if I interpreted it wrong, your post just came across as rather formal that's all.
Not copied or lifted from anywhere except the deranged workings of my brain. I'm almost flattered (I think?) that you'd think it came from some more authoritative source than me ;) I was bored for an hour or so, and I have a nasty habit of switching to "thesis" mode when I get going, I really need to work on that.

Anyways, I wanna ask you a question in true Socratic form, we have this notion of randomness; Do you not think there is cause for concern about any system we do not fully understand and then seemingly deriving an absolute from that system, an absolute which in this case is random behaviour?
Cause for concern? Scares the willies out of me. My understanding of quantum physics is strictly limited to what a layman can be expected to understand, but from what I have read I get the impression that current theory more or less "proves" that some things cannot be predicted, thus are -- for our purposes -- random. To obtain a way to calculate the incalculable would require throwing out large portions of the theoretical basis of today's physics. And that may well happen -- Newtonian physics seemed to be as right as right could be for centuries, until strange things started to be observed at the very small or very large scales, and this realisation was so momentous that it took many of the world's finest minds decades just to accept that some error hadn't been made.

Any serious debate concerning random behaviours will almost certainly default to Heisenberg's work seemingly dotting the i's and crossing the t's of any doubting folk and in so doing condemning Einstein's Godly observation to the error basket.
...But for the life of me Rabies, I just can't seem to accept the absolute idea of 'randomness'.
Einstein took his doubts to the grave, so you're in the best company; all I can do is trust what far smarter people than me say is the case. The idea doesn't trouble me too much -- I figure that if God really is there playing with the world, this is how he'd do it. Not big things like healing people or destroying cities with earthquakes, but the tiny things: move this electron just over here, and let the consequences unfold. Which would mean that what appears to be random isn't truly so, but it is and will remain beyond the comprehension of man.

Once again mate, apologies if I read the tone of your post wrong, it may well be I have been away too long.
No apology needed, I thought this forum was specifically to argue the unsolvable, where opinions will always differ and there is no right answer.

I have some real issues with what you have said here. Your first paragraph talks about cryptography and its use of "randomness" or "unpredictability". Cryptography is not about predictability, its about probability.
Predictability could be termed then as there being a high probablility (such as 100%) of a certain outcome. Cryptography is, in essence, about transforming data in a way that has a very low probability of being accessibly by a party not holding all the required information. If some or all of the secret information can be predicted -- that is, it can be determined independently -- then the method loses its cryptographic value.

Many well-known, and widely used, pseudorandom number generators are cryptographically unsafe, because their future output can be determined given enough past output. Other methods are unsafe because they can be primed with input which will produce known output, i.e. output that has a high probability of matching a particular pattern.

Secondly, entropy is a measure of the "randomness" of a closed system given a set of starting criteria. You do not add entropy to a system, you do not generate entropy and any notion of controlling entropy is wholly misguided and misunderstood.
The term "entropy", as used in cryptography, has a slightly different (and arguably incorrect) meaning to the true definition you give above. Entropy in a cryptosystem is equatable with randomness, injected from an external source. The entropy of a piece of data is a measure of how random it is, or appears to be. It's a misused word in the same way that the word "bandwidth" has been hijacked and misused in computer circles, but that's what it means in this particular field so we just get on with it.

OK, I've read your take on it but it does seem to being getting involved with semantics .. I can understand why though, we need to be perfectly clear about what we are talking about and it's true, some people fail to distinguish between probability and predictability thus affecting any conclusions they arrive at.
Probability is orthogonal to randomness. I could write a program that outputs the sequence 1-0-1-0-1-0-1-0-1-0 ad infinitum. The probability of any given digit being a 1 is 50%, but the randomness is zero. A psuedorandom stream of digits may not exhibit the same probability of a 1 coming out (though most PRNG algorithms are designed to produce approximately 50% distribution over long periods), but the randomness is higher.

Computers can produce strings that will appear very random, such as GUIDS - globally unique identifiers (an alpha-numeric string). The system is not Robbo-Random approved, but the possible combinations are 2^122 or 5,316,911,983,139,663,491,615,228,241,121,400,000 (I looked that up) - so getting close to the number of stars in the visible universe? I could be wrong on that but its a frickin beast of number if you take it in.

Because of the way in which guids are calculated, you would likely need to produce this many before reaching a point of duplication - even if all computers in the world were firing them out at the same time (which many will be right now). Software such as Microsoft's .NET Framework can produce one of these combinations at any point and this can then be used in a selection process. So even understanding of the system would be of no use to us, because our brains are too small to deal with the numbers and we will not have the computing power for thousands of years to be able to accurately determine which guid will be calculated. The numbers and the variables are (quite deliberately) just too vast. It's not totally random because variables go into a pot and a string comes out, but understanding how the system works will only leave you knowing how many combinations there are, based on those variables, and what they might be....one of many (for practical purposes anyway). It's still not truly random, but if you can guess a GUID here and now, your a god. It also means that computers can produce things that are incredibly random (to us ants and our machines at least).
A poorly designed pseudorandom number generator could indeed produce a sequence that only repeats itself once in every 2^122 iterations, but is predictable after many fewer iterations. An example is a linear feedback shift register, which can be designed with a very long cycle, but is deterministic -- a relatively small sample of its output can be used to construct an equivalent LFSR and thus predict all future output. Many a cryptosystem has been defeated in the past because the designer has confused complexity with randomness.
 

Robbo

Owner of this website
Jul 5, 2001
13,114
2,157
448
London
www.p8ntballer.com
Good response Rabies, thanks for that, I'm quite impressed mate; I think maybe this discussion (on this particular point anyway) is doomed in terms of distilling a definitive answer out of it but it was pretty good shnizzle getting to that point.
 

Bedlam

Gone crazy, back soon...
Predictability could be termed then as there being a high probablility (such as 100%) of a certain outcome. Cryptography is, in essence, about transforming data in a way that has a very low probability of being accessibly by a party not holding all the required information. If some or all of the secret information can be predicted -- that is, it can be determined independently -- then the method loses its cryptographic value.
I don't want to get into semantics over this but predictability is not in anyway synonymous with probability. Cryptography is about enciphering data and ensuring it is secure. Whatever method is used (from caesar shift to vignere cipher or PGP) it is about making it more improbable that it can be deciphered by someone other than the intended recipient. By increasing the complexity of the cipher, you increase the probability that it cannot be cracked. Interestingly enough though, there is an area of cryptography that could be predicted - codes. A cipher is neither random or predictable by nature.

The term "entropy", as used in cryptography, has a slightly different (and arguably incorrect) meaning to the true definition you give above. Entropy in a cryptosystem is equatable with randomness, injected from an external source.
Ok, this one is given. The word entropy within cryptography is different to its classical use in thermodynamics. Both uses have a basis in randomness but I think you do yourself an injustice when you say it is arguably incorrect. Shannons information theory is what you are referring to and yes, entropy plays a key element within that framework. Is it random? I don't think it is but I only know of the theory, not about it.

But arguing the point as to whether or not anything is truly random, I still have to stand by my earlier post.
:cool:
 

Rabies

Trogdor!
Jul 1, 2002
1,344
8
63
London, UK
I don't want to get into semantics over this but predictability is not in anyway synonymous with probability. Cryptography is about enciphering data and ensuring it is secure. Whatever method is used (from caesar shift to vignere cipher or PGP) it is about making it more improbable that it can be deciphered by someone other than the intended recipient. By increasing the complexity of the cipher, you increase the probability that it cannot be cracked. Interestingly enough though, there is an area of cryptography that could be predicted - codes. A cipher is neither random or predictable by nature.
I think that's what I said ;) A cipher cannot be random because it must be possible to recover the original information, otherwise it is useless. On the other hand, a cipher is entirely predictable if you know all the information. Ciphers depend on some part of the information, namely the key(s), remaining secret. Random numbers are most commonly used to choose session keys, which form part of the secret information. If an external observer can guess the session key, or fool the correspondents into using a chosen key, then the encryption can be broken. So it is essential that the session key is hard to predict, which is where randomness comes in -- and if the key is unpredictable, then the cipher is also unpredictable.
But arguing the point as to whether or not anything is truly random, I still have to stand by my earlier post.
:cool:
Your earlier post shows the important difference between randomness and complexity. "Chaotic" systems like the weather are not random, just very very complex, with too many factors affecting the outcome to calculate the result to a high confidence (= probability of getting it right.) Quantum physics shows us examples which appear to be quite simple, but (theoretically) completely random, and thus unpredictable for a different reason. The former case offers us a chance of improving the probability of correct predictions by better understanding of the system, more computational power and more accurate measurement of the inputs. The latter will remain entirely unpredictable to us until major aspects of quantum theory are shown to be incorrect.