The random function in programming languages creates a (pseudo-)random number that has an equal chance of being anywhere in the range 0.0 to 1.0. For example if the mean was 0.0 and a "standard deviation" was 1.0, the random number could be anything. However, it would be more likely to lie between 0 and 1 than between 1 and 2 and a number > 5 would be extremely unlikely (previous integers could also be negative ending in ... < -5).
Thanks in advance.