unofficial mirror of guile-user@gnu.org 
 help / color / mirror / Atom feed
* Generating "independent" random numbers
@ 2023-10-03 15:08 Zelphir Kaltstahl
  2023-10-03 16:04 ` tomas
  2023-10-03 18:25 ` Maxime Devos
  0 siblings, 2 replies; 12+ messages in thread
From: Zelphir Kaltstahl @ 2023-10-03 15:08 UTC (permalink / raw)
  To: guile-user

Hello Guile Users,

today I want to verify some understanding I have about generating random numbers 
using Guile.

So there is SRFI-27 and I am using it like this:

~~~~
(define make-random-integer-generator
     (lambda* (#:key (seed #f))
       "Get a procedure for generating uniformly distributed
random integers from 0 up to a not included bound, which is
seeded by the keyword argument seed, which must be a
positive integer."
       (cond
        [seed
         (let ([rand-src (make-random-source)])
           ;; Set the given seed to guarantee same results for
           ;; same invokations.
           (random-source-pseudo-randomize! rand-src 0 seed)
           ;; Obtain a procedure, which gives uniformly
           ;; distributed integers.
           (random-source-make-integers rand-src))]
        [else
         (let ([rand-src (make-random-source)])
           ;; Try to make the random source truly random. How
           ;; this works depends on the specific implementation
           ;; of SRFI-27.
           (random-source-randomize! rand-src)
           (random-source-make-integers rand-src))])))
~~~~

That gives me a simple procedure, that I can call with no argument to get the 
next random number from the RNG. It allows me to see the RNG, which I definitely 
want for being able to reproduce results.

Some time ago, I wanted to generate uniformly distributed floats though. There 
seems to be no facility in Guile to do that. So I took a look at Wikipedia: 
https://en.wikipedia.org/wiki/Normal_distribution#Computational_methods:

 > An easy-to-program approximate approach that relies on the central limit 
theorem is as follows: generate 12 uniform U(0,1) deviates, add them all up, and 
subtract 6 – the resulting random variable will have approximately standard 
normal distribution. In truth, the distribution will be Irwin–Hall, which is a 
12-section eleventh-order polynomial approximation to the normal distribution. 
This random deviate will have a limited range of (−6, 6).[55] Note that in a 
true normal distribution, only 0.00034% of all samples will fall outside ±6σ.

OK, for most purposes this seems good enough?

But there is one caveat: 
https://en.wikipedia.org/wiki/Irwin%E2%80%93Hall_distribution

 > In probability and statistics, the Irwin–Hall distribution, named after 
Joseph Oscar Irwin and Philip Hall, is a probability distribution for a random 
variable defined as the sum of a number of independent random variables, each 
having a uniform distribution.[1] For this reason it is also known as the 
uniform sum distribution.

Aha! They need to be "independent" too! Not only uniformly distributed! Might be 
my code would be working wrong, if I do not pay attention to statistical traps.

Checking SRFI-27: 
https://www.gnu.org/software/guile/manual/html_node/SRFI_002d27-Random-Sources.html#index-random_002dsource_002dpseudo_002drandomize_0021

 > Function: random-source-pseudo-randomize! source i j

 > Changes the state of the random source s into the initial state of the (i, 
j)-th independent random source, where i and j are non-negative integers. This 
procedure provides a mechanism to obtain a large number of independent random 
sources (usually all derived from the same backbone generator), indexed by two 
integers. In contrast to random-source-randomize!, this procedure is entirely 
deterministic.

The wording here "the (i, j)-th independent random source" makes me think, that 
if i and j are set with 0 and the seed like in my procedure, the generated 
random integers are _not_ independent.

Is this understanding correct?

Am I correct in assuming, that I will need to make 12 separate RNGs with 
different values for the tuple (i, j), to generate 12 independent uniformly 
distributed random integers?

Best regards,
Zelphir

-- 
repositories:https://notabug.org/ZelphirKaltstahl


^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 15:08 Generating "independent" random numbers Zelphir Kaltstahl
@ 2023-10-03 16:04 ` tomas
  2023-10-03 16:16   ` tomas
  2023-10-03 22:16   ` Zelphir Kaltstahl
  2023-10-03 18:25 ` Maxime Devos
  1 sibling, 2 replies; 12+ messages in thread
From: tomas @ 2023-10-03 16:04 UTC (permalink / raw)
  To: Zelphir Kaltstahl; +Cc: guile-user

[-- Attachment #1: Type: text/plain, Size: 1413 bytes --]

On Tue, Oct 03, 2023 at 03:08:03PM +0000, Zelphir Kaltstahl wrote:
> Hello Guile Users,
> 
> today I want to verify some understanding I have about generating random
> numbers using Guile.

[...]

> Some time ago, I wanted to generate uniformly distributed floats though.

Wait a minute: you want uniformly distributed floats (in an interval,
I assume)?

> There seems to be no facility in Guile to do that. So I took a look at
> Wikipedia:
> https://en.wikipedia.org/wiki/Normal_distribution#Computational_methods:
> 
> > An easy-to-program approximate approach that relies on the central limit
> theorem is as follows [...]

What you get here is (an approximation to) a *normal* distribution, which
is something totally different (you actually use N (== 12 in your example)
*uniform* distributions to build that)

[...]

> Aha! They need to be "independent" too! Not only uniformly distributed!
> Might be my code would be working wrong, if I do not pay attention to
> statistical traps.

If your random number generator is "good enough", consequent values are
expected (heh) to be (statistically [1]) independent. So I wouldn't bother
too much about it. On the contrary -- perhaps this mode is better tested
than using twelve "parallel" PRNGs as you plan.

Cheers

[1] Yeah, deep philosophical rabbit holes everywhere. I mean it in some
   robust, practical way.
-- 
t

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 195 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 16:04 ` tomas
@ 2023-10-03 16:16   ` tomas
  2023-10-03 22:17     ` Zelphir Kaltstahl
  2023-10-03 22:16   ` Zelphir Kaltstahl
  1 sibling, 1 reply; 12+ messages in thread
From: tomas @ 2023-10-03 16:16 UTC (permalink / raw)
  To: Zelphir Kaltstahl; +Cc: guile-user

[-- Attachment #1: Type: text/plain, Size: 964 bytes --]

On Tue, Oct 03, 2023 at 06:04:45PM +0200, tomas@tuxteam.de wrote:
> On Tue, Oct 03, 2023 at 03:08:03PM +0000, Zelphir Kaltstahl wrote:
> > Hello Guile Users,
> > 
> > today I want to verify some understanding I have about generating random
> > numbers using Guile.
> 
> [...]
> 
> > Some time ago, I wanted to generate uniformly distributed floats though.
> 
> Wait a minute: you want uniformly distributed floats (in an interval,
> I assume)?

Perhaps something graphical (I hope y'all have fixed pitch fonts ;-)

Uniform probability density (usually across some fixed interval, say [0,1)

    |
    |...........
    |
    |
    +--------------------
    0          1

Normal (say, with expectation value 0.5)

    |
    |    ...
    |  .    .
    |..      ..
    +--------------------
    0          1

(er... well, more or less). The normal one never gets to zero,
but approaches asymptotically the X line.

Cheers
-- 
t

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 195 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 15:08 Generating "independent" random numbers Zelphir Kaltstahl
  2023-10-03 16:04 ` tomas
@ 2023-10-03 18:25 ` Maxime Devos
  2023-10-03 22:22   ` Zelphir Kaltstahl
  1 sibling, 1 reply; 12+ messages in thread
From: Maxime Devos @ 2023-10-03 18:25 UTC (permalink / raw)
  To: Zelphir Kaltstahl, guile-user


[-- Attachment #1.1.1: Type: text/plain, Size: 2317 bytes --]

Op 03-10-2023 om 17:08 schreef Zelphir Kaltstahl:
> Some time ago, I wanted to generate uniformly distributed floats though. 
> There seems to be no facility in Guile to do that. So I took a look at 
> Wikipedia: 
> https://en.wikipedia.org/wiki/Normal_distribution#Computational_methods:
> 
>  > An easy-to-program approximate approach that relies on the central 
> limit theorem is as follows: generate 12 uniform U(0,1) deviates, add 
> them all up, and subtract 6 – the resulting random variable will have 
> approximately standard normal distribution. In truth, the distribution 
> will be Irwin–Hall, which is a 12-section eleventh-order polynomial 
> approximation to the normal distribution. This random deviate will have 
> a limited range of (−6, 6).[55] Note that in a 
> true normal distribution, only 0.00034% of all samples will fall outside ±6σ.
> 
> OK, for most purposes this seems good enough?
That's _normal_, not uniform, like tomas wrote.  Though, if you really 
want to (inefficient), you could apply the cdf of the normal 
distribution to the normal random variable to get a

First, I'll say that there is no such thing as an uniformly distributed 
float (*), because the real line has length infinity.

As such, you need to pick a bounded range from which you want to sample 
uniformly.  For example, let's say [0,1), which can be rescaled as 
desired to any finite half-closed interval.

The uniform distribution on [0,1) has infinite support which makes 
things difficult for computers, but it can be approximated by a 
distribution with finite support, let's say

    mu_N = sum(i=0..(N-1)) dirac(i/N)/N

where dirac(i/N) is a Dirac-pulse situated as i/N and N is large.

Up to scaling, this is simply the uniform discrete measure on {0,1,..,N-1}!

So, to generate an (approximately) uniform random number on [0,1), you 
can simply do

(define (random-real)
   (exact->inexact (/ (random N) N)))

for a suitably large choice of integer N>0.

(*) Ignoring for a moment there are technically finitely many floats, 
but the uniform distribution on the discrete set of floats is likely not 
what you are interested in.

(I probably didn't have to go in so much detail but whatever ...)

Best regards,
Maxime Devos.

[-- Attachment #1.1.2: OpenPGP public key --]
[-- Type: application/pgp-keys, Size: 929 bytes --]

[-- Attachment #2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 236 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 16:04 ` tomas
  2023-10-03 16:16   ` tomas
@ 2023-10-03 22:16   ` Zelphir Kaltstahl
  1 sibling, 0 replies; 12+ messages in thread
From: Zelphir Kaltstahl @ 2023-10-03 22:16 UTC (permalink / raw)
  To: tomas; +Cc: guile-user


On 10/3/23 18:04, tomas@tuxteam.de wrote:
> On Tue, Oct 03, 2023 at 03:08:03PM +0000, Zelphir Kaltstahl wrote:
>> Hello Guile Users,
>>
>> today I want to verify some understanding I have about generating random
>> numbers using Guile.
> [...]
>
>> Some time ago, I wanted to generate uniformly distributed floats though.
> Wait a minute: you want uniformly distributed floats (in an interval,
> I assume)?
Ah sorry, I meant to write normal distributed.
>> Aha! They need to be "independent" too! Not only uniformly distributed!
>> Might be my code would be working wrong, if I do not pay attention to
>> statistical traps.
> If your random number generator is "good enough", consequent values are
> expected (heh) to be (statistically [1]) independent. So I wouldn't bother
> too much about it. On the contrary -- perhaps this mode is better tested
> than using twelve "parallel" PRNGs as you plan.

Hm. Good point. On the other hand the docs at least say, that when the tuple (i, 
j) is different, the random numbers should be independent. At least that is how 
I read it.

Thanks!

Zelphir

-- 
repositories:https://notabug.org/ZelphirKaltstahl


^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 16:16   ` tomas
@ 2023-10-03 22:17     ` Zelphir Kaltstahl
  0 siblings, 0 replies; 12+ messages in thread
From: Zelphir Kaltstahl @ 2023-10-03 22:17 UTC (permalink / raw)
  To: tomas; +Cc: guile-user

On 10/3/23 18:16, tomas@tuxteam.de wrote:
> On Tue, Oct 03, 2023 at 06:04:45PM +0200, tomas@tuxteam.de wrote:
[…]
> Perhaps something graphical (I hope y'all have fixed pitch fonts ;-)
>
> Uniform probability density (usually across some fixed interval, say [0,1)
>
>      |
>      |...........
>      |
>      |
>      +--------------------
>      0          1
>
> Normal (say, with expectation value 0.5)
>
>      |
>      |    ...
>      |  .    .
>      |..      ..
>      +--------------------
>      0          1
>
> (er... well, more or less). The normal one never gets to zero,
> but approaches asymptotically the X line.
>
> Cheers
Ha, thanks for the neat diagrams. Sorry for the typo mixing up uniform and normal.

-- 
repositories: https://notabug.org/ZelphirKaltstahl




^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 18:25 ` Maxime Devos
@ 2023-10-03 22:22   ` Zelphir Kaltstahl
  2023-10-04 16:14     ` Keith Wright
  2023-10-18 19:08     ` Mikael Djurfeldt
  0 siblings, 2 replies; 12+ messages in thread
From: Zelphir Kaltstahl @ 2023-10-03 22:22 UTC (permalink / raw)
  To: Maxime Devos; +Cc: Guile User

On 10/3/23 20:25, Maxime Devos wrote:
> Op 03-10-2023 om 17:08 schreef Zelphir Kaltstahl:
>> Some time ago, I wanted to generate uniformly distributed floats though. 
>> There seems to be no facility in Guile to do that. So I took a look at 
>> Wikipedia: 
>> https://en.wikipedia.org/wiki/Normal_distribution#Computational_methods:
>>
>>  > An easy-to-program approximate approach that relies on the central limit 
>> theorem is as follows: generate 12 uniform U(0,1) deviates, add them all up, 
>> and subtract 6 – the resulting random variable will have approximately 
>> standard normal distribution. In truth, the distribution will be Irwin–Hall, 
>> which is a 12-section eleventh-order polynomial approximation to the normal 
>> distribution. This random deviate will have a limited range of (−6, 6).[55] 
>> Note that in a 
>> true normal distribution, only 0.00034% of all samples will fall outside ±6σ.
>>
>> OK, for most purposes this seems good enough?
> That's _normal_, not uniform, like tomas wrote. 

Aye, sorry for that typo. Yes, my goal is normal distributed floats (leaving 
aside the finite nature of the computer and floats).

> Though, if you really want to (inefficient), you could apply the cdf of the 
> normal distribution to the normal random variable to get a
>
> First, I'll say that there is no such thing as an uniformly distributed float 
> (*), because the real line has length infinity.
>
> As such, you need to pick a bounded range from which you want to sample 
> uniformly.  For example, let's say [0,1), which can be rescaled as desired to 
> any finite half-closed interval.
>
> The uniform distribution on [0,1) has infinite support which makes things 
> difficult for computers, but it can be approximated by a distribution with 
> finite support, let's say
>
>    mu_N = sum(i=0..(N-1)) dirac(i/N)/N
>
> where dirac(i/N) is a Dirac-pulse situated as i/N and N is large.
>
> Up to scaling, this is simply the uniform discrete measure on {0,1,..,N-1}!
>
> So, to generate an (approximately) uniform random number on [0,1), you can 
> simply do
>
> (define (random-real)
>   (exact->inexact (/ (random N) N)))
>
> for a suitably large choice of integer N>0.
>
> (*) Ignoring for a moment there are technically finitely many floats, but the 
> uniform distribution on the discrete set of floats is likely not what you are 
> interested in.
>
> (I probably didn't have to go in so much detail but whatever ...)

That's what I get for writing the wrong word :D

OK that's some math to unpack, thank you!

Best regards,
Zelphir

-- 
repositories: https://notabug.org/ZelphirKaltstahl




^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 22:22   ` Zelphir Kaltstahl
@ 2023-10-04 16:14     ` Keith Wright
  2023-10-10 22:03       ` Maxime Devos
  2023-10-18 19:08     ` Mikael Djurfeldt
  1 sibling, 1 reply; 12+ messages in thread
From: Keith Wright @ 2023-10-04 16:14 UTC (permalink / raw)
  To: Zelphir Kaltstahl; +Cc: maximedevos, guile-user

From: Zelphir Kaltstahl <zelphirkaltstahl@posteo.de>

> my goal is normal distributed floats (leaving aside the finite
> nature of the computer and floats).

Warning: I am out of my depth, I can't even spell statisticion, but...

The following is either provably correct up to round-off,
or totally stupid.

First define the cumulative distribution for the normal distribution:

$$f(x)= \frac{1}{\sqrt{\pi}} \int_{-\infty}^{x} e^{-x^2/2} dx $$

Now to get a normal random variable, let $u$ be a uniform random number
on [0,1), then $f^{-1}(u)$ is a standard normal random variable.

Computing the inverse of the cumulative normal distribution is left as
an exercise, because I don't know how, but it seems possible.

From: Maxime Devos <maximedevos@telenet.be>

> So, to generate an (approximately) uniform random number on [0,1), you 
> can simply do
>
> (define (random-real)
>     (exact->inexact (/ (random N) N)))
>
> for a suitably large choice of integer N>0.

A choice that makes this nice (on a 32 bit machine) is $N = 2^{32}$.

   -- Keith



^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-04 16:14     ` Keith Wright
@ 2023-10-10 22:03       ` Maxime Devos
  2023-10-10 23:04         ` Keith Wright
  0 siblings, 1 reply; 12+ messages in thread
From: Maxime Devos @ 2023-10-10 22:03 UTC (permalink / raw)
  To: Keith Wright, Zelphir Kaltstahl; +Cc: guile-user


[-- Attachment #1.1.1: Type: text/plain, Size: 1827 bytes --]

Op 04-10-2023 om 18:14 schreef Keith Wright:
> From: Zelphir Kaltstahl <zelphirkaltstahl@posteo.de>
> 
>> my goal is normal distributed floats (leaving aside the finite
>> nature of the computer and floats).
> 
> Warning: I am out of my depth, I can't even spell statisticion, but...
> 
> The following is either provably correct up to round-off,
> or totally stupid.

It's not stupid at all, but neither is it correct (the basic approach is 
correct, and holds more generally for other probability distributions as 
well, but some important details are off ...).

> First define the cumulative distribution for the normal distribution:
> 
> $$f(x)= \frac{1}{\sqrt{\pi}} \int_{-\infty}^{x} e^{-x^2/2} dx $$

Instead of sqrt(pi) you need sqrt(2pi).
Also, this is the pdf, not the cdf. For the cdf, you need integrate this 
expression from -infinity to x.

> Now to get a normal random variable, let $u$ be a uniform random number
> on [0,1), then $f^{-1}(u)$ is a standard normal random variable.
> 
> Computing the inverse of the cumulative normal distribution is left as
> an exercise, because I don't know how, but it seems possible.

While the pdf is easy to invert (multiply by constant factor, take log, 
multiply by constant factor), resulting in an expression only involving 
constants, multiplication, logarithms (and, depending on simplification, 
subtraction), the cdf isn't.

See https://en.wikipedia.org/wiki/Normal_distribution, in particular 
‘Quantile function’. Unfortunately, erf^-1 is not straightforward to 
approximate (though methods definitely exists -- it's the theory of the 
approximation method that is difficult, not the implementation; 
transcribing some implementation from Fortran to Scheme is tedious but 
straightward).

Best regards,
Maxime Devos

[-- Attachment #1.1.2: OpenPGP public key --]
[-- Type: application/pgp-keys, Size: 929 bytes --]

[-- Attachment #2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 236 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-10 22:03       ` Maxime Devos
@ 2023-10-10 23:04         ` Keith Wright
  2023-10-11 11:31           ` Maxime Devos
  0 siblings, 1 reply; 12+ messages in thread
From: Keith Wright @ 2023-10-10 23:04 UTC (permalink / raw)
  To: Maxime Devos; +Cc: zelphirkaltstahl, guile-user

Maxime Devos <maximedevos@telenet.be> writes:

> Op 04-10-2023 om 18:14 schreef Keith Wright:
>> From: Zelphir Kaltstahl <zelphirkaltstahl@posteo.de>
>> 
>>> my goal is normal distributed floats (leaving aside the finite
>>> nature of the computer and floats).

>> The following is either provably correct up to round-off,
>> or totally stupid.

> It's not stupid at all, but neither is it correct (the basic approach is 
> correct, and holds more generally for other probability distributions as 
> well, but some important details are off ...).

Not surprising, I just made it up on the fly...

>> First define the cumulative distribution for the normal distribution:
>> 
>> $$f(x)= \frac{1}{\sqrt{\pi}} \int_{-\infty}^{x} e^{-x^2/2} dx $$
>
> Instead of sqrt(pi) you need sqrt(2pi).

Seems plausible.

> Also, this is the pdf, not the cdf. For the cdf, you need integrate this 
> expression from -infinity to x.

I was using TeX notation.  That's exactly what $\int_{-\infty}^{x}$
means.

>> Computing the inverse of the cumulative normal distribution is left as
>> an exercise, because I don't know how, but it seems possible.

> While the pdf is easy to invert (multiply by constant factor, take log, 
> multiply by constant factor), resulting in an expression only involving 
> constants, multiplication, logarithms (and, depending on simplification, 
> subtraction), the cdf isn't.

I was thinking of numerical integration, but too busy to write the code.

I was feeling guilty about being so sloppy, and thinking of writing
it more carefully, but...

> (the basic approach is correct, and holds more generally for other
> probability distributions as well)

I just saw some pictures in my head and thought it must be a general
theorem, but it's so simple and general that it must be "well known".
Somebody sometime must have already written it better than I could;
but I don't know who.  It is Theorem (X?) on page (Y?) of book (Z?).

-- Keith
   

PS: While we are cleaning up details, substitute "modulo" for "/" in:

From: Maxime Devos <maximedevos@telenet.be>

> So, to generate an (approximately) uniform random number on [0,1), you 
> can simply do
>
> (define (random-real)
>     (exact->inexact (/ (random N) N)))
>
> for a suitably large choice of integer N>0.

A choice that makes this nice (on a 32 bit machine) is $N = 2^{32}$.



^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-10 23:04         ` Keith Wright
@ 2023-10-11 11:31           ` Maxime Devos
  0 siblings, 0 replies; 12+ messages in thread
From: Maxime Devos @ 2023-10-11 11:31 UTC (permalink / raw)
  To: Keith Wright; +Cc: zelphirkaltstahl, guile-user


[-- Attachment #1.1.1: Type: text/plain, Size: 2825 bytes --]



Op 11-10-2023 om 01:04 schreef Keith Wright:
> Maxime Devos <maximedevos@telenet.be> writes:
> 
>> Op 04-10-2023 om 18:14 schreef Keith Wright:
>>> From: Zelphir Kaltstahl <zelphirkaltstahl@posteo.de>
>>>
>>>> my goal is normal distributed floats (leaving aside the finite
>>>> nature of the computer and floats).
> 
>>> The following is either provably correct up to round-off,
>>> or totally stupid.
> 
>> It's not stupid at all, but neither is it correct (the basic approach is
>> correct, and holds more generally for other probability distributions as
>> well, but some important details are off ...).
> 
> Not surprising, I just made it up on the fly...
> 
>>> First define the cumulative distribution for the normal distribution:
>>>
>>> $$f(x)= \frac{1}{\sqrt{\pi}} \int_{-\infty}^{x} e^{-x^2/2} dx $$
>>
>> Instead of sqrt(pi) you need sqrt(2pi).
> 
> Seems plausible.
> 
>> Also, this is the pdf, not the cdf. For the cdf, you need integrate this
>> expression from -infinity to x.
> 
> I was using TeX notation.  That's exactly what $\int_{-\infty}^{x}$
> means.

Ok, didn't notice the \int.

>>> Computing the inverse of the cumulative normal distribution is left as
>>> an exercise, because I don't know how, but it seems possible.
> 
>> While the pdf is easy to invert (multiply by constant factor, take log,
>> multiply by constant factor), resulting in an expression only involving
>> constants, multiplication, logarithms (and, depending on simplification,
>> subtraction), the cdf isn't.
> 
> I was thinking of numerical integration, but too busy to write the code.
 >
> I was feeling guilty about being so sloppy, and thinking of writing
> it more carefully, but...
> 
>> (the basic approach is correct, and holds more generally for other
>> probability distributions as well)
> 
> I just saw some pictures in my head and thought it must be a general
> theorem, but it's so simple and general that it must be "well known".
> Somebody sometime must have already written it better than I could;
> but I don't know who.  It is Theorem (X?) on page (Y?) of book (Z?).

You can probably find it in most course texts on probability and random 
number generation.  I could provide you a reference, but I don't think 
my source is publicly available.

> -- Keith
>     
> 
> PS: While we are cleaning up details, substitute "modulo" for "/" in:
> 
> From: Maxime Devos <maximedevos@telenet.be>
> 
>> So, to generate an (approximately) uniform random number on [0,1), you
>> can simply do
>>
>> (define (random-real)
>>      (exact->inexact (/ (random N) N)))
>>
>> for a suitably l

I'm pretty sure '/' is correct here.  (random N) produces integers in 
[0,N), dividing by N yields something in [0,1).

Best regards,
Maxime Devos.

[-- Attachment #1.1.2: OpenPGP public key --]
[-- Type: application/pgp-keys, Size: 929 bytes --]

[-- Attachment #2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 236 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: Generating "independent" random numbers
  2023-10-03 22:22   ` Zelphir Kaltstahl
  2023-10-04 16:14     ` Keith Wright
@ 2023-10-18 19:08     ` Mikael Djurfeldt
  1 sibling, 0 replies; 12+ messages in thread
From: Mikael Djurfeldt @ 2023-10-18 19:08 UTC (permalink / raw)
  To: Zelphir Kaltstahl; +Cc: Maxime Devos, Guile User, Mikael Djurfeldt

On Wed, Oct 4, 2023 at 12:22 AM Zelphir Kaltstahl <
zelphirkaltstahl@posteo.de> wrote:

> Aye, sorry for that typo. Yes, my goal is normal distributed floats
> (leaving
> aside the finite nature of the computer and floats).
>

Maybe I'm missing something from just skimming this long discussion, but
what about

  (random:normal)

(at the guile prompt)?

That's a distribution centered at 0 with sigma 1. I assume you know how to
shift it and scale it. If you're curious how to get it from a uniform
distribution, have a look at libguile/random.c.

Best regards,
Mikael


^ permalink raw reply	[flat|nested] 12+ messages in thread

end of thread, other threads:[~2023-10-18 19:08 UTC | newest]

Thread overview: 12+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2023-10-03 15:08 Generating "independent" random numbers Zelphir Kaltstahl
2023-10-03 16:04 ` tomas
2023-10-03 16:16   ` tomas
2023-10-03 22:17     ` Zelphir Kaltstahl
2023-10-03 22:16   ` Zelphir Kaltstahl
2023-10-03 18:25 ` Maxime Devos
2023-10-03 22:22   ` Zelphir Kaltstahl
2023-10-04 16:14     ` Keith Wright
2023-10-10 22:03       ` Maxime Devos
2023-10-10 23:04         ` Keith Wright
2023-10-11 11:31           ` Maxime Devos
2023-10-18 19:08     ` Mikael Djurfeldt

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).