Is continuously seeding a random number generator performance intensive?

Jeroen Bollen jbinero at gmail.com
Fri Jan 3 12:19:46 PST 2014


On Friday, 3 January 2014 at 18:23:23 UTC, monarch_dodra wrote:
> On Friday, 3 January 2014 at 17:41:48 UTC, Jeroen Bollen wrote:
>> On Friday, 3 January 2014 at 13:42:19 UTC, monarch_dodra wrote:
>>> On Friday, 3 January 2014 at 13:30:09 UTC, Jeroen Bollen 
>>> wrote:
>>>> I already considered this, but the problem is, I need to 
>>>> smoothen the noise, and to do that I need all surrounding 
>>>> 'checkpoints' too. This means that it'll have to load in 5 
>>>> at a time.
>>>
>>> I don't see that as a problem. Just because you can load 
>>> sub-regions at once, doesn't mean you are limited to only 
>>> loading one at a time.
>>
>> That still means that out of 5 million pixels loaded, only 1 
>> million 4 thousand will be used. I guess I can recycle them 
>> though.
>
> You could also do a "subdivision" approach:
>
> First, a table that contains seeds for 1Kx1K blocks... However 
> each seed is designed to seed 10000 new seeds, to generate 
> 10x10 blocks (for example).
>
> This way, you can first load your big 1Kx1K block, and then 400 
> 10x10 blocks. Seems reasonable to me.
>
> I don't know exactly how big your data is, so your mileage may 
> vary. Depending on your algorithm, you may have to adapt the 
> numbers, or even amount of subdivisions.

What generator would be most fitting for this? The Documentation 
says the "MersenneTwisterEngine" is an 'overall' good one. I 
decided to go with blocks of 32*32, which all require to be 
filled with an unsigned byte.


More information about the Digitalmars-d-learn mailing list