im trying to use "class bernoulli_distribution" to simulate the number of detected particle considering the detector´s deadtime. Trigger probability is 30%
. Say from N particles only 0.3N are expected to be initially recorded. And from these 30% entries each ninth particle
can be further processed by detector (because of the deadtime after a data processing). So I want to know the survived particle number after data processing at the end. I used the following code which I modified from http://www.cplusplus.com/reference/random/bernoulli_distribution/
(parameter names were not modified).
const int nrolls=526;
int count=0; // count number of trues
for (int i=1; i<nrolls; i++)
if (count%9==0 && distribution(generator)) //count%9 counts each ninth
std::cout << "bernoulli_distribution (0.3) x 526:" << std::endl;
std::cout << "true: " << count << std::endl;
std::cout << "false: " << nrolls-count << std::endl;
the result "ture" is about 20 here (I dont have work PC to tell exact number at the moment, could have it later, sry). But both experiment and formula of deadtime show a number of about 50! Anyway if I correct the trigger probability of 30% up to 70% in
, then the simulated result will agree with other two results. Certainly I´ve tested this with other number than
, but the output seems to agree with a trigger probability of 70%.
I cant find the documentation about "std::bernoulli_distribution distribution()" But is it here really correct to input 0.7 instead of 0.3? Did I overlook or misunderstand something?
thank you in advance