Example of a Simple Simulation

The impact of extra lessons on students’ grades, a statistical power analysis.

Date

September 11, 2024

Imagine we want to measure the impact of receiving extra lessons on students’ grades. We will implement a simple randomized experiment in a set of classes and schools. We want to make sure that our design will enable us to detect the effect if there is one: how should we design our study to do so?

Setting

Let’s assume that we implement a very simple RCT:

  • We consider nn students indexed by ii,
  • We randomly allocate a proportion ptreatp_{treat} of the students to the treatment. These students will receive additional lessons,
  • We give a test to all the students and estimate the effect of the treatment on grades.
Note

There are actually two approaches to estimate the effect of the treatment:

  • By computing the difference between the average of the grades of the treatment group (students who took additional classes) and of the grades of the control group (students who did not), or

  • With a simple linear regression of the grades on the treatment status.

Before implementing the experiment, we want to make sure that we will have a high probability of actually detecting an effect if there is one. That is to say, we want to be able to reject the null of no effect when it is actually false: we want to have adequate statistical power. A conventional (but somehow arbitrary) target threshold for adequate statistical power is 80%1.

We, as researchers, can choose the value of two parameters: the sample size and the proportion of treated observations. To pick them such that our design will have enough power, we could do a simple power calculation by hand. However, we will see that simulations can actually be much simpler when the data generating process is complex. This analysis will also help us identify the parameters driving power and easily play around the values given to the parameters.

Objective of the simulation

Choose how many students to include in the sample and the proportion of treated students to have adequate statistical power.

We will first generate one data set, estimate the parameter of interest (the average treatment effect) and then repeat this analysis. This will give us a set of estimates and we will in particular be able to compute the proportion that are statistically significant. Then, we will be able to vary the values of our design parameters of interest (sample size and proportion of treated observation) and determine which one to pick to detect the effect.

Generating a Data Set

Code to load the R packages used in this doc
library(tidyverse)
library(knitr) 
library(kableExtra)
library(broom)
library(beepr)
library(here)

set.seed(35) #fix the starting point of the random numbers generator

# The code below is to use my own package for ggplot themes
# You can remove it. Otherwise you will need to install the package via GitHub
# using the following line of code
# devtools::install_github("vincentbagilet/mediocrethemes")
library(mediocrethemes)
set_mediocre_all(pal = "portal") #set theme for all the subsequent plots

# For exercises
library(webexercises)

We first assume that the Data Generating Process (DGP) for student ii is very simple and as follows:

Gradei=α0+β0Treati+uiGrade_i = \alpha_0 + \beta_0 Treat_i + u_i

Warning

There are some implicit assumptions behind this DGP. For instance, linearity and homogeneity of the treatment.

Distribution of the variables

The goal is to generate a data set. To do so, we need to generate uu, TreatTreat and GradeGrade (vectors of length nn). GradeGrade will be given by Grade=α0+β0Treat+uGrade = \alpha_0 + \beta_0 Treat + u. For the other two variables, we need to assume their distribution.

  • Since the treatment (0 or 1) is allocated at random, we can assume that Treat(ptreat)Treat \sim \mathcal{B}(p_{treat})
  • In the real world, grades are often approximately normally distributed. We can thus assume that u𝒩(0,σu2)u \sim \mathcal{N}(0, \sigma_u^2). σu2\sigma_u^2 will represent the standard deviation of the grades (absent the treatment). α\alpha will represent their mean.

Setting the parameters

To be able to generate the data, we now need to set values for the parameters (nn, σu\sigma_u, α0\alpha_0, β0\beta_0, and ptreatp_{treat}). We can distinguish between parameters on which the researcher has a hand and those on which they do not. They can choose how many participants to include in their sample (nn) and the proportion of treated observations ptreatp_{treat} but cannot affect the rest.

These other parameters are generally fixed for the researcher. Here, since we are generating the data, we have a hand on them. We will therefore pick them such that the data generated is realistic and the distribution of grades resembles an actual distribution of grades.

n <- 100
alpha_0 <- 10
beta_0 <- 1
sigma_u <- 2
p_treat <- 0.5

Generating the data

The data generating process we defined enables us to generate the variables.

Code
u <- rnorm(n, 0, sigma_u)
treat <- rbernoulli(n, p = p_treat)
grade <- alpha_0 + beta_0*treat + u

Note that, since we know the true data genrating process, we can compute the potential outcomes for each individual: for each individual, we can compute the grade they would have gotten if they had taken the treatment and the one they would have gotten if they did not. This is of course impossible to observe in an actual study; for each student, we only observe one of the potential outcomes.

grade_1 <- alpha_0 + beta_0 + u
grade_0 <- alpha_0 + u

From this, we can derive the treatment effect for each student, the individual treatment effect.

indiv_effect <- grade_1 - grade_0

We can now store all these variables in a data frame (a tibble).

gen_data <- 
  tibble(grade, treat, u, grade_1, grade_0, indiv_effect) 

Let’s see what it looks like. To print the data nicely, we can use the kableand scroll_box functions.

gen_data |> 
  knitr::kable() |> 
  kableExtra::scroll_box(width = "100%", height = "200px")
grade treat u grade_1 grade_0 indiv_effect
12.130251 FALSE 2.1302507 13.130251 12.130251 1
11.265762 TRUE 0.2657624 11.265762 10.265762 1
9.931912 FALSE -0.0680880 10.931912 9.931912 1
10.910047 TRUE -0.0899527 10.910047 9.910047 1
17.675676 TRUE 6.6756757 17.675676 16.675676 1
10.214165 TRUE -0.7858351 10.214165 9.214165 1
11.821132 TRUE 0.8211320 11.821132 10.821132 1
8.875629 FALSE -1.1243709 9.875629 8.875629 1
13.382453 TRUE 2.3824533 13.382453 12.382453 1
10.698136 FALSE 0.6981363 11.698136 10.698136 1
12.023108 TRUE 1.0231082 12.023108 11.023108 1
10.890681 FALSE 0.8906808 11.890681 10.890681 1
9.463238 TRUE -1.5367623 9.463238 8.463238 1
8.997782 FALSE -1.0022181 9.997782 8.997782 1
6.182826 FALSE -3.8171743 7.182826 6.182826 1
7.442656 TRUE -3.5573437 7.442656 6.442656 1
12.728501 TRUE 1.7285013 12.728501 11.728501 1
9.308524 FALSE -0.6914758 10.308524 9.308524 1
14.676523 TRUE 3.6765226 14.676523 13.676523 1
14.346167 TRUE 3.3461670 14.346167 13.346167 1
11.368485 FALSE 1.3684853 12.368485 11.368485 1
12.913915 TRUE 1.9139145 12.913915 11.913915 1
13.003782 TRUE 2.0037825 13.003782 12.003782 1
13.552473 TRUE 2.5524733 13.552473 12.552473 1
9.358722 FALSE -0.6412779 10.358722 9.358722 1
12.156908 TRUE 1.1569084 12.156908 11.156908 1
11.974202 TRUE 0.9742019 11.974202 10.974202 1
12.105695 TRUE 1.1056949 12.105695 11.105695 1
11.135126 FALSE 1.1351261 12.135126 11.135126 1
10.699134 FALSE 0.6991345 11.699134 10.699134 1
13.344878 TRUE 2.3448775 13.344878 12.344878 1
12.224396 FALSE 2.2243958 13.224396 12.224396 1
12.448768 FALSE 2.4487679 13.448768 12.448768 1
9.640514 FALSE -0.3594857 10.640514 9.640514 1
9.651325 TRUE -1.3486753 9.651325 8.651325 1
14.868253 FALSE 4.8682527 15.868253 14.868253 1
11.223819 TRUE 0.2238194 11.223819 10.223819 1
9.793142 TRUE -1.2068577 9.793142 8.793142 1
14.278753 TRUE 3.2787529 14.278753 13.278753 1
8.609292 FALSE -1.3907081 9.609292 8.609292 1
9.227352 FALSE -0.7726476 10.227352 9.227352 1
10.098029 FALSE 0.0980288 11.098029 10.098029 1
8.653697 TRUE -2.3463033 8.653697 7.653697 1
9.374474 TRUE -1.6255256 9.374474 8.374474 1
12.178305 FALSE 2.1783050 13.178305 12.178305 1
12.537926 TRUE 1.5379263 12.537926 11.537926 1
10.866863 TRUE -0.1331368 10.866863 9.866863 1
13.129058 TRUE 2.1290579 13.129058 12.129058 1
10.167460 FALSE 0.1674599 11.167460 10.167460 1
12.459794 TRUE 1.4597944 12.459794 11.459794 1
10.286978 FALSE 0.2869784 11.286978 10.286978 1
13.177599 TRUE 2.1775985 13.177599 12.177599 1
8.055879 FALSE -1.9441213 9.055879 8.055879 1
8.202991 TRUE -2.7970085 8.202991 7.202992 1
11.544127 TRUE 0.5441266 11.544127 10.544127 1
8.461452 FALSE -1.5385476 9.461452 8.461452 1
12.727441 FALSE 2.7274407 13.727441 12.727441 1
7.663704 FALSE -2.3362963 8.663704 7.663704 1
9.348784 FALSE -0.6512156 10.348784 9.348784 1
9.971264 TRUE -1.0287357 9.971264 8.971264 1
13.314684 FALSE 3.3146844 14.314684 13.314684 1
12.061729 TRUE 1.0617286 12.061729 11.061729 1
9.443522 FALSE -0.5564781 10.443522 9.443522 1
8.353766 FALSE -1.6462339 9.353766 8.353766 1
10.734081 TRUE -0.2659188 10.734081 9.734081 1
15.760676 FALSE 5.7606763 16.760676 15.760676 1
11.375374 TRUE 0.3753741 11.375374 10.375374 1
8.249464 FALSE -1.7505361 9.249464 8.249464 1
12.075871 TRUE 1.0758709 12.075871 11.075871 1
8.401415 TRUE -2.5985854 8.401415 7.401415 1
5.906489 FALSE -4.0935110 6.906489 5.906489 1
10.740369 FALSE 0.7403691 11.740369 10.740369 1
12.391943 TRUE 1.3919425 12.391943 11.391943 1
14.443107 TRUE 3.4431071 14.443107 13.443107 1
11.521485 TRUE 0.5214853 11.521485 10.521485 1
9.297947 TRUE -1.7020534 9.297947 8.297947 1
9.493486 FALSE -0.5065144 10.493486 9.493486 1
11.726207 FALSE 1.7262067 12.726207 11.726207 1
10.967600 TRUE -0.0323996 10.967600 9.967600 1
10.727962 FALSE 0.7279618 11.727962 10.727962 1
11.630992 TRUE 0.6309915 11.630992 10.630992 1
7.396978 FALSE -2.6030225 8.396978 7.396978 1
8.647235 FALSE -1.3527651 9.647235 8.647235 1
7.863456 TRUE -3.1365439 7.863456 6.863456 1
11.123284 TRUE 0.1232844 11.123284 10.123284 1
11.866272 FALSE 1.8662717 12.866272 11.866272 1
10.703307 TRUE -0.2966932 10.703307 9.703307 1
10.441176 FALSE 0.4411759 11.441176 10.441176 1
11.146730 TRUE 0.1467301 11.146730 10.146730 1
12.135158 TRUE 1.1351579 12.135158 11.135158 1
9.962801 FALSE -0.0371985 10.962801 9.962801 1
10.071319 FALSE 0.0713195 11.071319 10.071319 1
11.345728 FALSE 1.3457283 12.345728 11.345728 1
11.123247 FALSE 1.1232471 12.123247 11.123247 1
9.768669 TRUE -1.2313305 9.768669 8.768669 1
14.157733 TRUE 3.1577331 14.157733 13.157733 1
14.212989 TRUE 3.2129891 14.212989 13.212989 1
9.651103 TRUE -1.3488971 9.651103 8.651103 1
10.330899 TRUE -0.6691009 10.330899 9.330899 1
7.312035 TRUE -3.6879654 7.312035 6.312035 1
Note

kable and scroll_box come from other packages. For simplicity and clarity, we loaded those packages at the beginning of the document.

Important

In an actual study, we would only observe the two first columns: the treatment status of each individual and their actual grade.

Plotting the data

We can then plot the data to check that the distribution of grades is similar to what we wanted.

Code
gen_data |> 
  ggplot(aes(x = grade)) +
  geom_density() +
  geom_vline(xintercept = mean(gen_data$grade), size = 1) +
  labs(
    title = "Distribution of the observed grades in the sample",
    x = "Grade", 
    y = "Density"
  )

Similarly, we compute the mean and standard deviation of our variable.

Code
gen_data |> 
  summarise(
    mean_grade = mean(grade),
    sd_grade = sd(grade)
  ) |> 
  kable()
mean_grade sd_grade
10.93328 2.105984

This distribution seems realistic: we chose realistic parameters values.

Warning

Note that the mean of our grade distribution depends on the treatment effect size and the proportion of treated: alpha sets the mean of Grade(0)Grade(0), the grades in the absence of treatment, not of the observed grades (which is a mixture of Grade(0)Grade(0) and Grade(1)Grade(1)).

Now that we have generated a data set, we can estimate the effect of the treatment in this data set.

Estimating the effect of the treatment

We now want to regress grade on treat and estimate the regression model Grade=α+βTreatGrade = \alpha + \beta Treat. Note that our regression model represents the actual DGP here. In reality, in economics, the model basically never represents the DGP, which adds a layer of complexity.

reg <- lm(grade ~ treat, data = gen_data) 

We then print the output of the regression, only focusing on the values of interest: the point estimate of β̂\hat{\beta} and its standard error.

reg_interest <- reg |> 
  summary() |> 
  broom::tidy() |> 
  filter(term == "treatTRUE") 
Code
reg_interest |> 
  kable()
term estimate std.error statistic p.value
treatTRUE 1.286566 0.4051383 3.175621 0.0019992

The estimate is not significant: we cannot reject the null of no effect (even if we know that there is indeed an effect; we put it ourselves in the data).

Comparison to the true effect

The whole game here is look whether the estimate we obtained is close to the true effect of the treatment and whether it is significant.

What is the true effect of the treatment?

At the population level, we set this true effect to be equal to β0\beta_0. However, the true effect in the sample may differ from the true effect in the population. We can compute the true effect in the sample by computing the average of individual treatment effects, ie the difference between the potential outcomes. Here, since the treatment effect is homogeneous, all individual treatment effects will be identical and equal to the the population average treatment effect. We can therefore compute the “bias” of our estimate; the difference between the estimate and the true treatment effect.

bias <- reg_interest$estimate - beta_0

The bias is 0.29, ie 29% of the true effect. This is quite a large bias.

So far, we have generate one data set and estimated the average treatment effect in this data set. Yet, there is some uncertainty associated with this estimate: the associated standard error is 0.41. This means that, if we had a different data set, we may have obtained a different point estimate. We will now explore this question. What would have happened if we had another sample?

Repeat

For loop

To know what what would have happened if we had another sample, we can simply rerun the analysis, recompiling our code several times. This will generate a new data set and re-estimate the ATE but using this different data set. However, this approach consisting in recompiling everything by hand is quite cumbersome. Instead, we will write a for loop to automatically repeat the analysis (and store the results).

Let’s start again and act as if we had not written anything yet.

Tip

To write a for loop, first write what is inside the loop, make sure it runs and then put it into a loop.

n_iter <- 100
res <- NULL

n <- 100
alpha_0 <- 10
beta_0 <- 1
sigma_u <- 2
p_treat <- 0.5

for (i in 1:n_iter) {
  res <- 
    tibble(
      u = rnorm(n, 0, sigma_u),
      treat = rbernoulli(n, p = p_treat),
      grade = alpha_0 + beta_0*treat + u,
      grade_1 = alpha_0 + beta_0 + u,
      grade_0 = alpha_0 + u,
      indiv_effect = grade_1 - grade_0
    ) %>% 
    lm(grade ~ treat, data = .) |> 
    summary() |> 
    broom::tidy(conf.int = TRUE, conf.level = 0.95) |> 
    filter(term == "treatTRUE") |> 
    select(-term) |> 
    mutate(sim_id = i) |> 
    bind_rows(res)
}

What do we get?

Code
res |> 
  kable() |> 
  scroll_box(width = "100%", height = "200px")
estimate std.error statistic p.value conf.low conf.high sim_id
0.8725538 0.4083589 2.1367326 0.0351095 0.0621788 1.6829288 100
1.0451750 0.4080325 2.5614994 0.0119463 0.2354478 1.8549022 99
1.1312827 0.4537682 2.4930852 0.0143398 0.2307945 2.0317709 98
1.1421168 0.3995833 2.8582695 0.0052035 0.3491567 1.9350769 97
0.6471430 0.4526530 1.4296670 0.1559929 -0.2511321 1.5454180 96
0.2971704 0.3841368 0.7736058 0.4410261 -0.4651365 1.0594774 95
0.5436053 0.3710357 1.4651024 0.1460939 -0.1927030 1.2799136 94
0.7115554 0.4149337 1.7148651 0.0895306 -0.1118671 1.5349778 93
1.0924949 0.4290158 2.5465142 0.0124374 0.2411270 1.9438629 92
0.2461773 0.4175948 0.5895124 0.5568742 -0.5825259 1.0748805 91
1.0333288 0.4163229 2.4820365 0.0147643 0.2071494 1.8595081 90
1.6530151 0.4170113 3.9639579 0.0001401 0.8254698 2.4805603 89
0.5895902 0.3960097 1.4888278 0.1397438 -0.1962781 1.3754586 88
0.9123130 0.4033208 2.2620035 0.0259059 0.1119360 1.7126899 87
1.3177480 0.3634680 3.6254856 0.0004598 0.5964576 2.0390385 86
0.9406685 0.3827689 2.4575364 0.0157460 0.1810761 1.7002610 85
0.6354703 0.3906715 1.6266104 0.1070312 -0.1398045 1.4107451 84
1.7939966 0.3950838 4.5408007 0.0000160 1.0099657 2.5780275 83
1.2135079 0.3997138 3.0359421 0.0030716 0.4202889 2.0067269 82
1.1593593 0.3800090 3.0508729 0.0029357 0.4052437 1.9134749 81
0.9147777 0.3896745 2.3475431 0.0209083 0.1414813 1.6880741 80
0.3692997 0.4357736 0.8474575 0.3988051 -0.4954790 1.2340783 79
0.8272228 0.3746005 2.2082799 0.0295560 0.0838403 1.5706053 78
1.2797161 0.3867299 3.3090691 0.0013103 0.5122631 2.0471690 77
0.3297614 0.4404869 0.7486295 0.4558736 -0.5443704 1.2038933 76
1.1104987 0.4518764 2.4575276 0.0157464 0.2137647 2.0072327 75
1.1296937 0.3376454 3.3457991 0.0011641 0.4596474 1.7997401 74
0.7509941 0.4380324 1.7144716 0.0896031 -0.1182669 1.6202552 73
1.0488365 0.4008660 2.6164271 0.0102917 0.2533311 1.8443420 72
0.8001015 0.3906172 2.0483008 0.0432046 0.0249344 1.5752686 71
0.7950048 0.4004750 1.9851548 0.0499222 0.0002753 1.5897343 70
0.5999817 0.4078927 1.4709301 0.1445137 -0.2094682 1.4094316 69
1.0625139 0.3586906 2.9622013 0.0038329 0.3507040 1.7743237 68
0.6729487 0.4133360 1.6280912 0.1067164 -0.1473032 1.4932006 67
0.7723290 0.4318465 1.7884343 0.0767961 -0.0846562 1.6293143 66
1.5703815 0.3984381 3.9413437 0.0001520 0.7796941 2.3610690 65
1.3377061 0.4367320 3.0629909 0.0028294 0.4710257 2.2043865 64
1.8131116 0.3566687 5.0834617 0.0000018 1.1053142 2.5209090 63
0.6942697 0.4343603 1.5983729 0.1131794 -0.1677041 1.5562436 62
1.2638621 0.4200097 3.0091264 0.0033306 0.4303667 2.0973576 61
0.4145652 0.3852562 1.0760767 0.2845350 -0.3499632 1.1790936 60
0.6091501 0.4391088 1.3872417 0.1685149 -0.2622471 1.4805473 59
1.0047163 0.4481372 2.2419839 0.0272171 0.1154027 1.8940300 58
1.1027469 0.4256980 2.5904444 0.0110467 0.2579631 1.9475306 57
0.6859754 0.3981062 1.7230966 0.0880250 -0.1040534 1.4760041 56
0.9697924 0.4222566 2.2966895 0.0237651 0.1318378 1.8077469 55
0.9573040 0.3979120 2.4058186 0.0180117 0.1676607 1.7469474 54
0.8836766 0.4370793 2.0217762 0.0459256 0.0163069 1.7510464 53
1.0097388 0.4199880 2.4042088 0.0180867 0.1762863 1.8431913 52
0.7924508 0.4013860 1.9742860 0.0511639 -0.0040867 1.5889884 51
0.4043102 0.3696081 1.0938889 0.2766852 -0.3291650 1.1377853 50
0.9576403 0.3681667 2.6011051 0.0107311 0.2270254 1.6882551 49
0.9118646 0.4236947 2.1521737 0.0338398 0.0710563 1.7526729 48
1.5369827 0.4024160 3.8193875 0.0002347 0.7384012 2.3355642 47
0.1715084 0.4209315 0.4074496 0.6845665 -0.6638165 1.0068333 46
0.8917278 0.4123860 2.1623622 0.0330240 0.0733613 1.7100943 45
0.8516098 0.3777465 2.2544477 0.0263941 0.1019841 1.6012355 44
0.9764321 0.3730199 2.6176406 0.0102576 0.2361862 1.7166781 43
1.5398226 0.3884399 3.9641203 0.0001401 0.7689762 2.3106690 42
1.0493278 0.4281191 2.4510187 0.0160168 0.1997395 1.8989162 41
0.9541602 0.4202993 2.2701924 0.0253858 0.1200900 1.7882304 40
0.9499480 0.4075083 2.3311132 0.0217962 0.1412610 1.7586351 39
0.3014509 0.3734616 0.8071806 0.4215172 -0.4396714 1.0425732 38
1.1349699 0.3863511 2.9376642 0.0041225 0.3682687 1.9016712 37
0.9241645 0.3459322 2.6715189 0.0088431 0.2376732 1.6106557 36
0.1684784 0.4049658 0.4160311 0.6782969 -0.6351630 0.9721198 35
0.2660638 0.4023680 0.6612449 0.5100069 -0.5324224 1.0645500 34
0.6540052 0.3969428 1.6476057 0.1026364 -0.1337148 1.4417253 33
1.2893119 0.4134889 3.1181292 0.0023895 0.4687566 2.1098672 32
1.3135402 0.4514491 2.9096086 0.0044784 0.4176541 2.2094262 31
1.0253142 0.4134565 2.4798600 0.0148493 0.2048233 1.8458052 30
1.0730963 0.3795312 2.8274256 0.0056894 0.3199289 1.8262636 29
0.6123842 0.4077969 1.5016891 0.1363928 -0.1968755 1.4216438 28
1.5374353 0.3900948 3.9411841 0.0001521 0.7633049 2.3115658 27
0.6956575 0.4224739 1.6466283 0.1028377 -0.1427282 1.5340433 26
0.7640088 0.4310064 1.7726161 0.0793996 -0.0913093 1.6193269 25
1.0218385 0.4063789 2.5144971 0.0135482 0.2153928 1.8282841 24
1.2499663 0.3542607 3.5283797 0.0006383 0.5469475 1.9529852 23
1.6103230 0.4078807 3.9480245 0.0001484 0.8008970 2.4197489 22
0.9244306 0.3693394 2.5029296 0.0139708 0.1914885 1.6573726 21
0.4136146 0.3672289 1.1263128 0.2627837 -0.3151392 1.1423684 20
0.5514754 0.4430437 1.2447427 0.2161948 -0.3277304 1.4306812 19
0.6549261 0.4323959 1.5146445 0.1330812 -0.2031496 1.5130018 18
0.5372136 0.4122891 1.3030020 0.1956274 -0.2809608 1.3553880 17
1.2956686 0.4321035 2.9985142 0.0034386 0.4381732 2.1531640 16
0.6674213 0.3931016 1.6978343 0.0927124 -0.1126760 1.4475186 15
0.9725591 0.4551595 2.1367432 0.0351086 0.0693098 1.8758084 14
-0.0312996 0.3960280 -0.0790339 0.9371668 -0.8172043 0.7546051 13
1.2320876 0.3974476 3.0999999 0.0025266 0.4433657 2.0208095 12
0.0957748 0.3800946 0.2519764 0.8015870 -0.6585104 0.8500601 11
1.2650406 0.4096523 3.0880839 0.0026206 0.4520990 2.0779823 10
1.2604960 0.4302230 2.9298668 0.0042187 0.4067325 2.1142596 9
1.2012604 0.4159299 2.8881322 0.0047696 0.3758611 2.0266596 8
0.9617566 0.3897398 2.4676892 0.0153323 0.1883307 1.7351825 7
0.8132844 0.4121962 1.9730516 0.0513066 -0.0047056 1.6312743 6
1.1310331 0.4255266 2.6579610 0.0091813 0.2865894 1.9754768 5
1.3008521 0.4142995 3.1398830 0.0022341 0.4786882 2.1230161 4
0.7035932 0.4304360 1.6346059 0.1053400 -0.1505930 1.5577795 3
1.6768648 0.4048088 4.1423620 0.0000730 0.8735348 2.4801947 2
0.8299627 0.3889488 2.1338610 0.0353501 0.0581064 1.6018189 1


We get n_iter estimates of our effect of interest, estimated on n_iter different data sets (generated from the same DGP). Let’s plot some of these estimates:

Code
res |> 
  arrange(sim_id) |> 
  slice(1:30) |> 
  ggplot(aes(x = sim_id, y = estimate)) + 
  geom_point() + 
  geom_pointrange(aes(ymin = conf.low, ymax = conf.high)) +
  geom_hline(yintercept = beta_0) +
  geom_hline(yintercept = 0, linetype = "solid") +
  labs(
    title = "Estimates of the ATE",
    subtitle = paste("Computed on 30 different data sets"),
    x = "Simulation id",
    y = "Estimate"
  ) 

Some are close to the true effect, some are further away, for a clearer summary of the distribution of estimates, we can directly plot it:

Code
res |> 
  ggplot(aes(x = estimate)) + 
  geom_dotplot() + 
  geom_vline(aes(xintercept = mean(res$estimate)), size = 1.2) + 
  labs(
    title = paste("Distribution of", n_iter, "estimates of the ATE"),
    subtitle = paste("Computed on", n_iter, "different data sets"),
    x = "Estimate",
    y = NULL,
    caption = "Each dot represents one estimate
      The doted line represents the mean of the estimates"
  ) +
  theme(panel.grid.major.y = element_blank(), axis.text.y = element_blank())

We also notice that the average the estimates is rather close from the true effect: our estimator seems to be unbiased (in the sense that 𝔼(β̂)=β0\mathbb{E}(\hat{\beta}) = \beta_0).

However, if we implement our experiment with these chosen parameters (nn and ptreatp_{treat}), assuming that the DGP is as described, it is quite likely that the estimate we would be rather far from the true effect.

Note

We know that this estimator should be distributed normally. It is not clear from this graph since we have only a limited number of iterations. We will however double check that later in this document.

Statistical power of the analysis

Let’s now explore our initial question: would we have enough statistical power to detect the true effect?

Which simulations would yield a significant estimates?

Code
res |>  
  kable() |> 
  column_spec(
    column = 2, 
    color = "white",
    background = spec_color(
      res$p.value < 0.05, 
      pal = c("#C51E3A", "#009E60")
    )
  ) |> 
  scroll_box(height = "250px")
estimate std.error statistic p.value conf.low conf.high sim_id
0.8725538 0.4083589 2.1367326 0.0351095 0.0621788 1.6829288 100
1.0451750 0.4080325 2.5614994 0.0119463 0.2354478 1.8549022 99
1.1312827 0.4537682 2.4930852 0.0143398 0.2307945 2.0317709 98
1.1421168 0.3995833 2.8582695 0.0052035 0.3491567 1.9350769 97
0.6471430 0.4526530 1.4296670 0.1559929 -0.2511321 1.5454180 96
0.2971704 0.3841368 0.7736058 0.4410261 -0.4651365 1.0594774 95
0.5436053 0.3710357 1.4651024 0.1460939 -0.1927030 1.2799136 94
0.7115554 0.4149337 1.7148651 0.0895306 -0.1118671 1.5349778 93
1.0924949 0.4290158 2.5465142 0.0124374 0.2411270 1.9438629 92
0.2461773 0.4175948 0.5895124 0.5568742 -0.5825259 1.0748805 91
1.0333288 0.4163229 2.4820365 0.0147643 0.2071494 1.8595081 90
1.6530151 0.4170113 3.9639579 0.0001401 0.8254698 2.4805603 89
0.5895902 0.3960097 1.4888278 0.1397438 -0.1962781 1.3754586 88
0.9123130 0.4033208 2.2620035 0.0259059 0.1119360 1.7126899 87
1.3177480 0.3634680 3.6254856 0.0004598 0.5964576 2.0390385 86
0.9406685 0.3827689 2.4575364 0.0157460 0.1810761 1.7002610 85
0.6354703 0.3906715 1.6266104 0.1070312 -0.1398045 1.4107451 84
1.7939966 0.3950838 4.5408007 0.0000160 1.0099657 2.5780275 83
1.2135079 0.3997138 3.0359421 0.0030716 0.4202889 2.0067269 82
1.1593593 0.3800090 3.0508729 0.0029357 0.4052437 1.9134749 81
0.9147777 0.3896745 2.3475431 0.0209083 0.1414813 1.6880741 80
0.3692997 0.4357736 0.8474575 0.3988051 -0.4954790 1.2340783 79
0.8272228 0.3746005 2.2082799 0.0295560 0.0838403 1.5706053 78
1.2797161 0.3867299 3.3090691 0.0013103 0.5122631 2.0471690 77
0.3297614 0.4404869 0.7486295 0.4558736 -0.5443704 1.2038933 76
1.1104987 0.4518764 2.4575276 0.0157464 0.2137647 2.0072327 75
1.1296937 0.3376454 3.3457991 0.0011641 0.4596474 1.7997401 74
0.7509941 0.4380324 1.7144716 0.0896031 -0.1182669 1.6202552 73
1.0488365 0.4008660 2.6164271 0.0102917 0.2533311 1.8443420 72
0.8001015 0.3906172 2.0483008 0.0432046 0.0249344 1.5752686 71
0.7950048 0.4004750 1.9851548 0.0499222 0.0002753 1.5897343 70
0.5999817 0.4078927 1.4709301 0.1445137 -0.2094682 1.4094316 69
1.0625139 0.3586906 2.9622013 0.0038329 0.3507040 1.7743237 68
0.6729487 0.4133360 1.6280912 0.1067164 -0.1473032 1.4932006 67
0.7723290 0.4318465 1.7884343 0.0767961 -0.0846562 1.6293143 66
1.5703815 0.3984381 3.9413437 0.0001520 0.7796941 2.3610690 65
1.3377061 0.4367320 3.0629909 0.0028294 0.4710257 2.2043865 64
1.8131116 0.3566687 5.0834617 0.0000018 1.1053142 2.5209090 63
0.6942697 0.4343603 1.5983729 0.1131794 -0.1677041 1.5562436 62
1.2638621 0.4200097 3.0091264 0.0033306 0.4303667 2.0973576 61
0.4145652 0.3852562 1.0760767 0.2845350 -0.3499632 1.1790936 60
0.6091501 0.4391088 1.3872417 0.1685149 -0.2622471 1.4805473 59
1.0047163 0.4481372 2.2419839 0.0272171 0.1154027 1.8940300 58
1.1027469 0.4256980 2.5904444 0.0110467 0.2579631 1.9475306 57
0.6859754 0.3981062 1.7230966 0.0880250 -0.1040534 1.4760041 56
0.9697924 0.4222566 2.2966895 0.0237651 0.1318378 1.8077469 55
0.9573040 0.3979120 2.4058186 0.0180117 0.1676607 1.7469474 54
0.8836766 0.4370793 2.0217762 0.0459256 0.0163069 1.7510464 53
1.0097388 0.4199880 2.4042088 0.0180867 0.1762863 1.8431913 52
0.7924508 0.4013860 1.9742860 0.0511639 -0.0040867 1.5889884 51
0.4043102 0.3696081 1.0938889 0.2766852 -0.3291650 1.1377853 50
0.9576403 0.3681667 2.6011051 0.0107311 0.2270254 1.6882551 49
0.9118646 0.4236947 2.1521737 0.0338398 0.0710563 1.7526729 48
1.5369827 0.4024160 3.8193875 0.0002347 0.7384012 2.3355642 47
0.1715084 0.4209315 0.4074496 0.6845665 -0.6638165 1.0068333 46
0.8917278 0.4123860 2.1623622 0.0330240 0.0733613 1.7100943 45
0.8516098 0.3777465 2.2544477 0.0263941 0.1019841 1.6012355 44
0.9764321 0.3730199 2.6176406 0.0102576 0.2361862 1.7166781 43
1.5398226 0.3884399 3.9641203 0.0001401 0.7689762 2.3106690 42
1.0493278 0.4281191 2.4510187 0.0160168 0.1997395 1.8989162 41
0.9541602 0.4202993 2.2701924 0.0253858 0.1200900 1.7882304 40
0.9499480 0.4075083 2.3311132 0.0217962 0.1412610 1.7586351 39
0.3014509 0.3734616 0.8071806 0.4215172 -0.4396714 1.0425732 38
1.1349699 0.3863511 2.9376642 0.0041225 0.3682687 1.9016712 37
0.9241645 0.3459322 2.6715189 0.0088431 0.2376732 1.6106557 36
0.1684784 0.4049658 0.4160311 0.6782969 -0.6351630 0.9721198 35
0.2660638 0.4023680 0.6612449 0.5100069 -0.5324224 1.0645500 34
0.6540052 0.3969428 1.6476057 0.1026364 -0.1337148 1.4417253 33
1.2893119 0.4134889 3.1181292 0.0023895 0.4687566 2.1098672 32
1.3135402 0.4514491 2.9096086 0.0044784 0.4176541 2.2094262 31
1.0253142 0.4134565 2.4798600 0.0148493 0.2048233 1.8458052 30
1.0730963 0.3795312 2.8274256 0.0056894 0.3199289 1.8262636 29
0.6123842 0.4077969 1.5016891 0.1363928 -0.1968755 1.4216438 28
1.5374353 0.3900948 3.9411841 0.0001521 0.7633049 2.3115658 27
0.6956575 0.4224739 1.6466283 0.1028377 -0.1427282 1.5340433 26
0.7640088 0.4310064 1.7726161 0.0793996 -0.0913093 1.6193269 25
1.0218385 0.4063789 2.5144971 0.0135482 0.2153928 1.8282841 24
1.2499663 0.3542607 3.5283797 0.0006383 0.5469475 1.9529852 23
1.6103230 0.4078807 3.9480245 0.0001484 0.8008970 2.4197489 22
0.9244306 0.3693394 2.5029296 0.0139708 0.1914885 1.6573726 21
0.4136146 0.3672289 1.1263128 0.2627837 -0.3151392 1.1423684 20
0.5514754 0.4430437 1.2447427 0.2161948 -0.3277304 1.4306812 19
0.6549261 0.4323959 1.5146445 0.1330812 -0.2031496 1.5130018 18
0.5372136 0.4122891 1.3030020 0.1956274 -0.2809608 1.3553880 17
1.2956686 0.4321035 2.9985142 0.0034386 0.4381732 2.1531640 16
0.6674213 0.3931016 1.6978343 0.0927124 -0.1126760 1.4475186 15
0.9725591 0.4551595 2.1367432 0.0351086 0.0693098 1.8758084 14
-0.0312996 0.3960280 -0.0790339 0.9371668 -0.8172043 0.7546051 13
1.2320876 0.3974476 3.0999999 0.0025266 0.4433657 2.0208095 12
0.0957748 0.3800946 0.2519764 0.8015870 -0.6585104 0.8500601 11
1.2650406 0.4096523 3.0880839 0.0026206 0.4520990 2.0779823 10
1.2604960 0.4302230 2.9298668 0.0042187 0.4067325 2.1142596 9
1.2012604 0.4159299 2.8881322 0.0047696 0.3758611 2.0266596 8
0.9617566 0.3897398 2.4676892 0.0153323 0.1883307 1.7351825 7
0.8132844 0.4121962 1.9730516 0.0513066 -0.0047056 1.6312743 6
1.1310331 0.4255266 2.6579610 0.0091813 0.2865894 1.9754768 5
1.3008521 0.4142995 3.1398830 0.0022341 0.4786882 2.1230161 4
0.7035932 0.4304360 1.6346059 0.1053400 -0.1505930 1.5577795 3
1.6768648 0.4048088 4.1423620 0.0000730 0.8735348 2.4801947 2
0.8299627 0.3889488 2.1338610 0.0353501 0.0581064 1.6018189 1


Let’s compute the proportion of significant results at the 5% level:

power <- mean(res$p.value < 0.05)

The statistical power of this design would be 64%. In 36% of cases, we would not be able to reject the null of no effect.

Why is that problematic ? Since we wrote the DGP, we know what the true effect of the treatment on grades is: the treatment increases grades by 1 points. If we invest resources into actually implementing this experiment, if the DGP is close to what we simulated, in 36% of cases, we not be able to state that the average treatment effect is non-null, even if it actually is equal to 1.

Functions

Now that we have computed the statistical power for this set of parameters, we may wonder how it varies with the value of the parameters. We therefore want to re-run the analysis but for another set of parameters. We could just rerun all this code, just changing the values of the parameters. However, this approach would not be reproducible. Instead, we will therefore write a function that takes as inputs the parameters we want to vary and that runs one simulation. Then, we will be able to call this function as many times as we want, even varying the value of the parameters.

A good practice in R programming is “one action, one function” and vice-versa. We will therefore create several functions and then wrap it all within one overall function (compute_sim()): we first generate_data() and then run_estim(). I chose short names that clearly describe what the functions do.

Important

The simulation workflow we will use through out the class is as follows:

  • Define a set of baseline parameters and store them in a data frame,
  • Generate data for a given set of parameters,
  • Run the estimation and extract the outcomes of interest,
  • Compute one simulation by combing the previous steps,
  • Replicate the process for a number of iterations and a set of parameters.

generate_data()

Let’s first write a function to generate data and run it on our set of parameters:

generate_data <- function(n,
                          p_treat,
                          alpha_0,
                          beta_0,
                          sigma_u) {
  tibble(
    u = rnorm(n, 0, sigma_u),
    treat = rbernoulli(n, p = p_treat),
    grade = alpha_0 + beta_0 * treat + u,
    grade_1 = alpha_0 + beta_0 + u,
    grade_0 = alpha_0 + u,
    indiv_effect = grade_1 - grade_0
  )
}
Note

We will only be able to easily vary the parameters that are inputs of the functions. If we would have wanted to change other parameters, such as the functional form of the DGP, introducing heterogeneous effect for instance, we would have needed to add parameters to generate_data() that would have enabled us to do so.

To pass the value of the parameters to the function, we can store them in a data frame and use the function purrr::pmap(). You will see that this approach comes handy. Let’s therefore create a tibble to store our baseline parameters. Note that pmap() returns a list. To transform it into a tibble, we need to use the function list_rbind().

baseline_param <- tibble(
  n = 100,
  p_treat = 0.5,
  alpha_0 = 10,
  beta_0 = 1,
  sigma_u = 2
)

baseline_param |> 
  purrr::pmap(.f = generate_data) |> 
  list_rbind() |> 
  kable() |> 
  scroll_box(height = "200px", width = "100%")
u treat grade grade_1 grade_0 indiv_effect
1.2961732 TRUE 12.296173 12.296173 11.296173 1
1.9286980 FALSE 11.928698 12.928698 11.928698 1
3.4106607 TRUE 14.410661 14.410661 13.410661 1
-0.4692821 TRUE 10.530718 10.530718 9.530718 1
1.7861525 FALSE 11.786153 12.786153 11.786153 1
-1.4067537 TRUE 9.593246 9.593246 8.593246 1
-1.2030811 FALSE 8.796919 9.796919 8.796919 1
-2.2918364 FALSE 7.708164 8.708164 7.708164 1
0.9110209 TRUE 11.911021 11.911021 10.911021 1
2.1620647 FALSE 12.162065 13.162065 12.162065 1
-0.4607435 TRUE 10.539257 10.539257 9.539257 1
2.7333430 FALSE 12.733343 13.733343 12.733343 1
-0.8427977 TRUE 10.157202 10.157202 9.157202 1
3.0035888 TRUE 14.003589 14.003589 13.003589 1
2.4184287 TRUE 13.418429 13.418429 12.418429 1
0.0717223 TRUE 11.071722 11.071722 10.071722 1
-1.2980604 FALSE 8.701940 9.701940 8.701940 1
-1.5655372 TRUE 9.434463 9.434463 8.434463 1
1.3812708 TRUE 12.381271 12.381271 11.381271 1
1.3003620 FALSE 11.300362 12.300362 11.300362 1
-2.2595389 TRUE 8.740461 8.740461 7.740461 1
1.2613916 TRUE 12.261392 12.261392 11.261392 1
3.1617608 FALSE 13.161761 14.161761 13.161761 1
2.6719146 TRUE 13.671915 13.671915 12.671915 1
-3.0066248 FALSE 6.993375 7.993375 6.993375 1
0.1842885 TRUE 11.184288 11.184288 10.184288 1
0.9189807 TRUE 11.918981 11.918981 10.918981 1
-0.7760913 FALSE 9.223909 10.223909 9.223909 1
0.2753327 FALSE 10.275333 11.275333 10.275333 1
-2.3150007 FALSE 7.684999 8.684999 7.684999 1
-0.4048655 FALSE 9.595135 10.595135 9.595135 1
-0.7786494 TRUE 10.221351 10.221351 9.221351 1
3.3547019 FALSE 13.354702 14.354702 13.354702 1
0.7892373 TRUE 11.789237 11.789237 10.789237 1
0.3069108 TRUE 11.306911 11.306911 10.306911 1
-0.2129346 TRUE 10.787065 10.787065 9.787065 1
-1.1470196 TRUE 9.852980 9.852980 8.852980 1
-0.4848390 FALSE 9.515161 10.515161 9.515161 1
0.7981876 FALSE 10.798188 11.798188 10.798188 1
1.3369512 FALSE 11.336951 12.336951 11.336951 1
-0.2514232 TRUE 10.748577 10.748577 9.748577 1
1.6246313 FALSE 11.624631 12.624631 11.624631 1
3.0434204 TRUE 14.043420 14.043420 13.043420 1
0.3017659 FALSE 10.301766 11.301766 10.301766 1
0.1910647 FALSE 10.191065 11.191065 10.191065 1
-0.1891320 TRUE 10.810868 10.810868 9.810868 1
-2.7064128 FALSE 7.293587 8.293587 7.293587 1
0.6709531 TRUE 11.670953 11.670953 10.670953 1
0.1858125 TRUE 11.185813 11.185813 10.185813 1
1.9276119 TRUE 12.927612 12.927612 11.927612 1
1.0297105 TRUE 12.029711 12.029711 11.029711 1
-2.2170420 FALSE 7.782958 8.782958 7.782958 1
-4.8140107 TRUE 6.185989 6.185989 5.185989 1
2.2621623 FALSE 12.262162 13.262162 12.262162 1
3.0677530 FALSE 13.067753 14.067753 13.067753 1
0.1824523 FALSE 10.182452 11.182452 10.182452 1
-2.1729546 FALSE 7.827045 8.827045 7.827045 1
-0.5478949 TRUE 10.452105 10.452105 9.452105 1
-0.2598406 TRUE 10.740159 10.740159 9.740159 1
-1.6644754 TRUE 9.335525 9.335525 8.335525 1
3.8395333 TRUE 14.839533 14.839533 13.839533 1
-1.7928331 FALSE 8.207167 9.207167 8.207167 1
-1.6051881 TRUE 9.394812 9.394812 8.394812 1
-0.5268842 FALSE 9.473116 10.473116 9.473116 1
3.2744683 TRUE 14.274468 14.274468 13.274468 1
-0.7001067 TRUE 10.299893 10.299893 9.299893 1
-0.4020117 FALSE 9.597988 10.597988 9.597988 1
0.6493842 FALSE 10.649384 11.649384 10.649384 1
7.7224753 FALSE 17.722475 18.722475 17.722475 1
0.1179377 TRUE 11.117938 11.117938 10.117938 1
-1.2592594 FALSE 8.740741 9.740741 8.740741 1
-2.2122149 TRUE 8.787785 8.787785 7.787785 1
3.2661898 FALSE 13.266190 14.266190 13.266190 1
-1.4583202 FALSE 8.541680 9.541680 8.541680 1
2.0347647 TRUE 13.034765 13.034765 12.034765 1
0.5586079 TRUE 11.558608 11.558608 10.558608 1
2.8451873 TRUE 13.845187 13.845187 12.845187 1
0.0975708 FALSE 10.097571 11.097571 10.097571 1
-2.3853189 TRUE 8.614681 8.614681 7.614681 1
1.9161986 TRUE 12.916199 12.916199 11.916199 1
3.5303603 FALSE 13.530360 14.530360 13.530360 1
-0.3455935 FALSE 9.654407 10.654407 9.654407 1
-4.0094273 TRUE 6.990573 6.990573 5.990573 1
0.7104924 TRUE 11.710492 11.710492 10.710492 1
-0.4935580 TRUE 10.506442 10.506442 9.506442 1
2.4614205 TRUE 13.461420 13.461420 12.461420 1
0.7822820 TRUE 11.782282 11.782282 10.782282 1
0.2109578 FALSE 10.210958 11.210958 10.210958 1
0.0585406 FALSE 10.058541 11.058541 10.058541 1
2.4452118 TRUE 13.445212 13.445212 12.445212 1
0.5311174 TRUE 11.531117 11.531117 10.531117 1
2.9355915 TRUE 13.935591 13.935591 12.935591 1
1.3721081 FALSE 11.372108 12.372108 11.372108 1
2.9175897 TRUE 13.917590 13.917590 12.917590 1
-2.2117099 TRUE 8.788290 8.788290 7.788290 1
1.2260678 TRUE 12.226068 12.226068 11.226068 1
-0.9260947 TRUE 10.073905 10.073905 9.073905 1
-4.4512894 TRUE 6.548711 6.548711 5.548711 1
3.0451754 FALSE 13.045175 14.045175 13.045175 1
1.0369026 FALSE 11.036903 12.036903 11.036903 1


run_estim()

We then can write a function that takes as input a tibble generated by generate_data(), runs the estimation and selects the information of interest. Here, since our model is very simple, the estimation is also very simple.

run_estim <- function(data) {
  lm(grade ~ treat, data = data) |> 
    summary() |> 
    broom::tidy(conf.int = TRUE, conf.level = 0.95) |> 
    filter(term == "treatTRUE") |> 
    select(-term)
}

It returns the following dataset:

Code
baseline_param |> 
  purrr::pmap(.f = generate_data) |> 
  list_rbind() |> #pmap returns a list, turn it into a data frame
  run_estim() |> 
  kable()
estimate std.error statistic p.value conf.low conf.high
0.0987152 0.3982716 0.247859 0.8047621 -0.6916417 0.8890722

compute_sim()

We then combine generate_data() and run_estim() into a function that computes one iteration of the simulation. We can also add a line of code to store the parameters used in this simulation.

compute_sim <- function(...) {
  generate_data(...) %>% 
    run_estim() %>% 
    cbind(as_tibble(list(...))) |> #add parameters used for generation
    as_tibble()
}

Taking the baseline parameters as inputs, it returns this data frame:

Code
baseline_param |> 
  purrr::pmap(.f = compute_sim) |> 
  list_rbind() |> 
  kable()
estimate std.error statistic p.value conf.low conf.high n p_treat alpha_0 beta_0 sigma_u
0.4705079 0.4120373 1.141906 0.2562755 -0.3471668 1.288183 100 0.5 10 1 2

We now have a function that runs one simulation for a given set of parameters.

Replication of the process

To run several iterations of the simulation, we can build a data frame of parameters such that each row contains the values of the parameters for one single iteration of the simulation.

If we want to run n_iter iterations of the simulation for the baseline parameter values, we only have to replicate the one-row tibble baseline_param using the function uncount() for instance.

n_iter <- 100

param <- baseline_param |> 
  uncount(n_iter) #replicates the rows in the dataset

param |> 
  slice(1:10) |> 
  kable()
n p_treat alpha_0 beta_0 sigma_u
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2
100 0.5 10 1 2

Then, we can pass the resulting set of parameter values to compute_sim():

result_sim <- pmap(param, compute_sim) |> 
  list_rbind()
Code
result_sim |> 
  kable() |> 
  scroll_box(height = "200px", width = "100%")
estimate std.error statistic p.value conf.low conf.high n p_treat alpha_0 beta_0 sigma_u
1.0645568 0.3867274 2.7527314 0.0070431 0.2971088 1.832005 100 0.5 10 1 2
1.1946148 0.4422535 2.7011990 0.0081418 0.3169771 2.072253 100 0.5 10 1 2
1.2150933 0.3948763 3.0771493 0.0027098 0.4314742 1.998712 100 0.5 10 1 2
1.1871223 0.3608696 3.2896152 0.0013946 0.4709882 1.903256 100 0.5 10 1 2
0.7033259 0.3803249 1.8492764 0.0674320 -0.0514165 1.458068 100 0.5 10 1 2
0.5692859 0.4132414 1.3776109 0.1714614 -0.2507783 1.389350 100 0.5 10 1 2
1.5939605 0.3429683 4.6475453 0.0000105 0.9133512 2.274570 100 0.5 10 1 2
1.2099362 0.4287868 2.8217664 0.0057830 0.3590227 2.060850 100 0.5 10 1 2
0.7437744 0.3744973 1.9860607 0.0498199 0.0005967 1.486952 100 0.5 10 1 2
0.7024094 0.4130424 1.7005744 0.0921943 -0.1172599 1.522079 100 0.5 10 1 2
1.1624069 0.3687204 3.1525432 0.0021481 0.4306933 1.894121 100 0.5 10 1 2
0.7839064 0.4117092 1.9040294 0.0598398 -0.0331171 1.600930 100 0.5 10 1 2
1.1076757 0.4491706 2.4660468 0.0153986 0.2163113 1.999040 100 0.5 10 1 2
1.0024697 0.4285300 2.3393222 0.0213485 0.1520659 1.852874 100 0.5 10 1 2
1.1721744 0.3874149 3.0256303 0.0031689 0.4033620 1.940987 100 0.5 10 1 2
0.9295248 0.4125592 2.2530698 0.0264840 0.1108144 1.748235 100 0.5 10 1 2
0.9012358 0.3637596 2.4775588 0.0149395 0.1793667 1.623105 100 0.5 10 1 2
0.4123999 0.3670412 1.1235792 0.2639364 -0.3159814 1.140781 100 0.5 10 1 2
1.1331686 0.4381715 2.5861300 0.0111768 0.2636314 2.002706 100 0.5 10 1 2
1.8214171 0.4070740 4.4744129 0.0000207 1.0135920 2.629242 100 0.5 10 1 2
1.3204910 0.4355786 3.0315794 0.0031125 0.4560995 2.184882 100 0.5 10 1 2
0.7404594 0.3861122 1.9177315 0.0580566 -0.0257676 1.506687 100 0.5 10 1 2
0.6741736 0.4182454 1.6119091 0.1101978 -0.1558208 1.504168 100 0.5 10 1 2
0.7304740 0.3967597 1.8410994 0.0686322 -0.0568827 1.517831 100 0.5 10 1 2
0.8634665 0.4158509 2.0763847 0.0404756 0.0382239 1.688709 100 0.5 10 1 2
0.9867908 0.4141252 2.3828321 0.0191087 0.1649728 1.808609 100 0.5 10 1 2
1.2709567 0.3748693 3.3903996 0.0010072 0.5270408 2.014873 100 0.5 10 1 2
1.5695414 0.3448147 4.5518404 0.0000153 0.8852679 2.253815 100 0.5 10 1 2
1.1056177 0.4461091 2.4783570 0.0149082 0.2203286 1.990907 100 0.5 10 1 2
1.4369849 0.3715280 3.8677702 0.0001978 0.6996997 2.174270 100 0.5 10 1 2
1.1964734 0.4109437 2.9115263 0.0044532 0.3809690 2.011978 100 0.5 10 1 2
1.1439945 0.4066184 2.8134353 0.0059233 0.3370736 1.950916 100 0.5 10 1 2
0.6454429 0.3849875 1.6765296 0.0968219 -0.1185522 1.409438 100 0.5 10 1 2
0.5345560 0.3787496 1.4113706 0.1613024 -0.2170602 1.286172 100 0.5 10 1 2
1.0950670 0.3875495 2.8256183 0.0057192 0.3259876 1.864146 100 0.5 10 1 2
1.7787401 0.3967829 4.4829051 0.0000200 0.9913374 2.566143 100 0.5 10 1 2
1.4809992 0.4121122 3.5936797 0.0005122 0.6631760 2.298822 100 0.5 10 1 2
0.3194561 0.4135999 0.7723794 0.4417485 -0.5013196 1.140232 100 0.5 10 1 2
0.4411000 0.3721907 1.1851453 0.2388260 -0.2975003 1.179700 100 0.5 10 1 2
1.5850684 0.4101843 3.8642837 0.0002002 0.7710711 2.399066 100 0.5 10 1 2
1.0056107 0.4127429 2.4364092 0.0166389 0.1865358 1.824686 100 0.5 10 1 2
1.3198551 0.4345467 3.0373147 0.0030589 0.4575113 2.182199 100 0.5 10 1 2
1.0019786 0.4355416 2.3005348 0.0235377 0.1376605 1.866297 100 0.5 10 1 2
0.8072194 0.4318800 1.8690828 0.0645975 -0.0498323 1.664271 100 0.5 10 1 2
1.7876702 0.3359593 5.3210915 0.0000007 1.1209699 2.454371 100 0.5 10 1 2
1.1752437 0.3581907 3.2810561 0.0014332 0.4644260 1.886062 100 0.5 10 1 2
1.1249549 0.3440297 3.2699355 0.0014849 0.4422392 1.807671 100 0.5 10 1 2
0.9778338 0.4517861 2.1643736 0.0328650 0.0812791 1.874389 100 0.5 10 1 2
1.5472503 0.3990567 3.8772690 0.0001912 0.7553352 2.339165 100 0.5 10 1 2
1.4181649 0.4404056 3.2201337 0.0017387 0.5441943 2.292136 100 0.5 10 1 2
1.0141149 0.3548033 2.8582457 0.0052039 0.3100194 1.718211 100 0.5 10 1 2
0.8694589 0.4244201 2.0485807 0.0431767 0.0272109 1.711707 100 0.5 10 1 2
1.2472834 0.4157420 3.0001380 0.0034218 0.4222569 2.072310 100 0.5 10 1 2
0.8974011 0.4030800 2.2263599 0.0282804 0.0975020 1.697300 100 0.5 10 1 2
1.0005158 0.4116257 2.4306444 0.0168902 0.1836579 1.817374 100 0.5 10 1 2
1.0426250 0.4254577 2.4505961 0.0160345 0.1983180 1.886932 100 0.5 10 1 2
1.9718991 0.4152773 4.7483916 0.0000070 1.1477949 2.796003 100 0.5 10 1 2
0.9237137 0.3735763 2.4726236 0.0151348 0.1823636 1.665064 100 0.5 10 1 2
0.6975123 0.4330569 1.6106714 0.1104677 -0.1618749 1.556900 100 0.5 10 1 2
0.8992152 0.3487402 2.5784675 0.0114112 0.2071517 1.591279 100 0.5 10 1 2
1.3709782 0.3762106 3.6441771 0.0004313 0.6244004 2.117556 100 0.5 10 1 2
0.8694626 0.3829040 2.2707062 0.0253534 0.1096020 1.629323 100 0.5 10 1 2
0.4704114 0.3535182 1.3306568 0.1863899 -0.2311339 1.171957 100 0.5 10 1 2
0.6912771 0.3828058 1.8058167 0.0740169 -0.0683885 1.450943 100 0.5 10 1 2
1.8918798 0.4018441 4.7079949 0.0000082 1.0944333 2.689326 100 0.5 10 1 2
1.5497550 0.3865996 4.0086826 0.0001192 0.7825607 2.316949 100 0.5 10 1 2
0.4022261 0.4223832 0.9522776 0.3432990 -0.4359797 1.240432 100 0.5 10 1 2
1.2696831 0.3631001 3.4967855 0.0007093 0.5491228 1.990243 100 0.5 10 1 2
1.9611435 0.4064663 4.8248608 0.0000051 1.1545243 2.767763 100 0.5 10 1 2
1.1781857 0.3789875 3.1087722 0.0024594 0.4260974 1.930274 100 0.5 10 1 2
0.8484961 0.3652482 2.3230673 0.0222430 0.1236730 1.573319 100 0.5 10 1 2
1.3673080 0.3729516 3.6661812 0.0004000 0.6271978 2.107418 100 0.5 10 1 2
1.0944645 0.4192586 2.6104761 0.0104604 0.2624595 1.926470 100 0.5 10 1 2
0.6579463 0.4015362 1.6385728 0.1045090 -0.1388892 1.454782 100 0.5 10 1 2
0.5460748 0.4288729 1.2732788 0.2059311 -0.3050095 1.397159 100 0.5 10 1 2
1.5092727 0.4275526 3.5300279 0.0006348 0.6608084 2.357737 100 0.5 10 1 2
1.0264178 0.3960114 2.5918895 0.0110034 0.2405461 1.812290 100 0.5 10 1 2
0.7438798 0.3885072 1.9147131 0.0584455 -0.0271001 1.514860 100 0.5 10 1 2
0.9649140 0.3801549 2.5382129 0.0127173 0.2105090 1.719319 100 0.5 10 1 2
1.2350336 0.3722354 3.3178837 0.0012738 0.4963447 1.973723 100 0.5 10 1 2
1.0572469 0.3991354 2.6488431 0.0094154 0.2651758 1.849318 100 0.5 10 1 2
1.0608290 0.4167005 2.5457830 0.0124619 0.2339005 1.887758 100 0.5 10 1 2
1.2794563 0.4984122 2.5670644 0.0117684 0.2903734 2.268539 100 0.5 10 1 2
0.6165308 0.3930177 1.5687102 0.1199388 -0.1634000 1.396462 100 0.5 10 1 2
1.1463968 0.4061707 2.8224510 0.0057716 0.3403644 1.952429 100 0.5 10 1 2
0.5624193 0.4217443 1.3335553 0.1854410 -0.2745185 1.399357 100 0.5 10 1 2
1.3684339 0.3930059 3.4819681 0.0007451 0.5885266 2.148341 100 0.5 10 1 2
0.9069241 0.3880047 2.3374053 0.0214523 0.1369415 1.676907 100 0.5 10 1 2
1.7209916 0.4321884 3.9820402 0.0001313 0.8633278 2.578655 100 0.5 10 1 2
0.9090895 0.4173288 2.1783534 0.0317781 0.0809142 1.737265 100 0.5 10 1 2
0.9640210 0.4466262 2.1584516 0.0333351 0.0777059 1.850336 100 0.5 10 1 2
1.7227144 0.3678578 4.6830991 0.0000091 0.9927126 2.452716 100 0.5 10 1 2
0.8246534 0.3826758 2.1549661 0.0336145 0.0652457 1.584061 100 0.5 10 1 2
1.1426723 0.3698293 3.0897291 0.0026075 0.4087581 1.876587 100 0.5 10 1 2
0.7958239 0.3838384 2.0733308 0.0407650 0.0341092 1.557539 100 0.5 10 1 2
1.8743224 0.4198037 4.4647591 0.0000215 1.0412356 2.707409 100 0.5 10 1 2
1.0666256 0.4018105 2.6545490 0.0092683 0.2692458 1.864005 100 0.5 10 1 2
0.9026697 0.3869554 2.3327484 0.0217063 0.1347692 1.670570 100 0.5 10 1 2
1.4861488 0.4193743 3.5437286 0.0006063 0.6539141 2.318384 100 0.5 10 1 2
0.7394798 0.4003117 1.8472598 0.0677264 -0.0549258 1.533885 100 0.5 10 1 2


Using this method, we can also easily run a series of simulations for several parameter values.

Varying parameters

Remember, the initial goal was to find the right sample size for our an RCT: which sample size should we choose to have enough statistical power. To do, we can vary the sample size in our simulation and see at which sample size the power is larger than a given value.

Varying the sample size

To run simulations for several values of nn, we can modify the param dataframe such that, when passing it to compute_sim(), it will implement the analysis for several values of nn. To do so, we use the function crossing():

n_iter <- 500

vect_n <- seq(from = 50, to = 300, by = 50)

param_var_n <- baseline_param |> 
  select(-n) |> 
  crossing(n = vect_n) |> 
  uncount(n_iter)

This new data set contains 6 (length of vect_n) times n_iter rows. For each value of vect_n, the parameters are repeated n_iter times:

Code
param_var_n |>
  kable() |> 
  scroll_box(height = "200px", width = "100%")
p_treat alpha_0 beta_0 sigma_u n
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 50
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 100
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 150
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 200
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 250
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300
0.5 10 1 2 300


To choose the range of values for the sample size, first we proceed a bit at random, indicating a wide but coarse array of values (for instance vect_n <- seq(10, 1000, by = 100)) and then narrow it down and make it more precise, depending on simulation results.

Tip

When exploring which parameters values you want to consider, use a small number of iterations. The results will be imprecise and vary a lot but will be quick to compute.

We can now compute the simulation for all parameters values and store the results in a data frame.

sim_res_var_n <- param_var_n |> 
  pmap(compute_sim, .progress = TRUE) |> 
  list_rbind()

beepr::beep()
Tip

When the simulations take long time to run, we use the .progress parameter of the pmap() function. In addition, we use the beepr::beep() function to produce a sound to inform you that the simulation is done.

We can now compute the statistical power, ie, the proportion of significant estimates. We also compute the standard error of this mean; these results should not be taken at face value.

Code
sim_power_var_n <- sim_res_var_n |> 
  mutate(signif = (p.value < 0.05)) |> 
  group_by(n) |> 
  summarise(
    power = mean(signif),
    se_power = sd(signif)
  ) 

sim_power_var_n |> 
  kable()
n power se_power
50 0.352 0.4780723
100 0.680 0.4669433
150 0.882 0.3229312
200 0.928 0.2587468
250 0.982 0.1330843
300 0.994 0.0773043
Code
sim_power_var_n |> 
  ggplot(aes(x = n, y = power)) + 
  geom_point() +
  geom_line() + 
  labs(
    title = "Statistical Power as Function of Sample Size",
    subtitle = paste("Computed with", n_iter, "simulations"),
    x = "Number of students",
    y = "Statistical Power"
  )

We notice that the power is an increasing function of the number of observations. This result is well kwon and thus not surprising. Power indeed depends on the precision (ie the variance) of the estimator; the more precise the estimate the larger the power. Now, the variance of this univariate OLS estimator is given by σβ̂2=σu2nσx2\sigma_{\hat{\beta}}^2 = \dfrac{\sigma_u^2}{n\sigma_x^2} and is therefore a strictly decreasing function of the number of observations.

To reach the usual (but arbitrary) 80% power threshold, the graph indicates us that we would need about 150 observations. We have answered part of the question of interest.

Warning

This result only holds conditional on the true DGP being the one we simulated. This is very unlikely; the true DGP is certainly more complex.

Varying the proportion of treated

We could also have varied other parameters, like the proportion of treated. There are two approaches to do that: we can modify it ceteris paribus, thus holding the sample size constant or varying both at the same time. We will start by the latter one. To get to the former one, only filter out results for one sample size.

We define a new set of parameters and run the simulations for this set of parameters.

Warning

If we consider a proportion of treated that is too small, chances are that in some of the data sets, the number of treated (or non-treated) students will be 0, preventing the estimation of the effect of interest.

vect_p_treat <- seq(0.2, 0.8, by = 0.1)

param_var_n_ptreat <- param_var_n |> 
  select(-p_treat) |> 
  crossing(p_treat = vect_p_treat) |> 
  uncount(n_iter)

sim_res_var_n_ptreat <- param_var_n_ptreat |> 
  pmap(compute_sim, .progress = TRUE) |> 
  list_rbind()

beepr::beep()

saveRDS(
  sim_res_var_n_ptreat, 
  file = here::here("content", "sim", "data", "sim_res_var_n_ptreat.RDS")
)
Tip

This function takes a tiny bit of time to run (about 40s). To avoid running it over and over again, in particular when compiling the Quarto document, we save the results of the simulation and do not run the code of this chunk.

Code
sim_res_var_n_ptreat <- 
  readRDS(here("content", "sim", "data", "sim_res_var_n_ptreat.RDS"))

sim_res_var_n_ptreat |> 
  group_by(n, p_treat) |> 
  summarise(power = mean(p.value < 0.05)) |> 
  ungroup() |> 
  ggplot(aes(x = n, y = p_treat, fill = power)) + 
  geom_tile(color = "white", linewidth = 1) +
  scale_mediocre_c(pal = "portal", gradient = "left") + 
  labs(
    title = "Given a sample size power is max when half of the sample is treated",
    subtitle = "Power, number of observations and proportion of treated",
    x = "Number of students",
    y = "Proportion of treated students",
    fill = "Statistical Power"
  )

Statistical power is (unsurprisingly) maximized for a proportion of treated of 0.5. This might however be clearer with another graph:

Code
sim_res_var_n_ptreat |> 
  group_by(n, p_treat) |> 
  summarise(power = mean(p.value < 0.05)) |> 
  ungroup() |> 
  ggplot(aes(x = p_treat, y = power, color = factor(n))) + 
  geom_line(linewidth = 1.2) +
  labs(
    title = "Given a sample size power is max when half of the sample is treated",
    subtitle = "Power, number of observations and proportion of treated",
    x = "Proportion of treated students", 
    y = "Statistical Power",
    color = "Number of students"
  )

Power and effect size

The statistical power we computed relied on a very important hypothesis: the effect size. If it was in fact smaller than what we hypothesized, we expect statistical power to be smaller. Simulations confirm that:

Code
vect_beta_0 <- seq(0.4, 1.6, by = 0.2)

param_var_n_beta_0 <- param_var_n |> 
  select(-beta_0) |> 
  crossing(beta_0 = vect_beta_0) |> 
  uncount(n_iter)

sim_res_var_n_beta_0 <- param_var_n_beta_0 |> 
  pmap(compute_sim, .progress = TRUE) |> 
  list_rbind()

beepr::beep()

saveRDS(
  sim_res_var_n_beta_0, 
  file = here::here("content", "sim", "data", "sim_res_var_n_beta_0.RDS")
)
Code
sim_res_var_n_beta_0 <- 
  readRDS(here("content", "sim", "data", "sim_res_var_n_beta_0.RDS"))

sim_res_var_n_beta_0 |> 
  group_by(n, beta_0) |> 
  summarise(power = mean(p.value < 0.05)) |> 
  ungroup() |> 
  ggplot(aes(x = n, y = beta_0, fill = power)) + 
  geom_tile(color = "white", linewidth = 1) +
  scale_mediocre_c(pal = "portal", gradient = "left") + 
  labs(
    title = "Given a sample size power is max when half of the sample is treated",
    subtitle = "Power, number of observations and proportion of treated",
    x = "Number of students",
    y = "Proportion of treated students",
    fill = "Statistical Power"
  )

Hypotheses regarding the true effect size matter a lot in regards to the resulting statistical power.

Complexifying the DGP

As discussed above, our whole analysis relied on the fact that we accurately represented the true DGP. However, it is very likely that the actual DGP is different from the one we modeled. For instance, effects are very likely heterogenous across individuals.

Note

We are often interested in estimating a version of an ATE (Average Treatment Effect). The wording itself (“average”) implies that effects are heterogenous across individuals.

Footnotes

  1. The empirical design implies a 80% probability of getting a significant result.↩︎