The package offers two functions:
Data-generating auction_generate_data()
, which simulates outcomes of a procurement auction, where the winning bid is the amount that a single buyer will pay to the top bidding supplier.
Estimating auction_model()
, which recovers the parameters of the Weibull distribution for private costs \((\mu\) and \(\alpha)\), the Log-Normal variance of the unobserved heterogeneity (\(\sigma\)), as well as the loads on the observed heterogeneity \((\beta_i)\).
The code below generates a vector of winning bids, winning_bid
, with a corresponding number of bids, n_bids
, and a set of observed heterogeneity covariates Xi
.
set.seed(5)
auction_generate_data(obs = 100, mu = 10, alpha = 2,
dat <-sigma = 0.2, beta = c(-1,1),
new_x_mean= c(-1,1),
new_x_sd = c(0.5,0.8))
head(dat)
#> winning_bid n_bids X1 X2
#> 1 68.839954 3 -1.2637836 1.1013900
#> 2 13.591898 10 -0.5770019 0.6112398
#> 3 9.865958 10 -0.5892253 0.1969346
#> 4 242.628261 10 -1.6678847 2.4273626
#> 5 8.754863 6 -0.4784410 0.3062844
#> 6 35.739652 8 -1.5614955 0.2235402
The simulated sample above was constructed in such a way that, when passed to the estimation procedure with certain initial values, it will not produce standard errors for the MLE estimators. This is due to the fact that the Hessian matrix is approximated numerically, so there is no guarantee that it will be a positive definite:
## Standard error calculation fails in the following single run
auction_model(dat,
res <-init_param = c(8, 2, .5, .4, .6),
num_cores = 1,
std_err = TRUE)
#> Warning in auction_model(dat, init_param = c(8, 2, 0.5, 0.4, 0.6), num_cores
#> = 1, : The estimated Hessian matrix is not positive definite, so standard
#> errors will not be produced. The routine may not have found the global minimum.
#> We suggest re-running the routine with different starting values or using a
#> different optimization method (see ?optim for a full list).
res#>
#> Estimated parameters (SE):
#> mu 11.855447 (--)
#> alpha 1.670615 (--)
#> sigma 0.180057 (--)
#> beta[1] -0.965104 (--)
#> beta[2] 0.981842 (--)
#>
#> Maximum log-likelihood = -422.825
This issue can be solved by following a classic best-practice recommendation for optimization problems to use several initial values and run the procedure multiple times to ensure that the ultimate solution is a global optimum. Here is a recommended code, where we run the estimation procedure 4 times and only select cases where the standard errors were obtained with a valid Hessian:
## Solving the issue with multiple runs
list()
res_list <- c()
max_llik <- c(8, 2, .5, .4, .6)
init_param0 =
set.seed(100)
for (i in 1:4){
c(abs(init_param0[1:3]*rnorm(3) + 5*rnorm(3)), init_param0[4:5] + .5*rnorm(2))
init_param = auction_model(dat, init_param = init_param, num_cores = 1, std_err = TRUE)
res <-print(res)
## Only keeping results with valid standard errors
if (all(!is.na(res$std_err))){
c(res_list, list(res))
res_list <- c(max_llik, res$value)
max_llik =
}
}#> Warning in auction_model(dat, init_param = init_param, num_cores = 1, std_err
#> = TRUE): The estimated Hessian matrix is not positive definite, so standard
#> errors will not be produced. The routine may not have found the global minimum.
#> We suggest re-running the routine with different starting values or using a
#> different optimization method (see ?optim for a full list).
#>
#> Estimated parameters (SE):
#> mu 11.856610 (--)
#> alpha 1.670351 (--)
#> sigma 0.179924 (--)
#> beta[1] -0.965032 (--)
#> beta[2] 0.981821 (--)
#>
#> Maximum log-likelihood = -422.825
#>
#> Estimated parameters (SE):
#> mu 11.850268 (0.795955)
#> alpha 1.670388 (0.096351)
#> sigma 0.179752 (0.021367)
#> beta[1] -0.964933 (0.064690)
#> beta[2] 0.982139 (0.028195)
#>
#> Maximum log-likelihood = -422.82
#> Warning in auction_model(dat, init_param = init_param, num_cores = 1, std_err
#> = TRUE): The estimated Hessian matrix is not positive definite, so standard
#> errors will not be produced. The routine may not have found the global minimum.
#> We suggest re-running the routine with different starting values or using a
#> different optimization method (see ?optim for a full list).
#>
#> Estimated parameters (SE):
#> mu 11.856782 (--)
#> alpha 1.670366 (--)
#> sigma 0.179926 (--)
#> beta[1] -0.965022 (--)
#> beta[2] 0.981811 (--)
#>
#> Maximum log-likelihood = -422.825
#>
#> Estimated parameters (SE):
#> mu 11.857747 (0.897312)
#> alpha 1.669761 (0.106674)
#> sigma 0.179310 (0.028005)
#> beta[1] -0.964912 (0.059557)
#> beta[2] 0.981907 (0.029767)
#>
#> Maximum log-likelihood = -422.824
Two out of four auction_model()
runs produced standard errors. We then select the one, which reports the highest likelihood:
res_list[[which.max(max_llik)]]
res_final <-
res_final#>
#> Estimated parameters (SE):
#> mu 11.850268 (0.795955)
#> alpha 1.670388 (0.096351)
#> sigma 0.179752 (0.021367)
#> beta[1] -0.964933 (0.064690)
#> beta[2] 0.982139 (0.028195)
#>
#> Maximum log-likelihood = -422.82
Note that the estimated parameters are close to the true values of mu=10, alpha = 2, sigma = 0.2, beta = c(-1,1))
.