DescriptionThis thesis consists of two parts: high dimensional bootstrap and shape constrained inference. The first part focuses on the bootstrap consistency theory in high dimension and the second on estimation and inference for monotonicity or convexity constrained models.
In Part I, we reduce the existing sample size requirement for the consistency of the empirical bootstrap and the multiplier/wild bootstrap for the maxima of sums of independent random vectors. We also develop a slightly conservative bootstrap approach that was proved to further reduce sample size requirement significantly with vanishing type I error in large scale inference. New comparison and anti-concentration theorems, which are of considerable interest in and of themselves, are developed.
In Part II, we study in two chapters the estimation and local inference mainly for multiple isotonic regression using block estimators. We show that the block estimators can achieve the minimax rate in worst cases and the near parametric rate when the unknown function is piecewise constant. It can well adapt to the case where the unknown function only depends on a subset of variables, matching the oracle minimax rate. For local inference, we propose the first valid tuning-free confidence interval (CI) for function value at a fixed point in multiple isotonic regression. We show the CI has asymptotically exact confidence level and oracle length and extends to many other common monotone models. In the last chapter, we show similar inference procedures also apply to convexity constrained models, including convex regression and log-concave density estimation.