Probability bounds for polynomial functions in random variables

Simai He, Bo Jiang, Zhening Li, Shuzhong Zhang

Research output: Contribution to journalArticlepeer-review

180 Downloads (Pure)

Abstract

Random sampling is a simple but powerful method in statistics and in the design of randomized algorithms. In a typical application, random sampling can be applied to estimate an extreme value, say maximum, of a function f over a set S⊆ℝⁿ. To do so, one may select a simpler (even finite) subset S₀⊆S, randomly take some samples over S₀ for a number of times, and pick the best sample. The hope is to find a good approximate solution with reasonable chance. This paper sets out to present a number of scenarios for f, S and S₀ where certain probability bounds can be established, leading to a quality assurance of the procedure. In our setting, f is a multivariate polynomial function. We prove that if f is a d-th order homogeneous polynomial in n variables and F is its corresponding super-symmetric tensor, and ξi (i=1,2,...,n) are i.i.d. Bernoulli random variables taking 1 or -1 with equal probability, then Prob{f(ξ₁,ξ₂,...,ξn)≥τn-d/2∥F∥₁}≥θ, where τ,θ>0 are two universal constants and ∥∙∥₁ denotes the summation of the absolute values of all its entries. Several new inequalities concerning probabilities of the above nature are presented in this paper. Moreover, we show that the bounds are tight in most cases. Applications of our results in optimization are discussed as well.
Original languageEnglish
Pages (from-to)889-907
Number of pages19
JournalMathematics of Operations Research
Volume39
Issue number3
Early online date20 Jan 2014
DOIs
Publication statusPublished - 1 Aug 2014

Keywords

  • random sampling
  • probability bound
  • tensor form
  • polynomial function
  • polynomial optimization
  • approximation algorithm

Fingerprint

Dive into the research topics of 'Probability bounds for polynomial functions in random variables'. Together they form a unique fingerprint.

Cite this