论文标题
信息上限以提高概率灵敏度
An information upper bound for probability sensitivity
论文作者
论文摘要
数学模型的不确定输入会导致输出中的不确定性和概率灵敏度分析确定有影响力的输入以指导决策。实际关注的是,输出可能会或不会超过阈值的可能性,并且概率敏感性取决于通常不确定的阈值。最近在文献中提出了Fisher信息和Kullback-Leibler差异,作为阈值无关的灵敏度指标。我们提供了数学证据,表明信息理论指标为概率灵敏度提供了上限。证明是基本的,仅依靠凯奇·史瓦兹(Cauchy-Schwarz)的特殊版本,称为titu的引理。尽管存在各种不平等现象,但概率很少,但概率敏感性范围很少,而这里提出的概率是当前作者知识的新事物。概率灵敏度结合是分析性和数值示例扩展到输入和输出的Fisher信息的。因此,它为基于概率灵敏度指标的决策提供了坚实的数学基础。
Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.