论文标题
作为数据驱动规则的例外的权利
The Right to be an Exception to a Data-Driven Rule
论文作者
论文摘要
数据驱动的工具越来越多地用于做出结果决策。他们已经开始向雇主建议哪个工作申请人面试,被告授予保释金的法官,贷方提供贷款的房主等等。在这种情况下,不同的数据驱动规则会导致不同的决策。问题是:对于每个数据驱动的规则,都有例外。尽管数据驱动的规则可能适合某些规则,但它可能并不适合所有人。随着数据驱动的决策变得越来越普遍,在某些情况下,有必要保护个人,而那些没有自己的错误是数据驱动的例外。同时,不可能仔细检查越来越多的数据驱动决策,乞求一个问题:何时以及如何保护数据驱动的异常? 在本文中,我们认为个人有权成为数据驱动规则的例外。也就是说,推定不应是数据驱动的规则(甚至具有很高准确性的规则)适合于任意的利益决策。相反,一个决策者只有在排除决策可能是数据驱动规则的例外的可能性时,才应在行使适当的谨慎和尽职调查(相对于危害风险)时应用规则。在某些情况下,伤害的风险可能如此之低,以至于只需要粗略考虑。尽管应用适当的护理和尽职调查在人为驱动的决策环境中是有意义的,但尚不清楚数据驱动规则的含义。我们建议确定数据驱动的规则是否适合给定的决策可能需要考虑三个因素:个性化,不确定性和危害。我们详细介绍了此权利,为评估数据驱动的规则提供了一个框架,并描述了在实践中调用权利的含义。
Data-driven tools are increasingly used to make consequential decisions. They have begun to advise employers on which job applicants to interview, judges on which defendants to grant bail, lenders on which homeowners to give loans, and more. In such settings, different data-driven rules result in different decisions. The problem is: to every data-driven rule, there are exceptions. While a data-driven rule may be appropriate for some, it may not be appropriate for all. As data-driven decisions become more common, there are cases in which it becomes necessary to protect the individuals who, through no fault of their own, are the data-driven exceptions. At the same time, it is impossible to scrutinize every one of the increasing number of data-driven decisions, begging the question: When and how should data-driven exceptions be protected? In this piece, we argue that individuals have the right to be an exception to a data-driven rule. That is, the presumption should not be that a data-driven rule--even one with high accuracy--is suitable for an arbitrary decision-subject of interest. Rather, a decision-maker should apply the rule only if they have exercised due care and due diligence (relative to the risk of harm) in excluding the possibility that the decision-subject is an exception to the data-driven rule. In some cases, the risk of harm may be so low that only cursory consideration is required. Although applying due care and due diligence is meaningful in human-driven decision contexts, it is unclear what it means for a data-driven rule to do so. We propose that determining whether a data-driven rule is suitable for a given decision-subject requires the consideration of three factors: individualization, uncertainty, and harm. We unpack this right in detail, providing a framework for assessing data-driven rules and describing what it would mean to invoke the right in practice.