Search results

1 – 5 of 5
Book part
Publication date: 15 January 2010

Sean M. Puckett and John M. Rose

Currently, the state of practice in experimental design centres on orthogonal designs (Alpizar et al., 2003), which are suitable when applied to surveys with a large sample size…

Abstract

Currently, the state of practice in experimental design centres on orthogonal designs (Alpizar et al., 2003), which are suitable when applied to surveys with a large sample size. In a stated choice experiment involving interdependent freight stakeholders in Sydney (see Hensher & Puckett, 2007; Puckett et al., 2007; Puckett & Hensher, 2008), one significant empirical constraint was difficult in recruiting unique decision-making groups to participate. The expected relatively small sample size led us to seek an alternative experimental design. That is, we decided to construct an optimal design that utilised extant information regarding the preferences and experiences of respondents, to achieve statistically significant parameter estimates under a relatively low sample size (see Bliemer & Rose, 2006).

The D-efficient experimental design developed for the study is unique, in that it centred on the choices of interdependent respondents. Hence, the generation of the design had to account for the preferences of two distinct classes of decision makers: buyers and sellers of road freight transport. This paper discusses the process by which these (non-coincident) preferences were used to seed the generation of the experimental design, and then examines the relative power of the design through an extensive bootstrap analysis of increasingly restricted sample sizes for both decision-making classes in the sample. We demonstrate the strong potential for efficient designs to achieve empirical goals under sampling constraints, whilst identifying limitations to their power as sample size decreases.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Content available
Book part
Publication date: 15 January 2010

Abstract

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Book part
Publication date: 15 January 2010

David A. Hensher

It has long been recognised that humans draw from a large pool of processing aids to help manage the everyday challenges of life. It is not uncommon to observe individuals…

Abstract

It has long been recognised that humans draw from a large pool of processing aids to help manage the everyday challenges of life. It is not uncommon to observe individuals adopting simplifying strategies when faced with ever increasing amounts of information to process, and especially for decisions where the chosen outcome will have a very marginal impact on their well-being. The transactions costs associated with processing all new information often exceed the benefits from such a comprehensive review. The accumulating life experiences of individuals are also often brought to bear as reference points to assist in selectively evaluating information placed in front of them. These features of human processing and cognition are not new to the broad literature on judgment and decision-making, where heuristics are offered up as deliberative analytic procedures intentionally designed to simplify choice. What is surprising is the limited recognition of heuristics that individuals use to process the attributes in stated choice experiments. In this paper we present a case for a utility-based framework within which some appealing processing strategies are embedded (without the aid of supplementary self-stated intentions), as well as models conditioned on self-stated intentions represented as single items of process advice, and illustrate the implications on willingness to pay for travel time savings of embedding each heuristic in the choice process. Given the controversy surrounding the reliability of self-stated intentions, we introduce a framework in which mixtures of process advice embedded within a belief function might be used in future empirical studies to condition choice, as a way of increasingly judging the strength of the evidence.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Article
Publication date: 5 April 2021

Sean E. Goodison

The study aims to examine the effect of detective experience on the likelihood of clearing a homicide, while controlling for additional extralegal and case/investigative…

Abstract

Purpose

The study aims to examine the effect of detective experience on the likelihood of clearing a homicide, while controlling for additional extralegal and case/investigative characteristics.

Design/methodology/approach

This study uses homicide and policing data collected from case files in a mid-sized US city. Detective experience is measured in multiple ways. Analytical models include extralegal variables, case characteristics, and proxies of investigative quality as controls. The study uses logistic regression with a dichotomous clearance outcome.

Findings

The results suggest a robust and significant inverse relationship between the years spent as a homicide detective and the likelihood of case closure. However, years of experience with the department overall has a significant and positive relationship to clearance. Investigation-related variables and case characteristics contribute more to model explanatory power than extralegal factors.

Originality/value

The potential role of experience has not been fully explored, with contradictory findings over time. This work builds on previous research to highlight the potential role of experience in clearing cases, while questioning previous assumptions tied to the belief that more experience improves investigative outcomes.

Details

Policing: An International Journal, vol. 44 no. 4
Type: Research Article
ISSN: 1363-951X

Keywords

Abstract

Details

Corporate Fraud Exposed
Type: Book
ISBN: 978-1-78973-418-8

1 – 5 of 5