Search results
1 – 10 of 522Bin Yao, Richard T.R. Qiu, Daisy X.F. Fan, Anyu Liu and Dimitrios Buhalis
Due to product diversity, traditional quality signals in the hotel industry such as star ratings and brand affiliation do not work well in the accommodation booking process on the…
Abstract
Purpose
Due to product diversity, traditional quality signals in the hotel industry such as star ratings and brand affiliation do not work well in the accommodation booking process on the sharing economy platform. From a suppliers’ perspective, this study aims to apply the signaling theory to the booking of Airbnb listings and explore the influence of quality signals on the odds of an Airbnb listing being booked.
Design/methodology/approach
A binomial logistic model is used to describe the influences of different attributes on the market demand. Because of the large sample size, sequential Bayesian updating method is utilized in hospitality and tourism field for the first attempt.
Findings
Results show that, in addition to host-specific information such as “Superhost” and identity verification, attributes including price, extra charges, region competitiveness and house rules are all effective signals in Airbnb. The signaling impact is more effective for the listings without any review comments.
Originality/value
This study contributes to the literature by incorporating the signaling theory in the analysis of booking probability of Airbnb accommodation. The research findings are valuable to hosts in improving their booking rates and revenue. In addition, government and industrial management organizations can have more efficient strategy and policy planning.
Details
Keywords
Garland Durham and John Geweke
Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…
Abstract
Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.
Details
Keywords
Change propagation is the major source of schedule delays and cost overruns in design projects. One way to mitigate the risk of change propagation is to impose a design freeze on…
Abstract
Purpose
Change propagation is the major source of schedule delays and cost overruns in design projects. One way to mitigate the risk of change propagation is to impose a design freeze on components at some point prior to completion of the process. The purpose of this paper is to propose a model-driven approach to optimal freeze sequence identification based on change propagation risk.
Design/methodology/approach
A dynamic Bayesian network was used to represent the change propagation process within a system. According to the model, when a freeze decision is made with respect to a component, a probabilistic inference algorithm within the Bayesian network updates the uncertain state of each component. Based on this mechanism, a set of algorithm was developed to derive optimal freeze sequence.
Findings
The authors derived the optimal freeze sequence of a helicopter design project from real product development process. The experimental result showed that our proposed method can significantly improve the effectiveness of freeze sequencing compared with arbitrary freeze sequencing.
Originality/value
The methodology identifies the optimal sequence for resolution of entire-system uncertainty in the most effective manner. This mechanism, in progressively updating the state of each component, enables an analyzer to continuously evaluate the effectiveness of the freeze sequence.
Details
Keywords
Ishan Kashyap Hazarika and Ashutosh Yadav
This study combines different perspectives on herding, viewing it as a social network heuristic in comparison to other heuristics. The purpose is to use the heuristic view of…
Abstract
Purpose
This study combines different perspectives on herding, viewing it as a social network heuristic in comparison to other heuristics. The purpose is to use the heuristic view of herding as found in early literature and test it on grounds of efficiency and payoff, in essence, combining the heuristic and rational agent view of herding. The simulated double auction setting includes agents embedded in a social network, allowing for an examination of herding alongside rational behaviour and imperfect signals.
Design/methodology/approach
In each round of the simulation, levels of homophily, density and fractions of types of agents is set and agents are allowed to follow their respective heuristics under those conditions. Characteristics of the social network, such as the size, levels of different homophilies, density and fractions of different types of agents are varied randomly to gauge their effect on the performance of herders vis-à-vis others and the overall market efficiency through simulation based approach. The data used for the study has been developed in Python and linear models are estimated using R.
Findings
Herding decreases total surplus in private value double auctions, but herders are not worse off than other agents and perform equally in common value auctions. Further, herders and random offerers reduce payoffs of other agents as well, and herding effects the surplus per transaction and not the quantum.
Research limitations/implications
This study explores herding as a strategic behaviour coexisting with rationality and other strategies in specific circumstances. It presents intriguing findings on the impact of herding on individual outcomes and market efficiency, raising new avenues for future research. Implication to research includes a dent on the “sieve” argument of markets rooting out irrationality and from it, a policy implication that follows is the need for corrective measures as markets cannot self-correct this, given herders do not perform worse than others.
Originality/value
The study links the phenomenon of herding to the dynamics of social networks and heuristic-based learning mechanisms that sets apart this research from the majority of existing literature, which predominantly conceptualizes herding as an outcome derived from a perfect Bayesian Equilibrium and a rational learning process.
Details
Keywords
Breitung Jörg and Eickmeier Sandra
This paper compares alternative estimation procedures for multi-level factor models which imply blocks of zero restrictions on the associated matrix of factor loadings. We suggest…
Abstract
This paper compares alternative estimation procedures for multi-level factor models which imply blocks of zero restrictions on the associated matrix of factor loadings. We suggest a sequential least squares algorithm for minimizing the total sum of squared residuals and a two-step approach based on canonical correlations that are much simpler and faster than Bayesian approaches previously employed in the literature. An additional advantage is that our approaches can be used to estimate more complex multi-level factor structures where the number of levels is greater than two. Monte Carlo simulations suggest that the estimators perform well in typical sample sizes encountered in the factor analysis of macroeconomic data sets. We apply the methodologies to study international comovements of business and financial cycles.
Details
Keywords
Several activity-based transportation models are now becoming operational and are entering the stage of application for the modelling of travel demand. In our application, we will…
Abstract
Several activity-based transportation models are now becoming operational and are entering the stage of application for the modelling of travel demand. In our application, we will use decision rules to support the decision-making of the model instead of principles of utility maximization, which means our work can be interpreted as an application of the concept of bounded rationality in the transportation domain. In this chapter we explored a novel idea of combining decision trees and Bayesian networks to improve decision-making in order to maintain the potential advantages of both techniques. The results of this study suggest that integrated Bayesian networks and decision trees can be used for modelling the different choice facets of a travel demand model with better predictive power than CHAID decision trees. Another conclusion is that there are initial indications that the new way of integrating decision trees and Bayesian networks has produced a decision tree that is structurally more stable.
Details
Keywords
Fangqi Hong, Pengfei Wei and Michael Beer
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and…
Abstract
Purpose
Bayesian cubature (BC) has emerged to be one of most competitive approach for estimating the multi-dimensional integral especially when the integrand is expensive to evaluate, and alternative acquisition functions, such as the Posterior Variance Contribution (PVC) function, have been developed for adaptive experiment design of the integration points. However, those sequential design strategies also prevent BC from being implemented in a parallel scheme. Therefore, this paper aims at developing a parallelized adaptive BC method to further improve the computational efficiency.
Design/methodology/approach
By theoretically examining the multimodal behavior of the PVC function, it is concluded that the multiple local maxima all have important contribution to the integration accuracy as can be selected as design points, providing a practical way for parallelization of the adaptive BC. Inspired by the above finding, four multimodal optimization algorithms, including one newly developed in this work, are then introduced for finding multiple local maxima of the PVC function in one run, and further for parallel implementation of the adaptive BC.
Findings
The superiority of the parallel schemes and the performance of the four multimodal optimization algorithms are then demonstrated and compared with the k-means clustering method by using two numerical benchmarks and two engineering examples.
Originality/value
Multimodal behavior of acquisition function for BC is comprehensively investigated. All the local maxima of the acquisition function contribute to adaptive BC accuracy. Parallelization of adaptive BC is realized with four multimodal optimization methods.
Details
Keywords
Mohamed Ibrahim and Mohamed Shehata
Hogarth and Einhorn (1990) posited a psychological model for updating beliefs that is based on an anchoring and adjustment process which incorporates a contrast or surprise…
Abstract
Hogarth and Einhorn (1990) posited a psychological model for updating beliefs that is based on an anchoring and adjustment process which incorporates a contrast or surprise effect; in particular, the larger the current belief in a hypothesis or outcome, the more it is discounted by negative information and the less it is increased by positive information. The model provides a set of predictions that could be of important implications for financial decisions. It predicts strong recency effects for mixed or conflicting information (negative and positive), and no order effects for consistent information (all positive or all negative). Furthermore, an earlier version of the model (1985) predicts that simultaneous processing of consistent information leads to more extreme responses than the sequential processing of the same information. Einhorn and Hogarth refer to this phenomenon as a “dilution effect.” This paper reports the results of testing these qualitative predictions of the belief updating model. Three experiments involving a content rich scenario of asset valuation judgment were conducted using a sample of 120 subjects enroled in two MBA courses. The results support the model's prediction that there is no order effects attributable to sequential processing of consistent information. The results also support the existence of recency effects for mixed information regardless of the response mode. However, no significant effects were observed for processing consistent information under different response modes.
Using a decision support system (DSS) delays the decision‐making process and commits the user to the cost of invoking the system. The existing configurations of decision support…
Abstract
Using a decision support system (DSS) delays the decision‐making process and commits the user to the cost of invoking the system. The existing configurations of decision support systems do not guarantee the profitability of the DSS. If the DSS generates messages that the decision maker can anticipate, then the cost and waiting time as a result of invoking the DSS will not be justified. Proposes a decision support system equipped with a knowledge‐based model that tells the decision maker, prior to invoking the DSS, whether or not it is profitable to invoke the DSS; if invoking the DSS is not profitable, then the decision maker will have to base the decision on pure managerial subjective judgement. Uses a numerical example to illustrate the work of the proposed DSS.
Details