ABSTRACTA stochastic model of the vacancy/occupancy pattern of a collection of rental units (a building) is used to obtain expressions relating vacancy rates to measurable market parameters. The model is well known in queueing theory, but yields new insigts when interpreted in the above context. By investigating the effects of the number of units, the mixture of units, and the "start‐up" time, one can obtain a number of useful principles with implications for public and private planning.
ABSTRACTThis paper presents a comparison of statistical techniques versus a sales force‐executive opinion approach to forecasting single item demand over a five year period for a single company. Three statistical techniques are used: Winter's three parameter exponential smoothing model, Brown's harmonic model, and Box‐Jenkins methodology. These techniques are compared against the company's forecasts and actual sales for a five year period. The results indicate an interesting area for further research.
ABSTRACTThe committee or group approach is often used where considerable uncertainties have to be handled by skillful judgment. In this paper the authors investigate the subject of influence sharing in judgmental forecasting of commodity prices. They propose a conceptual framework for patterns of influence sharing in groups and for influence roles of individuals. A statistical analysis of the influence of group members on the group forecast leads to inferences about influence pattern and roles in a group.
ABSTRACTThis paper deals with the economics of computer simulation applied to problems in the operations management/industrial engineering area. First, the material in the literature dealing with the economics of simulation is summarized. However, the amount of information available on the economics of simulation is very small. A number of hypotheses are presented for this paucity of material. Since many managers operate in an environment in which costs and benefits are relevant to their decision‐making process, the lack of data on the economics of simulation may be a barrier to the utilization of the technique. In an attempt to improve this situation, a framework for the collection and specification of the economics of simulation modeling projects is proposed as a method of urging management scientists to begin to collect and publish data on the economics of simulation projects.
ABSTRACTThe purpose of this study was to examine the effects of different interdisciplinary problem‐oriented formats on a student's knowledge of and retention of disciplinary concepts and principles, his application of these concepts and principles to disciplinary problems, and his use of a discipline in his analysis of a complex problem. Performances of students enrolled in the experimental sections were compared to those of students enrolled in standard lecture‐discussion sections. Regression analysis was used to analyze students' performance in order to control the effects of differences in student backgrounds.It was found that altering the pedagogical format and reward system within the experimental sections had no significant effect upon student performances in the experimental, problem‐oriented program. In general, this study indicates that students taught in the standard lecture‐discussion format in which grades are determined by examination over course material retain concepts and principles, apply these concepts and principles to disciplinary problems, and integrate disciplinary concepts into their analysis of complex problems as well as, if not better than, students taught in an interdisciplinary, problem‐solving format.
ABSTRACTThe problem of estimating steady state absorption probabilities for first order stationary Markov chains having a finite state space is examined. As model parameters, these probabilities are analytic functions of transition probabilities Q and R, and they can be represented as P= (I‐Q)‐1R. Estimators P̌ may be obtained by replacing the transition probabilities by their maximum likelihood estimators Q and Ř under multinomial theory. Using large sample multivariate normal theory, one can derive the asymptotic distribution of these estimators and can obtain large sample confidence intervals. Finally, an application related to estimating loss reserves for an installment loan portfolio assumed to satisfy a Markov chain is discussed.
ABSTRACTWhile many problems of uncertainty are commonly analyzed by means of stochastic models, under certain circumstances this may not be an appropriate approach. The latter situation arises when the decision maker knows that the uncertain variables are not generated by a stochastic process, or when he is unwilling, or unable, to compute subjective probabilities. One of the nonstochastic approaches to uncertainty is the expectational approach in which the decision maker forms deterministic expectations about the uncertain aspects of his environment.This paper is concerned with some criteria for selecting among available expectations, or anticipations functions, and the possibility of ordering them according to these criteria. This study focuses especially on the learning criterion. The discussion brings out conceptual problems in connection with the definition of learning, as well as some technical difficulties that one encounters when attempting to compare different anticipations functions from the point of view of the learning criterion. As an illustration of the issues discussed, the paper reports on the results of some simulated decision rules. These show that decision rules in which no learning takes place, and in which some information is ignored, may perform better than more sophisticated rules.
ABSTRACTThe author generated several residual patterns under controlled conditions in order to observe the effects of various specification errors. The results and their interpretation are presented in this article extending the work of Richards [3]. A list of conditions which affect the residual pattern resulting from the misspecifications is included.
ABSTRACTThis paper demonstrates the use of computer‐based multivariate techniques in analysing survey data and segmenting markets. Such techniques are increasingly used to deal with the vast amounts of data currently being collected by market researchers. Three techniques are illustrated and compared in detail‐regression analysis, AID, and cluster analysis. These methods have wide application in marketing. A simple data set is used to illustrate their principles and comparative advantages.
ABSTRACTThe entropy measure H=−σpi log pi is being used with increasing frequency in the analysis of business and economic data. It is, however, simply another measure of dispersion which can be related to the moments of the probability function. Its virtues stem from its decomposition and interpretative properties. This paper surveys the uses to which the measure has been put in the literature, and discusses whether its use has been appropriate and innovative.
ABSTRACTAfter a brief review of the role of dummy variables in regression analysis and the current state‐of‐the art in rounding/truncation error detection in computerized least squares programs, this paper presents a theorem that can be used to detect this type of error whenever an analyst is running a regression program that has one (or more) dummy variables as independent variables.
ABSTRACTThe robustness of linear programming regression estimators is examined where the disturbance terms are normally distributed and there are observation errors in the explanatory variables. These errors are occasional gross biases between one set of observations and another. The simulation of short series data offers preliminary evidence that when these biases have a non‐zero mean, MSAE estimation is more robust than least squares.
ABSTRACTAlthough spectral analysis has previously been discussed in a number of business journals, the discussion has not been detailed enough for non‐mathematicians. The objective of this paper is to review in detail the concepts and to go over the computations of spectral analysis as they pertain to forecasting.To gain insight into the model building technique of spectral analysis, a passing comparison with a familiar model–regression–is made. Regression analysis attempts to find a set of independent variables that shed some light on the dependent variable to be forecasted. In other words, if the independent variables have some functional relationship with the dependent variable, a reliable forecast of the dependent variable can then be made.Forecasting using spectral analysis, on the other hand, is based on the assumption that the variation of a time series can be explained by some mixture of sine and cosine waves. Model parameters can then be estimated for these waves and forecasts be made. These parameters have the same property of least squares as in ordinary regression analysis. A transformation of these parameters gives the spectra of the time series. The spectra are related to the explained variation present in regression analysis. An extension of the spectra gives a set of coefficients of an autoregressive forecasting model. This latter model is referred to as the Wiener‐Kolmogorov forecasting model.