Applied Decision Analysis: Utilizing Poliheuristic Theory to Explain and Predict Foreign Policy and National Security Decisions
In: International Studies Perspectives 6(1):94 - 98, 2005
30893 Ergebnisse
Sortierung:
In: International Studies Perspectives 6(1):94 - 98, 2005
SSRN
Working paper
In: Vestnik MGIMO-Universiteta: naučnyj recenziruemyj žurnal = MGIMO review of international relations : scientific peer-reviewed journal, Heft 5(38), S. 56-78
ISSN: 2541-9099
Applied analysis of international relations began to form at MGIMO-University in the 1970s. This kind of research always attracted considerable interest of the Ministry of Foreign Affairs of the USSR, and other executive institutions of the government and received their support. The Ministry of Foreign Affairs initiated the creation of a special unit at MGIMO - the Problem Research Laboratory of Systems Analysis in International Relations. The Laboratory was using system analysis and quantitative methods to produce scientific information for decision-makers to make "more informed decisions in the field of international relations in order to reduce the level of uncertainty in the assessment of the expected impact of these decisions". In 2004, the successor to the Problem Laboratory - Center for International Studies - was transformed into a Research Coordination Council for International Studies, which in 2009 handed its functions to the Institute of International Studies. In comparison with previous periods the Institute of International Studies has significantly increased of research for the Ministry of International Affairs. It has also moved functionally outside its institutional boundaries and produces unclassified research for public offer. It also serves as a place for vivid public discussions among IR specialists. There's also an international recognition of the Institute of International Studies. The "Go to think tanks" international ranking produced annually at the University of Pennsylvania has put MGIMO-University on the 10th place in the category of university based think tanks.
In: International studies perspectives: ISP, Band 6, Heft 1, S. 94-98
ISSN: 1528-3585
In: International studies perspectives: a journal of the International Studies Association, Band 6, Heft 1, S. 94-98
ISSN: 1528-3577
World Affairs Online
SSRN
Working paper
In: Review of Pacific Basin Financial Markets and Policies, Band 14, Heft 4, S. 715-735
ISSN: 1793-6705
The main purpose of this paper is to advocate a rule-based forecasting technique for anticipating stock index volatility. This paper intends to set up a stock index indicators projection prototype by using a multiple criteria decision making model consisting of the cluster analysis (CA) technique and Rough Set Theory (RST) to select the important attributes and forecast TSEC Capitalization Weighted Stock Index. The projection prototype was then released to forecast the stock index in the first half of 2009 with an accuracy of 66.67%. The results point out that the decision rules were authenticated to employ in forecasting the stock index volatility appropriately.
In: History of political economy, Band 49, Heft Supplement, S. 78-102
ISSN: 1527-1919
In: Forum qualitative Sozialforschung: FQS = Forum: qualitative social research, Band 14, Heft 3
ISSN: 1438-5627
The key to using an analytic method is to understand its underlying logic and figure out how to incorporate it into the research process. In the case of Qualitative Comparative Analysis (QCA), so far these issues have been addressed only partly. While general introductions and user's guides for QCA software packages are available, prospective users find little guidance as to how the method works in applied data analysis. How can QCA be used to produce comprehensive, ingenious explanations of social phenomena? In this article, the author provides such a hands-on introduction to QCA. In the first two parts, he offers a concise overview of 1. the method's main principles and advantages as well as 2. its vital concepts. In the subsequent part, he offers suggestions for 3. how to employ QCA's analytic tools in the research process and how to interpret their output. Lastly, the author shows 4. how QCA results can inform the data analysis. As the main contribution, he provides a template for how to reassess cases, causal recipes, and single conditions based on QCA results in order to produce better explanations of what is happening in the data. With these contributions, the article helps prospective QCA users to utilize the full potential the method offers for social science research.
In: Izvestiya of Altai State University, Heft 1(123), S. 89-94
ISSN: 1561-9451
The article deals with the problem of the reconciliation of observation results, which arises when solving problems of interval analysis of a database. It is found that the values of the set of input variables and the output variable are consistent if the graph of the desired dependence is located at the inner points of the interval hyper-rectangle in each observation. In this case, it is proposed to use special solutions of interval systems of linear algebraic equations (ISLAU) to analyze the data of linear processes. However, in real and model conditions, the specified property of the database is not always fulfilled a priori. In these cases, it is proposed to use the principle of robust estimation: inconsistent observations should either be excluded from the sample or adjusted. This paper presents the results of the study of these methods of matching the used experimental database on model linear processes under conditions when the basic assumptions of interval estimation of dependencies are fulfilled. In addition, variant computational experiments have been investigated to reveal the possibility of increasing the accuracy of interval analysis due to preliminary correction of observations, including the possibility of guaranteed estimation of the sought dependences.
In: Journal of multi-criteria decision analysis, Band 25, Heft 5-6, S. 142-161
ISSN: 1099-1360
AbstractAn overview of uncertainty issues associated with the analytic hierarchy process (AHP) is provided. Further, an explicit understanding of uncertainty (designation, categorization, and quantification) with respect to the methodological properties of the AHP is developed and used to analyse a hypothetical group decision problem located in the context of environmental decision making (EDM). To calculate the numerical impact of especially designed uncertainty scenarios (USs) on the final ranking given by the AHP, a simulation experiment is conducted using R. It evaluates the impact of uncertainty within three variants of a hypothetical decision‐making case by calculating an "overall uncertainty" measure. The consideration of uncertainty may lead to a rank reversal in comparison with that analysis neglecting uncertainty (best alternative given by the AHP). The results show that the absolute maximal impact caused by a US is approximately 0.03. With respect to a single US and the specific case characteristics, in about 50% of the simulated runs a rank reversal occurs. Similar shares of rank reversal over different USs within a single variant of the case raise the question to which uncertainty should be given prior attention in decision‐making practice. For decision analysts in EDM, this result implies that additional resources may be necessary to commonly negotiate with decision makers that uncertainties should be addressed. From a theoretical normative point of view, the effects of considering uncertainty issues in the AHP methodology cannot satisfy the ideal of a rational decision analysis. From a descriptive point of view, considering the practice of decision makers, the impacts of the considered uncertainties stay within reasonable limits, meaning that the maximal numerical impact stays on the hundredths decimal place.
In: Public choice, Band 162, Heft 3-4, S. 451-452
ISSN: 1573-7101
In: Summers, G., "Friction and Decision Rules in Portfolio Decision Analysis," Decision Analysis, articles in advance, Feb. 26, 2021.
SSRN
In: Decision analysis: a journal of the Institute for Operations Research and the Management Sciences, INFORMS, Band 18, Heft 2, S. 101-120
ISSN: 1545-8504
In portfolio decision analysis, features comprise the objectives, alternatives, physics, and information that define a decision context. By modeling features, decision analysts forecast the expected utilities of the alternatives. A model is complete if it contains all the features. A model is well-calibrated if it correctly predicts the probability distributions of each alternative's utility, whereas ill-calibrated models, like those that suffer the optimizer's curse, do not. Friction identifies qualities of a situation that prevent decision analysts from creating complete, well-calibrated models. When friction is significant, can maximizing expected utility be a suboptimal decision rule? Is satisfying decision theory's axioms a necessary or sufficient condition for good decision making? Can rules that violate the axioms outperform rules that satisfy them? A simulation study of how unbiased, imprecise forecasts of payoffs affect project selection finds that, for the example tested, the answers are yes, no, and yes, which suggests that further studies of friction may be worthwhile. Discussions of friction bookend the study, starting the paper by defining friction and concluding by presenting three frameworks, each one from a different field of study, that provide mathematical tools for studying friction.
In: Prakash, Gyan, Ram Kumar Jha and R C Sharma (2006), "Consumption of Nutrients: An Applied Microeconomic Analysis", ICFAI Journal of Applied Economics, Hyderabad, Vol. V, No. 4, July, pp. 54-66 (ISSN: 0972-6861).
SSRN
In: Journal of economic studies, Band 45, Heft 6, S. 1106-1123
ISSN: 1758-7387
Purpose
The purpose of this paper is to study the reasons and decision-making processes of heterogeneous firms' bribery behavior, and how they will affect an aggregate economy's development and corruption status.
Design/methodology/approach
The authors build a dynamic model to study a firm's joint decision to bribe and invest, and how the decision is determined by its production and infrastructure status. The authors simulate the firm-level decision and development paths, and then build an aggregate economy consisting of heterogeneous firms. The authors then also simulate the development and corruption growth paths of the economy, by calibrating the model according to Chinese manufacturing firms in 2012.
Findings
Following the simulation results, the authors conduct counterfactual policy analyses. By comparing between the simulation results of two different counterfactual scenarios, the authors study how a government could control bribing better – as to decrease the number of bribers, and the average amount of the bribery payments. It is found that directly raising the bribery costs works more efficiently in controlling corruption, compared with reducing the benefits received by the bribers. The finding provides insightful policy implications for the government to clear up its economy.
Originality/value
The paper makes a novel and unique contribution to the literature by filling the current theoretical gap. The authors introduce a dynamic firm-level model to interpret firms' bribery decisions and replicate the aggregate stylized facts. The paper innovatively treats bribery as both discrete and continuous decisions. Given both types of bribery decisions, now the authors can successfully simulate and quantify a firm's intertemporal status and growth path.