The Changing International Law of High Seas Fisheries. By FRANCISCO ORREGO VICUNA. Cambridge: Cambridge University Press, 1999. 338 pp. 45
In: The British yearbook of international law, Band 71, Heft 1, S. 401-403
ISSN: 2044-9437
60 Ergebnisse
Sortierung:
In: The British yearbook of international law, Band 71, Heft 1, S. 401-403
ISSN: 2044-9437
In: Oxford review of economic policy, Band 16, Heft 1, S. 84-94
ISSN: 1460-2121
In: Survey review, Band 35, Heft 271, S. 70-70
ISSN: 1752-2706
In: Common market law review, Band 34, Heft 3, S. 571-602
ISSN: 0165-0750
In: Intelligence and national security, Band 10, Heft 4, S. 113-132
ISSN: 0268-4527
A CONSISTENT THEME IN BRITISH GOVERNMENT PUBLICATIONS CONCERNING BRITAIN'S INTELLIGENCE SYSTEM HAS BEEN THE EMPHASIS ON ITS INCREASED OVERALL CENTRALIZATION, AND THE INCREASINGLY CENTRAL ROLE THEREIN OF THE JOINT INTELLIGENCE COMMITTEE. HOWEVER, THE PROBLEMS AND ISSUES, WHICH HAVE FORMED THE FOCUS OF CURRENT DISCUSSIONS OF INTELLIGENCE ARISING OUT OF THE 1992 MATRIX-CHURCHILL TRIAL AND THE SCOTT INQUIRY, HAVE ALL RESULTED FROM THE FACT THAT THE BRITISH INTELLIGENCE SYSTEM IS, IN FACT, PROFOUNDLY DECENTRALIZED. THIS ESSAY EXAMINES HOW THIS OPTICAL ILLUSION OF CENTRALIZATION HAS DEVELOPED, AND HOW BRITAIN'S INTELLIGENCE MACHINERY HAS DEVELOPED SUCH A SPLIT-PERSONALITY.
In: Politics, Band 8, Heft Oct 88
ISSN: 0263-3957
Looks at what has happened to Britain's nationalized industries since 1979, and then offers an interpretation of these events. The principal theme is that considerable continuity exists between the policies adopted by governments before and after 1979, except ministers do not want to become involved in the management of nationalized industries in order to protect and promote their political status. (PFB)
In: Statistica Neerlandica, Band 49, Heft 2, S. 185-245
ISSN: 1467-9574
This article attempts to provide a formal framework for a data based inference which explicitly and consistently recognizes the approximate nature of probability models. It is based on the idea that a stochastic model is adequate if samples generated under the model are very much like the sample actually obtained. The formalization is based on the concept of data feature. Examples are given of applying the ideas to different areas of statistics including location‐scale models, densities, non‐parametric regression, interlaboratory test, auto‐regressive processes and the analysis of variance. The four cornerstones of the approach are direct comparison, approximation, weak topologies and parsimony. The approach is contrasted to that of much of conventional statistics many of whose concepts are pathologically discontinuous with respect to the topology of data analysis and common sense.
In: International journal of information management, Band 14, Heft 2, S. 84-94
ISSN: 0268-4012
In: Yearbook of European law, Band 11, Heft 1, S. 610-611
ISSN: 2045-0044
In: Business history, Band 32, Heft 3, S. 210-211
ISSN: 1743-7938
In: Yearbook of European law, Band 9, Heft 1, S. 21-53
ISSN: 2045-0044
In: Business history, Band 31, Heft 1, S. 94-95
ISSN: 1743-7938
In: Business history, Band 29, Heft 2, S. 229-230
ISSN: 1743-7938
In: Unemployment, Social Vulnerability, and Health in Europe; Health Systems Research, S. 323-325
In: Business history, Band 23, Heft 2, S. 254-254
ISSN: 1743-7938