The authors believe this free of charge book, Fundamentals of Infrastructure Management, will expand the impact of the material and help improve the practice of infrastructure management. By 'free of charge,' we mean that the material can be freely obtained, but readers should devote time and effort to mastering the material. We have provided problem assignments for various chapters, and we strongly urge readers to undertake the problems as a learning experience. This book grew out of a decade of co-teaching a course entitled 'Infrastructure Management' at Carnegie Mellon University. Our teaching philosophy was to prepare students for work in the field of infrastructure management. We believe that infrastructure management is a professional endeavor and an attractive professional career. The book is co-authored by two accomplished engineers - each representing professional practice, academic research and theoretical evaluation. Their collective strengths are presented throughout the text and serve to support both the practice of infrastructure management and a role for infrastructure management inquiry and search. Importantly, both co-authors have academic research interests (and a number of research publications) on various topics of infrastructure management. That said, the primary audience for this book is expected to be professionals intending to practice infrastructure management, and only secondarily individuals who intend to pursue a career of research in the area. The text draws examples and discusses a wide variety of infrastructure systems, including roadways, telecommunications, power generation, buildings and systems of infrastructure. We have found that some common fundamentals of asset management, analysis tools and informed decision-making are useful for a variety of such systems. Certainly, many infrastructure managers encounter a variety of infrastructure types during their professional careers. Moreover, due to the functional inter-dependencies of different infrastructure systems, it is certainly advantageous for managers of one infrastructure type to understand other types of infrastructure. For example, roadway managers rely upon the power grid for traffic signal operation. The first segment of this book presents fundamental concepts and processes (e.g., the FHWA Asset Management Process), followed by chapters on specific types of infrastructure. In the first segment of the book, we generally use roadways as an example infrastructure application but not exclusively. We have chosen roadways since they are ubiquitous and nearly everyone is familiar with their use (and deterioration!). We are convinced that a life cycle or long-term viewpoint is essential for good infrastructure management. There are always pressures to adopt short term thinking in making investment and management decisions. Political election cycles and short-term stock performance certainly focus attention on immediate priorities or issues. Nevertheless, many infrastructure investments will last for decades or more, and providing good performance over an entire lifetime is critical for good infrastructure management. Even virtual decisions such as the adoption of a particular performance standard for a facility component are likely to have long-term implications. Of course, infrastructure managers may face budget limits or other constraints that preclude long-term optimization. A result is the deferred maintenance and functional obsolescence that exist in many infrastructure systems. However, understanding the effects and implications of these constraints is an important task for infrastructure managers. As a fourth organizational concept, we believe that infrastructure management in a process with multiple objectives (as well as multiple constraints). In particular, infrastructure management should adopt a 'triple bottom line' to consider economic, environmental and social impacts. Again, infrastructure managers may be charged with focusing solely on costs of providing services, but infrastructure certainly has implications for the natural environment and for society. For example, infrastructure management typically involves a large number of workers and affects a large number of users, so social impacts are significant. Throughout this book, we will try to address the impacts of infrastructure decision making with regard to these multiple objectives. Students in our Infrastructure Management course were usually first year graduate students or senior undergraduate students. While most were majoring in engineering disciplines, we also had architects, computer scientists and public policy students successfully complete the course. Indeed, many of our students ended up pursuing a career in some form of infrastructure management, and we are particularly grateful to all our students for their insights, their questions and their feedback on the material. Our course in Infrastructure Management was a full semester with class sessions for 30 to 35 hours over the course of a semester. The order of coverage of material generally followed the order in this book, except that we usually covered one or two infrastructure chapters early in the course to provide context for examples. The course involved class sessions (with a mix of lecture, discussion, videos and in-class exercises), homework assignments and a semester-long group project of the student's choosing. We considered the group project fundamental to a substantive understanding of the concepts expressed in the text. Our textbooks ranged from peer reviewed journal papers to the literal infrastructure "news of the day", and our use of those materials is in part responsible for the framework of this text. We also invited a few practicing infrastructure managers to guest lecture on their own their own activities, problems and successes as practitioners in this sub-discipline. We always included a tour of campus infrastructure, visiting utility tunnels, rooftops, and mechanical rooms – spaces not generally open to students. The text features discussions and materials covering the following infrastructure management related topics: Introduction to Infrastructure Management, Importance of Infrastructure, Goals for Infrastructure Management, Role of Infrastructure Managers, Organizations for Infrastructure Management, Asset Management Process, Inventory, Inspection and Condition Assessment, Deterioration Modeling, Decision Making and Forecasting from Condition Assessment, Regression Models, Markov Deterioration Models, Artificial Neural Network Deterioration Models, Failure Rates and Survival Probabilities, Fault Tree Analysis, Optimization and Decision Making, Performance, Usage, Budget and Cost Functions, Short Run Cost Functions for Infrastructure, Life Cycle Costs, Long Run Investment Decisions and Cost Functions, Decision Analysis and Monte Carlo Simulation for Investment Decisions, Interdependence & Resiliency, Contract and Workflow Management, Commissioning New Facilities, Benchmarking and Best Practices. In addition, the text examines in some detail the following infrastructure systems: Roadway Infrastructure, Building Infrastructure, Water Infrastructure, Telecommunications Infrastructure, Electricity Power Generation Infrastructure, Base, Campus, Park and Port Infrastructure. This list is certainly not exhaustive, and in fact, the various strategies, techniques and tools expressed in the text can be readily extended to virtually any infrastructure system. The second edition is available at: https://doi.org/10.1184/R1/5334379.v1. The third edition is available at: https://doi.org/10.1184/R1/5334379.v2.
Israel is a country in the Near East consisting for 95% of the arid regions in which 60% of the territory are covered by the Negev Desert. Therefore, the water resources are scant here and formed mostly by atmospheric precipitations. In the period from 1989 to 2005 the average precipitations were 6 billion cu. m, of which 60–70% were evaporated soon after rainfalls, at least 5% run down by rivers into the sea (mostly in winter) and the remaining 25% of precipitations infiltrated into soil from where the greater part of water got into the sea with ground waters. In Israel there are two groups of water resources: surface and underground. Israel is not rich in surface waters. The natural reservoir of surface fresh water is the Kinneret Lake in the northeast of the country. It gets water from the Jordan River and its tributaries. The average annual amount of available water of this lake is around 370 million cu. m, which accounts for one-third of the country's water needs and still higher share of the drinking water needs. The greater part of fresh waters (37% of water supply of Israel as of 2011) in this country is supplied from ground water sources. Owing to insufficiency of available natural resources, unevenness of precipitations by years and seasons and with the growth of the population and economic development the issues of provision with the quality drinking water of the population as well as agriculture and industry, rehabilitation of natural environment cause permanently growing concern. In view of the water shortage untiring efforts have been taken to improve the irrigation efficiency and to reduce water use by improving the efficacy of irrigation techniques and application of advanced system management approaches. Among the water saving technologies applied in Israel there are: drop irrigation, advanced filtration, up to date methods of water leak detection from networks, rainwater collection and processing systems. At the same time such measures as water flow measurements, policy of water price formation, changeover to cultivation of valuable crops, thermal water recycling, computer-based and remotely controlled irrigation are also applied. The search for new techniques of fresh water production is going on. Water saving is considered the most reliable and less costly method to increase water resources of the country, And this task is being accomplished in all sectors. In 1964 the National Water Carrier of Israel was constructed. The main task of this project is to achieve the reliable compensation of the difference between water income in various regions (north and south), in different seasons (summer and winter) and in different years (with sufficient and insufficient precipitations). In 1999 the Israel government initiated the long-term large-scale program of sea water desalination for production of drinking water for internal use. Reverse osmosis was adopted the basic technique for desalination of brackish and sea water. Currently there are five sea water desalination plants producing about 600 million cu. m of desalinated water per year which is equivalent to approximately 42% of the country's drinking water needs. Israel adopted the General Plan of Water Economy Development for 2010–2050, which envisages complete coverage of water deficit by way of entire wastewater treatment and construction of additional sea water desalination facilities with a capacity to 1500 million cu. m by 2050. Any additional desalinated water that may become available in these years will be used for replenishing the water supply in Israel. ; Израиль — страна на Ближнем Востоке, состоящая на 95% из засушливых районов, где более 60% территории занимает пустыня Негев. Поэтому водные ресурсы страны крайне ограничены и формируются в основном за счет атмосферных осадков. Средний объем атмосферных осадков за период с 1989 по 2005 год составил 6 млрд. м³. Из этого количества 60–70% испаряется вскоре после выпадения дождя, не менее 5% по руслам рек стекает в море (в основном зимой). Из оставшихся 25% влаги, которая впитывается в почву, значительное количество также попадает в моря с подземным стоком. Общие запасы воды в Израиле можно разделить на два природных водных источника: поверхностный и подземный. Израиль небогат поверхностными водами. Природный резервуар поверхностной пресной воды, один — это расположенное на северо-востоке страны озеро Кинерет, питающееся в основном за счет р. Иордан и его притоков. Среднегодовой объем доступной воды озера составляет примерно 370 млн. м³, что обеспечивает одну треть потребностей страны в воде и более высокую долю потребностей в питьевой воде. Большая часть пресной воды (37% водоснабжения Израиля в 2011 г.) в Израиле добывается из подземных источников. Ввиду ограниченности доступных природных ресурсов, неравномерности осадков по годам и сезонам, по мере роста населения и экономического развития возрастает актуальность проблемы обеспечения населения качественной питьевой водой, а также сельского хозяйства, промышленности и реабилитации природных объектов. Из-за нехватки водных ресурсов предпринимались постоянные усилия по повышению эффективности орошения и сокращению использования воды за счет повышения эффективности методов орошения и использования передовых методов управления системой. В числе водосберегающих технологий в Израиле стоит упомянуть: капельное орошение, продвинутая фильтрация, прогрессивные методики обнаружения утечек воды из сетей, системы сбора и обработки дождевой воды, а также измерение расхода воды, политика ценообразования на воду, переход на высокоценные культуры, повторное использование термальных вод, компьютеризация и дистанционное управление ирригацией. Постоянно осуществляется поиск новых методов получения пресной питьевой воды. Водосбережение является самым надежным и наименее дорогостоящим способом расширения водных ресурсов страны, и эта задача выполняется во всех секторах. В 1964 г. в Израиле завершил сооружение системы водоснабжения — Национальный Всеизраилский водовод, который учитывал следующие факторы: добиваться надежной компенсации разницы в поступлении воды в различных регионах (север и юг), в различные времена года (зима и лето) и в различные годы (с достаточным и недостаточным количеством осадков). Израильское правительство инициировало в 1999 г. долгосрочную крупномасштабную программу опреснения морской воды для производства питьевой воды для внутреннего потребления. Обратный осмос был принят как ведущий метод опреснения солоноватой и морской воды. В настоящее время существует пять опреснительных установок для морской воды, которые обеспечивают около 600 млн. м³ опреснительной воды в год, что эквивалентно примерно 42% потребностей страны в питьевой воде. В Израиле принят Генеральный план развития водного хозяйства на период 2010–2050 гг., который покроет водный дефицит за счет полной очистки сточных вод и строительства дополнительных объектов опреснения морской воды до 1500 млн. м³ к 2050 г. Любая дополнительная опресненная вода, которая станет доступной в течение этих лет, будет использоваться для пополнения природных систем водоснабжения Израиля.
The Agreement on the Conservation of African-Eurasian Migratory Waterbirds (AEWA) aims at the conservation of migratory waterbird populations through the joint production of knowledge, collaborative and sustainable management. Within this framework the RESSOURCE project (Strengthen Southern Sahara Expertise on Birds and their Rational Use for Communities and their Environment) funded by FAO, EU and FFEM was developed. The aim of the project is to improve the knowledge on waterbirds and their using in order to promote a better management of the waterbird populations and habitats on the following sites: the Senegal River Delta (Senegal), Inner Niger Delta (Mali), Lakes Chad and Fitri (Chad), the Nile Delta and Nasser Lake (Egypt) and the Khor Abu Habil (Sudan). Component 3 of the project focus on the using of waterbirds for which there is little or no data available about ecotourism, sport-hunting, recreation, food or commercial hunting. Our aim is to address the local and national socio-economic impacts of these sectors. Therefore, to strengthen the conservation of waterbirds and Sahelien wetlands, the project aims to build national observatories. Indeed, defined as information systems, they facilitate shared understanding of the issues, participate in decision-making, promote collective and coherent actions, organize the management of information flows and their links to actions. This information is derived from the data collected according to a series of indicators generated by this same collection process. Thus, facilitating such collection also facilitates the creation of these observatories. New technologies and social networks are now considered effective tools for conservation. They can contribute significantly to the collection and sharing of data in real time. Today, there are a number of free and open tools used for species conservation against illegal wildlife trade (Traffic 2017). Experiments have shown that SMS may be limited in terms of collection and transfer of complex information (Le Bel, Chavernac et al. 2014). It is therefore necessary to choose the most suitable technology for the production of direct data in real time. To meet our objectives of observatory building, of managing complex data and information flows for sharing, for decision-making and collective action, we turned to OpenDataKit system with KoboToolBox and its KoBoCollect Android smartphone application developed by the Harvard Humanitarian Initiative and OCHA (Kreutzer 2014). The tool responds to the following 4 main actions: i/ Collecting field data on a smartphone, ii/ Centralizing the data on the Internet platform, iii/ Analyzing the data (automatic pre-analysis and downloading to spreadsheet) and iv/ Giving back data to the field. This tool does not require any qualification or advanced technical knowledge. It can be used in areas not covered by the internet because once collected the data is stored in the smartphone. When the investigator finds internet access he can send this data by synchronization to the platform. Previously used for inter-stakeholder study and decision-making projects on human-wildlife conflict mitigation (LeBel, Chavernac et al. 2016) we benefit from the experiences and advice of a growing 'community'. Likewise, since the project is long-term, it aims at the transfer and appropriation by national scientific, administrative and citizen actors of this digital tool, which can improve consultation and decision-making. The use of KoBoCollect involves a 3 step process. Following a first exploratory survey in November 2016, we were able to identify socio-cultural, economic, legal indicators and logistical constraints (in particular the fear of speaking and contradictory speeches) and design a first questionnaire. During the second phase of coconstruction in January 2017 with the national consultants, a questionnaire with 5 themes and two entry points on waterbird utilization, trade and consumption was tested in order to design the survey. The third step was to develop an electronic form under KoBoCollect, to test it for improvement and to train the team of enumerators to a digital note taking. The survey in Lakes Chad and Fitri took place in 20 villages (360 interviews) from 5th March to 9th April 2017. Information collected aims at testing hypotheses on the following behaviors: food choice, food access, income, leisure and pleasure. We collect quantitative and qualitative data through a set of questions focusing on the multi-uses of waterbirds in these territories. Additional questions were added on stakeholders' perceptions and knowledge of wetlands, waterbirds, migration, legislation and hunting. The first Chadian data is currently stored on the KoBoToolbox site and nearly ready for analyzing. The innovation within this socio-economic, socio-cultural and cognitive survey protocol is the use of the KoBoCollect as a co-production tool. Thus, the use of this numerical digital interface tool has several advantages: speed of access to collected data and its sharing, standardization of qualitative data for statistical analysis and pre-analysis carried out on the storage platform of KoBoToolbox, compilation and creation of database, remote access and data sharing and finally solution sharing by the "KoBoToolbox community". In this way, it could constitute the first step to the observatory building. Because of these facilities and by attempting to minimize bias of protocol change, we can improve the collection with the investigators in a more reactive way during the process. This saves time and meets scientific and financial objectives. Also, the consultants' appropriation took place quickly and did not encounter any specific problems, even if they mentioned navigational difficulties within the digital questionnaire, as we might do in an interview with a paper questionnaire and handwritten notes. Similarly, the challenge of collecting qualitative, literal and complex data with a digital interface remains to be considered. Thus, field tests have enabled us to identify important points to optimize and improve the tool whenever possible. For the moment these points concern computer bugs and correspondence of the environment between the website and the smartphones. Since the African continent has a high level of mobile penetration and the highest mobile growth rate (Aker and Mbiti 2010), this type of mobile data collection system could be an efficient tool for strengthening the monitoring and management of natural resources in terms of speed and accession. However, in order that these gains not to be unnecessary, the choice of such a tool must take into account: knowledge of the terrain and its constraints (energy, internet coverage and GSM), co-construction of the questionnaire, standardization of the data collection, numerous field tests, a simplified questionnaire for simplified navigation within the application, the relationship between time passed on direct digitization of data and time passed in preparation and digital collection.
The main objective of this thesis is to develop analysis and mitigation techniques that can be used to face the effects of radiation-induced soft errors - external and internal disturbances produced by radioactive particles, affecting the reliability and safety in operation complex microelectronic circuits. This thesis aims to provide industrial solutions and methodologies for the areas of terrestrial applications requiring ultimate reliability (telecommunications, medical devices, .) to complement previous work on Soft Errors traditionally oriented aerospace, nuclear and military applications.The work presented uses a decomposition of the error sources, inside the current circuits, to highlight the most important contributors.Single Event Effects in sequential logic cells represent the current target for analysis and improvement efforts in both industry and academia. This thesis presents a state-aware analysis methodology that improves the accuracy of Soft Error Rate data for individual sequential instances based on the circuit and application. Furthermore, the intrinsic imbalance between the SEU susceptibility of different flip-flop states is exploited to implement a low-cost SER improvement strategy.Single Event Transients affecting combinational logic are considerably more difficult to model, simulate and analyze than the closely-related Single Event Upsets. The working environment may cause a myriad of distinctive transient pulses in various cell types that are used in widely different configurations. This thesis presents practical approach to a possible exhaustive Single Event Transient evaluation flow in an industrial setting. The main steps of this process consists in: a) fully characterize the standard cell library using a process and library-aware SER tool, b) evaluate SET effects in the logic networks of the circuit using a variety dynamic (simulation-based) and static (probabilistic) methods and c) compute overall SET figures taking into account the particularities of the implementation of the circuit and its environment.Fault-injection remains the primary method for analyzing the effects of soft errors. This document presents the results of functional analysis of a complex CPU. Three representative benchmarks were considered for this analysis. Accelerated simulation techniques (probabilistic calculations, clustering, parallel simulations) have been proposed and evaluated in order to develop an industrial validation environment, able to take into account very complex circuits. The results obtained allowed the development and evaluation of a hypothetical mitigation scenario that aims to significantly improve the reliability of the circuit at the lowest cost.The results obtained show that the error rate, SDC (Silent Data Corruption) and DUE (Detectable Uncorrectable Errors) can be significantly reduced by hardening a small part of the circuit (Selective mitigation).In addition to the main axis of research, some tangential topics were studied in collaboration with other teams. One of these consisted in the study of a technique for the mitigation of flip-flop soft-errors through an optimization of the Temporal De-Rating (TDR) by selectively inserting delay on the input or output of flip-flops.The Methodologies, the algorithms and the CAD tools proposed and validated as part of the work are intended for industrial use and have been included in a commercial CAD framework that offers a complete solution for assessing the reliability of circuits and complex electronic systems. ; L'objectif principal de cette thèse est de développer des techniques d'analyse et mitigation capables à contrer les effets des Evènements Singuliers (Single Event Effects) - perturbations externes et internes produites par les particules radioactives, affectant la fiabilité et la sureté en fonctionnement des circuits microélectroniques complexes. Cette thèse à la vocation d'offrir des solutions et méthodologies industrielles pour les domaines d'applications terrestres exigeant une fiabilité ultime (télécommunications, dispositifs médicaux, .) en complément des travaux précédents sur les Soft Errors, traditionnellement orientés vers les applications aérospatiales, nucléaires et militaires.Les travaux présentés utilisent une décomposition de sources d'erreurs dans les circuits actuels, visant à mettre en évidence les contributeurs les plus importants.Les upsets (SEU) - Evènements Singuliers (ES) dans les cellules logiques séquentielles représentent actuellement la cible principale pour les efforts d'analyse et d'amélioration à la fois dans l'industrie et dans l'académie. Cette thèse présente une méthodologie d'analyse basée sur la prise en compte de la sensibilité de chaque état logique d'une cellule (state-awareness), approche qui améliore considérablement la précision des résultats concernant les taux des évènements pour les instances séquentielles individuelles. En outre, le déséquilibre intrinsèque entre la susceptibilité des différents états des bascules est exploité pour mettre en œuvre une stratégie d'amélioration SER à très faible coût.Les fautes transitoires (SET) affectant la logique combinatoire sont beaucoup plus difficiles à modéliser, à simuler et à analyser que les SEUs. L'environnement radiatif peut provoquer une multitude d'impulsions transitoires dans les divers types de cellules qui sont utilisés en configurations multiples. Cette thèse présente une approche pratique pour l'analyse SET, applicable à des circuits industriels très complexes. Les principales étapes de ce processus consiste à: a) caractériser complètement la bibliothèque de cellules standard, b) évaluer les SET dans les réseaux logiques du circuit en utilisant des méthodes statiques et dynamiques et c) calculer le taux SET global en prenant en compte les particularités de l'implémentation du circuit et de son environnement.L'injection de fautes reste la principale méthode d'analyse pour étudier l'impact des fautes, erreurs et disfonctionnements causés par les évènements singuliers. Ce document présente les résultats d'une analyse fonctionnelle d'un processeur complexe dans la présence des fautes et pour une sélection d'applications (benchmarks) représentatifs. Des techniques d'accélération de la simulation (calculs probabilistes, clustering, simulations parallèles) ont été proposées et évalués afin d'élaborer un environnement de validation industriel, capable à prendre en compte des circuits très complexes. Les résultats obtenus ont permis l'élaboration et l'évaluation d'un hypothétique scénario de mitigation qui vise à améliorer sensiblement, et cela au moindre coût, la fiabilité du circuit sous test. Les résultats obtenus montrent que les taux d'erreur, SDC (Silent Data Corruption) et DUE (Detectable Uncorrectable Errors) peuvent être considérablement réduits par le durcissement d'un petite partie du circuit (protection sélective). D'autres techniques spécifiques ont été également déployées: mitigation du taux de soft-errors des Flip-Flips grâce à une optimisation du Temporal De-Rating par l'insertion sélective de retard sur l'entrée ou la sortie des bascules et biasing du circuit pour privilégier les états moins sensibles.Les méthodologies, algorithmes et outils CAO proposés et validés dans le cadre de ces travaux sont destinés à un usage industriel et ont été valorisés dans le cadre de plateforme CAO commerciale visant à offrir une solution complète pour l'évaluation de la fiabilité des circuits et systèmes électroniques complexes.
The technological advances and success of Service-Oriented Architectures and the Cloud computing paradigm have produced a revolution in the Information and Communications Technology (ICT). Today, a wide range of services are provisioned to the users in a flexible and cost-effective manner, thanks to the encapsulation of several technologies with modern business models. These services not only offer high-level software functionalities such as social networks or e-commerce but also middleware tools that simplify application development and low-level data storage, processing, and networking resources. Hence, with the advent of the Cloud computing paradigm, today's ICT allows users to completely outsource their IT infrastructure and benefit significantly from the economies of scale. At the same time, with the widespread use of ICT, the amount of data being generated, stored and processed by private companies, public organizations and individuals is rapidly increasing. The in-house management of data and applications is proving to be highly cost intensive and Cloud computing is becoming the destination of choice for increasing number of users. As a consequence, Cloud computing services are being used to realize a wide range of applications, each having unique dependability and Quality-of-Service (Qos) requirements. For example, a small enterprise may use a Cloud storage service as a simple backup solution, requiring high data availability, while a large government organization may execute a real-time mission-critical application using the Cloud compute service, requiring high levels of dependability (e.g., reliability, availability, security) and performance. Service providers are presently able to offer sufficient resource heterogeneity, but are failing to satisfy users' dependability requirements mainly because the failures and vulnerabilities in Cloud infrastructures are a norm rather than an exception. This thesis provides a comprehensive solution for improving the dependability of Cloud computing -- so that -- users can justifiably trust Cloud computing services for building, deploying and executing their applications. A number of approaches ranging from the use of trustworthy hardware to secure application design has been proposed in the literature. The proposed solution consists of three inter-operable yet independent modules, each designed to improve dependability under different system context and/or use-case. A user can selectively apply either a single module or combine them suitably to improve the dependability of her applications both during design time and runtime. Based on the modules applied, the overall proposed solution can increase dependability at three distinct levels. In the following, we provide a brief description of each module. The first module comprises a set of assurance techniques that validates whether a given service supports a specified dependability property with a given level of assurance, and accordingly, awards it a machine-readable certificate. To achieve this, we define a hierarchy of dependability properties where a property represents the dependability characteristics of the service and its specific configuration. A model of the service is also used to verify the validity of the certificate using runtime monitoring, thus complementing the dynamic nature of the Cloud computing infrastructure and making the certificate usable both at discovery and runtime. This module also extends the service registry to allow users to select services with a set of certified dependability properties, hence offering the basic support required to implement dependable applications. We note that this module directly considers services implemented by service providers and provides awareness tools that allow users to be aware of the QoS offered by potential partner services. We denote this passive technique as the solution that offers first level of dependability in this thesis. Service providers typically implement a standard set of dependability mechanisms that satisfy the basic needs of most users. Since each application has unique dependability requirements, assurance techniques are not always effective, and a pro-active approach to dependability management is also required. The second module of our solution advocates the innovative approach of offering dependability as a service to users' applications and realizes a framework containing all the mechanisms required to achieve this. We note that this approach relieves users from implementing low-level dependability mechanisms and system management procedures during application development and satisfies specific dependability goals of each application. We denote the module offering dependability as a service as the solution that offers second level of dependability in this thesis. The third, and the last, module of our solution concerns secure application execution. This module considers complex applications and presents advanced resource management schemes that deploy applications with improved optimality when compared to the algorithms of the second module. This module improves dependability of a given application by minimizing its exposure to existing vulnerabilities, while being subject to the same dependability policies and resource allocation conditions as in the second module. Our approach to secure application deployment and execution denotes the third level of dependability offered in this thesis. The contributions of this thesis can be summarized as follows.The contributions of this thesis can be summarized as follows. • With respect to assurance techniques our contributions are: i) de finition of a hierarchy of dependability properties, an approach to service modeling, and a model transformation scheme; ii) de finition of a dependability certifi cation scheme for services; iii) an approach to service selection that considers users' dependability requirements; iv) de finition of a solution to dependability certifi cation of composite services, where the dependability properties of a composite service are calculated on the basis of the dependability certi ficates of component services. • With respect to off ering dependability as a service our contributions are: i) de finition of a delivery scheme that transparently functions on users' applications and satisfi es their dependability requirements; ii) design of a framework that encapsulates all the components necessary to o er dependability as a service to the users; iii) an approach to translate high level users' requirements to low level dependability mechanisms; iv) formulation of constraints that allow enforcement of deployment conditions inherent to dependability mechanisms and an approach to satisfy such constraints during resource allocation; v) a resource management scheme that masks the a ffect of system changes by adapting the current allocation of the application. • With respect to security management our contributions are: i) an approach that deploys users' applications in the Cloud infrastructure such that their exposure to vulnerabilities is minimized; ii) an approach to build interruptible elastic algorithms whose optimality improves as the processing time increases, eventually converging to an optimal solution.
This set of three case studies explores the intersection of openness, digital governance, andhigh quality information in Estonia,1 Finland, and Norway with the aim of identifying lessonsthat will support the same objectives in lower resource countries. Openness, a key aspect ofthe international agenda for increasing transparency and accountability, for reducing public sectorcorruption, and for strengthening economic performance, rests on the principle that citizenshave a right to know what their governments are doing and to benefit from using governmentinformation. Goals for open, accountable, and inclusive governance rest on the assumption thattrustworthy information is available and can be shared meaningfully through strategies for digitalgovernance. This assumption needs to be examined. Does reliable and complete informationexist across lower resource countries? Can it be accessed readily? Will it survive through time?
Agglomeration economies are among the most important factors in increasing firm productivity. However, there is little evidence supportive of this in Africa. Using the firm registry database in Tanzania, this paper examines a new application of the logit approach with two empirical issues taken into account: spatial autocorrelation and endogeneity of infrastructure placement. The paper finds significant agglomeration economies. It is also found that firms are more likely to be located where local connectivity and access to markets are good. The paper finds that dealing with infrastructure endogeneity and spatial autocorrelation in the empirical model is important. According to the exogeneity test, infrastructure variables are likely endogenous. The spatial autoregressive term is significant. As expected, therefore, there are positive externalities of firm location choice around the neighboring areas.
Rapid and consistent economic growth of Ethiopia over the past decade has contributed to reducing the number of people living in poverty. The Government of Ethiopia has created the growth and transformation plan (GTP), focusing on two overarching themes: fostering competitiveness and employment, and enhancing resilience and reducing vulnerabilities. This plan recognizes that for poverty reduction and economic growth to be sustainable, the Ethiopian economic structure will have to undergo a fundamental transformation. In accordance with a focus on poverty reduction and economic growth, the GTP has identified five main levers for change: public sector investment in infrastructure to lay the ground for private sector development, enhancement of policies and regulations to provide an environment conducive to competitiveness and productivity, expanding access to credit for small and medium size enterprises, provision of training and education to augment the supply of skilled labor, and improved access to land. Technological adoption and innovation will play a crucial role in delivering the goals laid out in the GTP. Participation in foreign markets also induces firms to become more innovative, a phenomenon known as learning through exporting, as observed among Ethiopian leather exporters. Another vital determinant of innovative activity is the accumulation of human capital and the skill level of the workforce. This study seeks: (i) to empirically analyze the extent of innovative activities that formal firms are undertaking in Ethiopia; (ii) to conduct a review of the existing innovation landscape; and (iii) to identify opportunities to foster innovations at the base of the pyramid (BoP) in Ethiopia. This study is structured as follows: chapter one gives introduction. Chapter two provides a detailed overview of the characteristics of growth and innovation, by providing key insights based on the enterprise survey analysis on the characteristics, motivations, operational and market environment and constraints of the innovators in Ethiopia. Chapter three assesses the innovation landscape in Ethiopia, by looking at the governmental and private agencies responsible to promote innovation, as well as active programs, and donor initiatives which may play a role in promoting firm level and pro-poor innovations. Chapter four provides policy recommendations to promote innovation in Ethiopia both at the firm level and in the form of pro-poor initiatives.
For the purposes of this project, the East African countries included in the study were Kenya, Rwanda, Tanzania, and Uganda. The focus for this project was Small and Medium-sized Enterprises (SMEs) as for-profit or nonprofit organizations with less than 50 employees and not exceeding USD 1,000,000 in annual revenues/turnover. The main output of this project was a proposed program of interventions to drive transformational change. To succeed in this ambitious endeavor, the project articulated clear objectives and designed a blueprint for implementation including levels of resourcing, budget and monitoring metrics. Over the course of the project the team conducted brief surveys with over 90 entrepreneurs, over 50 percent of who had 3-10 years of experience in the Information and Communication Technology (ICT) sector and primarily worked at companies with 5 employees or less.
In: Decision analysis: a journal of the Institute for Operations Research and the Management Sciences, INFORMS, Band 8, Heft 2, S. 158-162
ISSN: 1545-8504
Ali E. Abbas (" Decomposing the Cross Derivatives of a Multiattribute Utility Function into Risk Attitude and Value ") received the M.S. degree in electrical engineering, the M.S. degree in engineering economic systems and operations research, the Ph.D. degree in management science and engineering, and the Ph.D. (minor) degree in electrical engineering, all from Stanford University, Stanford, California. He was a lecturer in the Department of Management Science and Engineering at Stanford. He previously worked for Schlumberger Oilfield Services, where he held several international positions in wireline logging, operations management, and international training. He was also involved with several consulting projects for mergers and acquisitions in California, and was a co-teacher of several executive seminars on decision analysis at Strategic Decisions Group, Menlo Park, California. He is currently an associate professor in the Department of Industrial and Enterprise Systems Engineering, University of Illinois at Urbana–Champaign, Champaign. His research interests include utility theory, decision making with incomplete information and preferences, dynamic programming, and information theory. Dr. Abbas is a member of INFORMS, a senior member of the IEEE, an associate editor for Decision Analysis and Operations Research, and an editor of the DA column in education for Decision Analysis Today. Address: Department of Industrial and Enterprise Systems Engineering, College of Engineering, University of Illinois at Urbana–Champaign, 117 Transportation Building, MC-238, 104 South Mathews Avenue, Urbana, IL 61801; e-mail: aliabbas@uiuc.edu . Vicki M. Bier (" Deterring the Smuggling of Nuclear Weapons in Container Freight Through Detection and Retaliation ") holds a joint appointment as professor in the Department of Industrial and Systems Engineering and the Department of Engineering Physics at the University of Wisconsin–Madison, where she chairs the Department of Industrial and Systems Engineering. She has directed the Center for Human Performance and Risk Analysis (formerly the Center for Human Performance in Complex Systems) since 1995. She has more than 20 years of experience in risk analysis for the nuclear power, chemical, petrochemical, and aerospace industries. Before returning to academia, she spent seven years as a consultant at Pickard, Lowe and Garrick, Inc. While there, her clients included the U.S. Nuclear Regulatory Commission, the U.S. Department of Energy, and a number of nuclear utilities, and she prepared testimony for Atomic Safety and Licensing Board hearings on the safety of the Indian Point nuclear power plants. Dr. Bier's current research focuses on applications of risk analysis and related methods to problems of security and critical infrastructure protection, under support from the Department of Homeland Security. She is also currently serving as a special term appointee for the Infrastructure Assurance Center at Argonne National Laboratory. Address: Department of Industrial and Systems Engineering, 1513 University Avenue, University of Wisconsin–Madison, Madison, WI 53706; e-mail: bier@engr.wisc.edu . Robert F. Bordley (" Using Bayes' Rule to Update an Event's Probabilities Based on the Outcomes of Partially Similar Events ") is an INFORMS Fellow and a winner of the best publication award from the Decision Analysis Society as well as five major application awards from General Motors. He is a General Motors Technical Fellow with experience in research, planning, quality, marketing, corporate strategy, and procurement. He is also an adjunct professor at the University of Michigan, Ann Arbor, and was formerly program director of Decision, Risk and Management Sciences at the National Science Foundation. Dr. Bordley has published 75 papers in decision analysis, marketing, and operations management. He has also served as chair of the American Statistical Association's Risk Analysis Section (which now has 1000 members), vice president of the Production and Operations Management Society, and a member of the INFORMS Board and the Decision Analysis Society Council. He earned a Ph.D. and M.S. in operations research and an M.B.A. in finance from the University of California, Berkeley. His primary interests have been in theoretical developments enabling high-impact application of decision analysis in a wide variety of corporate contexts (e.g., engineering design, corporate strategy, procurement, program management, etc.). Address: General Motors, Pontiac Centerpoint Campus North, 585 South Boulevard, Pontiac, MI 48341; e-mail: robert.bordley@gm.com , rbordley@umich.edu . Heidi M. Crane (" Whether to Retest the Lipids of HIV-Infected Patients: How Much Does Fasting Bias Matter? ") is an assistant professor of medicine at the University of Washington (UW) School of Medicine and the associate director of Clinical Epidemiology and Health Services Research at the UW Center for AIDS Research (CFAR), which promotes research comparing the effectiveness of management strategies for HIV-infected patients in routine clinical practice. She is co–principal investigator (PI) of a PROMIS (Patient-Reported Outcomes Measurement Information Systems) National Institutes of Health Roadmap initiative U01 on measuring patient reported outcomes in clinical care for HIV-infected patients and PI of a National Institute of Mental Health R01 project on measuring and improving adherence for HIV-infected patients in clinical care. She is also medical director of the Madison HIV Metabolic clinic, PI of an American Heart Association grant on myocardial infarction and metabolic complications among patients with HIV, and PI of an Agency for Healthcare Research and Quality grant on comparative effectiveness of antihypertensive and lipid-lowering medication among HIV-infected patients. She provides care and training in the clinical care of HIV-infected individuals, and she also mentors junior investigators in HIV research in the UW Division of Infectious Diseases. Dr. Crane is a member of the Data Management Centers for the National Institute of Allergy and Infectious Diseases–funded CFAR Network of Integrated Clinical Systems (CNICS) research platform of real-time electronic health record data for 22,000 patients from eight CFARs across the United States, and the International Epidemiological Databases to Evaluate AIDS project's North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), which merges data on 110,000 HIV-infected individuals in care at 60 sites across the United States and Canada. Dr. Crane leads the CNICS Patient Reported Outcomes Committee and the CNICS and NA-ACCORD myocardial infarction event adjudication teams. Dr. Crane's research focuses on methods to improve clinical care for HIV-infected individuals as well as metabolic and other chronic comorbidities of HIV. She received her internal medicine residency training from Barnes and Jewish Hospitals, and her B.A., B.S., M.D., M.P.H. and Infectious Disease Fellowship training from the UW. Address: Harborview Medical Center, 325 9th Avenue, Box 359931, Seattle, WA 98104; e-mail: hcrane@u.washington.edu . Naraphorn Haphuriwat (" Deterring the Smuggling of Nuclear Weapons in Container Freight Through Detection and Retaliation ") is a researcher at the National Metal and Materials Technology Center in Thailand. She applies tools including optimization, decision analysis, and process simulation to improve production processes and operations for small and medium enterprises. She earned her Ph.D. from the University of Wisconsin–Madison in the Department of Industrial and Systems Engineering in August 2010. During her doctoral study, she was supported by the Center for Risk and Economic Analysis of Terrorism Events (CREATE) at the University of Southern California, where she conducted game-theoretic studies in the applications of security. She also received an honorable mention in the 2004–2005 University Book Store Academic Excellence Award Competition for a project related to computer security. Address: 114 Thailand Science Park, Paholyothin Road, Klong 1, Klong Luang, Pathumthani 12120, Thailand; e-mail: naraphoh@mtec.or.th . Joseph B. Kadane (" Whether to Retest the Lipids of HIV-Infected Patients: How Much Does Fasting Bias Matter? ") is Leonard J. Savage University Professor of Statistics and Social Sciences, Emeritus, at Carnegie Mellon University. His research focus is on both foundational issues of Bayesian analysis and applications in many settings. These currently include physics, phylogenetics, air pollution, Internet security, law, and medicine, as well as Internet auctions. He also serves as an expert witness in legal matters. Address: Tepper School of Business, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213; e-mail: kadane@andrew.cmu.edu . L. Robin Keller (" From the Editors: Deterrence, Multiattribute Utility, and Probability and Bayes' Updating ") is a professor of operations and decision technologies in the Merage School of Business at the University of California, Irvine. She received her Ph.D. and M.B.A. in management science and her B.A. in mathematics from the University of California, Los Angeles. She has served as a program director for the Decision, Risk, and Management Science Program of the U.S. National Science Foundation (NSF). Her research is on decision analysis and risk analysis for business and policy decisions and has been funded by NSF and the U.S. Environmental Protection Agency. Her research interests cover multiple-attribute decision making, riskiness, fairness, probability judgments, ambiguity of probabilities or outcomes, risk analysis (for terrorism, environmental, health, and safety risks), time preferences, problem structuring, cross-cultural decisions, and medical decision making. She is currently Editor-in-Chief of Decision Analysis, published by the Institute for Operations Research and the Management Sciences (INFORMS). She is a Fellow of INFORMS and has held numerous roles in INFORMS, including board member and chair of the INFORMS Decision Analysis Society. She is a recipient of the George F. Kimball Medal from INFORMS. She has served as the decision analyst on three National Academy of Sciences committees. Address: The Paul Merage School of Business, University of California, Irvine, Irvine, CA 92697-3125; e-mail: lrkeller@uci.edu . Mari M. Kitahata (" Whether to Retest the Lipids of HIV-Infected Patients: How Much Does Fasting Bias Matter? ") is professor of medicine at the University of Washington (UW) School of Medicine, director of Clinical Epidemiology and Health Services Research at the Center for AIDS Research (CFAR), and principal investigator of the UW HIV Cohort. She has provided care and training in the clinical management of HIV-infected individuals for two decades, and she mentors investigators in HIV research in the UW Division of Infectious Diseases. Dr. Kitahata studies the outcomes of care for persons with HIV infection, and her research has elucidated key determinants of increased survival, including care managed by physicians with HIV expertise and earlier initiation of antiretroviral treatment. The need for observational research to complement the invaluable information provided by randomized controlled trials has grown tremendously, which is why she established the CFAR Clinical Epidemiology and Health Services Research program at UW in 1995 and was among the first CFARs in the United States to do so. Dr. Kitahata developed the structure and methods to merge comprehensive HIV patient data and biological specimens from multiple settings into a powerful resource for researchers conducting basic, translational, clinical outcomes/comparative effectiveness, and behavioral/prevention research. She has led efforts to establish networks of national and international HIV research collaborations to address the most pressing questions regarding treatment and outcomes for HIV-infected individuals that cannot be answered through smaller cohort studies. Dr. Kitahata directs the Data Management Centers for the National Institute of Allergy and Infectious Diseases–funded CFAR Network of Integrated Clinical Systems (CNICS) research platform of real-time electronic health record (EHR) data for 22,000 patients from eight CFARs across the United States, and the International Epidemiological Databases to Evaluate AIDS project's North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), which merges data on 110,000 HIV-infected individuals in care at 60 sites across the United States and Canada. Dr. Kitahata serves on the Board of Directors for the Infectious Diseases Society of America (IDSA) HIV Medicine Association, the U.S. Public Health Service/IDSA Guidelines Committee for Prevention of Opportunistic Infections, and the International Training and Education Center on HIV (I-TECH), where she developed a national EHR system for the Haitian Ministry of Health. Dr. Kitahata received her B.S. from Yale University, M.D. from the University of Pennsylvania, internal medicine residency training at the University of California, San Francisco, and M.P.H. and Fellowship training at the University of Washington, where she was a Robert Wood Johnson Clinical Scholar. Address: University of Washington Medical Center, 1959 NE Pacific Street, UW Box 356423, Seattle, WA 98195-6423; e-mail: kitahata@u.washington.edu . Sanjeev R. Kulkarni (" Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment ") is a professor in the Department of Electrical Engineering at Princeton University. He is also an affiliated faculty member in the Department of Operations Research and Financial Engineering and the Department of Philosophy. Prior to joining Princeton, he was a member of the technical staff at MIT Lincoln Laboratory. During his time at Princeton, he has held visiting or consulting positions with Australian National University, Susquehanna International Group, and Flarion Technologies. Professor Kulkarni has served as an associate editor for the IEEE Transactions on Information Theory, and he is a Fellow of the IEEE. His research interests include statistical pattern recognition, nonparametric statistics, learning and adaptive systems, information theory, wireless networks, and image/video processing. Address: School of Engineering and Applied Science, Princeton University, Princeton, NJ 08544; e-mail: kulkarni@princeton.edu . Daniel N. Osherson (" Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment ") earned his Ph.D. in psychology at the University of Pennsylvania in 1973. Since then he has taught at Stanford University, the University of Pennsylvania, Massachusetts Institute of Technology, Università San Raffael, Rice University, and Princeton University. His work centers on probability judgment and learning. Address: Department of Psychology, Princeton University, Princeton, NJ 08544; e-mail: osherson@princeton.edu . H. Vincent Poor (" Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment ") is dean of the School of Engineering and Applied Science at Princeton University, where he is also the Michael Henry Strater University Professor of Electrical Engineering. He holds a Ph.D. from Princeton. His research interests are in the areas of statistical signal processing, stochastic analysis, and information theory, and their applications to wireless networks and related fields. Among his publications in these areas are the recent books Quickest Detection (Cambridge University Press, 2009) and Information Theoretic Security (NOW Publishers, 2009). Dean Poor is a member of the U.S. National Academy of Engineering and the U.S. National Academy of Sciences, and he is a Fellow of the IEEE, the Institute of Mathematical Statistics, the American Academy of Arts and Sciences, and the Royal Academy of Engineering of the United Kingdom. A former Guggenheim Fellow, recent recognition of his work included the Institution of Engineering and Technology Ambrose Fleming Medal, the IEEE Eric E. Sumner Award, and an honorary D.Sc. from the University of Edinburgh. Address: School of Engineering and Applied Science, Princeton University, Princeton, NJ 08544; e-mail: poor@princeton.edu . Guanchun Wang (" Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment ") received an undergraduate degree in electrical engineering at Shanghai Jiao Tong University. He is currently a Ph.D. student in the Department of Electrical Engineering at Princeton University. His research interests include statistical learning, information retrieval, and judgment aggregation. He also worked as a summer associate for McKinsey's technology practice. Address: School of Engineering and Applied Science, Princeton University, Princeton, NJ 08544; e-mail: guanchun@princeton.edu . Henry H. Willis (" Deterring the Smuggling of Nuclear Weapons in Container Freight Through Detection and Retaliation ") is a professor of policy analysis at the Pardee RAND Graduate School and the associate director of the RAND Homeland Security and Defense Center. His research has applied risk analysis tools to resource allocation and risk management decisions in the areas of public health and emergency preparedness, terrorism and national security policy, energy and environmental policy, and transportation planning. Dr. Willis serves on the editorial board of the journal Risk Analysis and served on the National Academies of Science Committee on Evaluating Testing, Costs, and Benefits of Advanced Spectroscopic Portals. He earned his Ph.D. from the Department of Engineering and Public Policy at Carnegie Mellon University and holds degrees in chemistry and environmental studies from the University of Pennsylvania (B.A.) and in environmental science from the University of Cincinnati (M.A.). Address: RAND Corporation, 4570 Fifth Avenue, Suite 600, Pittsburgh, PA 15213; e-mail: hwillis@rand.org . Xiting (Cindy) Yang (" Whether to Retest the Lipids of HIV-Infected Patients: How Much Does Fasting Bias Matter? ") completed her Ph.D. from Carnegie Mellon University in the area of elicitation, specifically focusing on elicitation of expert knowledge on phylogenies in the format of rooted trees. She is currently a statistical reviewer at the Center for Devices and Radiological Health, U.S. Food and Drug Administration. Her current research focuses on clinical trials and elicitation. Address: U.S. Food and Drug Administration, 10903 New Hampshire Avenue, Building 66, Room 2223, Silver Spring, MD 20993-0002; e-mail: xiting.yang@fda.hhs.gov .
Technical Report 2018-08-ECE-138 Technical Report 2002-09-ECE-007 Enterprise Engineering - A Transdisciplinary Activity: Mapping IT to Core Competency Rajani S. Sadasivam Urcun J. Tanik Murat M. Tanik This technical report is a reissue of a technical report issued December 2002 Department of Electrical and Computer Engineering University of Alabama at Birmingham August 2018 Technical Report 2002-09-ECE-007 Enterprise Engineering- A Transdisciplinary Activity: Mapping IT to Core Competency Rajani S. Sadasivam Urcun J. Tanik Murat M. Tanik TECHNICAL REPORT Department of Electrical and Computer Engineering University of Alabama at Bi1mingham December 2002 ENGINEERING OF ENTERPRISES: A TRANSDISCIPLINARY ACTIVITY Mapping Information Technology to Core Competency Contributors: Murat M. Tanik, Rajani S. Sadasivam, lJrcun J. Tanik 1. ELEMENTS Qli' INTERNET ENTERPRISE ENGINEERING 1.1) Uusiness Language Structure Before any e n gin~ring takes place, the requirements of the project must be identified in as much detail as possible to satisfy the customer. Hence a customer-driven system is created, with the goal of ensuring that the end product is profitable. Business analysts should be able to accurately assess customer needs and break them down into manageable pieces for the technology analysts and engineering team, and that analysis can be effectively done on common ground with an object-oriented design language called Unified Modeling Language (UML). Introduced in November 1997, UML has quickly become the standard modeling language for software development, later adapted by business analysts to systematically design business processes [15]. UML has a business model approach that provides a pJan for engineering an orchestrated set of business functions. It provides a framework by which business is to be perf01med, allowing for changes and various improvements in the process. The model is designed to anticipate changes in business function and adapt software implementation accordingly in order for a business to maintain a competitive edge. One of the advantages of modeling in UML is that it can visually depict functions, relationships, and paradigms. UML is reconunended for business analysts to breakdown a large-scale business operation into its constituent parts for restructming and design. 1.2 Strategic Guidance One innovative approach to safely guiding an lEE venture through unce11ain waters is the Cosmos model, proposed by Yeh [10]. It is a model designed to assist a business manage change through holistic, three-dimensional modeling. One of the important aspects of this model is that it should be noted that three dimensions exist interdependently, as each dimension behaves as an enabler and an inhibitor to the other dimensions. The Cosmos model provides a conceptual tool for managers to guide their company along the best possible path by providing a structure for effective decision-making, while navigating from one point of an organizational situation to another over . time. The managerial decision made at each point in the path detetmines the future course of the company and the most optimum tradeoff's along the way. 1.3 General Business Types and Characterizations The Enterprise Maturity Model is introduced in the beginning of Chapter 2 to serve as a general reference for business and technology analysts working together to build an Intemet enterprise riO]. This model helps to accurately define the objectives for the specific type of organization they wish to build with respect to the maturity level of the organizational structure. In order to characterize a business in terms of its level of maturity, focus, activity, coordination, and infrastructure, these various faclors are covered explicitly. It is very helpful to understand the maturity level of a given organization in order to apply the most appropriate management techniques according to enterprise type. 1.4 Online Business Model Selection and Analysis An enterprise business model is one of the most important aspects in the construction of a viable business initiative. The combination of a company's policy, operations, technology, and ideology defines its business model [16]. Entrepreneurs who wish to launch e-businesses need to be aware of these models and how to implement them effectively. An array of business model types are provided that have proven profitable for actual enterprises in operation today, including the storefront model, auction model, portal model, and dynamic pricing model [11]. The case study is used as an example for online business model selection and analysis. 1.5 Online Financial Transactions In financial transactions, it is critical to have a reliable method to collect payment. Since the scope of this thesis covers lEE systems, this thesis describes those that would be relevant to Cyberspace operations. There are various methods and mechanisms that merchants online use to collect income through electronic transactions. The types of transactions covered include credit card, ewallet, debit card, digital currency, peer-lo-peer, smartcard, micropayment and ebilling mechanisms [11]. 1.6 Online Legal Contracts In order for proper oversight of monetary operations, legal issues must be addressed for lEE projects, so the concept of the online contract is introduced. An online contract can be accomp.lished through the use of Digital Signatures [1 1]. These electronic signatures are the electronic equivalent of written signatures. The ";Electronic Signatures in Global and National Commerce Act of 2000" (E-sign Bill) was recently passed into law [ 1 I, 4]. This technology was developed for use in public-key cryptography to solve the problems of authentication and integrity. The purpose of a digital signature is for electronic authentication. The U.S. govemment's digitalauthentication standard is called the Digital Signature Algorithm (DSA) [1 1]. The U.S. government also recently passed digital-signature legislation that makes digital signatures as legally binding as hand-written signatures. This legislation is designed to promote more activity in e-business by legitimizing online contractual agreements. 1.7 OnUne Security For centuries in human society whenever something of value was transferred, a method to protect that shipment or trade had to be established. Naturally, this subject is discussed to illustrate the measures taken by current security agencies to protect value on the Internet. For example, Netscape Communications developed the Secure Sockets Layer (SSL) protocol, developed as a non-proprietary protocol commonly used to secure communication on the Internet and the Web. SSL is designed to use public key technology and digital certificates to authenticate the server in a transaction and to protect private information as it passes f1·om one party to another over the Internet. Secure Electronic Transaction (SET) protocol was developed by Visa International and Mastercard and was designed specifically to protect e-commerce payment transactions [11, 12]. SET uses digital certificates to authenticate each party in an e-commerce transaction, induding the customer, the merchant, and the merchant's bank. 1 . 8 Online Business Prototyping Technologies and Development In order for technologists to satisfy the requirements set forth by the business analysts expressed in a language such as UML, various high-level tools are needed to develop an accepta~le solution. One such tool, Macromedia . Drumbeat 2000, recently released by Elemental Software, is capable of accepting and delivering complex information and functionality through a Web-interface [20). This technology is introduced as a recommended tool for building a professional enterprise through rapid prototyping. The tool aids a visually skilled Web designer to competitively build a website without necessarily having to do any coding. It is a sophisticated tool that interacts with the back-end database by building user-friendly client-side applications with Active Server Page (ASP) Web technology. 1.9 Government Initiatives in Cyberspace An enterprise planning for the future should consider the overall development of the global Internet infrastructure. It is important to understand the territory in which an Internet enterprise is to be launched. In addition to the current Intemet environment, a new government initiative is in the works, entitled The Next Generation Internet (NGI). This multi-agency, national U.S. research and development program began on October 1, 1997 with the partidpation of the following agencies: DARPA, DOE, NASA, NJH, NIST, and NSF. These agencies are charged with the responsibility of developing advanced networking technologies, developing revolutionary applications that require advanced networking, and demonstrating these capabilities on test beds that are 100 to 1,000 times faster th
Ensuring access to essential medicines is a key objective of all health systems, and is an integral component of the progress towards universal health coverage (UHC). Despite global and national efforts to improve access and affordability of medicines, millions of people – particularly in low- and middle-income countries – still remain without access to quality-assured and affordable medicines. This study aims to contribute to existing knowledge on regulatory systems and harmonization efforts in Southeast Asia. Focusing on five member states of the Association of Southeast Asian Nations (ASEAN) – Indonesia, Malaysia, the Philippines, Thailand, an Vietnam – this study gives an overview of pharmaceutical markets and key pharmaceutical policies in the region, provides a cross-country comparison of medicines regulatory systems, and details harmonization efforts, opportunities, and challenges.
The Pacific region is in the midst of an information and communications technology (ICT) revolution in which small island countries are increasingly connected to the global economy. The regions improving internet connectivity presents an opportunity for Pacific Island countries (PICs) to overcome their inherent limitations, and help address the long-standing issues of employment and income generation, which is vital to accelerate progress towards ending poverty and creating an inclusive society. The PICs possess unique vulnerabilities that make addressing poverty and development challenges particularly difficult and complex. The purpose of this study is to examine the feasibility of leveraging ICT to help generate job opportunities in the PICs. This is because of the positive experiences that several countries, including small island economies, have had in generating new types of employment opportunities. The study's main focus is on cities and urban areas, given the still low availability of internet infrastructure and services in rural and remote areas and islands in the Pacific region. The study focuses specifically on the information technology (IT) - enabled global outsourcing services (GOS) industry. ICT covers diverse subsectors that include telecommunications and IT hardware and software.
The Pacific region is in the midst of an information and communications technology (ICT) revolution in which small island countries are increasingly connected to the global economy. The regions improving internet connectivity presents an opportunity for Pacific Island countries (PICs) to overcome their inherent limitations, and help address the long-standing issues of employment and income generation, which is vital to accelerate progress towards ending poverty and creating an inclusive society. The PICs possess unique vulnerabilities that make addressing poverty and development challenges particularly difficult and complex. The purpose of this study is to examine the feasibility of leveraging ICT to help generate job opportunities in the PICs. This is because of the positive experiences that several countries, including small island economies, have had in generating new types of employment opportunities. The study's main focus is on cities and urban areas, given the still low availability of internet infrastructure and services in rural and remote areas and islands in the Pacific region. The study focuses specifically on the information technology (IT) - enabled global outsourcing services (GOS) industry. ICT covers diverse subsectors that include telecommunications and IT hardware and software.