Die Verarbeitung großer Datenmengen sowie die hohe Relevanz von Datenanalysen sind in den produzierenden Unternehmen mittlerweile angekommen. Bekannte Anwendungsbeispiele sind Digital Mock-Up in der Produktentwicklung oder Prozessoptimierung durch Predictive Maintenance. Die in letzter Zeit entwickelten Referenzarchitekturen in diesen breitgefächerten Themenfeldern betrachten dementsprechend verschiedene Aspekte in unterschiedlichen Ausprägungen. Dieser aus zwei Teilen bestehende Fachbeitrag rekapituliert und bewertet diese Entwicklungen, um Unternehmen bei der Umsetzung ihrer eigenen individuellen Architektur Hilfestellung zu geben. Im Teil 1 werden aktuelle Referenzarchitekturen mit ihren Architekturbausteinen im Bereich Industrie 4.0 vorgestellt. Im zweiten Teil (Ausgabe 6-2017 der wt Werkstattstechnik online) werden die Referenzarchitekturen unter dem Gesichtspunkt der Themenfelder Analytics sowie Datenmanagement untersucht und bewertet. The processing of huge amounts of data as well as the importance of analytics on data have arrived in the manufacturing industry by now. Well-known usage examples are digital mock-ups in product engineering or process optimization through predictive maintenance. Recently developed reference architectures in these wide-ranging subject areas consider multiple aspects under different characteristics. This article recapitulates and evaluates these developments in two parts and aims to support companies in the implementation of their individual architecture. In this first part, current reference architectures for Industrie 4.0 are introduced. In the second part (to be pubished in issue 6-2017), these architectures are compared and assessed with regard to analytics and data management.
Die Verarbeitung großer Datenmengen sowie die Erkenntnis, dass Datenanalysen eine hohe Relevanz haben, sind in den produzierenden Unternehmen angekommen. Bekannte Anwendungsbeispiele sind Digital Mock-Up in der Produktentwicklung oder Prozessoptimierung durch Predictive Maintenance. Die in letzter Zeit entwickelten Referenzarchitekturen in diesen breitgefächerten Themenfeldern betrachten dementsprechend verschiedene Aspekte in unterschiedlichen Ausprägungen. Dieser aus zwei Teilen bestehende Beitrag rekapituliert und bewertet diese Entwicklungen, um Unternehmen bei der Umsetzung ihrer eigenen individuellen Architektur Hilfestellung zu geben. Im ersten Teil des Beitrags (Ausgabe 3-2017: wt Werkstattstechnik online) wurden aktuelle Referenzarchitekturen mit ihren Architekturbausteinen im Bereich Industrie 4.0 vorgestellt. In diesem zweiten Teil werden nun die Referenzarchitekturen unter dem Gesichtspunkt der Themenfelder Analytics sowie Datenmanagement untersucht und bewertet. The processing of huge amounts of data as well as the importance of data analytics have arrived in the manufacturing industry by now. Well-known usage examples are digital mock-ups in product engineering or process optimization through predictive maintenance. Recently developed reference architectures in these wide-ranging subject areas consider multiple aspects under different characteristics. This two-part article recapitulates and evaluates these developments and aims to support companies in the implementation of their individual architecture. In the first part, published in 3-2017 wt online, current reference architectures for Industrie 4.0 were introduced. In this part, these architectures are compared and assessed with regard to analytics and data management.
Abstract. Mountain debris cones in the Alpine region often provide space for dense population and cultivation. Hence, a great number of buildings are exposed to torrential hazards. In order to protect the settlement areas against flooding and overbank sedimentation, torrent defence structures are implemented at various locations within catchments. Directly at the debris cones, these protection measures often include a deposition basin at the fan apex and/or a confined channel that passes through the settlement. The work presented within this paper deals with the effect of specific outlet structure layouts, situated at the lower end of a selected deposition basin, on bed-load transport processes and flood protection. A case study analysis was accomplished comprising a 3-D numerical model (FLOW-3D) and a physical scale model test (1 : 30). The subject of investigation was the deposition basin of the Larsennbach torrent in the Austrian Northern Limestone Alps. The basin is situated on a large debris cone and opens out into a paved channel. Since the basin is undersized and the accumulation of sediment in the outlet section reduces the available cross section during floods, adjoining settlements are considerably endangered of lateral overtopping of both clear water and sediment. Aiming for an upgrade in flood protection, certain layouts for a "closing-off structure" at the outlet were tested within this project. For the most efficient design layout, its effect on flood protection, a continuous bed-load output from the basin and the best possible use of the retention volume are pointed out. The simple design of the structure and the key aspects that have to be taken into consideration for implementation are highlighted.
Abstract. Since the 1990s the land subsidence due to the rapid urbanization has been considered a severely destructive hazard in the center of Hanoi City. Although previous studies and measurements have quantified the subsiding deformation in Hanoi center, no data exist for the newly established districts in the south and the west, where construction development has been most significant and where groundwater pumping has been very intensive over the last decade. With a multi-temporal InSAR approach, we quantify the spatial distribution of the land subsidence in the entire Hanoi urban region using ALOS images over the 2007–2011 period. The map of the mean subsidence velocity reveals that the northern bank of the Red River appears stable, whereas some areas in southern bank are subsiding with a mean vertical rate up to 68.0 mm yr−1, especially within the three new urban districts of Hoang Mai, Ha Dong – Thanh Xuan and Hoai Duc – Tu Liem. We interpret the spatial distribution of the surface deformation as the combination of the nature of the unsaturated layer, the lowering of groundwater in the aquifers due to pumping withdrawal capacity, the increase of built-up surfaces and the type of building foundation. The piezometric level in Qp aquifer lowers particularly after 2008, whereas the groundwater level in Qh aquifer remains steady, even if it loses its seasonal fluctuation in urban areas and drawdowns in neighboring water production plants. The time evolution deduced from the InSAR time series is consistent with previous leveling data and shows that the lowering rate of the surface slightly decreases till 2008. The analysis of groundwater levels in instrumented wells shows a correlation between the behavior of groundwater with the urban development and the acceleration of groundwater withdrawal. Also, the time variations suggest that the deformation became non-stationary, with upward and downward transient displacements related to the charge and discharge of the aquifers.
Investigating and reporting of foodborne outbreaks became mandatory with Directive 2003/99/EC. In 2006 and 2007 the Community reporting system for foodborne outbreaks was further developed in an interdisciplinary approach, which is described in this paper. This involved experts on investigating and reporting foodborne outbreaks as well as experts on communicable diseases in addition to the European Food Safety Authority (EFSA) Task Force for Zoonoses Data Collection, the European Centre for Disease Prevention and Control (ECDC) Advisory Forum and representatives of ECDC, the World Health Organization (WHO), the World Organization for Animal Health (OIE) and the European Commission. European Union Member States participated in a survey regarding their national reporting systems and the needs for information on foodborne outbreaks at the Community level. The acceptability, the functionality and the data quality of the current reporting system were evaluated. The results were used to propose new variables on which data should be reported. Pick-lists were developed to facilitate reporting and better integration of the Community system with Member States' reporting systems. The new system is expected to yield better quality data on foodborne outbreaks relevant for risk assessment and risk management while reducing the work load for Member States
In: Azuma-Dicke , N , Morthorst , P E , Ravn , H F , Schmidt , R & Weber , C 2004 , CO2-emission trading and green markets for renewable electricity. Wilmar - deliverable 4.1 . Denmark. Forskningscenter Risoe. Risoe-R , no. 1470(EN) , Risø National Laboratory , Roskilde .
This report is Deliverable 4.1 of the EU project "Wind Power Integration in Liberalised Electricity Markets" (WILMAR) and describes the application of two policy instruments, Tradable Emissions Permits (TEP's) and Tradable Green Certificates (TGC's) forelectricity produced from renewable energy sources in the European Union and the implications for implementation in the Wilmar model. The introduction of a common emission-trading system in the EU is expected to have an upward effect on the spot pricesat the electricity market. The variations of the spot price imply that some types of power generation may change the situation from earning money to losing money despite the increasing spot price. Heavy restrictions on emissions penalise thefossil-fuelled technologies significantly, and the associated increase in the spot price need not compensate for this. Therefore, a market of TEP's is expected to have a significant influence on the electricity spot price. However, the expected pricelevel of TEP's are met with great uncertainty and a study of a number of economical studies shows a price span between zero and 270 USD per ton of CO2 depending on the participation or non-participation of countries in the scheme. The price-determinationat the TGC market is expected to be closely related to the price at the power spot market as the RE-producers of electricity will have expectations to the total price paid for the energy produced, i.e., for the price of electricity at the spot market plusthe price per kWh obtained at the green certificate market. In the Wilmar model, the TGC market can either be handled exogenously, i.e., the increase in renewable capacity and an average annual TGC price are determined outside the model, or a simple TGCmodule is developed, including the long-term supply functions for the most relevant renewable technologies and an overall TGC quota. Both solutions are rather simple, but to develop a more advanced model for the TGC market seems to be out of scope forhandling the interplay with the Wilmar model. The obligation for the TGC market is normally given on an annual basis, i.e., the certificate quota has to be fulfilled within a given year. This implies that to establish a TGC price on an hourly basisthroughout the year is not only difficult, but irrelevant as well. The incorporation of model elements representing an annual quota for emission and deriving a TEP price seems more relevant for the Wilmar model. ; This report is Deliverable 4.1 of the EU project "Wind Power Integration in Liberalised Electricity Markets" (WILMAR) and describes the application of two policy instruments, Tradable Emissions Permits (TEP's) and Tradable Green Certificates (TGC's) for electricity produced from renewable energy sources in the European Union and the implications for implementation in the Wilmar model. The introduction of a common emission-trading system in the EU is expected to have an upward effect on the spot prices at the electricity market. The variations of the spot price imply that some types of power generation may change the situation from earning money to losing money despite the increasing spot price. Heavy restrictions on emissions penalise the fossil-fuelled technologies significantly, and the associated increase in the spot price need not compensate for this. Therefore, a market of TEP's is expected to have a significant influence on the electricity spot price. However, the expected price level of TEP's are met with great uncertainty and a study of a number of economical studies shows a price span between zero and 270 USD per ton of CO2 depending on the participation or nonparticipation of countries in the scheme. The price-determination at the TGC market is expected to be closely related to the price at the power spot market as the REproducers of electricity will have expectations to the total price paid for the energy produced, i.e., for the price of electricity at the spot market plus the price per kWh obtained at the green certificate market. In the Wilmar model, the TGC market can either be handled exogenously, i.e., the increase in renewable capacity and an average annual TGC price are determined outside the model, or a simple TGC module is developed, including the long-term supply functions for the most relevant renewable technologies and an overall TGC quota. Both solutions are rather simple, but to develop a more advanced model for the TGC market seems to be out of scope for handling the interplay with the Wilmar model. The obligation for the TGC market is normally given on an annual basis, i.e., the certificate quota has to be fulfilled within a given year. This implies that to establish a TGC price on an hourly basis throughout the year is not only difficult, but irrelevant as well. The incorporation of model elements representing an annual quota for emission and deriving a TEP price seems more relevant for the Wilmar model.
Abstract. Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.