The following links lead to the full text from the respective local libraries:
Alternatively, you can try to access the desired document yourself via your local library catalog.
If you have access problems, please contact us.
11 results
Sort by:
Wie sich die Transformation moderner Gesellschaften in den nächsten Jahren fortsetzt, hängt ganz zentral von der Entwicklung, Implementierung und sozialen Kontrolle der GNR-Technologien (der Kombination aus Gen-, Nano- und Robotertechnologie) ab. Die Diskussion zur künstlichen Intelligenz, die im letzten Jahrzehnt geführt worden ist, hat mit dem Gebiet der Robotertechnologie gleichsam eine neue Arena gefunden und sich auf dieses Gebiet verlagert. Hier werden jetzt grundlegende, auch pädagogisch zentrale Fragen, wie z.B. die nach einem Personenkonzept, diskutiert (vgl. Richards u.a. 2002). Zentrale Bedenken, die sich auf die mit den neuen Technologien verbundenen Gefahren stützen, sind immer wieder vorgetragen worden (Joy 2000; Moravec 1999). Ohne diese verzweigte Debatte an dieser Stelle rekonstruieren zu wollen, kann doch ein Befund in verallgemeinernder Absicht hervorgehoben werden: In dem Maße, in dem Gesellschaften aufgrund des Einsatzes neuer Technologien einen Komplexitätsschub aufweisen, der sich bis in die Lebenswelten einzelner Menschen hinein auswirkt, rückt ein "Mechanismus" von Sozialität immer stärker in das Zentrum der Aufmerksamkeit: Vertrauen. Nicht nur aus der hier herangezogenen Perspektive wird diese Ressource prekär. Vielmehr ist seit Beginn der neunzehnhundertneunziger Jahre ein Ansteigen der Publikationen zu dem Thema Vertrauen aus verschiedenen Perspektiven zu konstatieren, und zwar in Soziologie, Pädagogik, Philosophie, Politikwissenschaft und Ökonomie. Vertrauen wird als elementare Voraussetzung sozialer Prozesse gesehen. Wenn Vertrauen aber nicht mehr als selbstverständliche Voraussetzung sozialer Prozesse verstanden werden kann, häufen sich Maßnahmen zur Vertrauensbildung, gerät das Phänomen Vertrauen also in den Fokus der systematischen Reflexion. ; How the transformation of modern societies continues in the coming years depends very centrally on the development, implementation and social control of GNR technologies (the combination of genetic, nano and robotic technologies). The discussion on artificial intelligence that has taken place over the last decade has found a new arena, as it were, with the field of robotics technology and has shifted to this area. Here, fundamental, also pedagogically central questions, such as that of a concept of person, are now being discussed (cf. Richards et al. 2002). Central concerns based on the dangers associated with the new technologies have been raised again and again (Joy 2000; Moravec 1999). Without wishing to reconstruct this ramified debate here, one finding can be highlighted in general terms: To the extent that societies are experiencing a surge in complexity due to the use of new technologies, which is having an impact on the life worlds of individual people, a "mechanism" of sociality is increasingly becoming the focus of attention: trust. It is not only from the perspective used here that this resource becomes precarious. Rather, since the beginning of the nineteen-nineties, there has been an increase in publications on the topic of trust from various perspectives, namely in sociology, pedagogy, philosophy, political science and economics. Trust is seen as an elementary prerequisite of social processes. However, if trust can no longer be understood as a self-evident prerequisite of social processes, measures to build trust accumulate, the phenomenon of trust thus becomes the focus of systematic reflection.
BASE
In: Cyber Warfare and Cyber Terrorism, p. 421-418
In: Cyber Warfare and Cyber Terrorism, p. 154-160
In: Cyber Warfare and Cyber Terrorism
Part 3: Extended Abstracts ; International audience ; Data sets of biometric or forensic samples are an important basis for evaluations and research. Especially biometric data is considered as personal data, which is protected by privacy regulations. Since the data cannot be altered or revoked, at least in some countries, this poses a challenge because rights must be granted to the data's subject. In particular in Germany and probably in the entire European Union after its reformation of the data protection legislation it is challenging to use such data. Furthermore, with respect to latent fingerprints only very few public data sets exist nowadays. We propose the creation of a public data set without privacy implications consisting of latent fingerprints from artificial fingerprint patterns. On the foundation of a first set of 50 fingerprints on a compact disk surface we report challenges that need to be solved in order to create realistic samples.
BASE
In: Proceedings of the Weizenbaum Conference 2022: Practicing Sovereignty - Interventions for Open Digital Futures, p. 142-161
This paper will first investigate possible contributions that an AI-based detector for deepfakes could make to the challenge of responding to disinformation as a threat to democracy. Second, this paper will also investigate the implications of such a tool - which was developed, among other reasons, for security purposes - for the emerging European discourse on digital sovereignty in a global environment. While disinformation is surely not a new topic, recent technological developments relating to AI-generated deepfakes have increased the manipulative potential of video and audio-based contents spread online, making it a specific but important current challenge in the global and interconnected information context.