Non-representational approaches to the unconscious in the phenomenology of Husserl and Merleau-Ponty
In: Phenomenology and the cognitive sciences, Band 17, Heft 1, S. 199-224
ISSN: 1572-8676
7 Ergebnisse
Sortierung:
In: Phenomenology and the cognitive sciences, Band 17, Heft 1, S. 199-224
ISSN: 1572-8676
In: Synthese: an international journal for epistemology, methodology and philosophy of science, Band 198, Heft 2, S. 1517-1547
ISSN: 1573-0964
AbstractDespite the ubiquity of uncertainty, scientific attention has focused primarily on probabilistic approaches, which predominantly rely on the assumption that uncertainty can be measured and expressed numerically. At the same time, the increasing amount of research from a range of areas including psychology, economics, and sociology testify that in the real world, people's understanding of risky and uncertain situations cannot be satisfactorily explained in probabilistic and decision-theoretical terms. In this article, we offer a theoretical overview of an alternative approach to uncertainty developed in the framework of the ecological rationality research program. We first trace the origins of the ecological approach to uncertainty in Simon's bounded rationality and Brunswik's lens model framework and then proceed by outlining a theoretical view of uncertainty that ensues from the ecological rationality approach. We argue that the ecological concept of uncertainty relies on a systemic view of uncertainty that features it as a property of the organism–environment system. We also show how simple heuristics can deal with unmeasurable uncertainty and in what cases ignoring probabilities emerges as a proper response to uncertainty.
In: European psychologist, Band 28, Heft 3, S. 206-224
ISSN: 1878-531X
Abstract: The spread of false and misleading information in online social networks is a global problem in need of urgent solutions. It is also a policy problem because misinformation can harm both the public and democracies. To address the spread of misinformation, policymakers require a successful interface between science and policy, as well as a range of evidence-based solutions that respect fundamental rights while efficiently mitigating the harms of misinformation online. In this article, we discuss how regulatory and nonregulatory instruments can be informed by scientific research and used to reach EU policy objectives. First, we consider what it means to approach misinformation as a policy problem. We then outline four building blocks for cooperation between scientists and policymakers who wish to address the problem of misinformation: understanding the misinformation problem, understanding the psychological drivers and public perceptions of misinformation, finding evidence-based solutions, and co-developing appropriate policy measures. Finally, through the lens of psychological science, we examine policy instruments that have been proposed in the EU, focusing on the strengthened Code of Practice on Disinformation 2022.
In: Kozyreva , A , Lewandowsky , S & Hertwig , R 2020 , ' Citizens Versus the Internet : Confronting Digital Challenges With Cognitive Tools ' , Psychological Science in the Public Interest , vol. 21 , no. 3 , pp. 103-156 . https://doi.org/10.1177/1529100620946707
The Internet has evolved into a ubiquitous digital environment in which people communicate, seek information, and make decisions. Online environments are replete with smart, highly adaptive choice architectures designed primarily to maximize commercial interests, capture and sustain users' attention, monetize user data, and predict and influence future behavior. This online landscape holds multiple negative consequences for society, such as a decline in human autonomy, rising incivility in online conversation, the facilitation of political extremism, and the spread of disinformation. Benevolent choice architects working with regulators may curb the worst excesses of manipulative choice architectures, yet the strategic advantages, resources, and data remain with commercial players. One way to address this imbalance is with interventions that empower Internet users to gain some control over their digital environments, in part by boosting their information literacy and their cognitive resistance to manipulation. Our goal is to present a conceptual map of interventions that are based on insights from psychological science. We begin by systematically outlining how online and offline environments differ despite being increasingly inextricable. We then identify four major types of challenges that users encounter in online environments: persuasive and manipulative choice architectures, AI-assisted information architectures, distractive environments, and false and misleading information. Next, we turn to how psychological science can inform interventions to counteract these challenges of the digital world. After distinguishing between three types of behavioral and cognitive interventions—nudges, technocognition, and boosts—we focus in on boosts, of which we identify two main groups: (1) those aimed at enhancing people's agency in their digital environments (e.g., self-nudging, deliberate ignorance) and (2) those aimed at boosting competences of reasoning and resilience to manipulation (e.g., simple decision aids, inoculation). These cognitive tools are designed to foster the civility of online discourse and protect reason and human autonomy against manipulative choice architectures, attention-grabbing techniques, and the spread of false information.
BASE
In: Kozyreva , A , Lorenz-Spreen , P , Hertwig , R , Lewandowsky , S & Herzog , S 2021 , ' Public attitudes towards algorithmic personalization and use of personal data online: Evidence from Germany, Great Britain, and the United States ' , Humanities & Social Sciences Communications , vol. 8 , no. 1 , 117 . https://doi.org/10.1057/s41599-021-00787-w
People rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people's attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people's data privacy concerns and behaviour using representative online samples in Germany (N = 1,065), Great Britain (N = 1,092), and the United States (N = 1,059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people's preferences on personalization, is easy to adjust, and does not extend to political advertising.
BASE
In: Kozyreva , A , Lorenz-Spreen , P , Lewandowsky , S , Garrett , P M , Herzog , S , Pachur , T & Hertwig , R 2021 , ' Psychological Factors Shaping Public Responses to COVID-19 Digital Contact Tracing Technologies in Germany ' , Scientific Reports , vol. 11 , no. 1 , 18716 . https://doi.org/10.1038/s41598-021-98249-5
The COVID-19 pandemic has seen one of the first large-scale uses of digital contact tracing to track a chain of infection and contain the spread of a virus. The new technology has posed challenges both for governments aiming at high and effective uptake and for citizens weighing its benefits (e.g., protecting others' health) against the potential risks (e.g., loss of data privacy). Our cross-sectional survey with repeated measures across four samples in Germany (N = 4, 357) focused on psychological factors contributing to the public adoption of digital contact tracing. We found that public acceptance of privacy-encroaching measures (e.g., granting the government emergency access to people's medical records or location tracking data) decreased over the course of the pandemic. Intentions to use contact tracing apps—hypothetical ones or the Corona-Warn-App launched in Germany in June 2020—were high. Users and non-users of the Corona-Warn-App differed in their assessment of its risks and benefits, in their knowledge of the underlying technology, and in their reasons to download or not to download the app. Trust in the app's perceived security and belief in its effectiveness emerged as psychological factors playing a key role in its adoption. We incorporate our findings into a behavioral framework for digital contact tracing and provide policy recommendations.
BASE
The COVID-19 pandemic has seen one of the first large-scale uses of digital contact tracing to track a chain of infection and contain the spread of a virus. The new technology has posed challenges both for governments aiming at high and effective uptake and for citizens weighing its benefits (e.g., protecting others' health) against the potential risks (e.g., loss of data privacy). Our cross-sectional survey with repeated measures across four samples in Germany ([Formula: see text] ) focused on psychological factors contributing to the public adoption of digital contact tracing. We found that public acceptance of privacy-encroaching measures (e.g., granting the government emergency access to people's medical records or location tracking data) decreased over the course of the pandemic. Intentions to use contact tracing apps—hypothetical ones or the Corona-Warn-App launched in Germany in June 2020—were high. Users and non-users of the Corona-Warn-App differed in their assessment of its risks and benefits, in their knowledge of the underlying technology, and in their reasons to download or not to download the app. Trust in the app's perceived security and belief in its effectiveness emerged as psychological factors playing a key role in its adoption. We incorporate our findings into a behavioral framework for digital contact tracing and provide policy recommendations.
BASE