Suchergebnisse
Filter
Format
Medientyp
Sprache
Weitere Sprachen
Jahre
6459720 Ergebnisse
Sortierung:
YOUTH PARTIES IN EU COUNTRIES
The article attempts to conduct a comparative analysis of youth parties in the EU. Investigate theoretical approaches to understanding the concept of «political party» – as well as to highlight the main features and characteristics of youth parties.
BASE
Challenges to established parties: The effects of party system features on the electoral fortunes of anti-political-establishment parties
In: European journal of political research: official journal of the European Consortium for Political Research, Band 41, Heft 4, S. 551-583
ISSN: 0304-4130
The rise of parties that challenge the political establishment has recently sparked the interest of political scientists. Scholars have identified several factors that lie behind the success of such anti-political-establishment parties. Most empirical studies, however, have concentrated their attention either on the importance of electoral system features or on the effects of socio-economic conditions. This article focuses instead on the role that party system factors play in the electoral success of these parties. Using three datasets from studies conducted in three different time periods it tests two seemingly contradictory hypotheses. On the one hand, the claim that where the established parties have converged toward centrist positions and thus fail to present voters with an identity that is noticeably different from their established competitors, the electorate will be more susceptible to the markedly different policies put forward by anti-political-establishment parties. On the other hand, there is the argument that these parties profit more from increasing polarization and the subsequent enlargement of the political space than from a convergence toward the median. The results of the analyses show that anti-political-establishment parties generally profit from a close positioning of the establishment parties on the left-right scale. However, there is no consistent support for the notion that party system polarization by itself is associated with an increase in the support for parties that challenge the political establishment. (European Journal of Political Research / FUB)
World Affairs Online
Evaluation of Parties and Coalitions After Parliamentary Elections
Five leading German political parties and their coalitions are evaluated with regard to party manifestos and results of the 2005 parliamentary elections. For this purpose, the party manifestos are converted into Yes/No answers to 95 topical questions (Relax the protection against dismissals? Close nuclear power plants? etc.). On each question, every party represents its adherents as well as those of the parties with the same position. Therefore, a party usually represents a larger group than its voters. The popularity of a party is understood to be the percentage of the electorate represented, averaged on all the 95 questions. The universality of a party is the frequency of representing a majority of electors. The questions are considered either unweighted, or weighted by an expert, or weighted by the number of GOOGLE-results for given keywords (the more important the question, the more documents in the Internet). The weighting however plays a negligible role because the party answers are backed up by the party ``ideology'' which determines a high intra-question correlation. The SPD (Social-Democratic Party) did not receive the highest percentage of votes, remains nevertheless the most popular and the most universal German party. A comparison of the election results with the position of German Trade Union Federation (DGB) reveals its high representativeness as well. Finally, all coalitions with two and three parties are also evaluated. The coalition CDU/SPD (which is currently in power) is the most popular, and the coalition SPD/Green/Left-Party (which failed due to personal conflicts) is the most universal.
BASE
Influence of Political Parties in Elections: Evidence from Nepal
This article measures the influence of political party variables on the electoral process in Nepal. In the findings, in general, the research found that a free and fair election in Nepal does not perform at par with their counterparts in the Western world. In Nepal, it is the confrontational political culture among the political parties which is challenging to ensure free and fair elections.
BASE
Ensuring Reproducibility in Large Research Teams
Blog: CEGA - Medium
Thomas Brailey shares takeaways from his Catalyst training project, which involved onboarding reproducible workflows for members of the J-PAL Payments and Governance Research Program. Check out the training materials developed as part of the project and read on to learn more! This post was originally published on the CEGA-managed Berkeley Initiative for Transparency for Social Sciences (BITSS) blog here.Campaign Creators via UnsplashHolding all else equal, ensuring a reproducible and transparent research pipeline is more straightforward with fewer team members. When we discuss achieving reproducible social science in the abstract, there are four broad steps that need clear documentation: 1) obtaining the data; 2) cleaning and wrangling the data; 3) analyzing and visualizing the data, and 4) archiving or releasing the data to the public. With a few principal investigators and research assistants to collect and work on the data, this process has been, in my experience, relatively straightforward. However, ensuring a reproducible workflow becomes markedly more tricky when the project has many team members or is integrated into non-academic bodies such as non-profits or governments. Such organizations face an uphill battle in keeping to the ground rules of transparent and ethical research, especially if their partners do not emphasize the norms of transparent social science."[E]nsuring a reproducible workflow becomes markedly more tricky when the project has many team members or is integrated into non-academic bodies such as non-profits or governments."One might assume that whatever works for a small research team simply scales up for larger teams, but I would argue that far more care needs to be taken with the latter. This is because individual team members will have different levels of exposure to reproducible practices, expectations of the research process, and deliverables and responsibilities. Does non-analysis code (e.g., back-checks, logic checks, cleaning, and recoding code) need to be treated the same as analysis code, even though it won't get included in a manuscript's replication package? Do policy reports or updates for government officials need to emphasize replicability, even if those industries are not placing the same emphasis on transparency as in academia? The answers to these questions, I believe, are absolutely, yes. With that said, there appears to be very little literature focusing on this particular aspect of reproducible social science, so I will discuss some concrete options to ensure transparency in large research teams (this guide offers a fantastic overview of the whole research pipeline for large teams but does not focus on the interplay between, and challenges faced by the whole team.).First, it is important to ensure that all code is version controlled, irrespective of what it does or who it is for. The industry standard (at least in political science) version control software is GitHub, and there are plenty of useful guides for getting this setup. Broadly speaking, each project should be stored as a single repository, with separate folders for cleaning, analysis, and replication code. Each researcher should create their pull request when working on a specific task, then assign another RA to review the changes before merging them into the main branch. Beyond reproducibility, this method ensures accountability among researchers and allows teams to see all changes made to code files from the beginning of time (e.g., Dropbox only allows version history tracking for 180 days). Datasets can be stored on GitHub, but it is not necessary to do this, given that there usually isn't a reason to overwrite a raw dataset. There also exist several trusted data storage sites which guarantee permanence and catalog stored data. Documents (.word, .tex, .pdf, etc.) can be stored on GitHub and version controlled, but it is not considered industry standard to do so. A bifurcated system where all code is version controlled and non-code files are kept in shared storage space can work well for large research teams, though for simplicity, storing all files on GitHub (e.g., linked through the repositories Wiki page) might be helpful.Second, within this reproducible framework, it is important to ensure that cleaning and analysis are kept parsimonious and well-documented. The findings that you publish and present to governments may well be replicable, but if it is based on bad analysis, then it is meaningless. A 2015 PNAS article suggests that the best way to prevent replicable, but poor analysis is to "increase the number of trained data analysts in the scientific community and […] identify statistical software and tools that can be shown to improve reproducibility and replicability of studies". Having a well-documented standard for conducting data analysis and data visualization that is uniform across the organization helps thwart potential mistakes or misleading results."The findings that you publish and present to governments may well be replicable, but if it is based on bad analysis, then it is meaningless."Third, large research teams should encourage non-academic entities with whom they interact to publish codebooks and thorough documentation accompanying any data that they share. Even if these data are not to be shared with the broader public, it is important for the research team to know exactly how the data were generated. It is exciting to see organizations such as J-PAL focus on bridging the gap between their survey experiments and the administrative data they use for analysis. J-PAL's Innovations in Data and Experiments for Action (IDEA) Initiative "supports governments, firms, and non-profit organizations […] who want to make their administrative data accessible in a safe and ethical way". With a survey, the research team has full control over the instrument and knows exactly how each variable is generated, but it is just as important to verify the validity of any external data used for analysis because bad data, like the bad analysis practices discussed above, cause misleading results.Fourth, it can be very helpful to have at least one team member, or an outside consultant, who remains up-to-date on the latest reproducible science practices to monitor the codebase and train the team members. This ensures that all researchers working with code and data can easily collaborate in a single repository. It is vital that all team members, even those who are not in direct contact with code and data, are aware of the importance of reproducible best practices and have exposure to the version control software that their team uses.In this post, I have outlined some of the challenges faced by large research teams with regard to ensuring transparency throughout their research pipeline. I have also pointed to a few potentially useful practices that can help these diverse and complex organizations adhere to the tenets of reproducible social science. For those who are interested, all of our team's onboarding materials can be found in our dedicated Open Science Framework repository. Want to share your experience and helpful resources for collaborating in large teams? Get in touch!Ensuring Reproducibility in Large Research Teams was originally published in CEGA on Medium, where people are continuing the conversation by highlighting and responding to this story.
SOCIAL DEMOCRATIC PARTIES OF THE EUROPEAN COMMUNITY
In: Journal of common market studies: JCMS, Band 13, Heft 4, S. 415-418
ISSN: 1468-5965
SOCIAL DEMOCRATIC PARTIES OF THE EUROPEAN COMMUNITY
In: Journal of common market studies: JCMS, Band 13, Heft 3, S. 415-418
ISSN: 1468-5965
Spring Cleaning: Refreshing CEGA's Annual Priorities
Blog: CEGA - Medium
CEGA's Director of Operations Lauren Russell and Executive Director Carson Christiano share the center's ambitions for 2023.Spring is upon us and we're cleaning CEGA's proverbial house. This means reflecting on our priorities and commitments, and tidying our goals for the year — each of which we hope will bring us closer to a world where people are better off because decision-makers use insights and tools backed by rigorous, inclusive, and transparent evidence.A woman in India assembles a jharu, a broom made of grass, used for cleaning | Hewlett FoundationBelow we outline five strategic ambitions for CEGA that we believe will generate new insights and tools leaders can use to improve policies, programs, and lives.Incubating new research portfolios on forced displacement, conflict and security, and gender and agency.These are topics of central importance to decision-makers and researchers, for which data and evidence remain lacking. We have made headway on each: a new suite of CEGA studies is focused on generating more and better data (including panel data) on the refugee and host community experience, as well as the effectiveness of interventions designed to improve outcomes for both. Meanwhile, we are building a portfolio on conflict and security, leveraging an ongoing project on post-conflict security structures in Latin America. Finally, we're scoping a new, cross-cutting research portfolio on gender and agency, designed to answer important questions about social norms, wellbeing, and measurement, and to inform improvements to social programs that affect underserved groups.Promoting the use of novel data and data-intensive analytical approaches by the development research community.New types of data — including call detail records, sound and text data, and satellite data — and new methods to analyze them (like machine learning and AI) can generate more accurate, nuanced, and useful insights on global poverty and development than traditional surveys. Through our Data Science for Development (DS4D) portfolio, CEGA is seeding frontier research leveraging these data and approaches — like employment matching, new poverty estimates, and using historical satellite imagery to predict growth — and building the capacity of early-career researchers and partners, including in low- and middle-income countries (LMICs), to use similar tools. In parallel, our Digital Credit Observatory (DCO) recently launched a new focus on data privacy, which is generating evidence on the effectiveness of privacy enhancing technologies.Centering the voices of women, LMIC scholars, and other underrepresented groups in our work.CEGA continues to promote diversity, equity, inclusion, and justice (DEIJ) in all that we do, for example, by empowering African scholars to generate policy-relevant research through fellowships, networking and dissemination opportunities, and access to dedicated research funding and mentorship. This year, our Collaboration for Inclusive Development Research (CIDR) is taking a structural view to investigate the need for — and effectiveness of — various inclusion strategies, in close partnership with the Network of Impact Evaluation Researchers in Africa (NIERA). Finally, CEGA works tirelessly to make social science research more transparent, benefitting underrepresented scholars by increasing access to knowledge (more below). We are eager to expand these activities, and to serve as a partner and resource to other organizations seeking to make development research more open and inclusive.Advancing open and transparent research.In 2023, CEGA is redoubling efforts to promote ethical, transparent, and reproducible research practices that can improve scientific integrity and inspire better public policy, while making the entire research process more inclusive. We are particularly excited to grow our Cost Transparency Initiative (CTI), which will drive new efficiencies in global development by helping to standardize the way the cost (and cost-effectiveness) of development interventions is measured. Importantly, we are further investing in our work on Open Policy Analysis (OPA), a crucial element of democratic and effective policymaking, which is advancing through an ongoing collaboration with the Ministry of Finance in Chile.Investing in partnerships to strengthen policy impact.The pathways by which evidence improves people's lives are rarely linear (or even clear). CEGA's impact stories highlight some of the many circuitous ways in which evidence-based tools and insights have guided improvements in programming, policy, and practice. Our approach to policy engagement has long involved investing in LMIC researchers, facilitating the co-creation of research through strategic matchmaking activities, and prioritizing demand-driven research in our competitive grantmaking. This year, CEGA seeks to partner with organizations in LMICs that can inform our research agendas and deliver key insights to decision-makers at opportune moments. Meanwhile, we are continuing to investigate our own impact to understand how CEGA investments have contributed to policy change so that we can incorporate lessons into our evolving policy engagement strategy.Marie-Kondo-ing our annual goals renews our motivation to continue advancing rigorous, transparent research that informs critical decisions impacting people experiencing poverty. We are deeply grateful for our diverse and committed network of affiliated faculty, LMIC scholars, partners, supporters, and staff. We invite you to engage with the CEGA community by reading about our research, attending our events, following us on social media, and sharing our data and resources as we work to meaningfully improve people's lives.Spring Cleaning: Refreshing CEGA's Annual Priorities was originally published in CEGA on Medium, where people are continuing the conversation by highlighting and responding to this story.
GENESIS AND FUNCTIONAL FEATURES OF POLITICAL PARTIES
In: Visnyk Nacional'noho jurydyčnoho universytetu "Jurydyčna akademija Ukraïny imeni Jaroslava Mudroho". Serija filosofija, filosofija prava, politologija, sociologija, Band 2, Heft 37, S. 31-45
ISSN: 2663-5704
Networks of Effective Action: Implementing an Integrated Approach to Peacebuilding
In: Security dialogue, Band 34, Heft 4, S. 445-462
ISSN: 1460-3640
Organizations in the peacebuilding field face the imperative of taking a holistic, integrated approach to peacebuilding that combines traditionally distinct disciplines such as human rights, humanitarian assistance, sustainable development, environment, conflict resolution, security, and the rule of law in order to be effective in today's complex conflicts. The concept of a Network of Effective Action is proposed as a set of practices for collaboration that is capable of facilitating integrated approaches to peacebuilding both on the ground and in terms of the theoretical development of the field.
The Effective Mechanisms in Monitoring and Management of E-Government
In: Journal of public administration and governance, Band 5, Heft 2, S. 55
ISSN: 2161-7104
In paper main principles of formation and management of e-government areanalyzed. Web analytics application issue is investigated as the possibility to increase the effectiveness of e-government management indicators. Web analytics isanalyzed as effective feedback mechanism in monitoring and management of e-government. Some recommendations on how to design e-government programs were made.
The Fading of Costa Rica's Old Parties
In: Journal of democracy, Band 29, Heft 4, S. 43-53
ISSN: 1086-3214
Fluid–Structure Interactions and Unsteady Kinematics of a Low-Reynolds-Number Rotor
Micro air vehicles are used for both civil and military applications, like rescue or surveillance. The aerodynamic performance of the rotor is known to be lower than for classical large rotors, due to increased drag at low Reynolds numbers. However, the rotor performance can be improved by taking advantage of the flow unsteadiness and considering unsteady rotor kinematics, like a periodic variation of the rotor pitch. To study such behaviors, it is necessary to develop numerical methods adapted to these fluid–structure interaction phenomena, which are the main objectives of this paper. The method relies on the implementation of fluid–structure interaction capabilities in a lattice–Boltzmann flow solver, which is implemented in a monolithic fashion using generalized coordinates. The validation is first conducted on a vortex-induced vibration test case. Then, numerical simulations are performed for a rotor test case 1) with a forced motion and 2) by coupling the flow with the equation of the dynamics. Some semianalytical models are derived and validated against the numerical simulations to predict the effects of pitching, flapping, and surging on the thrust. The results show that flapping and surging significantly increase the rotor thrust, but at the price of a penalty on the power consumption.
BASE
How does Measurement Contribute to a Habitable Planet for All?
Blog: CEGA - Medium
Last month, CEGA held its ninth annual Measuring Development (MeasureDev) conference on "Mitigating the Risks and Impacts of Climate Change," in partnership with the World Bank's Development Impact Evaluation (DIME) Department, Data Analytics and Tools Unit (DECAT), and the University of Chicago's Development Innovation Lab (DIL). Speakers showcased innovative approaches for measuring and tracking climate-related risk, developing effective responses, and evaluating outcomes in data-sparse environments. Sean Luna McAdams, CEGA's Data Science for Development Program Manager, shares key insights from the event here.Climate change is disrupting weather patterns around the world. Look no further than the unhealthy levels of smoke in the Northeast's skies last week. The impacts on human activity require urgent investments in mitigation and resilience for those most vulnerable. Last month, CEGA, DIL, and the World Bank brought together some of the most innovative social and natural scientists working on this existential challenge to share how they are pushing the frontiers of data collection, for example by using remote sensing technologies, engaging in participatory data collection, and effectively (and meaningfully) integrating different data streams.University of Chicago's Rachel Glennester emphasized the importance of measurement to help diagnose, mitigate, and adapt to climate change, particularly to incentivize green investments in LMICs. Credit: World Bank.A Call for Better Measurement"Mitigation is one of the true global public goods," noted the University of Chicago's Rachel Glennester in her keynote address. Indeed, the efforts by one country or group of countries to reduce carbon emissions will have benefits that are felt worldwide. Recognizing that low- and middle-income countries (LMICs) — who have historically contributed little to climate change — nevertheless face growing opportunities to mitigate emissions for the whole planet, Rachel suggested high-income countries could fund highly costeffective mitigation efforts in LMICs. These payments should not be considered aid as they benefit the world and offset high income countries' damage to the atmosphere. To do this effectively we need scalable approaches to measuring emissions, among many other critical indicators.Cost-Effective Measurement with Remote SensingMany speakers addressed the challenge of cost-effective measurement through the use of remote sensing. CEGA Affiliate Tamma Carleton highlighted the promise of satellite imagery and machine learning (SIML) to improve climate management. Her own work on MOSAIKS demonstrates the potential for these data and predictive models to increase the spatial coverage and resolution of survey and administrative georeferenced data, while lowering barriers to access for decision-makers in low-resource settings. Similarly, Dieter Wang showcased how higher resolution and frequency satellite imagery alongside cloud-penetrating sensors can improve estimates of how well conservation policies in the Brazilian Amazon are preventing deforestation. Better measurement in this case makes it possible to reward governments through bonds whose rates are tied to mitigation performance. Kangogo Sogomo discussed a novel approach that leverages satellite imagery to predict maize yields at a finer scale with less computational resources.Since 2010, new satellites have come online that increase both sensor resolution and cloud-free revisit rate. These advances provide researchers with more granular and frequent imagery data to incorporate into their analyses. Credit: Burke et al, Science 2021.Of course, remote sensing is not just limited to satellites and can inform adaptation and resilience alongside mitigation. Samuel Seo, for example, compared measurement strategies for methane emissions from a large, unmanaged landfill in Dakar, Senegal by collecting data using human enumerators, drones, and satellites. Across the board, these measurements suggest that current approaches used by the IPCC underestimate total emissions from these sites by more than half. Bridget Hoffman instead used low-cost air pollution sensors along bus routes and within buses in Dakar to understand the effects of an infrastructure project on air quality. Drones, stationary sensors, and other instruments can all provide rich data at scale to improve the evaluation and monitoring of climate mitigation and adaptation strategies.The Role of Participatory Data CollectionResearchers and climate practitioners not only think creatively about the sensors they use to collect data, they also innovate data collection and its infrastructure to make it more participatory. Kangogo Sogomo noted increasing mobile phone use and internet penetration across the global South suggesting, "climate action is urgent… there is still an opportunity for having participatory methods [for data collection]." Tom Bewick, for example, has trained indigenous communities in Africa and Latin America how to collect georeferenced data on planted trees to improve the monitoring of their growth and local collective governance. Similarly, Kenneth Mubea, who works to conserve mangrove forests, discussed how his research assembled teams of students to work with local communities to collect georeferenced data. Participatory approaches can extend to model validation, as with the case of Alejandra Mortarini. She worked with organizations that have long-standing relationships with communities living in informal settlements in Honduras to help validate the outputs of the predictive model and calibrate it to improve its performance. By incorporating local actors into data collection efforts, we can increase its frequency, provide greater access, and contribute to a local culture of evidence-use.New Approaches to Data IntegrationA third strategy to make data collection cheaper and more effective relies on exploiting efficiencies generated by integrating different data streams. The World Bank's Stéphane Hallegatte stressed the opportunity of integrating different data sources in his remarks."We have all this fantastic progress in measurement with remote sensing and big data, we have these household surveys that are playing an absolutely critical role to measure what we are doing and to prioritize," said Hallegatte. "One of the big challenges is to make them completely interlinked and to flow smoothly from the spatial to household surveys, and have household surveys that can be more flexible when there is a shock that can use data coming from satellites to maybe focus and do dedicated surveys in places that have been affected by a shock."In particular, Hallegatte stressed that traditional measures of vulnerability may lead us to miss some individuals who may be critically underprepared to face the "long tails" of climate shocks. Adaptive research designs can help us understand which interventions work best in particular contexts and communities, improving our understanding of how climate systems affect those who are socioeconomically and environmentally most vulnerable and how we may build resilience together.Hallegatte stressed how different metrics of climate vulnerability can lead policy makers to prioritize different areas. Here we see how four different risk indicators — annual asset risk, annual consumption poverty increase, socioeconomic resilience, and annual well-being risk — map onto the Philippines. Source: Hallegatte 2023.Paola Agostini, Mohammed Basheer, and Erwin Knippenberg simulated physical and social systems in their research designs. These simulations enabled each of them to estimate new quantities of interest, like the decision-space of negotiations for potential dam designs in the Nile River Basin, the cost-per-benefit of different land restoration interventions in Tajikistan, or the percentage of the population at risk of falling into poverty due to weather shocks in Afghanistan. Ben Brunckhorst showed how the incorporation of weather predictions unlocks the possibility of anticipatory cash transfers with demonstrable effects on household resilience to flooding in Bangladesh.Through better measurement we can improve our collective efforts to meet the challenge of climate change. As Hallegatte reminded us in his keynote remarks, how we construct these measures of impact fundamentally affects what regions, communities, and interventions we prioritize. A critical part of this effort will be to leverage measurement strategies highlighted during MeasureDev 2023 to channel resources to the places and communities where interventions to mitigate and adapt to climate change will have the greatest impact. In so doing, measurement can contribute to a more equitable future by incentivizing green investments in LMICs.How does Measurement Contribute to a Habitable Planet for All? was originally published in CEGA on Medium, where people are continuing the conversation by highlighting and responding to this story.