The study of digital marketing communications on the Web is inseparable from the study of the ways metadata with which the web content of these communications is or should be described. This article presents the aspects of digital marketing communication and understanding in the interactive, interconnected on the levels of text and metadata, hypermedia environment of the Web. The article argues that the texts of digital marketing communications on the Web should be perceived (and created) both as a collective action and together and from the point of view of their function as digital objects
Privacy protections against government surveillance are often scoped to communications content and exclude communications metadata. In the United States, the National Security Agency operated a particularly controversial program, collecting bulk telephone metadata nationwide. We investigate the privacy properties of telephone metadata to assess the impact of policies that distinguish between content and metadata. We find that telephone metadata is densely interconnected, can trivially be reidentified, enables automated location and relationship inferences, and can be used to determine highly sensitive traits.
This research tested the conventional wisdom that interactive voice recognition (IVR; also known as robocalls or auto calls) are not listened to by receivers. The study found that three out of four people (75%) listen to over 19 s of a message, which equates to over 40 words. The vast majority of people, 97%, listen to at least 6 s. This innovative, unobtrusive field approach for measuring actual listening time eliminates self-report bias. The unique data set, provided by a third-party vendor, consisted of 156 call projects with a total of 389,588 live answered phone calls from the last week of the 2012 election.
The intelligence leaks from Edward Snowden in 2013 unveiled the sophistication and extent of data collection by the United States' National Security Agency and major global digital firms prompting domestic and international debates about the balance between security and privacy, openness and enclosure, accountability and secrecy. It is difficult not to see a clear connection with the Snowden leaks in the sharp acceleration of new national security legislations in Australia, a long term member of the Five Eyes Alliance. In October 2015, the Australian federal government passed controversial laws that require telecommunications companies to retain the metadata of their customers for a period of two years. The new acts pose serious threats for the profession of journalism as they enable government agencies to easily identify and pursue journalists' sources. Bulk data collections of this type of information deter future whistleblowers from approaching journalists, making the performance of the latter's democratic role a challenge. After situating this debate within the scholarly literature at the intersection between surveillance studies and communication studies, this article discusses the political context in which journalists are operating and working in Australia; assesses how metadata laws have affected journalism practices and addresses the possibility for resistance.
Brief Description: The Technical Communication Body of Knowledge (TCBOK) is a landmark project by the Society of Technical Communication (STC) to establish a body of disciplinary knowledge for technical communicators. The initiative has its roots in connecting academics and practitioners and professionalizing technical communication (TC). Purpose: This report is aimed at infrastructural inversion – a way of externalizing the architecture and organization of the TCBOK classification system. Infrastructural inversion can help us find problems that are obscured from the surface and see existing problems with clear eyes. Method: This article focuses primarily on the practical politics, materiality and texture, indeterminacy of knowledge, and ubiquity of the controlled systems that are intrinsic in the TCBOK (Bowker and Star, 1999). Results: The TCBOK reflects the concurrent political and ethical environment of both the society and the profession. Decisions regarding the TCBOK design and development can support communities of practice that work toward professional consciousness and the professional status of TC. The TCBOK provides a place to negotiate that professional consciousness through user-mediation. Conclusion: The TCBOK allows the STC to govern the profession of TC. The core elements of the TCBOK, its strongest premises, validate or reject discourse through social elitism. This governing isn't necessarily bad, but it can be dangerous. A body of knowledge without governance risks unruliness. Over-governance risks professional inequality through exclusion. Viewing the structure of a controlled system through a critical lens can identify overlooked problems, improve meta-cognition through methodology, and establish a vocabulary for critical analysis through metaphor and genre.
In June 2013, two classified National Security Agency (NSA) collection programs received increased media attention based on unauthorized disclosures of classified documents by a contractor working for the NSA. Under one program, the NSA collects domestic telephone metadata (i.e., call records) in bulk. Under the other program, implemented under the Foreign Intelligence Surveillance Act (FISA), the government collects the contents of electronic communications, including telephone calls and emails, where the target is reasonably believed to be a non-U.S. person located outside the United States
Zugriffsoptionen:
Die folgenden Links führen aus den jeweiligen lokalen Bibliotheken zum Volltext:
The intelligence leaks from Edward Snowden in 2013 unveiled the sophistication and extent of data collection by the United States' National Security Agency and major global digital firms prompting domestic and international debates about the balance between security and privacy, openness and enclosure, accountability and secrecy. It is difficult not to see a clear connection with the Snowden leaks in the sharp acceleration of new national security legislations in Australia, a long term member of the Five Eyes Alliance. In October 2015, the Australian federal government passed controversial laws that require telecommunications companies to retain the metadata of their customers for a period of two years. The new acts pose serious threats for the profession of journalism as they enable government agencies to easily identify and pursue journalists' sources. Bulk data collections of this type of information deter future whistleblowers from approaching journalists, making the performance of the latter's democratic role a challenge. After situating this debate within the scholarly literature at the intersection between surveillance studies and communication studies, this article discusses the political context in which journalists are operating and working in Australia; assesses how metadata laws have affected journalism practices and addresses the possibility for resistance.
Research is now digital in execution and reporting/preservation. The end-to-end process of research – from idea to proposal to funded project to outputs – is supported by ICT (Information and Communication Technologies). The various entities of the research domain require digital description to facilitate discovery, contextualization (for relevance and quality but including rights, costs, security, and privacy restrictions) and action. Metadata is the magic key for this; starting with datasets metadata now describes software, workflows, persons, organisations, equipment, computing resources and more. Metadata must have formal syntax and declared (multilingual) semantics. Most existing metadata 'standards' do not meet these criteria, but many can be interconverted to a subset of a canonical form that does to permit interoperation. CERIF (Common European Research Information Format: a European Union Recommendation to Member States) is a widely-used canonical data model to meet these objectives. RDA (Research Data Alliance) is evolving a list of metadata elements to be recommended to support the operations described above; the set accords well with CERIF and – like CERIF - is a superset of other metadata 'standards'. The philosopher's stone was reputed to turn base substances to valuable ones. The Rosetta stone permitted multilinguality. Metadata has these properties. Keith Jeffery is an independent consultant and past Director IT at STFC Rutherford Appleton Laboratory (http://www.stfc.ac.uk/about-us/where-we-work/rutherford-appleton-laboratory/) with 360,000 users, 1100 servers and 140 staff. Keith holds 3 honorary visiting professorships, is a Fellow of the Geological Society of London and the British Computer Society, is a Chartered Engineer and Chartered IT Professional and an Honorary Fellow of the Irish Computer Society. Keith is past-President of ERCIM and past President of euroCRIS, and serves on international expert groups, conference boards and assessment panels. He has advised government on security and ...
Legal frameworks exist within democracies to prevent the misuse and abuse of personal data that law enforcement authorities obtain from private communication service providers. The fundamental rights to respect for private life and the protection of personal data underpin this framework within the European Union. Accordingly, the protection of the principles and safeguards required by these rights is key to ensuring that the oversight of State surveillance powers is robust and transparent. Furthermore, without the robust scrutiny of independent judicial review, the principles and safeguards guaranteed by these rights may become more illusory than real. Following the Edward Snowden revelations, major concerns have been raised worldwide regarding the legality, necessity and proportionality standards governing these laws. In 2014, the highest court in the EU struck down the legal framework that imposed a mandatory duty on communication service providers to undertake the mass retention of metadata for secret intelligence and law enforcement authorities across the EU. This article considers the influence of the Snowden revelations on this landmark judgment. Subsequently, the analysis explores the significance of this ruling for the future reform of EU law governing metadata surveillance and its contribution to the worldwide debate on indiscriminate and covert monitoring in the post-Snowden era.
Research is becoming increasingly digital, interdisciplinary, and data-driven and affects different environments in addition to academia, such as industry, and government. Research output representation, publication, mining, analysis, and visualization are taken to a new level, driven by the increased use of Web standards and digital scholarly communication initiatives. The number of scientific publications produced by new players and the increasing digital availability of scholarly artifacts, and associated metadata are other drivers of the substantial growth in scholarly communication. The heterogeneity of scholarly artifacts and their metadata spread over different Web data sources poses a major challenge for researchers with regard to search, retrieval and exploration. For example, it has become difficult to keep track of relevant scientific results, to stay up-to-date with new scientific events and running projects, as well as to find potential future collaborators. Thus, assisting researchers with a broader integration, management, and analysis of scholarly metadata can lead to new opportunities in research and to new ways of conducting research. The data integration problem has been extensively addressed by communities in the Database, Artificial Intelligence and Semantic Web fields. However, a share of the interoperability issues are domain specific and new challenges with regard to schema, structure, or domain, arise in the context of scholarly metadata integration. Thus, a method is needed to support scientific communities to integrate and manage heterogeneous scholarly metadata in order to derive insightful analysis (e.g., quality assessment of scholarly artifacts). This thesis tackles the problem of scholarly metadata integration and develops a life cycle methodology to facilitate the integrated use of different methods, analysis techniques, and tools for improving scholarly communication. Some key steps of the metadata life cycle are implemented using a collaborative platform, which allows to keep the research communities in the loop. In particular, the use of collaborative methods is beneficial for the acquisition, integration, curation and utilization of scholarly metadata. We conducted empirical evaluations to assess the effectiveness and efficiency of the proposed approach. Our metadata transformation from legacy resources achieves reasonable performance and results in better metadata maintainability. The interlinking of metadata enhances the coherence of scholarly information spaces both qualitatively and quantitatively. Our metadata analysis techniques provide a precise quality assessment of scholarly artifacts, taking into account the perspectives of multiple stakeholders, while maintaining compatibility with existing ranking systems. These empirical evaluations and the concrete applications with a particular focus on collaborative aspects demonstrate the benefits of integrating distributed scholarly metadata. ; Die Forschung wird zunehmend digital, interdisziplinär und datengetrieben und beeinflusst neben der akademischen Welt auch unterschiedliche Umgebungen wie Industrie und Verwaltung. Die Drastellung, Veröffentlichung, Gewinnung, Analyse und Visualisierung von Forschungsergebnissen werden auf eine neue Ebene gehoben, angetrieben durch den verstärkten Einsatz von Webstandards und digitalen Initiativen zur wissenschaftlichen Kommunikation. Die Anzahl der wissenschaftlichen Publikationen neuer Akteure und die zunehmende digitale Verfügbarkeit wissenschaftlicher Artefakte und der damit verbundenen Metadaten sind weitere treibende Kräfte für das starke Anwachsen der wissenschaftlichen Kommunikation. Die Heterogenität wissenschaftlicher Artefakte und ihrer Metadaten, die über verschiedene Webdatenquellen verteilt sind, stellt für Forscher eine große Herausforderung in Bezug auf Suche, Ausfinden und Erkunden der Metadaten dar. So ist es beispielsweise schwierig geworden, den Überblick über relevante wissenschaftliche Ergebnisse zu behalten, über neue wissenschaftliche Veranstaltungen und laufende Projekte auf dem Laufenden zu bleiben und potenzielle zukünftige Mitarbeiter zu finden. Die Unterstützung von Forschern bei der breiteren Integration, Verwaltung und Analyse wissenschaftlicher Metadaten kann daher zu neuen Möglichkeiten und Formen der Forschung führen. Das Problem der Datenintegration wurde in den Bereichen Datenbanken, Künstliche Intelligenz und Semantic Web ausführlich behandelt. Ein Teil der Interoperabilitätsprobleme ist jedoch domänenspezifisch und neue Herausforderungen in Bezug auf Schema, Struktur oder Domäne ergeben sich im Rahmen der wissenschaftlichen Metadatenintegration. Daher ist eine Methode erforderlich, um Wissenschaftsgruppen bei der Integration und Verwaltung heterogener wissenschaftlicher Metadaten zu unterstützen, um aussagekräftige Analysen (z.B. Qualitätsbewertungen wissenschaftlicher Artefakte) abzuleiten. Diese Arbeit beschäftigt sich mit dem Problem der Integration von wissenschaftlichen Metadaten und entwickelt eine "Lebenszyklusmethode", um den integrierten Einsatz verschiedener Methoden, Analysetechniken und Werkzeuge zur Verbesserung der wissenschaftlichen Kommunikation zu erleichtern. Einige wichtige Schritte des Metadaten-Lebenszyklus werden über eine kollaborative Plattform umgesetzt, die es ermöglicht, die Forschungsgemeinschaften auf dem Laufenden zu halten. Insbesondere der Einsatz kollaborativer Methoden ist für den Erwerb, die Integration, die Kurierung und die Nutzung wissenschaftlicher Metadaten von Vorteil. Wir haben empirische Evaluationen durchgeführt, um die Effektivität und Effizienz des vorgeschlagenen Ansatzes zu beurteilen. Unsere Metadatentransformation aus Legacy-Ressourcen erreicht eine angemessene Leistung und führt zu einer besseren Wartbarkeit der Metadaten. Die Verknüpfung von Metadaten erhöht die Kohärenz der wissenschaftlichen Informationsräume qualitativ und quantitativ. Unsere Metadatenanalyseverfahren ermöglichen eine präzise Qualitätsbewertung wissenschaftlicher Artefakte unter Berücksichtigung der Perspektiven mehrerer Interessengruppen bei gleichzeitiger Kompatibilität mit bestehenden Rankingsystemen. Diese empirischen Auswertungen und die konkreten Anwendungen mit besonderem Fokus auf kollaborative Aspekte zeigen die Vorteile der Integration von verteilten wissenschaftlichen Metadaten.
This is the final version of the article. It first appeared from Cogitatio Press via http://dx.doi.org/10.17645/mac.v3i2.297 ; Legal frameworks exist within democracies to prevent the misuse and abuse of personal data that law enforcement authorities obtain from private communication service providers. The fundamental rights to respect for private life and the protection of personal data underpin this framework within the European Union. Accordingly, the protection of the principles and safeguards required by these rights is key to ensuring that the oversight of State surveillance powers is robust and transparent. Furthermore, without the robust scrutiny of independent judicial review, the principles and safeguards guaranteed by these rights may become more illusory than real. Following the Edward Snowden revelations, major concerns have been raised worldwide regarding the legality, necessity and proportionality standards governing these laws. In 2014, the highest court in the EU struck down the legal framework that imposed a mandatory duty on communication service providers to undertake the mass retention of metadata for secret intelligence and law enforcement authorities across the EU. This article considers the influence of the Snowden revelations on this landmark judgment. Subsequently, the analysis explores the significance of this ruling for the future reform of EU law governing metadata surveillance and its contribution to the worldwide debate on indiscriminate and covert monitoring in the post-Snowden era.
International audience ; Nowadays, digital communications are pervasive and as such, they carry a huge amount of both professional and private information all around the world. Given the knowledge that can be extracted from such information, its confidentiality is of utmost importance for both companies and individuals. Recent news related to massive breaches of privacy by both external actors such as government agencies, rogue teams, and internal actors such as communication services providers (i.e., Google, Apple, Facebook, Microsoft, Amazon) have exacerbated the need for more secure communication technologies. Although message content can be encrypted end-to-end by so-called off-the-record techniques, message metadata such as sender, recipient, time sent and size can still leak a lot of information about communicating parties. Oblivious RAM (ORAM) systems form a promising new branch of research for hiding such metadata from the hosting servers, but they have not yet been deployed in production environments. Due to their complexity and performance penalty, they can currently be used only for very simple client-server applications such as instant messaging (IM). In this context, we show that accessing metadata on a messaging server can leak information that could be concealed by ORAM systems. More specifically, we show the differences observed in metadata collection between a classic XMPP server and two ORAM-based servers. In order to assess those systems , we have designed a new attack based on live forensic techniques in order to retrieve metadata from the RAM of a running IM server. We have used two datasets of instant messages for carrying out this assessment. Our experimental attack setup can highlight the leak of metadata from a standard messaging server and can also be used for testing the security of an ORAM-based messaging server.