We’re happy and we know it : Documentary: Data: Montage

Dovey, J. and Rose, M. (2012) We’re happy and we know it : Documentary: Data: Montage. Studies in Documentary Film, 6 (2). pp. 159-173. ISSN 1750-3280 Available from: http://eprints.uwe.ac.uk/17041

[img] Microsoft Word 2007
Data Montage Dovey Rose formatted July 16.docx - Submitted Version

Download (169kB)
Data Montage Dovey Rose formatted July 16.pdf - Submitted Version

Download (92kB) | Preview


This article is concerned with the social praxis of documentary in the sea of ‘ubiquitous data’ that is both consequence and driver of online social mediation. The topic is given importance by the morphing of the character of video in the context of the latest web coding language, HTML5. Until now web video has been impervious to its networked context; reproducing the conditions of the TV screen in a hypermediated space. Now existing databases and live information drawn from social media can be connected to the documentary environment, offering opportunities for the production of new kinds of knowledge and application. The affordances of networked connectivity offer the potential to re-contextualise documentary material through mobilising the enormous co-creative potential of human discourse captured in the web. The challenge in these marriages of mass media form and rhizomatic network is to find new ways of shaping attention into a coherent experience. To do so we have to re-invent the social praxis of documentary, creating new visual and informational grammars.

Item Type: Article
Uncontrolled Keywords: web, data, digital, documentary, interactive, online, semantic
Faculty/Department: Faculty of Arts, Creative Industries and Education > School of Art and Design
Faculty of Arts, Creative Industries and Education > School of Film and Journalism
Depositing User: M. Rose
Date Deposited: 26 Jul 2012 11:53
Last Modified: 11 Aug 2018 23:52
URI: http://eprints.uwe.ac.uk/id/eprint/17041


Activity Overview
Origin of downloads

Additional statistics for this repository are available via IRStats2

Actions (login required)

View Item View Item