ETD Guide/Students/Integrating multimedia elements
Inaccessible dead media has little value. Students who are incorporating moving images, audio and live data streams into their ETDs should not underestimate the work involved in managing these resources. How these resources are created, and the form and format they are created in, will determine how your ETD can be managed, used, preserved, and even re-used in the future.
When your ETD enters a networked electronic environment, it does not exist in isolation. It becomes part of a boundless resource space in which descriptions of the work and its components (metadata) need to be in an internationally recognized form if the work is to be accessible.
Inter-operable standards that will allow for this metadata to be understood by everyone (even machines) are now available. However, the application of these standards requires a new form of collaborative relationship between you, the creator, and indexing initiatives such as the NDLTD.
This means that you, the creator, have to take responsibility for describing each layer or 'object' of content as an independent entity capable of being accessed and manipulated in its own right. From its conception, a compound digital resource should be seen as a composition of objects in an encoding architecture that expresses the spatial and temporal relationships between these objects.
- These objects may be audio (mono or stereo) and visual (2D or 3D) or text.
- They may also be composed from several sources.
- They may be simultaneously acquired, processed, transmitted and used in real time.
If, in the future, the encoding of these objects is to be decoded, the metadata must include format information. In the interests of interoperability, it is good practice to select formats from the list of Internet Media Types (MIME values) whenever possible (this list is a registry where there is a procedure for adding new types, if necessary). Available [on-line] http://www.isi.edu/in-notes/iana/assignments/media-types/media- types/
An important (proposed) standard that multimedia content providers should investigate is the Synchronized Multimedia Integration Language (SMIL). This standard will allow hypertext creators to define and synchronize multimedia elements (video, sound, still images) for web presentation and interaction. Available [on-line] http://www.w3.org/AudioVideo/
Next Section: Providing metadata – inside, outside documentsLast modified on 18 June 2009, at 04:48