Interactive Visualisation of Sound Objects

Interactive Visualisation of Sound
Objects
•
•
•
•
Visualise: (vb) to form a mental image or vision of
Cognitive ability
Allows us to internalise data
– Gain insight and understanding
Internal Map = Cognitive Model
5 basic approaches
Indentation
Containment
Node-link diagrams
Clustering
Geographic
Visualisation Techniques /
Methods
Indentation
• Tree control
• Fisheye
Containment
Clustering
• Galaxy of News
• ThemeScape
Geographic
• Treemaps
• Floor plans
• ZUI’s
• Street maps
Node-link diagrams
• 2D diagrams
• Cone Tree
• Fisheye Cone Tree
• Hyperbolic viewer
• XML3D
Sonic Browser Interface
Sonic Browser
Psychophysical
Visual
Auditory
Actions
Cataloguing
Visual
Auditory
Actions
Psychophysical
Tool for Psychophysics
Visual
Encoding
Auditory
Encoding
•
Conduct simple experiments
– Based on a psychophysic scaling
– 1 to 4 dimension comparison of sounds (X,Y, Shape,
Colour)
– Supplements traditional discrimination methods by
providing both increased participant motivation
•
The scaling of the natural sounds vs. the model sounds
– Do they scale?
– Might validate sound models in relation to recordings of
real sounds
Actions
Bouncing Balls
Breaking Glass
Cataloguing
Tool for Cataloguing
Visual
Encoding
Auditory
Encoding
• Utilise traditional IR methods such as
• Indexing and Filtering
– Controlled by text filters of meta information
and by alphasliders of sound properties
Actions
• Combine Sound Retrieval methods such as
– Automatic classification & clustering among
several possible AIR methods
– Cue points analysis and browsing of sound
resources
– Supplement with results of psychophysical
scaling
Domain/Task Focus
• Work is user-centered informed by ethnography
• Trying to understand how real people might work
with sound objects
– E.g. Foley artists, composers, HCI designers
• simplify user navigation to sound objects
• find and manipulate sound objects
• design visual aspects
• design auditory aspects
• evaluate effectiveness & quality of use with
respect to real working contexts
My SOHO Objectives
•
Perform a pilot experiment 3-5 participants comparing
natural sounds versus sound models in psychophysics
scenario
•
Investigate results and use this to complement future work
on sound models
•
Gain insight and feedback from participants on current and
future dimensions to this research
•
Provide a roadmap for the production of a tool which will be
used to navigate sound model collections
Grey Areas / Dimensions
• How to map the parameters from the Sonic
Browser to the Pure Data models in real time?
• How to do this for multiple models or model
instances simultaneous (as when under the Aura)?
• How to catalogue dynamic models such as the
impact model with multiple variable parameters?