3️⃣

Scenario-Based Design to Envision How GenAI Can Support Sound Design Practices [SMACC LAB]

Interfaces for storyboard creation and chord assignment: A) Storyboard theme input; B) Storyboard phase theme input; C) Image upload for phase description; D) Add phases button; E) Initiate storyboard ideation (phase count option); F) Further inquiry button; G) Image generation button; H) Progress bar; I) Checkbox for chord assignment; J) Chord input; K) Initiate chord generation (phase count, function, tonal key options); L) Further inquiry button; M) Note components button; N) Initiate note arrangement; O) Play rhythmic style button; P) Play chord button.

Sound design involves crafting communicative sonic interfaces such as warning and confirmation signals. An interactive sound design method using musical tension and release with a supporting tool has been explored for practitioners without musical expertise; however, there remain needs for further development of the tool, regarding recommended samples and translated musical knowledge. This paper reports a project that investigates the use of generative AI applications in sonic interaction design through practitioner-based scenarios. We present two case studies that highlight domain challenges based on a prior study of sound and UX designers, develop two practitioner scenarios from the contexts, and create a prototype of an AI-powered sound design tool with four GenAI applications and four supporting features, tailored to the scenarios. We finally provide reflections on the overall design process and the prototype. This study contributes by extending conceptual domain needs into a tangible AI-driven interface through scenario-based design.

Under Review for Australian Conference on Human-Computer Interaction 2024 🤔