1 / 51

Wireless sensor networks for art & entertainment applications

Wireless sensor networks for art & entertainment applications. Center for Embedded Networked Sensing April 18, 2003. Jeff Burke UCLA School of Theater, Film and Television jburke@hypermedia.ucla.edu. HyperMedia Studio Background. Founded in 1997 by film professor Fabian Wagmister.

Jims
Download Presentation

Wireless sensor networks for art & entertainment applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Wireless sensor networks for art & entertainment applications Center for Embedded Networked Sensing April 18, 2003 Jeff Burke UCLA School of Theater, Film and Television jburke@hypermedia.ucla.edu

  2. HyperMedia Studio Background • Founded in 1997 by film professor Fabian Wagmister. • Located in a re-assigned television studio and edit rooms in the UCLA Department of Film, Television and Digital Media. Behind the Bars (1999) Iliad Project Team (2002)

  3. Goals Invocation and Interference (2001) • Investigate the impact of emerging technologies on traditional production of theater, film and television. • Explore new work inspired and enabled by their unique capabilities and qualities. • Build tools that help creators, educators, and students explore • novel uses of instrumented environments. Iliad Project Rehearsal(2002)

  4. Arena • Installation artworks • Live performance • Educational spaces • Film production Behind the Bars (1999) Macbett (2001)

  5. Outline • Background • Application examples • Live performance • Installation artworks • Film production • Research topics • Collaboration • Research • Curriculum

  6. Ubicomp as a catalyst for creation • Ubiquitous computing technologies canmake action and presence in physical space relevant to computing, networks, and media. • Process • Presence • Context

  7. Art as a laboratory for ubicomp • Controlled slices of the real world • Development / rehearsal process • need to author on-the-fly, in-the-field • Specific performance requirements different tradeoffs • Challenge of basic assumptions • different perspectives

  8. Live performance • Responsive stage environments • Arizona State University’s Intelligent Stage (Lovell, et al.) • University of Georgia (Saltz) • MIT Media Lab (Sparacino, et al.) • Audience interaction / incorporation • Carnegie Mellon ETC • Blast Theory (UK) • Many other companies Fahrenheit 451 (2000)

  9. Macbett Directed by Adam Shive, Interactive systems by Jeff Burke (2001) • Department of Theater subscription-series production. • Large-scale theatrical lighting and sound controlled based on actor movement, as sensed by an off-the-shelf wireless tracking system. • Team of one graduate student and four undergraduate programmers, as well as the entire design team, cast, and crew.

  10. Macbett • Seven networked workstations and servers communicated via UDP for • Sensor management, • Feature calculation, • Lighting control, • Sound control, • “Authoring” of interactive relationships, • System monitoring.

  11. Macbett

  12. The Iliad Project Architecture: Jeff Burke, Jared Stein / Research direction: Fabian Wagmister, Edit Villareal, Jose Luis Valenzuela • An ongoing research project that involves the simultaneous development of an original theatrical script, design and acting technique, and technology. • Emerging focus Customization of the script and media based on the attending audience.

  13. The Iliad Project • Core technologies • Audience database • Radio-Frequency Identification • Geographic Information System • Image capture and manipulation • Text processing • Live video streaming • Middleware – “Kolo”

  14. Live performance • Sensor network research areas • Localization: robust, precise, and scalable • Actor (and audience) ‘state’ • Middleware • Monitoring and realtime feedback • Authoring tools

  15. lights 1 2 intensity intensity ML001 intensity colorC colorM colorY focus pan tilt Middleware for experiementation • Lightweight, consistent access to inputs and outputs. • Very high-level abstractions can be counterproductive,unless they can be built by the author/creator. lights.ML001.colorM magenta level of moving light #1

  16. tracking actors macbett macbett banco position x x x y y y z z z Middleware for experiementation • First, make input and output devices on the network available in a consistent way. • Let authors define hierarchies and abstractions. • or

  17. lights.ML001, sound.pb1.left left lights sound syncCue1 intensity members ML001 intensity pb1 intensity colorC colorM colorY focus pan tilt Middleware for experiementation • Synchronization of actions in the environmentwithout centralized show control. • ‘group’

  18. Middleware for experiementation • Managing connections • Support for constructing ‘real-time’ relationshipsbetween data sources and sinks. • Arbitration between competing relationships. • Kolo middleware • Java-based middleware API(device drivers in C/C++) • Scripting language

  19. Specifying simple spatial relationships

  20. Specifying simple spatial relationships

  21. Specifying simple spatial relationships

  22. Specifying simple spatial relationships

  23. Installation artwork • Non-narrative participatory experiences • Without performance definitions of performer/audience • Extensions into media-rich educational spaces • Participatory simulations • Embodied and kinesthetic learning hamletmachine (2000-2) cheLA demonstration (2002)

  24. Time & Time Again… Fabian Wagmister and Lynn Hershmann (1999) • Distributed interactive installation exploring complex relationships between our increasingly interlinked bodies and machines. • Site-specific work in the Ruhr region of Germany. • Live video streamed into musuem. • Real-time video compositing. • Silhouette image choice controlled by sensors in the installation space. • Robotic, telematic doll streaming images to and being controlled by web users. • History database of video fragments recorded by web visitors.

  25. Time & Time Again… Fabian Wagmister and Lynn Hershmann (1999)

  26. Behind the Bars Fabian Wagmister (1999) • Confrontational interactive environment treating Latin America’s history of physical and intellectual oppression. • Premiere: • Central American Film and Video Festival in Nicaragua.

  27. Media-rich educational spaces • Research areas • Adapt a simple wireless ADC platform? • Middleware • Data storage / mining • Space configuration • Authoring tools

  28. Film production • Equipping the film set with a wireless sensor network to support all aspects of production through • Instrumentation • Observation • ‘Augmented footage’ • Decision support • Control • Developing collaboration with CENS faculty.

  29. Sensor networks on the film set • Not just application… laboratory: • ‘Articulated chaos’, • Complex coordination to generate small slices of controlled reality, • Tradition of precise documentation, • Repetitive action, • Rapid, large scale field deployments, • Rigorous test environment already available on our campus (over 100 student films per year).

  30. Terminology • Script • Location • Scene • Shot • Take Exterior, Day, Suburban Street Closeup of Heather at the fruit stand

  31. Three time frames

  32. Editing in post-production

  33. Filmmaking Application Drivers • Primary / initial • Asset management • Continuity management • Secondary / future • Real-time control • General post-production • Special effects

  34. Platform • Wireless articulated imagers • Commercial x86 low-power architecture • Active tags • Low-cost, low-power tags providing RF and/or acoustic localization • Passive RFID tags • Primarily for asset management

  35. Continuity Management • Ensuring repeatable action by the crew and talent. • ‘Script supervision’ is facilitated by several ‘continuity assistants’. • Fine to coarse grained documentation. • Squirrel twitches her nose in take 1a, but not in take 1d. • Male bystander correctly on the woman’s right in both shots.

  36. Continuity / Asset Management • A superset of asset management: • Must track on-camera and off-camera items that create the film camera’s field of view for a shot: • Presence and qualities of props, scenery, and actors. • Equipment positions and settings.

  37. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge

  38. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge • Eventually, must localize O(100) objects for continuity on a large set. • Need for coarse-grained localization for O(1000) objects for asset tracking only is apparent. • Would like to know more than just location: orientation, gaze direction in closeups, etc.

  39. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge • Data recorded must be ‘frame-accurate’, synchronized with the SMTPE time code of the shoot. • 24 frames/sec for film. • 29.97 frames/sec for NTSC video. • Reasonable but fine-grained synchronization of wireless sensor network to this external clock.

  40. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge • All that matters is what the film camera sees – it is the ‘privileged viewpoint’ of the system. • Camera and film stock parameters can be used to estimate what is seen and what isn’t. • With this information, manage: • focus of attention, • power consumption, • processing cycles, • bandwidth, • sensor articulation.

  41. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge • Need to track and take advantage of the repetitive action built into the process: • multiple shots of one scene, • multiple takes of one shot, • off-camera rehearsal. • Leverage filmmakers’ experience to develop heuristics for what action is ‘crucial’ and what is not.

  42. Needs and constraints • Localization with multiple sensing modalities • Synchronization • Privileged viewpoint • Event repetition • The Script – a priori scene knowledge • Unlike many ubicomp scenarios, the film shoot has a carefully defined blueprint for the action that will occur: the script. • Could consider ‘script reading’ research, or… • Use the consistent format of film scripts to recognize who and what are (or will be, or was) present in a scene or shot.

  43. Decision support example: the 180o rule

  44. Decision support example: the 180o rule

  45. Decision support example: the 180o line

  46. Decision support example: the 180o rule • A simple position-based metric for perceived discontinuity of scene layouts from multiple perspectives. • Well-suited for real-time visualization. • Important use of ‘coarser’ grained continuity management.

  47. Application Driver for What? • Localization and identification… • using several sensing platforms, high resolution for O(100) objects, low resolution for O(1000). • Asset management and tracking… • with multiple action-dependency relationships, and across multiple time frames. • Embedding sensor network generated metadata into media… • to create augmented footage. • Scene observation techniques… • for recognizing discontinuities across multiple perpsectives. • Field decision support… • considering an arbitrary ‘privileged viewpoint’.

  48. Art as a laboratory for ubicomp • Controlled slices of the real world • Development / rehearsal process • need to author on-the-fly, in-the-field • Specific performance requirements different tradeoffs • Challenge of basic assumptions • different perspectives

  49. Research Agenda • Short term goals • Localization platform • Middleware for ‘real-time’ relationships • Writing many input / output (and storage) drivers • Longer term interests • Fusing vision with other sensing modalities • Authoring tools

  50. Collaboration • Research • Project-based focus • Commissions • Externally funded research • Students and faculty from EE/CS, Theater, Film/TV, Design • Curriculum • Existing courses • FTVD 144/244, Theater 144C • Planned courses • Concurrent TFT / Electrical Engineering

More Related