UR LOCAL CYBORG

DANCExDANCE @ UMD's Moving with Screens + Machines Symposium

When I first started this blog, I was determined to leave my institution out of it. But I said last post that UR LOCAL CYBORG has now become a travelogue of me running around the DMV to review my professors' work. At this point, I've fully embraced it. For this post, I didn't have to run very far to write the review: DANCE^2 was part of the "Moving With Screens + Machines" that took place right on campus.

MWS+M was organized by Professor Kate Ladenheim, who taught my motion capture class... which I unfortunately dropped on a stress-induced whim right before the deadline to withdraw. But they weren't getting rid of me that easily! One of our assignments was to either volunteer at the symposium or to attend an event and write a reflection. Ya 'borg didn't quite get the memo that dropping class = less work because they're doing both.


The brainchild of dance, theater, immersive media, and robotics faculty at the University of Maryland, DANCE^2 is an interactive dance performance x human-computer interaction study featuring wearable robots. The half-hour performance showcases the Calico, a miniature robot developed by the University of Maryland's Small Artifacts Lab, as it buzzes and glows and maneuvers across the dancers' suits. DANCE^2 responds to its audience, and according to our hopes and fears, the robot can shapeshift from endearing little pet to a surveilling device patrolling the limbs of its host.

As audience members, we experience this performance on stage alongside the dancers and stagehands. This gave me a close-up appreciation of the costumes, which looked like the wardrobe of a sci-fi movie. While the stagehands wore the archetypical lab coat of a mad scientist (some themselves were university faculty), the dancers wore sleek white suits like Yelena's from Black Widow, sewn with helical tracks for the robots to move. As the experience unfolded around us, a disembodied robotic voice prompted us to arrange ourselves onstage as determined by our relationship to technology: walk to the purple lights if you use a smartwatch, walk to the yellow if you don't.

But the bulk of our interaction with DANCE^2 came through a mobile app---a timely twist from the traditional request to silence our phones at the door. At the beginning of the performance, we scanned a QR code that led to a Qualtrics-esque interface (they really didn't let us forget that we were their lab rats). At multiple points in the performance, the robotic voice read aloud a multiple-choice question, a bar graph was projected on stage, and we were asked to answer the question on our phones. More often than not, there were only two options: a hopeful response, a jaded one, nothing in-between.

For some questions we were neck and neck. What will technology feel like in the future? Hostile or friendly? We were a group of thirty on a tightrope walking the line between dry honesty and naive optimism. Participating in this performance brought me back to Cookie Clicker, or maybe Kahoot, watching the bar graphs teeter with our votes. I wonder how everyone else experienced it, because for a performance that responded to the audience as a single being, the experience itself was intimate and individual, confronting me with what was in my own head. My first instinct was to answer each question more critically---but mid-performance, sometimes mid-question, I found myself switching sides, clicking desperately for puppies and hope.

#500-1000 words #dance #live #performance #robots #umd