do /atlas/ en ATLAS researchers converge at TEI’26 to showcase their work on tangible, embedded, and embodied interaction /atlas/atlas-researchers-converge-tei26-showcase-their-work-tangible-embedded-and-embodied <span>ATLAS researchers converge at TEI’26 to showcase their work on tangible, embedded, and embodied interaction</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-03-09T09:06:26-06:00" title="Monday, March 9, 2026 - 09:06">Mon, 03/09/2026 - 09:06</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/TEI%2026%20Conference.png?h=d4c4cd0a&amp;itok=Bbs5T1Mw" width="1200" height="800" alt="TEI 2026 Conference"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/1464" hreflang="en">brainmusic</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/1463" hreflang="en">leslie</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>Sound, vision, movement and touch—ATLAS researchers explore many different ways humans can interact with computers, collect and analyze data, and empower creative exploration.</span></p><p dir="ltr"><span>Nearly a dozen current and former ATLAS lab members will participate in&nbsp;</span><a href="https://tei.acm.org/2026/" rel="nofollow"><span>ACM TEI’26</span></a><span> in Chicago (March 8-11, 2026), the 20th annual conference presenting the latest results in tangible, embedded, and embodied interaction.</span></p><p dir="ltr"><span>This year’s conference theme is “Tide + Tied”. Organizers note, “By becoming a venue to bring multi-folded 'Tides' across diverse, interdisciplinary fields, the conference aims to bring researchers, designers, and artists with different backgrounds and interests together to be 'Tied,' weaving the future of the TEI community together.”</span></p><p dir="ltr"><span>ATLAS has been involved with the TEI conference since its early years, with Professor and ACME Lab director, Ellen Do, and ATLAS director Mark Gross both actively involved behind the scenes.&nbsp;</span></p><p dir="ltr"><span>Do, who is a co-author on three papers and three works-in-progress accepted at TEI ‘26, explains, “Each one of the projects is a documentation of how researchers think about ideas and how to implement them and get them to fruition.”&nbsp;</span></p><p dir="ltr"><span>She elaborates, “The conference is called Tangible, Embedded and Embodied Interaction, so a lot of work we're doing is beyond the screen. Things that we touch and put together.”</span></p><h3><span>Papers</span></h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223849" rel="nofollow"><span><strong>Sound of Kigumi: A Playful VR Joinery Adjustment with Hammering Sound Feedback</strong></span></a></h4><p dir="ltr"><span>Kosei Ueda,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Hironori Yoshida&nbsp;</span></p><p dir="ltr"><span><strong>Abstract</strong>: Traditional carpentry faces a critical shortage of skilled workers due to limited opportunities for potential apprentices to access onsite woodworking experience. Through expert interviews, we learned the importance of hammering sound to judge the precision in Kigumi assembly, as master carpenters rely on differences between “soft sound" and “sharp sound" without relying on visuals. This paper presents Sound of Kigumi (SoK), a playful VR system for inexperienced users to casually experience sound sensory skills through the loop of hammering and chiseling. In SoK, users listen to hammering sound in relation to tightness, assess the precision of their work, and return to chiseling for further adjustments. Furthermore, SoK implements pseudo-haptic feedback by visually modifying hammering resistance based on chiseling progress. Expert evaluation indicated SoK replicates the hammering process and serves as an effective introductory tool, and user feedback confirmed SoK provides an immersive woodworking experience and effective Kigumi learning.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Sound%20of%20Kigumi.png?itok=hzG0hFYY" width="1500" height="903" alt="Sound of Kigumi processing technique"> </div> <span class="media-image-caption"> <p><em>The first prototype: The user observes and hammers two types of Kigumi - one correctly processed without visible gaps when hammered and one incorrectly processed that reveals gaps upon hammering.</em></p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223840" rel="nofollow"><span><strong>Why (Not) ReacTIVision: Emerging Challenges and Opportunities for Building Tangible User Interfaces with Computer Vision Toolkits</strong></span></a></h4><p dir="ltr"><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, S. Sandra Bae, Peter Gyory,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Clement Zheng, Rong-Hao Liang</span></p><p dir="ltr"><span><strong>Abstract</strong>: Outdated Computer Vision (CV) toolkits for Tangible User Interfaces (TUI) have led to fragmented practices, diminished reproducibility, and reduced community support. This paper examines the past, present, and future trajectory of CV-TUI toolkits. First, our scoping review of ACM literature reveals a divergence between applications using the limited interactions of established toolkits like ReacTIVision and the fragmented, bespoke systems built for complex interactions, highlighting the need for advanced toolkits that enable accessible making. Second, we present proof-of-concept applications using the contemporary ArUco fiducial marker library. We demonstrate how accessible hardware, like a top-down camera and a flat-panel display, can support a comprehensive design space of tangible interactions beyond 2D manipulation, including 3D spatial interaction, multi-device interaction, and actuated tangibles within canonical applications. Finally, reflecting on our findings, we offer six suggestions for building next-generation CV-TUI toolkits. This study provides the TUI community with an updated perspective to inform future research.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/ReacTIVision_0.png?itok=eSliRMj7" width="1500" height="431" alt="Sensing touch input on tokens with a capacitive touchscreen"> </div> <span class="media-image-caption"> <p><em>Sensing touch input on tokens with a capacitive touchscreen: a) Each ArUco-marked knob is augmented with a vinyl-cut copper sheet pattern. b-c) The knob transfers finger-touch inputs to the touchscreen when users interact with the token.</em></p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223837" rel="nofollow"><span><strong>HyperDance: Real-Time Vibrotactile Stimulation Feedback of Inter-Brain Connectivity in Partner Dance</strong></span></a></h4><p dir="ltr"><a href="/atlas/thiago-roque" rel="nofollow"><span><strong>Thiago Rossi Roque</strong></span></a><span>, Ruojia Sun,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>,&nbsp;</span><a href="/atlas/grace-leslie" rel="nofollow"><span><strong>Grace Leslie</strong></span></a><span>&nbsp;</span></p><p dir="ltr"><span><strong>Abstract</strong>: Building on the growing interest in technology-supported dance practice, neural imaging offers novel opportunities to reveal dancers’ internal states and expand the possibilities for augmented, embodied interaction. Despite advances in social neuroscience, the exploration of dance through brain imaging remains limited by technical challenges. To overcome these barriers, we developed and validated a real-time vibrotactile biofeedback system based on inter-brain coupling (IBC) measures from tango dancers using a mobile, synchronous multi-brain EEG system. We first conducted an empirical study recording synchronized EEG and motion data to test whether behavioral synchronization enhances inter-brain coupling. Insights from this study informed the design of our tangible neurofeedback system, which experienced dancers evaluated. Our findings support the Synchronicity Hypothesis of Dance and demonstrate how embodied technologies can enhance collective dance practice. This work introduces a novel methodological and interaction paradigm, bridging neural measurement with wearable feedback for socially situated embodied experiences.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/HyperDance.png?itok=TyomgJTX" width="1500" height="880" alt="Two dances wearing EEG caps"> </div> <span class="media-image-caption"> <p><em>HyperDance enables real-time measurement and tactile feedback of inter-brain coupling during natural partner dance practice.</em></p> </span> </div></div><hr><h4>Art and Performance</h4><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224275" rel="nofollow"><span>Bioactuated Tapestry: Converging Textile Craft and Moisture-Responsive Biomaterials</span></a></h4><p><a href="/atlas/eldy-lazaro" rel="nofollow"><span><strong>Eldy S. Lazaro Vasquez</strong></span></a><span>;&nbsp;</span><a href="/atlas/viola-arduini" rel="nofollow"><span><strong>Viola Arduini</strong></span></a><span>;&nbsp;</span><a href="/atlas/etta-sandry" rel="nofollow"><span><strong>Etta W Sandry</strong></span></a><span>;&nbsp;</span><a href="/atlas/katerina-houser" rel="nofollow"><span><strong>Katerina Houser</strong></span></a><span>;&nbsp;</span><a href="/atlas/srujana-golla" rel="nofollow"><span><strong>Srujana Golla</strong></span></a><span>;&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span><strong>Mirela Alistar</strong></span></a></p><p><span><strong>Abstract</strong>: Bioactuated Tapestry is an installation that explores how biomaterials and textile craft unfold multiple temporalities of interaction. Structured in three zones, the installation moves from milk-based bioplastic samples that change shape quickly when misted, to a Sample Book that documents iterations of bioplastic integration into weaving, to a woven tapestry that changes shape slowly in response to humidity in the surrounding space. Together, these zones demonstrate how interaction can emerge from material behavior shaped through biomaterial formulation and, when woven, through structure. The work foregrounds biomaterial agency, weaving, and situated sustainability grounded in sourcing, fabrication, and practices of care. Through this convergence of biodesign and textile craft, Bioactuated Tapestry aligns with the TEI theme of Resurgence and Convergence, highlighting how material-led practices reconnect material experimentation, environmental attunement, and embodied ways of knowing.</span></p><p><a class="ucb-link-button ucb-link-button-blue ucb-link-button-default ucb-link-button-regular" href="https://www.eldylazaro.com/?portfolio=bioactuated-tapestry" rel="nofollow"><span class="ucb-link-button-contents">Learn More</span></a></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Bioactuated%20Tapestry%202.jpg?itok=CD47wz59" width="1500" height="1125" alt="Detail of bioactuated textile"> </div> <span class="media-image-caption"> <p><em>Detail of Bioactuated Tapestry, showing colored casein-based bioplastic strips woven through black cotton yarns. Moisture causes the bioplastic to change shape, and the weave directs that change into curling.</em></p> </span> </div></div><hr><h3>Pictorials</h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224206" rel="nofollow"><span>Designing for the Leaky Body: Exploring Biomaterial Absorption as Body-Material Interaction</span></a></h4><p dir="ltr"><a href="/atlas/viola-arduini" rel="nofollow"><span><strong>Viola Arduini</strong></span></a><span>;&nbsp;</span><a href="/atlas/eldy-lazaro" rel="nofollow"><span><strong>Eldy S. Lazaro Vasquez</strong></span></a><span>;&nbsp;</span><a href="/atlas/srujana-golla" rel="nofollow"><span><strong>Srujana Golla</strong></span></a><span>;&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span><strong>Mirela Alistar</strong></span></a></p><p><span>Abstract: Leaking bodies are often concealed or disregarded in both society and design. Likewise, bodily fluids are rarely leveraged as triggers for material interaction in HCI. In this pictorial, we investigate how fluid-responsive biomaterials can enable porous, expressive, and cyclical interactions co-shaped by the body. We focus on a milk-derived bioplastic with reversible shape-changing properties, examining fluid absorption as a meaningful design affordance. Our material-led approach contributes both formulation and fabrication methods of casein bioplastic; while autoethnographic inquiry with a lactating body informed the development of Leaky Body Maps and speculative garments that position leakage as a generative site of body-material interaction. This work contributes to the discourse of feminist and posthuman HCI by centering bodily permeability, material responsiveness, and the potential of designing with – rather than concealing – leaky bodies.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2026-03/Leaky%20Body.jpg?itok=VSsJmDqY" width="375" height="610" alt="Leaky body garment prototype"> </div> <span class="media-image-caption"> <p><em>Garment prototype showing where casein-based bioplastic was placed based on a body leak map of possible milk leakage.</em></p> </span> </div></div><hr><h3><span>Works In Progress</span></h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224242" rel="nofollow"><span><strong>ArUcoTUI: Software Toolkit for Prototyping Tangible Interactions on Portable Flat-Panel Displays with OpenCV</strong></span></a></h4><p dir="ltr"><span>Rong-Hao Liang, Steven Houben,&nbsp;</span><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, S. Sandra Bae, Peter Gyory,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Clement Zheng</span></p><p dir="ltr"><span><strong>Abstract</strong>: Tangible User Interfaces (TUIs) that integrate digital information with physical interaction require specialized hardware and complex calibration, limiting their adoption in portable or mobile display systems. This paper introduces ArUcoTUI, a computer vision (CV) toolkit for prototyping tangible interactions on portable screens, leveraging standard cameras and the OpenCV library. ArUcoTUI uses ArUco fiducial markers to detect physical inputs. The software toolkit offers streamlined calibration, a signal processing pipeline, and a client application that translates tangible input into structured events for use in HCI applications. Using a conventional camera in a top-down setting with a flat-panel display, we demonstrate how this toolkit supports the development of interactive surface TUIs with advanced features, including 3D spatial interaction, multi-device interaction, and actuated tangibles within applications. We describe the software implementation, which utilizes accessible hardware to support the development of these tangible interactions. We provide the results of a preliminary evaluation with users, including design implications and suggestions for future research and development.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/ArUCoTUI.png?itok=jTHxkdnf" width="1500" height="287" alt="ArUcoTUI is a software toolkit for rapid TUI prototyping on portable screens"> </div> <span class="media-image-caption"> <p>ArUcoTUI is a software toolkit for rapid TUI prototyping on portable screens. It uses standard cameras, OpenCV, and ArUco markers for real-time object tracking. We demonstrate the applicability using an overhead camera for a) multi-token music control, b) above-screen gesture detection, c) multi-display board games, and d) actuated data visualization using robots.</p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224189" rel="nofollow"><span><strong>Rig-a-Doodle: Tangible Kit for Dynamic Hand-drawn Character Animation</strong></span></a></h4><p dir="ltr"><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, Khushbu Kshirsagar, Harrison Jesse Smith,&nbsp;</span><a href="/atlas/ellen-yi-luen-d" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a></p><p dir="ltr"><span><strong>Abstract</strong>: Character animation remains challenging for novices and children despite advances in digital tools. While recent tangible interfaces have lowered barriers by enabling creators to animate their drawings on paper, they are limited to preset animation sequences and support for only human-like characters. We present Rig-a-Doodle, a tangible kit and web application for fully open-ended character rigging animation, where creators can draw any character and construct a custom physical rig using everyday materials to animate it. This work-in-progress contributes a system of tangible interaction to animate hand-drawn characters by direct physical manipulation of custom rigs in real-time. We share findings from a preliminary workshop with adults to explore the kinds of expressive animation the kit enables, discover issues with interaction, and source ideas for future directions.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Rig-a-Doodle.png?itok=uGx3UhhN" width="1500" height="1212" alt="Rig-a-Doodle character template"> </div> <span class="media-image-caption"> <p>(Top-left) Rig-a-Doodle character template to draw the character and cut out CV markers for the rig. (Top-right, bottom-left, bottom-right) The three steps of Capture, Assign, and Play illustrated with screenshots from the Rig-a-Doodle application.</p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224167" rel="nofollow"><span><strong>Lighting the Reef: Modular Paper Circuits as Ecological Metaphor</strong></span></a></h4><p dir="ltr"><a href="/atlas/ruhan-yang" rel="nofollow"><span><strong>Ruhan Yang</strong></span></a><span>,&nbsp;</span><a href="/atlas/yuchen-zhang" rel="nofollow"><span><strong>Yuchen Zhang</strong></span></a><span>,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a></p><p dir="ltr"><span><strong>Abstract</strong>: We present Lighting the Reef, an interactive installation that uses modular 3D paper circuits to explore ecological fragility. Participants build coral structures from foldable paper blocks with copper tape and low-voltage components. When connections align, coral modules glow, metaphorically expressing the energy exchange between coral and zooxanthellae, the symbiotic algae crucial to coral metabolism. Pollution modules add resistance that dims light or interrupts the current entirely, mirroring environmental disruption. We position Lighting the Reef as a Research through Design case that articulates fragility as an interaction aesthetic and ecological metaphor. We reflect on how modular circuitry, material constraints, and embodied play make precarity tangible. We also report workshops with 15 participants that discussed themes of care, collapse, and interdependence. We contribute insights into designing for fragility with modular circuits, ecological storytelling through tangible interaction, and accessible and reproducible designs for participatory sustainability education.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Lighting%20the%20Reef.png?itok=kmLGkp1V" width="1500" height="1073" alt="Lighting the Reef installation built from 3D paper circuit modules"> </div> <span class="media-image-caption"> <p><em>Lighting the Reef is a tangible installation built from 3D paper circuit modules, whose illumination depends on alignment and balance. As participants assemble and adjust the blocks, the lights brighten, dim, or turn off, reflecting the changing conditions of the coral system.</em></p> </span> </div></div><p dir="ltr">&nbsp;</p></div> </div> </div> </div> </div> <div>Members of several ATLAS labs show off the latest research on human-computer interactivity.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 09 Mar 2026 15:06:26 +0000 Michael Kwolek 5177 at /atlas ATLAS community presents new research on interactive systems at DIS 2025 /atlas/atlas-community-presents-latest-research-human-computer-interaction-dis-2025 <span>ATLAS community presents new research on interactive systems at DIS 2025</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-06-26T11:14:27-06:00" title="Thursday, June 26, 2025 - 11:14">Thu, 06/26/2025 - 11:14</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-06/DIS%202025%20logo_0.png?h=252f27fa&amp;itok=iTkbKstP" width="1200" height="800" alt="DIS 2025 conference"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/342" hreflang="en">devendorf</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/771" hreflang="en">phd</a> <a href="/atlas/taxonomy/term/1426" hreflang="en">phd student</a> <a href="/atlas/taxonomy/term/376" hreflang="en">unstable</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><a href="https://dis.acm.org/2025/" rel="nofollow"> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2025-06/DIS%202025%20logo.png?itok=mbKo8dOI" width="375" height="179" alt="ACM designing interactive systems '25 Madeira, Portugal"> </div> </div> </a><p dir="ltr"><span>The 2025&nbsp;</span><a href="https://dis.acm.org/2025/" rel="nofollow"><span>ACM Designing Interactive Systems Conference</span></a><span> (DIS) in Madeira, Portugal, features work from ten ATLAS community members representing three labs. This year’s event has five focus areas: Critical Computing and Design Theory, Design Methods and Processes, Artifacts and Systems, Research Through Design, and AI and Design with an overall theme around “design that transcends human-centered perspectives.”</span></p><p dir="ltr"><span>ATLAS researchers study a broad range of topics, from human-computer interaction to biomaterials to woven forms.&nbsp;</span></p><p dir="ltr"><span>Ellen Do, professor and ACME director, explains what connects the work our community is presenting at the conference: “I think all of the papers and presentations we have are on designing interactive systems. Some of the systems could be physical, some could be digital, some could be human-and-people, human-and-physical objects. So I think the theme about interactive systems and how you make systems interactive, what kind of user experience or human experience or immersive experience with the object or system or even the ecosystem, or the human communication system—I think that's all there.”</span></p><h3>ATLAS research at DIS 2025</h3><p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200707" rel="nofollow"><span><strong>"Chaotic, Exciting, Impactful": Stories of Material-led Designers in Interdisciplinary Collaboration</strong></span></a><br><span>Gabrielle Benabdallah,&nbsp;</span><a href="/atlas/eldy-lazaro" rel="nofollow"><span>Eldy S. Lazaro Vasquez</span></a><span> (ATLAS PhD student),&nbsp;</span><a href="/atlas/laura-devendorf" rel="nofollow"><span>Laura Devendorf</span></a><span> (ATLAS Unstable Design Lab director, associate professor),&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span>Mirela Alistar</span></a><span> (ATLAS Living Matter Lab director, assistant professor)</span></p><p dir="ltr"><span>This paper explores the dynamics of interdisciplinary collaboration between designers, scientists, and engineers through ten stories as told from the perspective of material-led designers. These stories focus on material-led designers working in contexts like biodesign and smart textiles, where novel materials, fabrication methods, and technology often intersect, requiring cross-disciplinary collaboration. By including perspectives from designers within and adjacent to HCI, the study broadens the understanding of interdisciplinary teamwork that combines scientific, technical, and craft-based expertise. Our analysis highlights how designers navigate challenges like differing terminologies, epistemic hierarchies, and conflicting priorities. We discuss strategies such as material prototypes, attitudes of inquiry and openness, switching lexicons, and the value of interdisciplinary contexts. This research underscores designers as “translators” who mediate epistemological tensions, use tangible artifacts to communicate, and articulate possible applications. This research contributes ten stories as narrative resources for understanding strategies and fostering interdisciplinary spaces within HCI.</span><br>&nbsp;</p><p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200861" rel="nofollow"><span><strong>Towards Yarnier Interactive Textiles: Mapping a Design Journey through Hand Spun Conductive Yarns</strong></span></a><br><a href="/atlas/etta-sandry" rel="nofollow"><span>Etta W. Sandry</span></a><span> (ATLAS PhD student),&nbsp;</span><a href="/atlas/lily-gabriel" rel="nofollow"><span>Lily M. Gabriel</span></a><span> (ATLAS undergraduate student),&nbsp;</span><a href="/atlas/eldy-lazaro" rel="nofollow"><span>Eldy S. Lazaro Vasquez</span></a><span> (ATLAS PhD student),&nbsp;</span><a href="/atlas/laura-devendorf" rel="nofollow"><span>Laura Devendorf</span></a><span> (ATLAS Unstable Design Lab Director, associate professor)</span></p><p dir="ltr"><span>The ability to create a wide and varied set of interactive textiles depends on the materials that one has available. Currently, the range of yarns that can be used to bring interactivity to textiles is greatly limited, especially considering the diversity available in non-conductive yarns. This pictorial traces a design journey into hand spinning that seeks to address this limitation and contributes samples of techniques and materials that could be used to create conductive yarns along with reflection on design methods that enabled us to explore a wider range of aesthetic expressions. We advocate for an approach that reconnects with the textiles in e-textiles, embraces divergence, and prioritizes the material as the driver of a design concept. We offer pathways for readers and researchers to continue this exploration within varied domains and practices.</span></p> <div class="align-center image_style-large_image_style"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2024-12/spinningConductiveYarnBanner.jpg?itok=7PkmpUu3" width="1500" height="1000" alt="A table with a variety of different yarns varying in texture and size spread out."> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200738" rel="nofollow"><span><strong>Connect! A Circuit-Driven Card Game</strong></span></a><br><a href="/atlas/ruhan-yang" rel="nofollow"><span>Ruhan Yang</span></a><span> (ATLAS PhD alum),&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span>Ellen Yi-Luen Do</span></a><span> (ATLAS ACME Lab director, professor)</span></p><p dir="ltr"><span>Hybrid physical-digital games often rely on screen-based interactions, which can detract from their tactile nature. We introduce Connect!, a card game that integrates paper circuits and real-time LED feedback, enabling players to construct functional circuits as part of gameplay. Unlike traditional hybrid games, Connect! embeds feedback directly into physical components while preserving material interaction. We conducted a user study comparing gameplay with and without electronic feedback. Our findings suggest that real-time feedback not only increased engagement but also altered players' behavior, encouraging rule exploration and emergent play. Our work contributes to tangible interaction and game-based learning, demonstrating the potential of low-cost electronics in enhancing interactive experiences.</span></p> <div class="align-center image_style-large_image_style"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-06/Connect%20Card%20Game.jpg?itok=IJZECkiT" width="1500" height="882" alt="Connect game cards"> </div> <span class="media-image-caption"> <p><em>Connect! game cards</em></p> </span> </div> <p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200557" rel="nofollow"><span><strong>From Data to Discussion: Interfaces for Collective Inquiry and Open-Ended Data Creation</strong></span></a><br><a href="/atlas/david-hunter" rel="nofollow"><span>David Hunter</span></a><span> (ATLAS PhD student)</span></p><p dir="ltr"><span>Data can enrich our understanding of the world and improve our society. However the datafication of our society comes with challenges for empowering communities. In designing systems for recording and representing data, a theme has emerged of these interfaces as the site of conversations and sense-making, and the participatory nature is valuable beyond the data itself. This insight has led me to investigate tools and experiences that enable open-ended data creation and exploration as a grounding for discussion and prompting action. The goal is to design interfaces and systems for exploring places and futures through data, to empower communities and supporting civic participation, learning and making, situational awareness, and scenario planning. In this pictorial I present five ongoing research projects investigating these ideas.</span></p> <div class="align-center image_style-large_image_style"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-06/How%20To%20Data%20Walk%20Hunter.jpg?itok=uoUZXzxJ" width="1500" height="1281" alt="Graphic depicting steps to data walking"> </div> <span class="media-image-caption"> <p><em>How to Data Walk</em></p> </span> </div> <p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200627" rel="nofollow"><span><strong>Knitting with unknown trees: assembling a more-than-human practice</strong></span></a><br><span>Doenja Oogjes, Ege Kökel,&nbsp;</span><a href="/atlas/netta-ofer" rel="nofollow"><span>Netta Ofer</span></a><span> (ATLAS PhD alum), Hsiang-Lin Kuo, Jasmijn Vugts, Troy Nachtigall,&nbsp;</span><a href="/atlas/torin-hopkins" rel="nofollow"><span>Torin Hopkins</span></a><span> (ATLAS PhD alum)</span></p><p dir="ltr"><span>In this pictorial, we explore alternative ways of knowing urban trees through a more-than-human lens. Using a municipal tree dataset, we focus on “unknown” trees—entries unclassified due to error, decay, or absence—highlighting the limits of quantification and fixed knowledge systems. Urban trees, while critical for ecosystems, are often shaped by technological interventions (e.g., GIS, IoT sensors, AI diagnostics) that prioritize their utility over other expressions. We engage in knitting as a material inquiry to foreground nonhuman agencies and relational entanglements. Through reflective shifts and compromises, this project questions normative design practices, seeking to amplify nonhuman participation. We make two contributions. Firstly, we offer insights into fostering alternative, relational engagements with urban ecologies. Secondly, we reflect on our process of surfacing and working with agentic capacities, articulating guidance for other design researchers. Through this, we advocate for fragmented approaches that embrace complicity and complexity in more-than-human design.</span><br>&nbsp;</p><p dir="ltr"><a href="https://programs.sigchi.org/dis/2025/program/content/200577" rel="nofollow"><span><strong>Designing Interfaces that Support Temporal Work Across Meetings with Generative AI</strong></span></a><br><a href="/atlas/rishi-vanukuru" rel="nofollow"><span>Rishi Vanukuru</span></a><span> (ATLAS PhD student), Payod Panda, Xinyue Chen, Ava Elizabeth Scott, Lev Tankelevitch, Sean Rintel</span></p><p dir="ltr"><span>Temporal work is an essential part of the modern knowledge workplace, where multiple threads of meetings and projects are connected across time by the acts of looking back (retrospection) and ahead (prospection). As we develop Generative AI interfaces to support knowledge work, this lens of temporality can help ground design in real workplace needs. Building upon research in routine dynamics and cognitive science, and an exploratory analysis of real recurring meetings, we develop a framework and a tool for the synergistic exploration of temporal work and the capabilities of Generative AI. We then use these to design a series of interface concepts and prototypes to better support work that spans multiple scales of time. Through this approach, we demonstrate how the design of new Generative AI tools can be guided by our understanding of how work really happens across meetings and projects.</span></p></div> </div> </div> </div> </div> <div>Members of three ATLAS labs show how interactive technology can create possibilities for new means of productivity, data analysis, creativity and play.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 26 Jun 2025 17:14:27 +0000 Michael Kwolek 5090 at /atlas Colorado-based Computer Graphics Professionals Make Their Mark at SIGGRAPH 2024 /atlas/2024/08/02/colorado-based-computer-graphics-professionals-make-their-mark-siggraph-2024 <span>Colorado-based Computer Graphics Professionals Make Their Mark at SIGGRAPH 2024</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-08-02T10:30:29-06:00" title="Friday, August 2, 2024 - 10:30">Fri, 08/02/2024 - 10:30</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/ruhan_yang_at_conference.jpeg?h=982fb0dd&amp;itok=dCtC-aIu" width="1200" height="800" alt="Ruhan Yang sits behind a table showing off paper circuits research at the conference"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/1426" hreflang="en">phd student</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/883" hreflang="en">yang</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <div>ATLAS community members, including professor Ellen Do and PhD student Ruhan Yang, presented at this year's conference in Denver.</div> <script> window.location.href = `https://www.koaa.com/news/covering-colorado/colorado-based-computer-graphics-professionals-make-their-mark-at-siggraph-2024`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 02 Aug 2024 16:30:29 +0000 Anonymous 4738 at /atlas Public-private partnership drives attention for ATLAS research in augmented and mixed reality /atlas/2024/07/18/public-private-partnership-drives-attention-atlas-research-augmented-and-mixed-reality <span>Public-private partnership drives attention for ATLAS research in augmented and mixed reality</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-07-18T10:41:59-06:00" title="Thursday, July 18, 2024 - 10:41">Thu, 07/18/2024 - 10:41</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/suibi_ppi_award.jpg?h=68f59cd4&amp;itok=aZvQv4Zm" width="1200" height="800" alt="Suibi Che-Chuan Weng receives his award certificate "> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/1426" hreflang="en">phd student</a> <a href="/atlas/taxonomy/term/374" hreflang="en">phdstudent</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p>Partnerships between universities and industry can yield important research and commercial breakthroughs. ATLAS professor Ellen Do has worked to cultivate relationships between CU ý and industry players, including as a&nbsp;member of the Pervasive Personalized Intelligence (PPI) Center, to support graduate students and enhance opportunities for commercialization of ATLAS research.</p><p>The <a href="https://www.ppicenter.org/" rel="nofollow">PPI Center</a>, which recently concluded its tenure, was founded “with a mission of bringing industry and university talent together to solve the intelligence challenges faced by software and computer engineers in Internet of Things systems." It operated under the supervision of the National Science Foundation and included members from NEC, Intel and Trimble.</p><blockquote><p><em>“It’s been such a good experience. We’ve learned a lot. Ellen Do and her team have helped to expand our thinking and encouraged us to explore new areas.”</em> - Dr. Haifeng Chen, Head of Data Science Department at NEC Laboratories, and his colleague Kai Ishikawa, Principal Researcher&nbsp;(PPI Center event recap)</p></blockquote><p>The PPI Center’s <a href="https://www.ppicenter.org/post/the-ppi-center-s-profound-impact-on-industry-faculty-students" rel="nofollow">Spring 2024 Industry Advisory Board Meeting</a> in Portland, OR, included a research poster session, and ATLAS students were honored with three of the four awards industry attendees voted on at the event.&nbsp;</p><ul><li><strong>Suibi Che-Chuan Weng</strong>, PhD student, won "Most Industry Ready" for <a href="/atlas/sites/default/files/attached-files/weng-editing_reality.pdf" rel="nofollow"><em>Editing Reality: Empowering Users to Manipulate Reality through Addition, Erasing, and Modification with Speech to Prompt in Mixed Reality</em></a>.</li><li><strong>Rishi Vanukuru</strong>, PhD student, won "Most Impactful" for <a href="/atlas/sites/default/files/attached-files/vanukuru-asynchronous_spatial_guidance.pdf" rel="nofollow"><em>Asynchronous spatial guidance using mobile devices and Augmented Reality</em></a>.</li><li><strong>Ada Zhao</strong>, MS student, won "Most Impactful" for <a href="/atlas/sites/default/files/attached-files/zhao-wizard_and_apprentice.pdf" rel="nofollow"><em>The WizARd and Apprentice: Augmented Reality Expert Capture for Training Novices</em></a>.</li></ul><p class="text-align-center"> </p><div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/ppi_suibi.jpg?itok=5E30jhrA" width="750" height="563" alt="Suibi Che-Chuan Weng receives his award certificate"> </div> .&nbsp;&nbsp; <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/ppi_rishi.jpg?itok=hz2hKwGz" width="750" height="563" alt="Rishi Vanukuru receives his award certificate"> </div> &nbsp; &nbsp; <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/ppi_zhao.jpg?itok=gD50IKdc" width="750" height="563" alt="Ada Zhao receives her award certificate"> </div> <p>2 more ATLAS PhD students participated: <strong>Krithik Ranjan</strong> presented <a href="/atlas/sites/default/files/attached-files/ranjan-puppet_guide.pdf" rel="nofollow"><em>PuppetGuide: Tangible Personalized Museum Tour Guides using LLMs</em></a> and <strong>David Hunter</strong> presented <a href="/atlas/sites/default/files/attached-files/hunter-tangible_interaction.pdf" rel="nofollow"><em>Tangible Interaction with Object Detection and Large Language Models</em></a>.</p><p>As for the experience participating in the PPI Center, Do says, “it is good to know that the industry is interested in supporting research and considers our research relevant.” She sees ways ATLAS could form partnerships within several industry sectors on a range of themes due to the multidisciplinary nature of the research conducted here.</p><p>Since their involvement in PPI started, Do and her team have had a series of meetings with mentors from global technology firms, discussing collaborative research opportunities.</p><p>Vanukuru is currently doing an internship at Microsoft Research Cambridge focused on spatial computing in its VR/AR group. Weng and Zhao are working on research in the ACME Lab this summer, extending the Editing Reality (and PuppetGuide), and WizARd and Apprentice projects with interns from the <a href="/engineering/students/research-opportunities/summer-program-undergraduate-research-cu-spur" rel="nofollow">CU SPUR program</a>. Zhao is also conducting a pilot study, interviewing laser cutter operating experts about how they would demonstrate operations and how they can annotate their demonstration using the WizARd prototype for novice learners. Hunter has embarked on an internship with Trimble this summer, while he and Ranjan are also working in the ACME Lab.</p></div> </div> </div> </div> </div> <div>ACME Lab members built relationships with industry players through the Pervasive Personalized Intelligence (PPI) Center by collaborating on solutions to challenges in building Internet of Things systems. Three ATLAS PhD students took home awards from the PPI Center's Spring 2024 Advisory Board Meeting.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 18 Jul 2024 16:41:59 +0000 Anonymous 4698 at /atlas ATLAS in Ireland: 12 community members present at TEI’24 /atlas/atlas-ireland-12-community-members-present-tei24 <span>ATLAS in Ireland: 12 community members present at TEI’24</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-02-09T12:05:23-07:00" title="Friday, February 9, 2024 - 12:05">Fri, 02/09/2024 - 12:05</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2024-02-09_at_12.09.34_pm.png?h=8681559e&amp;itok=KvBy9zBf" width="1200" height="800" alt="Art and Demo Exhibition Venue building on the harbor in Cork, Ireland"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/342" hreflang="en">devendorf</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/514" hreflang="en">gyory</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/376" hreflang="en">unstable</a> <a href="/atlas/taxonomy/term/883" hreflang="en">yang</a> <a href="/atlas/taxonomy/term/641" hreflang="en">zheng</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/article-image/93b9319e-7438-f5ee-2a56-bc5dd1fd765d.png?itok=R-va1_rw" width="375" height="375" alt="TEI 2024 logo"> </div> </div> <p>ATLAS is well-represented at #TEI2024 - the 18th ACM International Conference on Tangible, Embedded and Embodied Interaction. This year’s conference, in Cork, Ireland, celebrates “cutting-edge scientific research and art that is on the edge of disciplines and on the edge of new unique developments and possibilities.”</p><p>Research from 12 members of the ATLAS community including faculty, alumni and students is featured at the conference. The work spans a range of disciplines, including weaving, biomaterials, mixed reality and robotics. In addition, ACME Lab director, Ellen Do, acted as Co-Chair of Graduate Student Consortium; PhD student, Sandra Bae, was an Associate Chair for Pictorials; and ATLAS PhD alum, Fiona Bell, was an Associate Chair for Papers.</p><p><strong>Research ATLAS PhD students presented at TEI’24</strong><br><br><a href="https://doi.org/10.1145/3623509.3633358" rel="nofollow"><strong>Loom Pedals: Retooling Jacquard Weaving for Improvisational Design Workflows</strong></a><br><a href="/atlas/shanel-wu" rel="nofollow"><strong>Shanel Wu</strong></a><strong>, </strong><a href="/atlas/xavier-corr" rel="nofollow"><strong>Xavier A Corr</strong></a><strong>, Xi Gao, </strong><a href="/atlas/sasha-de-koninck" rel="nofollow"><strong>Sasha De Koninck</strong></a><strong>, Robin Bowers, and</strong><a href="/atlas/laura-devendorf" rel="nofollow"><strong> Laura Devendorf</strong></a></p><p><strong>Abstract</strong>: We present the Loom Pedals, an open-source hardware/software interface for enhancing a weaver’s ability to create on-the-fly, improvised designs in Jacquard weaving. Learning from traditional handweaving and our own weaving experiences, we describe our process of designing, implementing, and using the prototype Loom Pedals system with a TC2 Digital Jacquard loom. The Loom Pedals include a set of modular, reconfigurable foot pedals which can be mapped to parametric Operations that generate and transform digital woven designs. Our novel interface integrates design and loom control, providing a customizable workflow for playful, improvisational Jacquard weaving. We conducted a formative evaluation of the prototype through autobiographical methods and collaboratively developed future Loom Pedals features. We contribute our prototype, design process, and conceptual reflections on weaving as a human-machine dialog between a weaver, the loom, and many other agents.</p><p><a href="https://doi.org/10.1145/3623509.3633386" rel="nofollow"><strong>Bio-Digital Calendar: Attuning to Nonhuman Temporalities for Multispecies Understanding</strong></a><br><a href="/atlas/fiona-bell" rel="nofollow"><strong>Fiona Bell</strong></a><strong>, </strong><a href="/atlas/joshua-coffie" rel="nofollow"><strong>Joshua Coffie</strong></a><strong>, and </strong><a href="/atlas/mirela-alistar" rel="nofollow"><strong>Mirela Alistar</strong></a></p><p><strong>Abstract</strong>:&nbsp;We explore how actively engaging with the temporalities of a nonhuman organism can lead to multispecies understanding. To do so, we design a bio-digital calendar that brings attention to the growth and health of kombucha SCOBY, a symbiotic culture of bacteria and yeast that lives in a tea medium. The non-invasive bio-digital calendar surrounds the kombucha SCOBY to track (via sensors) and enhance (via sound) its growth. As we looked at and listened to our kombucha SCOBY calendar on a daily basis, we became attuned to the slowness of kombucha SCOBY. This multisensory noticing practice with the calendar, in turn, destabilized our preconceived human-centered positionality, leading to a more humble, decentered relationship between us and the organism. Through our experiences with the bio-digital calendar, we gained a better relational multispecies understanding of temporalities based on care, which, in the long term, might be a solution to a more sustainable future.</p><p><a href="https://doi.org/10.1145/3623509.3633395" rel="nofollow"><strong>Wizard of Props: Mixed Reality Prototyping with Physical Props to Design Responsive Environments</strong></a><br><strong>Yuzhen Zhang, Ruixiang Han, </strong><a href="/atlas/ran-zhou" rel="nofollow"><strong>Ran Zhou</strong></a><strong>, </strong><a href="/atlas/peter-gyory" rel="nofollow"><strong>Peter Gyory</strong></a><strong>, </strong><a href="/atlas/clement-zheng" rel="nofollow"><strong>Clement Zheng</strong></a><strong>, Patrick C. Shih, </strong><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><strong>Ellen Yi-Luen Do</strong></a><strong>, Malte F Jung, Wendy Ju, and </strong><a href="/atlas/daniel-leithinger" rel="nofollow"><strong>Daniel Leithinger</strong></a></p><p><strong>Abstract</strong>:&nbsp;Driven by the vision of future responsive environments, where everyday surroundings can perceive human behaviors and respond through intelligent robotic actuation, we propose Wizard of Props (WoP): a human-centered design workflow for creating expressive, implicit, and meaningful interactions. This collaborative experience prototyping approach integrates full-scale physical props with Mixed Reality (MR) to support ideation, prototyping, and rapid testing of responsive environments. We present two design explorations that showcase our investigations of diverse design solutions based on varying technology resources, contextual considerations, and target audiences. Design Exploration One focuses on mixed environment building, where we observe fluid prototyping methods. In Design Exploration Two, we explore how novice designers approach WoP, and illustrate their design ideas and behaviors. Our findings reveal that WoP complements conventional design methods, enabling intuitive body-storming, supporting flexible prototyping fidelity, and fostering expressive environment-human interactions through in-situ improvisational performance.</p><p><a href="https://doi.org/10.1145/3623509.3634740" rel="nofollow"><strong>Making Biomaterials for Sustainable Tangible Interfaces</strong></a><br><a href="/atlas/fiona-bell" rel="nofollow"><strong>Fiona Bell</strong></a><strong>, </strong><a href="/atlas/shanel-wu" rel="nofollow"><strong>Shanel Wu</strong></a><strong>, Nadia Campo Woytuk, </strong><a href="/atlas/eldy-lazaro" rel="nofollow"><strong>Eldy S. Lazaro Vasquez</strong></a><strong>, </strong><a href="/atlas/mirela-alistar" rel="nofollow"><strong>Mirela Alistar</strong></a><strong>, and Leah Buechley</strong></p><p><strong>Abstract</strong>:&nbsp;In this studio, we will explore sustainable tangible interfaces by making a range of biomaterials that are bio-based and readily biodegradable. Building off of previous TEI studios that were centered around one specific biomaterial (i.e., bioplastics at TEI’22 and microbial cellulose at TEI’23), this studio will provide participants the ability to experience a wide variety of biomaterials from algae-based bioplastics, to food-waste-based bioclays, to gelatin-based biofoams. We will teach participants how to identify types of biomaterials that are applicable to their own research and how to make them. Through hands-on activities, we will demonstrate how to implement biomaterials in the design of sustainable tangible interfaces and discuss topics sensitized by biological media such as more-than-human temporalities, bioethics, care, and unmaking. Ultimately, our goal is to facilitate a space in which HCI researchers and designers can collaborate, create, and discuss the opportunities and challenges of working with sustainable biomaterials.</p><p><a href="https://dl.acm.org/doi/10.1145/3623509.3634899" rel="nofollow"><strong>Paper Modular Robot: Circuit, Sensation Feedback, and 3D Geometry</strong></a><br><a href="/atlas/ruhan-yang" rel="nofollow"><strong>Ruhan Yang</strong></a></p><p><strong>Abstract</strong>: Modular robots have proven valuable for STEM education. However, modular robot kits are often expensive, which makes them limited in accessibility. My research focuses on using paper and approachable techniques to create modular robots. The kit’s design encompasses three core technologies: paper circuits, sensation feedback mechanisms, and 3D geometry. I have developed proof-of-concept demonstrations of technologies for each aspect. I will integrate these technologies to design and build a paper modular robot kit. This kit includes various types of modules for input, output, and other functions. My dissertation will discuss the development of these technologies and how they are integrated. This research will address the considerations and techniques for paper as an interactive material, providing a guideline for future research and development of paper-based interaction.</p><p>&nbsp;</p></div> </div> </div> </div> </div> <div>Research from 12 members of the ATLAS community including faculty, alumni and students is featured at the 18th ACM International Conference on Tangible, Embedded and Embodied Interaction.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 09 Feb 2024 19:05:23 +0000 Anonymous 4676 at /atlas Ellen Yi-Luen Do Presents Keynote on Fun with Creative Technology & Design at TaiCHI 2023 /atlas/2023/09/13/ellen-yi-luen-do-presents-keynote-fun-creative-technology-design-taichi-2023 <span>Ellen Yi-Luen Do Presents Keynote on Fun with Creative Technology &amp; Design at TaiCHI 2023</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-09-13T12:51:39-06:00" title="Wednesday, September 13, 2023 - 12:51">Wed, 09/13/2023 - 12:51</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/ellen_speaking.jpeg?h=3fb1951d&amp;itok=5X8R_Kv_" width="1200" height="800" alt="Do speaking on stage at TaiCHI 2023"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/article-image/image.png?itok=baMRG0m0" width="375" height="93" alt="TaiCHI logo"> </div> </div> <p dir="ltr">ATLAS Professor Ellen Yi-Luen Do had the opportunity to be a keynote speaker at <a href="https://taichi2023.taiwanchi.org/" rel="nofollow">TaiCHI 2023</a>, a symposium hosted by the Taiwan Human-Computer Interaction Society at Taiwan University in Taipei. The event gathered researchers and practitioners across a range of backgrounds in technology, design and human factors to deepen community connections and explore new ideas.&nbsp;</p><p dir="ltr">Sessions included presentations on fabrication, perception, interactions and other timely topics, with a surprising range in mediums from humble materials like felt and puppets to advanced VR technologies and metaverse interactivity.</p><p dir="ltr">As director of the <a href="/atlas/acme-lab" rel="nofollow">ACME Lab</a> at ATLAS, Do and her team conduct research on using everyday items as interfaces, creating objects to think with, new ways of working, and methods and tools to help others make things. Do delivered her presentation, entitled “Fun with Creative Technology &amp; Design”, advocating for playful computing with easily accessible materials like paper and cardboard, while highlighting ways to make toolkits for others to create for themselves.&nbsp;</p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/fun_with_creative_technology.jpg?itok=Yf7tNubw" width="750" height="379" alt="Title slide of Do's presentation on Fun with Creative Technology and Design"> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr">The audience, which included experts in computer science, psychology, media, art, design and business responded enthusiastically, finding common ground in this relatable, inclusive approach to otherwise complex technologies. Do received a particularly warm reception from students in the field. She noted, “Several students came to thank me for my talk, stating that they learned so much from me, and that they never thought research could be this fun and interesting.”&nbsp;</p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/ellen_speaking.jpeg?itok=WZ9fgX8N" width="750" height="500" alt="Ellen Do speaking on stage"> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr">Do expressed excitement for a few standout presentations from the conference including <a href="https://www.edchi.net/" rel="nofollow">Ed Chi</a>, Distinguished Scientist at Google DeepMind, who delivered a keynote on the large language model revolution. She said, “I was happy to learn that Bard will be a tool-use application applying to many of the Google apps and services people already use, including Maps, Sheets, Gmail, Docs, and more.”&nbsp;</p><p dir="ltr">She also called out <a href="https://www.youtube.com/watch?v=IbRG8cLv4mo" rel="nofollow">FeltingReel: Density Varying Soft Fabrication with Reeling and Felting</a> by Ping-Yi Wang and Lung-Pan Cheng as particularly intriguing.</p><p dir="ltr">Back in 2015, Do wrote the article “<a href="https://dl.acm.org/doi/10.1145/2694475" rel="nofollow">A flourishing field: a guide to HCI in China, Taiwan, and Singapore</a>”, and saw the founding of Taiwan HCI. Looking back, she reflects, “I’m happy to see TaiCHI 2023 have 300 people registered with vibrant discussions, demos and posters. It's definitely growing!”</p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/group_shot.jpeg?itok=eWyTxcEw" width="750" height="500" alt="Group shot of TaiCHI 2023 attendees"> </div> </div> </div> </div> </div> </div> </div> <div>ATLAS Professor Ellen Yi-Luen Do presented on Fun with Creative Technology &amp; Design as keynote speaker at TaiCHI 2023.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 13 Sep 2023 18:51:39 +0000 Anonymous 4634 at /atlas 16 Members of the ATLAS Community Present Groundbreaking Research on Human-Computer Interaction at ACM DIS 2023 /atlas/2023/07/05/16-members-atlas-community-present-groundbreaking-research-human-computer-interaction-acm <span>16 Members of the ATLAS Community Present Groundbreaking Research on Human-Computer Interaction at ACM DIS 2023</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-07-05T13:43:45-06:00" title="Wednesday, July 5, 2023 - 13:43">Wed, 07/05/2023 - 13:43</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/artboard_3.png?h=fe6e0176&amp;itok=NfZZ8GUu" width="1200" height="800" alt="DIS23 logo"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/1181" hreflang="en">bsctd</a> <a href="/atlas/taxonomy/term/342" hreflang="en">devendorf</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/1463" hreflang="en">leslie</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/1269" hreflang="en">msctd</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/1426" hreflang="en">phd student</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/1511" hreflang="en">rivera</a> <a href="/atlas/taxonomy/term/376" hreflang="en">unstable</a> <a href="/atlas/taxonomy/term/1510" hreflang="en">utility</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr">ATLAS is well-represented at this year’s <a href="https://dis.acm.org/2023/" rel="nofollow">ACM Designing Interactive Systems (DIS) 2023 </a>conference convening at Carnegie Mellon University in Pittsburgh from July 10-14, 2023. This year’s theme is <strong>resilience</strong>.&nbsp;</p> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/article-image/artboard_1.png?itok=8RWNHC1P" width="375" height="90" alt="DIS23 rebuilding &amp; resilience logo"> </div> </div> <p dir="ltr"><em>"Resilience is at once about flexibility, durability, and strength as well as a sense of mutuality and hope where solidaristic modes of engagement make new kinds of worlds possible.&nbsp;</em></p><p dir="ltr"><em>This also recognizes that resilience takes many forms in design discourse, ranging across: indigenous knowledge, more-than-human perspectives, and the relationship between human, material and artificial intelligences.</em>"</p><p dir="ltr">It is exciting to see members across more than half of ATLAS labs represented in this year’s proceedings, with broad-reaching research covering microbiomes as materials for interactive design; 3D printing with spent coffee grounds; personal informatics systems; improving cross-disciplinary collaboration among artists and researchers; expressive movement for altering emotions and awareness; and the intersection of crocheting and data. Take a look:</p><p dir="ltr"><a href="https://programs.sigchi.org/dis/2023/program/content/118180" rel="nofollow"><strong>µMe: Exploring the Human Microbiome as an Intimate Material for Living Interfaces</strong></a><br><a href="/atlas/fiona-bell" rel="nofollow"><em>Fiona Bell</em></a><em> (ATLAS PhD alum), </em><a href="/atlas/michelle-ramsahoye" rel="nofollow"><em>Michelle Ramsahoye</em></a><em> (ATLAS affiliate PhD student), </em><a href="/atlas/joshua-coffie" rel="nofollow"><em>Joshua Coffie</em></a><em>&nbsp;(ATLAS MS alum), </em><a href="/atlas/julia-tung" rel="nofollow"><em>Julia Tung</em></a><em> (ATLAS BS student), and </em><a href="/atlas/mirela-alistar" rel="nofollow"><em>Mirela Alistar</em></a><em> (ATLAS Living Matter Lab director, assistant professor)</em></p><p dir="ltr">Our bodies are home to an unseen ecosystem of microbes that live in symbiosis with us. In this work, we extend the “human” in Human-Computer Interaction (HCI) to include these microbes. Specifically, we explore the skin microbiome as an intimate material for interaction design. Viewing the body as a microbial interface, we start by presenting a method to grow our microbiome such that it becomes visible to the human eye. We then present a design space that explores how different environmental parameters, such as temperature and growth media, can be controlled to influence the color of the microbiome. We further investigate how our interactions in a daily uncontrolled environment (e.g., exercising, hugging, typing) can impact the microbiome. We demonstrate several wearable applications that reveal and control the microbiome. Lastly, we address the challenges and opportunities of working with the microbiome as an intimate, living material for interaction design.</p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/human_microbiome.png?itok=1-iayA_x" width="750" height="268" alt="Human microbiome research image collage"> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr"><br><a href="https://programs.sigchi.org/dis/2023/program/content/118166" rel="nofollow"><strong>Designing a Sustainable Material for 3D Printing with Spent Coffee Grounds</strong></a><br><a href="/atlas/michael-rivera" rel="nofollow"><em>Michael L. Rivera</em></a><em> (ATLAS Utility Research Lab Director, assistant professor), </em><a href="/atlas/sandra-bae" rel="nofollow"><em>S. Sandra Bae</em></a><em> (ATLAS PhD student)</em></p><p dir="ltr">The widespread adoption of 3D printers exacerbates existing environmental challenges as these machines increase energy consumption, waste output, and the use of plastics. Material choice for 3D printing is tightly connected to these challenges, and as such researchers and designers are exploring sustainable alternatives. Building on these efforts, this work explores using spent coffee grounds as a sustainable material for prototyping with 3D printing. This material, in addition to being compostable and recyclable, can be easily made and printed at home. We describe the material in detail, including the process of making it from readily available ingredients, its material characteristics and its printing parameters. We then explore how it can support sustainable prototyping practices as well as HCI applications. In reflecting on our design process, we discuss challenges and opportunities for the HCI community to support sustainable prototyping and personal fabrication. We conclude with a set of design considerations for others to weigh when exploring sustainable materials for 3D printing and prototyping.</p><p dir="ltr"><em>For additional details, see </em><a href="/atlas/2023/05/08/atlas-innovators-win-big-reprap-festival" rel="nofollow"><em>our article</em></a><em> on how this and other Utility Research Lab projects won awards at the Rocky Mountain RepRap Festival.</em></p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/rivera_coffee_cups_0.jpg?itok=adP-SmiS" width="750" height="477" alt="Michael Rivera spent coffee grounds 3D printed mugs"> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr"><br><a href="https://programs.sigchi.org/dis/2023/program/content/118135" rel="nofollow"><strong>Being, Having, Doing, and Interacting: A Personal Informatics Approach to Understanding Human Need Satisfaction in Everyday Life</strong></a><br><em>Michael Jeffrey Daniel Hoefer, </em><a href="/atlas/stephen-voida" rel="nofollow"><em>Stephen Voida</em></a><em>, (ATLAS affiliate assistant professor, founding faculty, information science)</em></p><p dir="ltr">A grand challenge for computing is to better understand fundamental human needs and their satisfaction. In this work, we design a personal informatics technology probe that scaffolds reflection on how time-use satisfies Max-Neef's fundamental needs of being, having, doing, and interacting via self-aspects, relationships and organizations, activities, and environments. Through a combination of a think-aloud study (N=10) and a week-long in situ deployment (N=7), participants used the probe to complete self- aspect elicitation and Day Reconstruction Method tasks. Participants then interacted with network visualizations of their daily lives, and discovered insights about their lives. During the study, we collected a dataset of 662 activities annotated with need satisfaction ratings. Despite challenges in operationalizing a theory of need through direct elicitation from individuals, personal informatics systems show potential as a participatory and individually meaningful approach for understanding need satisfaction in everyday life.</p><p dir="ltr"><br><br>&nbsp;</p><p><a href="https://www.softrobotics.io/dis23" rel="nofollow"><strong>Enhancing Accessibility in Soft Robotics: Exploring Magnet-Embedded Paper-Based Interactions</strong></a><br><a href="/atlas/ruhan-yang" rel="nofollow"><em>Ruhan Yang</em></a><em> (ATLAS PhD student),&nbsp;</em><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><em>Ellen Yi-Luen Do</em></a><em> (ATLAS ACME Lab director,&nbsp;faculty member)</em></p><p>This paper explores the implementation of embedded magnets to enhance paper-based interactions. The integration of magnets in paper-based interactions simplifies the fabrication process, making it more accessible for building soft robotics systems. We discuss various interaction patterns achievable through this approach and highlight their potential applications.</p><p>&nbsp;</p><p><br><strong>[Workshop] </strong><a href="https://programs.sigchi.org/dis/2023/program/content/118476" rel="nofollow"><strong>Towards Mutual Benefit: Reflecting on Artist Residencies as a Method for Collaboration in DIS</strong></a><br><a href="/atlas/laura-devendorf" rel="nofollow"><em>Laura Devendorf</em></a><em> (ATLAS Unstable Design Lab director, assistant professor), Leah Buechley, Noura Howell, Jennifer Jacobs, Hsin-Liu (Cindy) Kao, Martin Murer, Daniela Rosner, Nica Ross, Robert Soden, Jared Tso, </em><a href="/atlas/clement-zheng" rel="nofollow"><em>Clement Zheng</em></a><em> (ATLAS PhD alum)</em></p><p dir="ltr">While cross-disciplinary collaboration has long been, and continues to be a cornerstone of inventive work in interactive design, the infrastructures of academia, as well as barriers to participation imposed by our professional organizations, make collaboration for some groups harder than others. In this workshop, we’ll focus specifically on how artists residencies are addressing (or not) the challenges that artists, craftspeople, and/or independent designers face when collaborating with researchers affiliated with DIS. While focusing on the question “what is mutual benefit”, this workshop seeks to combine the perspectives of artists as well as researchers collaborating with artists (through residencies or otherwise) to (1) reflect on benefits or deficiencies in what we are currently doing and (2) generate resources for our community to effectively structure and evaluate our methods of collaboration with artists. Our hope is to provide recognition of and pathways for equitable inclusion of artists as a first step towards broader infrastructural change.&nbsp;</p><p dir="ltr"><em>Refer to the </em><a href="https://unstable.design/mutualbenefit/" rel="nofollow"><em>Unstable Design Lab website</em></a><em> for more details on this research.&nbsp;</em><br><br>&nbsp;</p><p><strong>[Demo] </strong><a href="https://programs.sigchi.org/dis/2023/program/content/118533" rel="nofollow"><strong>SoniSpace: Expressive Movement Interaction to Encourage Taking Up Space with the Body</strong></a><br><a href="/atlas/ruojia-sun" rel="nofollow"><em>Ruojia Sun</em></a><em> (ATLAS PhD student), </em><a href="/atlas/althea-wallop" rel="nofollow"><em>Althea Vail Wallop</em></a><em> (ATLAS MS student), </em><a href="/atlas/grace-leslie" rel="nofollow"><em>Grace Leslie</em></a><em> (ATLAS Brain Music Lab director, assistant professor), </em><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><em>Ellen Yi-Luen Do</em></a><em> (ATLAS ACME Lab director,&nbsp;faculty member)</em></p><p dir="ltr">Movement forms the basis of our thoughts, emotions, and ways of being in the world. Informed by somaesthetics, we design for "taking up space" (e.g. encouraging expansive body movements), which may in turn alter our emotional experience. We demonstrate SoniSpace, an expressive movement interaction experience that uses movement sonification and visualization to encourage users to take up space with their body. We use a first-person design approach to embed qualities of awareness, exploration, and comfort into the sound and visual design to promote authentic and enjoyable movement expression regardless of prior movement experience. Preliminary results from 20 user experiences with the system show that users felt more comfortable with taking up space and with movement in general following the interaction. We discuss our findings about designing for somatically-focused movement interactions and directions for future work.</p><p dir="ltr">&nbsp;</p><p dir="ltr"><br><strong>[Demo] </strong><a href="https://programs.sigchi.org/dis/2023/program/content/118473" rel="nofollow"><strong>Crochet and Data Activity Book</strong></a><br><a href="/atlas/mikhaila-friske" rel="nofollow"><em>Mikhaila Friske</em></a><em> (ATLAS affiliate PhD student)</em></p><p dir="ltr">This demo focuses around crocheting and data. In addition to a physical workbook for conference goers to peruse and try, there will be a few small set-ups for specific activities and a small craft circle for people to craft within if they so choose.</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 05 Jul 2023 19:43:45 +0000 Anonymous 4568 at /atlas ATLAS Members Explore Childhood Play and Learning Through Interactive Design at IDC 2023 /atlas/2023/06/19/atlas-members-explore-childhood-play-and-learning-through-interactive-design-idc-2023 <span>ATLAS Members Explore Childhood Play and Learning Through Interactive Design at IDC 2023</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-06-19T12:29:52-06:00" title="Monday, June 19, 2023 - 12:29">Mon, 06/19/2023 - 12:29</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/idc_article_thumb.jpg?h=efa0e3d8&amp;itok=D5UqygFE" width="1200" height="800" alt="IDC conference logo"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/400" hreflang="en">THING</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/392" hreflang="en">leithinger</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/1511" hreflang="en">rivera</a> <a href="/atlas/taxonomy/term/1510" hreflang="en">utility</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/idc_article_banner.png?itok=4CKQiCk6" width="750" height="117" alt="IDC conference logo with background in the style of children's drawings"> </div> </div> <p dir="ltr">&nbsp;</p><p dir="ltr">11 ATLAS community members have contributed to work featured at the 22nd annual <a href="https://idc.acm.org/2023/" rel="nofollow">ACM Interaction Design and Children (IDC) Conference</a> to be held on June 19-23, 2023 at Northwestern University in Chicago, Illinois. IDC is the premier international conference for researchers, educators and practitioners to share the latest research findings, innovative methodologies and new technologies in the areas of inclusive child-centered design, learning and interaction. IDC’23 is hosted by the Center for Computer Science and Learning Sciences at Northwestern University.</p><p>Coming out of the pandemic, this year’s theme asks participants to “rediscover childhood” to understand what it means to be a child in this and coming decades and what adults can do to provide a sustainable and equitable future for the next generation. Key topics include privacy, ethics, equity, social and emotional wellbeing, sustainability, and healthy human development.</p><p>&nbsp;</p><h3 dir="ltr">Research presented by ATLAS faculty, students and affiliates</h3><p dir="ltr"><a href="https://dl.acm.org/doi/10.1145/3585088.3589359" rel="nofollow"><strong>Designing Together, Miles Apart: A Longitudinal Tabletop Telepresence Adventure in Online Co-Design with Children</strong></a><br><a href="/atlas/casey-hunt" rel="nofollow"><em>Casey Lee Hunt</em></a><em> (ATLAS THING Lab member, PhD student), Kaiwen Sun, Zahra Dhuliawala, Fumi Tsukiyama, Iva Matkovic, </em><a href="/atlas/zachary-schwemler" rel="nofollow"><em>Zachary Schwemler</em></a><em> (ATLAS MS alumnus), Anastasia Wolf, Zihao Zhang, Allison Druin, Amanda Huynh, </em><a href="/atlas/daniel-leithinger" rel="nofollow"><em>Daniel Leithinger</em></a><em> (ATLAS THING Lab Director, Computer Science faculty member), Jason Yip</em></p><p dir="ltr">Children’s online co-design has become prevalent since COVID-19. However, related research focuses on insights gained across several shorter-term projects, rather than longitudinal investigations. To explore longitudinal co-design online, we engaged in participatory design with children (ages 8 - 12) for 20 sessions in two years on a single project: an online collaboration platform with tabletop telepresence robots. We found that (1) the online technology space required children to play a role as technology managers and troubleshooters, (2) the home setting shaped online social dynamics, and (3) providing children the ability to choose their design techniques prevented gridlock from situational uncertainties. We discuss how each finding resulted from interplay between our long-term technology design and online co-design processes. We then present insights about the future of online co-design, a conceptual model for longitudinal co-design online, and describe opportunities for further longitudinal online co-design research to generate new methods, techniques, and theories.</p><p>&nbsp;</p><p dir="ltr"><a href="https://dl.acm.org/doi/10.1145/3585088.3589365" rel="nofollow"><strong>Exploring Computational Thinking with Physical Play through Design</strong></a><br><em>Junnan Yu, </em><a href="/atlas/ronni-hayden" rel="nofollow"><em>Ronni Hayden</em></a><em> (PhD student), </em><a href="/cmci/people/information-science/ricarose-roque" rel="nofollow"><em>Ricarose Roque</em></a><em> (Assistant Professor, Information Science)</em></p><p dir="ltr">Physical play has often been leveraged to provide children with active and engaging learning experiences. However, coding activities are predominantly sedentary in front of the screen, and the application of physical play in Computer Science education is less explored, e.g., how can we engage in computational thinking (CT) through physical play? In this design-based exploration, we conducted three design activities where young children, college students, and researchers were invited to create physical play projects using the BBC micro:bit and reflect on their experiences. By examining participants’ projects and creating processes, we provide empirical evidence that remixing physical play activities with coding can engage learners in various CT concepts and practices, reveal how CT concepts and practices can be represented in physical play, and highlight implications for designing physical play-mediated computational learning experiences. Ultimately, we encourage more learning experiences to incorporate physical play into computing education for children.</p><p dir="ltr"><em>Ricarose Roque chairs the session “Computational and Data Literacy” in which this paper is included.</em></p><p dir="ltr">&nbsp;</p><p dir="ltr"><strong>[Pictorial] </strong><a href="https://dl.acm.org/doi/10.1145/3585088.3589235" rel="nofollow"><strong>Imagining Alternative Visions of Computing: Photo-Visuals of Material, Social, and Emotional Contexts from Family Creative Learning</strong></a><br><a href="/cmci/people/information-science/ricarose-roque" rel="nofollow"><em>Ricarose Roque</em></a><em> (Assistant Professor, Information Science)</em></p><p dir="ltr">This pictorial presents visuals of families engaging with creative technologies as “knowledge-building artifacts” to provoke reflection on the social, material, and emotional context of designed interactions (“things that make you think”) as well as provocations to re-value these contexts and promote alternative visions in what and how engagement with computing can look like (“things that matter”). The selected images are from a large and ongoing collection of documentation from a family technology program. The images were captured using the Reggio Emilia documentation approach to documentation, which aims to “make learning visible.”</p><p dir="ltr"><em>Ricarose Roque is one of three Pictorial Chairs in the conference Organizing Committee.</em></p><p dir="ltr">&nbsp;</p><p dir="ltr"><strong>[Work-in-progress] </strong><a href="https://dl.acm.org/doi/10.1145/3585088.3593886" rel="nofollow"><strong>Cartoonimator: A Low-cost, Paper-based Animation Kit for Computational Thinking</strong></a><br><a href="/atlas/krithik-ranjan" rel="nofollow"><em>Krithik Ranjan</em></a><em> (ATLAS ACME Lab member, PhD student), </em><a href="/atlas/peter-gyory" rel="nofollow"><em>Peter Gyory</em></a><em> (ATLAS ACME Lab member, PhD Candidate), </em><a href="/atlas/michael-rivera" rel="nofollow"><em>Michael L. Rivera</em></a><em> (Utility Research Lab Director, Assistant Professor, Human-Computer Interaction and Digital Fabrication), and </em><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><em>Ellen Yi-Luen Do</em></a><em> (ATLAS ACME Lab Director, Computer Science faculty member)</em></p><p dir="ltr">Computational thinking has been identified as an important skill for children to learn in the 21st century, and many innovative kits and tools have been developed to integrate it into children’s learning. Yet, most solutions require the use of devices like computers or other expensive hardware, thus being inaccessible to low-income schools and communities. We present Cartoonimator, a low-cost, paper-based computational kit for children to create animations and engage with computational thinking. Cartoonimator requires only paper and a smartphone to use, offering an affordable learning experience. Children can draw the scenes and characters for their animation on the paper, which is printed with computer vision markers. We developed the mobile web app to provide an interface to capture keyframes and compile them into animations. In this paper, we describe the implementation and workflow of Cartoonimator, its deployment with children at a local STEAM event, and a planned evaluation for the kit.</p><p dir="ltr">&nbsp;</p><p dir="ltr"><strong>[Work-in-progress]&nbsp;</strong><a href="https://dl.acm.org/doi/10.1145/3585088.3593860" rel="nofollow"><strong>Empower Children in Nigeria to Design the Future of Artificial Intelligence (AI) through Writing</strong></a><br><em>Cornelius Onimisi Adejoro, Luise Arn, </em><a href="/atlas/larissa-schwartz" rel="nofollow"><em>Larissa Schwartz</em></a><em> (Master's student), </em><a href="/atlas/tom-yeh" rel="nofollow"><em>Tom Yeh</em></a><em> (Associate Professor, Computer Science)</em></p><p dir="ltr">This paper presents a new approach to engaging children in Nigeria to share their views of AI. This approach is centered on an inclusive writing contest for children in a secondary school in Abuja to write about AI to compete for prizes and share their writings with others. A preliminary analysis of the first 11 articles we received exhibits diverse gender and ethnic representation that conveys cultural values and perspectives distinct from those of the children in Western countries. This finding suggests future work to conduct an in-depth cross-cultural analysis of the articles and to replicate similar writing contests to engage children in other underrepresented countries</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 19 Jun 2023 18:29:52 +0000 Anonymous 4563 at /atlas ATLAS affiliates receive seed grants to study AI-augmented learning /atlas/2023/05/24/atlas-affiliates-receive-seed-grants-study-ai-augmented-learning <span>ATLAS affiliates receive seed grants to study AI-augmented learning</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-05-24T11:12:39-06:00" title="Wednesday, May 24, 2023 - 11:12">Wed, 05/24/2023 - 11:12</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/combineai.png.jpeg?h=ddf71b99&amp;itok=GhYWTWJ7" width="1200" height="800" alt="illustration of AI interactivity"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/771" hreflang="en">phd</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/1511" hreflang="en">rivera</a> <a href="/atlas/taxonomy/term/1510" hreflang="en">utility</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>The Engineering Education and AI-Augmented Learning Interdisciplinary Research Theme awarded multiple seed grants this spring to help spur research teaming in the college and boost early projects with the high potential for societal impact, including to several ATLAS Institute affiliates.</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/irt/engineering-education-ai/2023/05/19/new-seed-grants-engineering-education-and-ai-augmented-learning-research-theme-will`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 24 May 2023 17:12:39 +0000 Anonymous 4558 at /atlas ACME Lab Champions Humble Materials for Innovative Human-Computer Interactions at CHI 2023 /atlas/2023/05/22/acme-lab-champions-humble-materials-innovative-human-computer-interactions-chi-2023 <span>ACME Lab Champions Humble Materials for Innovative Human-Computer Interactions at CHI 2023</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-05-22T10:39:13-06:00" title="Monday, May 22, 2023 - 10:39">Mon, 05/22/2023 - 10:39</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2023-05-22_at_9.40.53_am.png?h=f3cdff55&amp;itok=IAfXBGpX" width="1200" height="800" alt="Example of paper marker in action"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/168" hreflang="en">feature</a> <a href="/atlas/taxonomy/term/34" hreflang="en">news</a> <a href="/atlas/taxonomy/term/771" hreflang="en">phd</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p>Over the years, the computer-human interaction field has seen many trends. For a time, gesture and pen-based interactions were key, then with the rising ubiquity of smartphones came a focus on haptic technologies. Now according to Ellen Do, ATLAS <a href="/atlas/acme-lab" rel="nofollow">ACME Lab</a> director, “material exploration” was the theme of the day at <a href="https://chi2023.acm.org/" rel="nofollow">CHI 2023</a>, the premiere conference on computer-human interaction.&nbsp;</p><p>Where researchers tend to focus on advanced electronics and innovative fibers, Do champions the use of everyday components as the basis for radical creativity and invention.&nbsp;&nbsp;</p><p>At the ACME Lab, Do notes that her students consistently look to humble, easy-to-source materials like paper, cardboard and even couch cushion foam for inspiration. These ubiquitous materials are often overlooked, but they offer many possibilities for developing ingenious tools for human-computer interaction that can be printed, cut and assembled with ease.&nbsp;</p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/acme_lab_team.jpeg?itok=g3JA6048" width="750" height="563" alt="Group photo from Beyond Prototyping workshop at CHI 2023"> </div> </div> <p>&nbsp;</p><p>Access to off-the-shelf tools allows ACME Lab researchers to speed the prototyping process, giving them freedom to quickly iterate and push their thinking. When paired with the advanced camera and sensor technology found in everyday smartphones and off-the-shelf electronics like microprocessors, the team has the power to create limitless forms of human-computer interaction.</p><p>The team discovered that paper markers make a viable alternative to physical circuits, requiring fewer tools and expertise to enable designers an easier path to exploring physical computing. They have built a programming library called Beholder, which empowers users to develop their own software leveraging computer vision (CV) markers. They presented this research at CHI.</p><p><a href="https://programs.sigchi.org/chi/2023/program/content/96604" rel="nofollow"><strong>Marking Material Interactions with Computer Vision</strong></a><br><a href="/atlas/peter-gyory" rel="nofollow">Peter Gyory</a> (ATLAS ACME Lab member, PhD candidate), <a href="/atlas/sandra-bae" rel="nofollow">S. Sandra Bae</a> (ATLAS ACME Lab member, PhD student), <a href="/atlas/ruhan-yang" rel="nofollow">Ruhan Yang</a> (ATLAS ACME Lab member, PhD student), <a href="/atlas/ellen-yi-luen-do" rel="nofollow">Ellen Yi-Luen Do</a> (ATLAS ACME Lab Director, Computer Science faculty member), <a href="/atlas/clement-zheng" rel="nofollow">Clement Zheng</a> (PhD alumnus, ATLAS ACME Lab)</p><p>[video:https://youtu.be/xYf1VJoqpBQ]</p><p>Do notes that ACME Lab has focused much energy on developing platforms upon which others can build out their own ideas. Put simply, she says, “We celebrate building things to help people build things,” including useful computational tools, process documents and 3D-printed devices. It is not just about prototyping; instead, ACME Lab aims toward “iso-typing”—functional tools that empower creatives, artists and scientists to make new and better things more easily.&nbsp;</p><p>With the ACME Lab’s simple toolkit, designers no longer need to study advanced computer vision to use the technology. Instead of spending too much time just figuring out how to get things to work, they can focus on rapid prototyping with alternative controllers and interfaces to build their own gadgets much more quickly. Do and her team presented this concept to fellow members of the research community and global business leaders at the Beyond Prototyping Boards workshop at CHI.</p><p>It is by putting tech integrations together first and then exploring the design space that you realize more opportunities for innovation. As Do notes, “ATLAS is doing such a diverse research, which is pretty unique,” and creates more opportunities to cross-pollinate ideas and support one another’s work across the institute.&nbsp;<br>&nbsp;</p><h3><strong>Related research presented at CHI 2023</strong></h3><p><strong>Fabricating Paper Circuits with Subtractive Processing</strong><br><a href="/atlas/ruhan-yang" rel="nofollow">Ruhan Yang</a> (ATLAS ACME Lab member, PhD student), <a href="/atlas/krithik-ranjan" rel="nofollow">Krithik Ranjan</a> (ATLAS ACME Lab member, PhD student), <a href="/atlas/ellen-yi-luen-do" rel="nofollow">Ellen Yi-Luen Do</a> (ATLAS ACME Lab Director, Computer Science faculty member)</p><p>[video:https://youtu.be/v9W9n_Lstns]<br>&nbsp;</p><p><strong>Facilitating Physical Computing with Computer Vision Markers</strong><br><a href="/atlas/clement-zheng" rel="nofollow">Clement Zheng</a> (PhD alumnus, ATLAS ACME Lab member), <a href="/atlas/peter-gyory" rel="nofollow">Peter Gyory</a> (ATLAS ACME Lab member, PhD Candidate), <a href="/atlas/ellen-yi-luen-do" rel="nofollow">Ellen Yi-Luen Do</a> (ATLAS ACME Lab Director and Computer Science faculty member)</p><p>[video:https://youtu.be/c_pXZIerZY0]</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 22 May 2023 16:39:13 +0000 Anonymous 4556 at /atlas