Projects

MobiPrint: A Mobile 3D Printer for Environment-Scale Design and Fabrication

Daniel Campos Zamora, Liang He, and Jon Froehlich

We present MobiPrint, a prototype mobile fabrication system that combines elements from robotics, architecture, and Human-Computer Interaction (HCI) to enable environment-scale design and fabrication in ad-hoc indoor environments. MobiPrint provides a multi-stage fabrication pipeline: first, the robotic 3D printer automatically scans and maps an indoor space; second, a custom design tool converts the map into an interactive CAD canvas for editing and placing models in the physical world; finally, the MobiPrint robot prints the object directly on the ground at the defined location. Through a “proof-by-demonstration” validation, we highlight our system’s potential across different applications, including accessibility, home furnishing, floor signage, and art.

3D Printing Magnetophoretic Displays

Zeyu Yan, Hsuanling Lee, Liang He, and Huaishu Peng

We present a pipeline for printing interactive and always-on magnetophoretic displays using affordable FDM 3D printers. Using our pipeline, an end-user can convert the surface of a 3D shape into a matrix of voxels. The generated model can be sent to an FDM 3D printer equipped with an additional syringe-based injector. During the printing process, an oil and iron powder-based liquid mixture is injected into each voxel cell, allowing the appearance of the once-printed object to be editable with external magnetic sources. To achieve this, we made modifications to the 3D printer hardware and the firmware. We also developed a 3D editor to prepare printable models. We demonstrate our pipeline with a variety of examples, including a printed Stanford bunny with customizable appearances, a small espresso mug that can be used as a post-it note surface, a board game figurine with a computationally updated display, and a collection of flexible wearable accessories with editable visuals.

Kinergy: Creating 3D Printable Motion using Embedded Kinetic Energy

Liang He, Xia Su, Huaishu Peng, Jeffrey I. Lipton, and Jon E. Froehlich

We present Kinergy—an interactive design tool for creating self-propelled motion by harnessing the energy stored in 3D printable springs. To produce controllable output motions, we introduce 3D printable kinetic units, a set of parameterizable designs that encapsulate 3D printable springs, compliant locks, and transmission mechanisms for three non-periodic motions—instant translation, instant rotation, continuous translation—and four periodic motions—continuous rotation, reciprocation, oscillation, intermittent rotation. Kinergy allows the user to create motion-enabled 3D models by embedding kinetic units, customize output motion characteristics by parameterizing embedded springs and kinematic elements, control energy by operating the specialized lock, and preview the resulting motion in an interactive environment. We demonstrate the potential of our techniques via example applications from spring-loaded cars to kinetic sculptures and close with a discussion of key challenges such as geometric constraints.

FlexHaptics: A Design Method for Passive Haptic Inputs Using Planar Compliant Structures

Hongnan Lin, Liang He, Fangli Song, Yifan Li, Tingyu Cheng, Clement Zheng, Wei Wang, and Hyunjoo Oh

This paper presents FlexHaptics, a design method for creating custom haptic input interfaces. Our approach leverages planar compliant structures whose force-deformation relationship can be altered by adjusting the geometries. Embedded with such structures, a FlexHaptics module exerts a fine-tunable haptic effect (i.e., resistance, detent, or bounce) along a movement path (i.e., linear, rotary, or ortho-planar). These modules can work separately or combine into an interface with complex movement paths and haptic effects. To enable the parametric design of FlexHaptic modules, we provide a design editor that converts user-specified haptic properties into underlying mechanical structures of haptic modules. We validate our approach and demonstrate the potential of FlexHaptic modules through six application examples, including a slider control for a painting application and a piano keyboard interface on touchscreens, a tactile low vision timer, VR game controllers, and a compound input device of a joystick and a two-step button.

ModElec: A Design Tool for Prototyping Physical Computing Devices Using Conductive 3D Printing

Liang He, Jarrid A Wittkopf, Ji Won Jun, Kris Erickson, and Rafael 'Tico' Ballagas

Integrating electronics with highly custom 3D designs for the physical fabrication of interactive prototypes is traditionally cumbersome and requires numerous iterations of manual assembly and debugging. With the new capabilities of 3D printers, combining electronic design and 3D modeling workflows can lower the barrier for achieving interactive functionality or iterating on the overall design. We present ModElec—an interactive design tool that enables the coordinated expression of electronic and physical design intent by allowing designers to integrate 3D-printable circuits with 3D forms. With ModElec, the user can arrange electronic parts in a 3D body, modify the model design with embedded circuits updated, and preview the auto-generated 3D traces that can be directly printed with a multi-material-based 3D printer.

HulaMove: Using Commodity IMU for Waist Interaction

Xuhai Xu, Jiahao Li, Tianyi Yuan, Liang He, Xin Liu, Yukang Yan, Yuntao Wang, Yuanchun Shi, Jennifer Mankoff, and Anind K Dey.

We present HulaMove, a novel interaction technique that leverages the movement of the waist as a new eyes-free and hands-free input method for both the physical world and the virtual world. We first conducted a user study (N=12) to understand users’ ability to control their waist. We found that users could easily discriminate eight shifting directions and two rotating orientations, and quickly confirm actions by returning to the original position (quick return). We developed a design space with eight gestures for waist interaction based on the results and implemented an IMU-based real-time system. Using a hierarchical machine learning model, our system could recognize waist gestures at an accuracy of 97.5%. Finally, we conducted a second user study (N=12) for usability testing in both real-world scenarios and virtual reality settings.

Ondulé: Designing and Controlling 3D Printable Springs

Liang He, Huaishu Peng, Michelle Lin, Ravikanth Konjeti, François Guimbretière, and Jon E. Froehlich

We present Ondulé—an interactive design tool that allows novices to create parameterizable deformation behaviors in 3D-printable models using helical springs and embedded joints. Informed by spring theory and our empirical mechanical experiments, we introduce spring and joint-based design techniques that support a range of parameterizable deformation behaviors, including compress, extend, twist, bend, and various combinations. To enable users to design and add these deformations to their models, we introduce a custom design tool for Rhino. With the tool, users can convert selected geometries into springs, customize spring stiffness, and parameterize their design with mechanical constraints for desired behaviors.

SqueezaPulse: Adding Interactive Input to Fabricated Objects

Liang He, Gierad Laput, Eric Brockmeyer, and Jon E. Froehlich

We present SqueezaPulse, a technique for embedding interactivity into fabricated objects using soft, passive, lowcost bellow-like structures. When a soft cavity is squeezed, air pulses travel along a flexible pipe and into a uniquely designed corrugated tube that shapes the airflow into predictable sound signatures. A microphone captures and identifies these air pulses enabling interactivity. Informed by the underlying acoustic theory, we described an informal examination of the effect of different 3D-printed corrugations on air signatures and our resulting SqueezaPulse implementation. To demonstrate and evaluate the potential of SqueezaPulse, we present four prototype applications and a small, lab-based user study (N=9). Our evaluations show that our approach is accurate across users and robust to external noise.

MakerWear: A Tangible Approach to Interactive Wearable Creation

Majeed Kazemitabaar, Jason McPeak, Alexander Jiao, Liang He, Thomas Outing, and Jon E. Froehlich

Wearable construction toolkits have shown promise in broadening participation in computing and empowering users to create personally meaningful computational designs. However, these kits present a high barrier of entry for some users, particularly young children (K-6). In this paper, we introduce MakerWear, a new wearable construction kit for children that uses a tangible, modular approach to wearable creation. We describe our participatory design process, the iterative development of MakerWear, and results from single- and multi-session workshops with 32 children (ages 5-12; M=8.3 years). Our findings reveal how children engage in wearable design, what they make (and want to make), and what challenges they face. As a secondary analysis, we also explore age-related differences.    

  Best Paper Award at CHI'17 |   Best LBW Paper Award at CHI'16

New Interaction Tools for Preserving an Old Language

Beryl Plimmer, Liang He, Tariq Zaman, Kasun Karunanayaka, Alvin W. Yeo, Garen Jengan, Rachel Blagojevic, and Ellen Yi-Luen Do

The Penan people of Malaysian Borneo were traditionally nomads of the rainforest. They would leave messages in the jungle for each other by shaping natural objects into language tokens and arranging these symbols in specific ways -- much like words in a sentence. With settlement, the language is being lost as it is not being used by the younger generation. We report here, a tangible system designed to help the Penan preserve their unique object writing language. The key features of the system are that: its tangibles are made of real objects; it works in the wild; and new tangibles can be fabricated and added to the system by the users. Our evaluations show that the system is engaging and encourages intergenerational knowledge transfer and thus has the potential to help preserve this language.

  Honorable Mentions Award at CHI'15

CozyMaps: Real-time Collaboration With Multiple Displays

Kelvin Cheng, Liang He, Xiaojun Meng, David A. Shamma, Dung Nguyen, and Anbarasan T.

With the use of several tablet devices and a shared large display, CozyMaps is a multi-display system that supports real-time collocated collaboration on a shared map. This paper builds on existing works and introduces rich user interactions by proposing awareness, notification, and view sharing techniques, to enable seamless information sharing and integration in map-based applications. Based on our exploratory study, we demonstrated that participants are satisfied with these new proposed interactions. We found that view sharing techniques should be location-focused rather than user-focused. Our results provide implications for the design of interactive techniques in collaborative multi-display map systems.

Exploratory Projects

Towards Rapid Fabrication of Custom Tactile Surface Indicators for Indoor Navigation

Daniel Campos Zamora, Liang He, and Jon E. Froehlich.

  • ASSETS 2024 Poster
  •  PDF

Tactile surface indicators (TSIs) provide ground-based tactile cues to help pedestrians who are blind or low-vision safely and independently navigate different environments. In this exploratory work, we examine how digital fabrication technologies such as 3D printing, CNC milling, vacuum forming, and heat transfer melting can enable the production of custom TSIs. To compare different fabrication approaches, we designed and evaluated a series of prototypes with varied surface materials and design features (e.g., bump height).

Fluxable: A Tool for Making 3D Printable Sensors and Actuators

Hsuanling Lee, Yujie Shan, Huachao Mao, and Liang He

We present Fluxable, a tool for making custom sensors and actuators 3D printable with customer-grade Stereolithography (SLA) 3D printers. With this tool, the user converts an arbitrary 3D model into a deformable body with integrated helix-and-lattice structures, which comprise a hollow helical channel in the center, lattice paddings, and a wireframe structure on the surface. The tool allows for the parameterization of the helix for sensing performance and customization of the lattice for actuation. By inserting a conductive shape-memory alloy (SMA) into a printed object through the helical channel, the converted shape becomes a sensor to detect various shape-changing behaviors using inductive sensing or an actuator to trigger movements through temperature control. We demonstrated our tool with a series of example sensors and actuators, including an interactive timer, a DJ station, and a caterpillar robot.

A Multi-modal Toolkit to Support DIY Assistive Technology Creation for Blind and Low Vision People

Liwen He, Yifan Li, Mingming Fan, Liang He, and Yuhang Zhao

We design and build A11yBits, a tangible toolkit that empowers blind and low vision (BLV) people to easily create personalized do-it-yourself assistive technologies (DIY-ATs). A11yBits includes (1) a series of Sensing modules to detect both environmental information and user commands, (2) a set of Feedback modules to send multi-modal feedback, and (3) two Base modules (Sensing Base and Feedback Base) to power and connect the sensing and feedback modules. The toolkit enables accessible and easy assembly via a "plug-and-play" mechanism. BLV users can select and assemble their preferred modules to create personalized DIY-ATs.

Understanding the Experiences, Challenges, and Needs of Dementia Caregivers in the Indian Subcontinent

Srishti Shekhar Agrawal, Shrey Panchal, and Liang He

  • ASSETS 2023 Poster
  •  PDF

In the context of dementia care-giving in India, caregivers have always been unsung heroes. However, there remains a dearth of comprehensive data regarding the emotional and physical challenges caregivers face in the Indian subcontinent. We present a qualitative study that delves into the realities of being a dementia caregiver in India, investigating caregivers' experiences and examining the impact of several challenges on their daily lives. In this study, we conducted interviews with four primary caregivers of persons with dementia to explore the impact of care-giving on their social lives, mental and emotional well-being. The findings highlight the importance of having an emotional outlet, a support system, and accessible resources for enhancing caregivers' quality of life. Drawing from these insights, we propose a set of design implications that can guide future endeavors focused on enhancing the overall well-being of dementia caregivers in India. This research addresses a significant gap in the understanding and support for dementia caregivers, providing valuable recommendations for supporting this crucial group of individuals.

sPrintr: Towards In-Situ Personal Fabrication using a Mobile 3D Printer

Daniel Campos Zamora, Liang He, Yueqian Zhang, Xuhai Xu, Jennifer Mankoff, and Jon E. Froehlich

We present our early work on sPrintr, a pipeline consisting of a mobile 3D printer and graphical interface to enable in-situ fabrication with consumer-grade hardware and fabrication tools. We prototyped two initial components of that pipeline i) a mobile 3D printer and ii) a user interface that helps users arrange, preview, and plan prints in their environment using a floor plan layout. We identify challenges in the automation of mobile printing systems, on-the-go printing, and human-machine interfaces for in-situ design and fabrication.

PneuFetch: Supporting BVI People to Fetch Nearby Objects

Liang He, Ruolin Wang, and Xuhai Xu

Blind and visually impaired (BVI) people can fetch objects in an acquainted environment by touching objects or relying on their memory. However, in a complex and less familiar situation, those strategies become less useful or even result in dangers (e.g., touching hazardous obstacles).We present PneuFetch, a light haptic cue-based wearable device that supports blind and visually impaired (BVI) people to fetch nearby objects in an unfamiliar environment. In our design, we generate friendly, non-intrusive, and gentle presses and drags to deliver direction and distance cues on BVI user's wrist and forearm. As a concept of proof, we discuss our PneuFetch wearable prototype, contrast it with past work, and describe a preliminary user study.

A Multi-Modal Approach for BVI Developers to Edit Webpages

Venkatesh Potluri, Liang He, Christine Chen, Jon E. Froehlich, and Jennifer Mankoff

Blind and visually impaired (BVI) individuals are increasingly creating visual content online; however, there is a lack of tools that allow these individuals to modify the visual attributes of the content and verify the validity of those modifications. We discuss the design and preliminary exploration of a multi-modal and accessible approach for BVI developers to edit visual layouts of webpages while maintaining visual aesthetics. The system includes three parts: an accessible canvas, a code editor, and a controller that checks if the updates violate design guidelines.

TacTILE: A Toolchain for Creating Accessible Graphics with 3D-Printed Overlays and Auditory Annotations

Liang He, Zijian Wan, Leah Findlater, Jon E. Froehlich

Tactile overlays with audio annotations can increase the accessibility of touchscreens for blind users; however, preparing these overlays is complex and labor intensive. We introduce TacTILE, a novel toolchain to more easily create tactile overlays with audio annotations for arbitrary touchscreen graphics (e.g., graphs, pictures, maps). The workflow includes: (i) an annotation tool to add audio to graphical elements, (ii) a fabrication process that generates 3D-printed tactile overlays, and (iii) a custom app for the user to explore graphics with these overlays. We close with a pilot study with one blind participant who explores three examples (floor plan, photo, and chart), and a discussion of future work.

PneuHaptic: Delivering Haptic Cues with a Pneumatic Armband

Liang He, Cheng Xu, Ding Xu, and Ryan Brill

A common approach in creating haptic cues is moving the contact surface with electromechanical actuators such as vibrating electric motors, piezoelectric motors, or voicecoils. While these actuators can be configured to effectively convey rich information, their high frequency movementscould raise negative responses after lengthy exposure. PneuHaptic is a pneumatically-actuated arm-worn haptic interface. The system triggers a range of tactile sensations on the arm by alternately pressurizing and depressurizing a series of custom molded silicone chambers. We detail the implementation of our functional prototype and explore the possibilities for interaction enabled by the system.