![Future Interfaces Group](/img/default-banner.jpg)
- 83
- 2 704 109
Future Interfaces Group
United States
Приєднався 26 кві 2014
The Future Interfaces Group (FIG) is an interdisciplinary research laboratory within the Human-Computer Interaction Institute at Carnegie Mellon University. We create new sensing and interface technologies that aim to make interactions between humans and computers more fluid, intuitive, and powerful. These efforts often lie in emerging use modalities, such as wearable computing, touch interfaces and gestural interaction.
TeslaTouch: Electrovibration for Touch Surfaces (ACM UIST 2010)
Bau, O., Poupyrev, I., Israr, A., and Harrison, C. 2010. TeslaTouch: Electrovibration for Touch Surfaces. In Proceedings of the 23rd Annual ACM Symposium on User interface Software and Technology (New York, New York, October 3 - 6, 2010). UIST '10. ACM, New York, NY. 283-292.
TeslaTouch infuses finger-driven interfaces with physical feedback. The technology is based on the electrovibration principle, which can programmatically vary the electrostatic friction between fingers and a touch panel. Importantly, there are no moving parts, unlike most tactile feedback technologies, which typically employ mechanical actuators. This allows for different fingers to feel different sensations. When combined with an interactive graphical display, TeslaTouch enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch. For example, when dragging a file, the level of friction could convey the file size. Objects could "snap" into place when designing a presentation. Or perhaps with a quick "rub" of your email application's icon, you could sense how many emails are unread. Finally, imagine a (flat) touch keyboard where the virtual keys can be felt.
Tactile feedback based on electrovibration has several compelling properties. It is fast, low-powered, dynamic, and can be used in a wide range of interaction scenarios and applications, including multitouch interfaces. Our system demonstrates an exceptionally broad bandwidth and uniformity of response across a wide range of frequencies and amplitudes. Furthermore, the technology is highly scalable and can be used efficiently on touch surfaces of any size, shape and configuration, including large interactive tables, hand-held mobile devices, as well as curved, flexible and irregular touch surfaces. Lastly, because our design does not have any moving parts, it can be easily added to existing devices with minimal physical modification.
TeslaTouch infuses finger-driven interfaces with physical feedback. The technology is based on the electrovibration principle, which can programmatically vary the electrostatic friction between fingers and a touch panel. Importantly, there are no moving parts, unlike most tactile feedback technologies, which typically employ mechanical actuators. This allows for different fingers to feel different sensations. When combined with an interactive graphical display, TeslaTouch enables the design of a wide variety of interfaces that allow the user to feel virtual elements through touch. For example, when dragging a file, the level of friction could convey the file size. Objects could "snap" into place when designing a presentation. Or perhaps with a quick "rub" of your email application's icon, you could sense how many emails are unread. Finally, imagine a (flat) touch keyboard where the virtual keys can be felt.
Tactile feedback based on electrovibration has several compelling properties. It is fast, low-powered, dynamic, and can be used in a wide range of interaction scenarios and applications, including multitouch interfaces. Our system demonstrates an exceptionally broad bandwidth and uniformity of response across a wide range of frequencies and amplitudes. Furthermore, the technology is highly scalable and can be used efficiently on touch surfaces of any size, shape and configuration, including large interactive tables, hand-held mobile devices, as well as curved, flexible and irregular touch surfaces. Lastly, because our design does not have any moving parts, it can be easily added to existing devices with minimal physical modification.
Переглядів: 3 820
Відео
DynaButtons: Fast Interactive Soft Buttons with Analog Control (IEEE HAPTICS 2024)
Переглядів 385 тис.Місяць тому
More Research: Fluid Reality Gloves: ua-cam.com/video/UJXLBqG9E_s/v-deo.html Flat Panel Haptics: ua-cam.com/video/j_rErbhxNFM/v-deo.html Research Team: Tucker Rae-Grant, Chris Harrison, and Craig Shultz Carnegie Mellon University and University of Illinois Urbana-Champaign While mechanical buttons are ubiquitous, their haptic response is fxed, reducing interface flexibility and precluding an av...
Expressive, Scalable, Mid-Air Haptics with Synthetic Jets
Переглядів 7 тис.2 місяці тому
Non-contact, mid-air haptic devices have been utilized for a wide variety of experiences, including those in extended reality, public displays, medical, and automotive domains. In this work, we explore the use of synthetic jets as a promising and under-explored mid-air haptic feedback method. We show how synthetic jets can scale from compact, low-powered devices, all the way to large, long-rang...
SmartPoser (ACM UIST 2023 Talk)
Переглядів 7657 місяців тому
Demo Video: ua-cam.com/video/AHh2vYQVb_8/v-deo.html Abstract: The ability to track a user’s arm pose could be valuable in a wide range of applications, including fitness, rehabilitation, augmented reality input, life logging, and context-aware assistants. Unfortunately, this capability is not readily available to consumers. Systems either require cameras, which carry privacy issues, or utilize ...
Pantœnna (ACM UIST 2023 Talk)
Переглядів 6177 місяців тому
Demo video: ua-cam.com/video/ya_KWEJTKsU/v-deo.html Abstract: Methods for faithfully capturing a user's holistic pose have immediate uses in AR/VR, ranging from multimodal input to expressive avatars. Although body-tracking has received the most attention, the mouth is also of particular importance, given that it is the channel for both speech and facial expression. In this work, we describe a ...
Fluid Reality (ACM UIST 2023 Talk)
Переглядів 1 тис.7 місяців тому
Academic talk at ACM UIST 2023. More details: www.figlab.com/research/2023/FluidReality
Pantœnna: Mouth Pose Estimation for VR/AR Headsets Using Low-Profile Antenna
Переглядів 9977 місяців тому
Methods for faithfully capturing a user's holistic pose have immediate uses in AR/VR, ranging from multimodal input to expressive avatars. Although body-tracking has received the most attention, the mouth is also of particular importance, given that it is the channel for both speech and facial expression. In this work, we describe a new RF-based approach for capturing mouth pose using an antenn...
Fluid Reality: High-Resolution, Untethered Haptic Gloves Using Electroosmotic Pump Arrays
Переглядів 200 тис.7 місяців тому
Same tech under touchscreen: ua-cam.com/video/j_rErbhxNFM/v-deo.html (keyboard you can feel!) Virtual and augmented reality headsets are making significant progress in audio-visual immersion and consumer adoption. However, their haptic immersion remains low, due in part to the limitations of vibrotactile actuators which dominate the AR/VR market. In this work, we present a new approach to creat...
SmartPoser: Arm Pose Estimation with a Smartphone and Smartwatch Using UWB and IMU Data
Переглядів 9967 місяців тому
The ability to track a user’s arm pose could be valuable in a wide range of applications, including fitness, rehabilitation, augmented reality input, life logging, and context-aware assistants. Unfortunately, this capability is not readily available to consumers. Systems either require cameras, which carry privacy issues, or utilize multiple worn IMUs or markers. In this work, we describe how a...
WorldPoint: Finger Pointing as a Rapid and Natural Trigger for In-the-Wild Mobile Interactions
Переглядів 1,2 тис.7 місяців тому
Pointing with one's finger is a natural and rapid way to denote an area or object of interest. It is routinely used in human-human interaction to increase both the speed and accuracy of communication, but it is rarely utilized in human-computer interactions. In this work, we use the recent inclusion of wide-angle, rear-facing smartphone cameras, along with hardware-accelerated machine learning,...
IMUPoser: Full-Body Pose Estimation using IMUs in Phones, Watches, and Earbuds
Переглядів 6 тис.Рік тому
Tracking body pose on-the-go could have powerful uses in fitness, mobile gaming, context-aware virtual assistants, and rehabilitation. However, users are unlikely to buy and wear special suits or sensor arrays to achieve this end. Instead, in this work, we explore the feasibility of estimating body pose using IMUs already in devices that many users own - namely smartphones, smartwatches, and ea...
Flat Panel Haptics: Embedded Electroosmotic Pumps for Scalable Shape Displays
Переглядів 56 тис.Рік тому
We present a new, miniaturizable type of shape-changing display using embedded electroosmotic pumps (EEOPs). Our pumps, controlled and powered directly by applied voltage, are 1.5mm in thickness, and allow complete stackups under 5mm. Nonetheless, they can move their entire volume's worth of fluid in 1 second, and generate pressures of /-50kPa, enough to create dynamic, millimeter-scale tactile...
Surface I/O: Creating Devices with Functional Surface Geometry for Haptics and User Input
Переглядів 1,6 тис.Рік тому
Surface I/O is a novel interface approach that functionalizes the exterior surface of devices to provide haptic and touch sensing without dedicated mechanical components. Achieving this requires a unique combination of surface features spanning the macro-scale (5cm~1mm), meso-scale (1mm~200um), and micro-scale (less than 200um). This approach simplifies interface creation, allowing designers to...
SweepSense: Ad Hoc Configuration Sensing Using Reflected Swept-Frequency Ultrasonics
Переглядів 5 тис.Рік тому
More info: www.gierad.com/projects/sweepsense
DynaTags: Low-Cost Fiducial Marker Mechanisms
Переглядів 3,3 тис.Рік тому
Published at ICMI 2022. www.figlab.com/research/2022/dynatags Printed fiducial markers are inexpensive, easy to deploy, robust and deservedly popular. However, their data payload is also static, unable to express any state beyond being present. For this reason, more complex electronic tagging technologies exist, which can sense and change state, but either require special equipment to read or a...
Pull Gestures with Coordinated Graphics on Dual Touchscreen Devices
Переглядів 1,3 тис.Рік тому
Pull Gestures with Coordinated Graphics on Dual Touchscreen Devices
EtherPose: Continuous Hand Pose Tracking with Wrist-Worn Antenna Impedance Characteristic Sensing
Переглядів 2 тис.Рік тому
EtherPose: Continuous Hand Pose Tracking with Wrist-Worn Antenna Impedance Characteristic Sensing
DiscoBand: Multiview Depth-Sensing Smartwatch Strap for Hand, Body and Environment Tracking
Переглядів 1,8 тис.Рік тому
DiscoBand: Multiview Depth-Sensing Smartwatch Strap for Hand, Body and Environment Tracking
TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens
Переглядів 6 тис.2 роки тому
TriboTouch: Micro-Patterned Surfaces for Low Latency Touchscreens
ControllerPose: Inside-Out Body Capture with VR Controller Cameras
Переглядів 9 тис.2 роки тому
ControllerPose: Inside-Out Body Capture with VR Controller Cameras
ElectriPop: Low-Cost, Shape-Changing Displays
Переглядів 4,5 тис.2 роки тому
ElectriPop: Low-Cost, Shape-Changing Displays
Mouth Haptics in VR using a Headset Ultrasound Phased Array
Переглядів 119 тис.2 роки тому
Mouth Haptics in VR using a Headset Ultrasound Phased Array
LRAir: Non-Contact Haptics Using Synthetic Jets
Переглядів 4 тис.2 роки тому
LRAir: Non-Contact Haptics Using Synthetic Jets
FarOut: Extending the Range of ad hoc Touch Sensing with Depth Cameras
Переглядів 2,4 тис.2 роки тому
FarOut: Extending the Range of ad hoc Touch Sensing with Depth Cameras
Retargeted Self-Haptics for Increased Immersion in VR without Hand Instrumentation
Переглядів 2 тис.2 роки тому
Retargeted Self-Haptics for Increased Immersion in VR without Hand Instrumentation
3D Hand Pose Estimation on Conventional Capacitive Touchscreens
Переглядів 1,3 тис.2 роки тому
3D Hand Pose Estimation on Conventional Capacitive Touchscreens
EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices
Переглядів 6 тис.2 роки тому
EyeMU Interactions: Gaze IMU Gestures on Mobile Devices
Vibrosight++: City-Scale Sensing Using Existing Retroreflective Signs and Markers
Переглядів 3,8 тис.3 роки тому
Vibrosight : City-Scale Sensing Using Existing Retroreflective Signs and Markers
Super-Resolution Capacitive Touchscreens
Переглядів 1,6 тис.3 роки тому
Super-Resolution Capacitive Touchscreens
Classroom Digital Twins with Instrumentation-Free Gaze Tracking
Переглядів 1,3 тис.3 роки тому
Classroom Digital Twins with Instrumentation-Free Gaze Tracking
im nerdin out thinkin about these gloves
Can you use it to play Doom?
I wonder if this will also give me goosebumps when touching corduroy in VR, since doing so IRL puts a shiver up my spine.
Looks like one of Guldies's animations
1. useless when you take a second thought 2. not really touch, you just "sorta" feel the bump pattern of textures. and the textures have to have these rough surface bumps which means more spec 3. not even full range of touch, just pushes against your skin, no friction 4. no pain
What does this really add? You can already see what you're touching. Haptic gloves should grant you the new feel for the weight of an object, or the pressure of a force.
why does this voice sound so familiar
Wait is this audit the audits voice
I WAS WONDERING IF THIS WAS POSSIBLE. I WILL BE ORDERING FROM YOU FOR NEAR FUTURE DIY PROJECTS
more innovation to the adult toys industry
Pimply skin
BOIOIOING
Is this audit the audit??
No.
This is a tech demo. All of this stuff has to be programmed in game. That will be the disconnect. Very few if any games are going to bother to program that kind of detail
The next step is to have this for the whole hand and the next after that is a full body suit 👌
Yeah my only thoughts where: Can you make it pop itself? How easy can it pop? Can you tear it off? Will it function with a tear? OMG is it going to pop? What safety features does it have against tearing and popping? I want to pop it.... Yeah I'd definitely pop it... Am I an A-hole? Also glad someone thought about braille use and tactile feedback for VR.
Booba
Yay, Biopunk is real
get someone that can read braille to test if its really good.
clitoratron?
go outside
Pimply skin?
That second glove is literally the prototype jaeger glove from Pacific Rims intro
Is this the same voice as from blowing fact, the guy who does cave diving and caving videos
I don't get it
Oh boy here we go again
Or you could just go outside and touch grass
This is absolutely amazing! Imagine the possibilities for blind people!
Nothing says future tech like 360p video.
3 hour battery time new is a joke, hell no
Using an xbox controller to play BotW.... hmmm. I smell ryujinx....
of course it glows pink when it inflates
AI voice using the Audit the Audit voice?
"pimply skin" I'm sorry...what?
Looks cool, but is this narrated by the same guy behind audit the audit? The voice and cadence is dead on.
Hi! What is the use case for this development?
another piece of tech that is home made , and will never get to the public hands.
why does this sound like the lock picking lawyer?
i keep having opportunities to make the comments a nice number 666, 700 and now 900!
I can tell this is what the control panel of a human conversion UAP will look like. Seems like a button that can offer operational feedback for complex technology with the state of the button representing different mechanical or operational states or speeds.
audit the audit auditing VR technology it appears
What's it for?
This is awesome
why was this in my feed? and why tf did i even watch this?
gj pls add it to sechs bot k
Electronic Pop-its
Braille, blind people language
I want this as a light switch for my house
Is this the tech they were using in Superman movie?