01 logo

Soon Our Phones Will Know Which Hand They're In

by Eliza Castaneda 2 months ago in mobile
Report Story

In the previous days, one of the less interesting conferences, a conference focused on the use of human factors in computing systems or "CHI" for short, was held in New Orleans, Louisiana. Although this conference is not very popular, it revolves around a very important area, which is finding new ways for humans to interact with technology.

Soon Our Phones Will Know Which Hand They're In
Photo by NordWood Themes on Unsplash

From the standpoint of this field, we have seen some inventions that will amaze you and perhaps be heartbroken about the technical future that awaits us, and how not, and I am about to tell you that a group of scientists were able to simulate the feeling of a kiss using virtual reality technologies, as well as the feeling of pet fur as if Metaverses are not terrifying with what Enough!

But before we get into these strange inventions, let's first talk about a funny use of selfie cameras that a group of Japanese researchers came up with, and it is perhaps the only less confusing invention at this conference that can be harnessed in life functions and applications that will make the ways we use our smartphones much easier.

We all carry a smartphone with a front selfie camera, and whatever way or angle you hold your phone, if we imagine that the front camera works, it sees you at all times, and from this idea a group of researchers from Keio Universities, Yahoo as well as Tokyo University of Technology came up with a way A smart camera in which the front camera is harnessed to see the way we hold our smartphones and take advantage of this information to create an adaptive user interface that changes according to the way we carry our devices, and this project was named “ReflecTouch” and was revealed at the CHI 2022 conference.

With a suitable screen brightness and a high camera resolution, these researchers were able to take a number of pictures of the user's eyes while he was holding his phone, and then analysed and processed those pictures so that they zoomed in on the focus of the eye and noticed the white spaces in them, which represent the reflection of the phone screen and the way the user carried it.

The researchers published a clip on YouTube in which the application of the experiment appears in practice, where a person appeared carrying an iPhone with an application that was specially programmed, asking the user to hold the phone in certain ways, such as the right and left hand, then the two hands together, and using different fingers on the screen such as the thumb and forefinger. Taking a picture of the user in each position, leaving a distance of 15 cm between him and the phone.

Those captured images are then processed using Convolutional Neural Networks (CNN) algorithms, where they zoom in on the user's eye and analyse the reflection of light appearing in focus, and by identifying the white spaces, it is possible to know the specific way in which you hold your smartphone.

You may wonder about the possible applications for this unique experience, and in fact, the researchers concluded their experience by showing a group of interesting applications, for example, if you use an application with two hands and then switch to using it with the right hand only, the phone will be able to know this and turn the buttons in the application To the left for easy access. Also, if you write with both hands and then switch to one hand, you will see the keyboard move to the left, did you get the idea? In general, the goal of this experiment is to create a dynamic user interface that interacts with the user.

Mark Zuckerberg promised us that Metaverse would be the future of the Internet, and he began pumping millions of dollars to develop appropriate technologies to interact with this digital world, and although virtual reality has developed greatly in recent years, it does not deviate from visual and audio effects only with the presence of a tool Control that makes some vibrations, scientists and researchers still have a long way to create an integrated experience that affects our senses to make the human mind illusion that imagination is real!

and now; Science has progressed this long way, as a group of researchers from Carnegie Mellon University's "Future Interfaces" group were able to simulate the touch of the mouth, which means that you will be able to feel the feeling of drinking water, brushing your teeth, smoking a cigarette or even being kissed, as we saw in This year's CHI Conference.

Perhaps the most important achievement in this scientific miracle is that it does not require any additional accessories to be worn by the user besides the virtual reality glasses, as the researchers used glasses believed to be from the “Meta Quest 2” model and added to them a group of “ultrasonic transducers” that all focus on the mouth The user, which can create a sense of touch on the user's lips, teeth, and even tongue when their mouth is open.

And we're not just talking about pulses being sent to simulate gentle touches on the lips. These transducers can send out pulses in specific patterns, creating an endless number of applications that can take advantage of this.

The researchers presented a clip showing how to implement this idea with some practical applications, which included the user hitting a spider web, drinking from a tap, rain falling on his face and more.. Each example presented a specific pattern on the user's lips to create a more realistic experience one step at a time.

Some researchers at the University of Taiwan have come up with another interesting idea, which is to simulate the feeling of fur when you adopt a Meta-verse cat or even the feeling of different textures of the cloth, but this time you will need an additional accessory piece similar to the control hand that includes two strands of fake fur that can be felt by the finger that The researchers called it "HairTouch."

The idea of ​​this tool is not only to show the user the fake fur when he interacts with any animal or inanimate, but it can also simulate the feeling of different types of fur and other surfaces, by controlling the length and angle of the hairs, where the fake fur can be made softer and more flexible when it is Fully stretched or stiffer and coarser when only a small amount of fibre adheres.

The CHI 2022 conference assured us that the technical future that awaits us cannot be expected, and will not fail to amaze us, and that these inventions that we have seen today are nothing but a prelude to what we will see in the coming months in enhancing the experience of virtual reality and harnessing technologies in general to facilitate the life of modern man.

mobile

About the author

Eliza Castaneda

Eliza Castaneda

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

Eliza Castaneda is not accepting comments at the moment

Want to show your support? Send them a one-off tip.

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2022 Creatd, Inc. All Rights Reserved.