The company "Google" revealed a new feature called "deep applicability", explaining that smartphones operating in "Android" operating systems will be equipped with this feature in the field of "augmented reality", so that it allows the user, for example, to direct his phone's camera to a room he wants to add to it A new piece of furniture, so it is very smooth and easy, to project a very clear and three-dimensional image of the piece of furniture with augmented reality technology in the place that the user wants, then the property changes the angle of view from each side and from the top and bottom, so that the position of the piece of furniture changes automatically, and parts appear and disappear Others, according to the viewing angle, as c Move it in every direction as if it was already there, to see the user and examine the compatibility between it and the rest of the room contents, and their suitability with the free space and the prevailing colors and others.

And Google reported that "deep application" is one of the future technologies that the augmented reality technologies team is working on in the company.

This came during a meeting between a group of team members and some technology editors in America, at the company's headquarters. And the website "The Verg" published a report on it in light of the live show that the site's editors followed.

AR Core

The "deep applicability" feature is part of the promotions and improvements that Google intends to add to the AR Core system, which was officially launched early last year, and is preparing to launch a second version soon. Google defines snapping as a situation in which part or all of the virtual objects can be blocked from display by other real objects in a scene, and the result is a more believable scene, because discovering the depth that occurs beyond or below the overlap area between targets, means The Augmented Reality application or system understands each object in the scene better and to what extent each object differs from the other.

Dental applicability

According to the "The Verg", the idea of ​​the new characteristic is inspired by the state of "deep applicability" between the teeth of the upper and lower jaw in humans, which leads to the occurrence of a complete blockage of the mouth, with the disappearance of part of the lower teeth behind the upper when applying, and when opening the mouth again shows the details Full jaws teeth again.

And in the new feature, the team was inspired by this, to formulate the way in which objects and objects that are projected with augmented reality technology appear, into live scenes and images that are captured by the camera and displayed on the screen of a device, such as a smartphone, so deep application and complete interaction occurs between the default goal presented With augmented reality, and the real objects that appear in the picture, and not just a surface movement, as is currently happening, makes people look at the scene through the camera. But what happens is an interaction without depth between the two cases, which causes the hypothetical target to hide behind it and on its sides parts of what it encounters on the screen, because it appears in full detail in all cases, whatever the viewing angles.

Angle of view

As for the "deep applicability" feature, it means that the objective or thing presented in the scene with the technology of augmented reality, deeply interferes with the things that appear around it, and interacts with it as if it was a goal or something real, realistic and actual. In one of the videos presented by The Verge, Google researchers opened a pixel 3 smartphone camera, roaming it in a living room containing many things, and then dropped a target on it with augmented reality, which is a cat, and made it roam the room, Amid the things, as the cat wandered and passed behind a sofa, she gradually disappeared with movement until only part of the tail appeared, while her head and neck gradually appeared on the other side. When the cat stood between two pieces of furniture, and the researchers changed the assumed angle of view, the visible parts of the cat changed according to the angle of view, once they appeared completely, and only once a side of them appeared.

Freedom of movement

Another video showed that Google researchers tried to add a new piece of furniture to a living room, using the “deep fit” feature. The augmented reality augmentation technology shows the room scene after adding the new table, then move the table from one place to another, to calculate the distances between it and the seats and the sofa, to explore freedom of movement, and whether the table size is appropriate or not.

Early stage

Technical reports, which followed the development of the "deep applicability" feature, reported that this feature is still in its early stages and is not flexible, effective, efficient and fast.

The reports pointed out that the live performances made by Google to run the new technology were much less than the quality they appeared in the clarification videos.