"Quasar IPS" -prototype of innovative system for indoor navigation, using Augmented Reality, with practical application in proximity marketing, entertainment industry (Optimized navigation in museums, galleries and other cultural events), and in education - educational and complementary training

Quasar IPS utilized in a new way information that is already collected and standardly stored on a device. This new type of analysis will results in the following: Movement tracking - will be able to monitor, store and analyze its own movement and orientation in indoor spaces Studying and memorizing a specific confined space - will orient itself in indoor location based on two main factors: preset floorplan, combined with the above function Tracking movements resulted orientation on a given subject location in the preset area. Depth perception - this functionality takes into account the environment type (room, staircase, etc.). On this basis, we can create elements of augmented reality. They will assist the user orientation - visualized in the form of signs, similar to the instructions in the GPS systems, but for greater user convenience, projected on the indoor location, displayed on the smartphone screen. The algorithm application will take into account predicable variables that will allow him to analyze and archive the above information, and with its predictive abilities improving with the time used, Quasar IPS will profile more accurately patterns of movement and points of interest. This will allow the application to predict and analyze larger volumes of data in terms of consumer habits which are used for marketing purposes strategic positioning by sellers in major retail outlets. The combination of information will lead to higher recognition accuracy of environment for the user, which in turn will lead to higher accuracy in localization and navigation instructions accordingly in an enclosed space.

Comments
Leave a comment

Overview

Status Closed (completion date)
Start date 31 Jul, 2017
End date 19 Nov, 2019
Contract date 31 Jul, 2017
View in UMIS

Financial information

Total cost 415,274.20
Grant 373,746.78
Self finance 41,527.42
Total paid 371,335.92
EU participation percent 85.0%

Location