A Real Time Virtual Dressing Room Application using Opencv
Rshami S. Shinkar1, Nagaraju Bogiri2

1Rshami S. Shinkar, Computer Engineering Department, K. J.College of Engineering & Management, SavitribaiPhule Pune University, Pune, India. 
2Nagaraju Bogiri, Computer Engineering Department, K. J.College of Engineering & Management, SavitribaiPhule Pune University, Pune, India.
Manuscript received on 25 August 2019. | Revised Manuscript received on 12 September 2019. | Manuscript published on 30 September 2019. | PP: 987-992 | Volume-8 Issue-11, September 2019. | Retrieval Number: H7396068819/2019©BEIESP | DOI: 10.35940/ijitee.H7396.0981119
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The real time clothes trying having long waiting queues for trial rooms is a problem faced at many places and store. In this system, by using real time system cloth processing is done by data drive approach which is worn by human. To give exact look to skeleton, by using height, skin color, it starts creation of clone person which is prior to real-time simulation. We use hardware sensors like motion, light, camera sensors which are controlled by using GUI (Graphical User Interface) software. Hardware uses latest depth camera of Kinect Sensing element along Unity SDK. We use operating system which can communicate with the user friendly software which should manage the controllers. This proposed approach can offer good GUI which is user friendly to end user and also to retailer. Because of this user friendly system, it should increase the level of marketing more than current system. Proposed system should provide not only good solution for dressing but also can solve the issues which are related to retailer and end user.
Keywords: Depth Image, Kinect Sensor, Augmented Reality, Unity SDK, Skeleton
Scope of the Article: