Summary: | In evaluating the dynamic landscape of online shopping, the integration of Augmented Reality (AR) technologies has emerged as a transformative force, redefining the way consumers engage with products in virtual environments. This research project investigates the intersection of deep learning and AR in the context of online shopping, with a particular focus on a Watch Try-On application. The experimentation involves the use of SSD MobileNet's models for real-time object detection aimed at enhancing the user experience during online watch shopping. Training both SSD MobileNet's V1 and V2 models through 50,000 iterations, the results reveal intriguing insights into their performance. SSD MobileNet's V1 demonstrated superior results, boasting a mean average precision (mAP) of 0.9725 and a significant reduction in total loss from 0.774 to 0.5405. However, the longer training time of 7 hours and 42 minutes prompted the selection of SSD MobileNet's V2 for real-time applications due to its faster inference capabilities. Extending beyond traditional online shopping experiences, the research explores the potential of AR technologies to revolutionize product visualization and interaction. The choice of the Vuforia model target for the Watch Try-On application showcases the synergy between deep learning and AR, allowing users to virtually try on watches and visualize them in their real-world environment. The application successfully detects users' hands with high accuracy, creating an immersive and visually enriching experience. In conclusion, this project contributes to the ongoing discourse on the fusion of deep learning and AR for online shopping. The exploration of SSD MobileNet's models, coupled with the integration of AR technologies, underscores the potential to elevate the online shopping experience by providing users with dynamic, interactive, and personalized ways to engage with products. © 2025, Bright Publisher. All rights reserved.
|