Mapping points of cloud of single image onto MRI for 3D cardiac modeling for augmented reality

This research is about obtaining the realism in virtual heart surgery. Normally, novice cardiac surgeons learn the surgery by observing the experts perform the operation on a patient. The physical involvement from these novice cardiac surgeons are very minimal. As such, virtual surgery might help in...

Full description

Bibliographic Details
Published in:ARPN Journal of Engineering and Applied Sciences
Main Author: Faudzi A.A.; Rahmat R.W.O.K.; Sulaiman P.S.; Dimon M.Z.
Format: Article
Language:English
Published: Asian Research Publishing Network 2015
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-84940061521&partnerID=40&md5=3c3a1cc3df618346912d1f92ddfed02c
Description
Summary:This research is about obtaining the realism in virtual heart surgery. Normally, novice cardiac surgeons learn the surgery by observing the experts perform the operation on a patient. The physical involvement from these novice cardiac surgeons are very minimal. As such, virtual surgery might help in simulating the cardiac surgery and provide the feeling of doing a real surgery without putting any patients' life at risk. The main challenge in virtual surgery is to ensure the 3D model of the heart is as realistic as the real heart. Currently artificial 3D heart is used in virtual cardiac surgery. In order to provide realism in the virtual surgery, MRI are used to capture the internal part of the heart and camera-captured images are used to portray the external part. The MRI slices are combined to create a 3D heart. In order to increase the accuracy of the texture mapping, the captured image are transformed into 3D points of cloud based on the depth of the surface, before they can be mapped on the 3D heart. These two types of images (MRI slices and camera captured) are taken from real patients. Here, MRI and point clouds provides accuracy and the captured images provide realism. © 2006-2015 Asian Research Publishing Network (ARPN).
ISSN:18196608