Motion Parallax Image Capturing and Display System for Virtual Musem Applications
The involvement of technology in the fields of arts, archaeology and history has increased over the past few decades. The 3D imaging and displays systems are widely used to capture, record and visualize the archaeological objects and artwork. In this project we propose an integrated system enabled w
2025-06-28 16:34:11 - Adil Khan
Motion Parallax Image Capturing and Display System for Virtual Musem Applications
Project Area of Specialization Augmented and Virtual RealityProject SummaryThe involvement of technology in the fields of arts, archaeology and history has increased over the past few decades. The 3D imaging and displays systems are widely used to capture, record and visualize the archaeological objects and artwork. In this project we propose an integrated system enabled with motion parallax 3D cues to provide the features of capturing and displaying 3D content. The system involves an imaging subsystem for capturing the images of the object from different angles to allow Multiview capture. The captured multiple views are displayed on the regular 2D screen. The motion parallax cue is used to provide 3D depth information. The motion parallax is enabled by tracking the position of the viewer using a head tracking device (such as Kinect sensor) and the multi-views are rendered based on the position and angle of the viewer with respect to the screen. The capturing system will be based on regular 2D camera attached to a moving camera rig. The translational movement of camera rig will be controlled using the microcontroller. The integrated system is used to capture the archaeological objects and captured Multiview’s are rendered on the screen to visualize the object in three dimensions.
Image in 3D display using simple 2D screen by utilizing Head Tracking Method. Kinect will be used for Head Tracking Purpose. We will capture multiple shots of an object from multiple angles and every shot will be allotted a specific angle by generating sequence using Python. A freely moving camera will perform the task of capturing images of the object. Stepper motor will be responsible to move the camera. Now in the stage of displaying the object, the Kinect takes place in front of the user to track the head of user and acknowledge python including angle. Python fetches image shot at that angle and displays at screen. This process will be repeating continuously very much quickly. As user moves his head, he will see a different image. So that user will be observing that image in 3D display. This whole process will be working in real time. We will also include virtual objects using Unity 3D which performs same action and provides essence of 3D display to user.
Project ObjectivesThe aim of the project is to develop an image capturing and display system based on a 3D motion parallax view for virtual museum application.
Objectives:
- Develop the image capture hardware assembly using camera and motors and conveyer belt.
- Interfacing of capture hardware assembly with PC for automated image capture.
- Develop a head tracking sub-module using camera/Kinect depth sensor.
- Real-time rendering and display of animated and captured multi-views on the screen
- Implementation of the system for 3D capture and 3D display of museum objects.
- Test and characterize the performance of the developed systems
The project is divided into two parts. The first part is image capture subsystem. The image capturing part consists of stepper motors and conveyer belt-based assembly. The camera is attached to belt to allow the translational movement of the camera to take multiple shots from different viewpoints. The camera movement and shot capture is controlled using a customized python application. The motor movement is controlled using the Arduino microcontroller, while the movement signal is received from the python application. The camera range is fixed from 1-100cm with step size of 1cm. 100 multi-views of the object are captured. The angular movement of each view is also recorded.
The second part involves the viewer tracking and display of the rendered views. A flat panel 2D display is used to for image display while a head tracking device such as Kinect is used to track the distance and angular position of the viewer. The required view is rendered based on the angular position of the viewer and is displayed on the screen. The viewer tracking, rendering and display functions are performed in real-time to provide the motion parallax cues on 3D.

Figure-1: Block diagram representation of the system


Figure-2: Conceptual view of the image capture assembly (left) and display system (right)
Benefits of the ProjectThe proposed system will allow the use of conventional 2D display screens to provide 3D depth cue using motion parallax. Though the developed system is demonstrated for museum scenarios, the use of a developed system can be extended to other similar applications. The system can be used to set up the virtual museum in rural areas across the country by capturing the views of the original antique object once and rendering the views on a regular 2D screen using viewer tracking. The proposed system can also benefit digital marketing and online shopping, where it would be possible to provide 3D views of the product to better market and sell it. The system can also benefit the domain of entertainment and gaming. The system can introduce a realistic feeling for gaming and entertainment experience. The system can also be used for medical visualization, where the CT scan and MRIs can be visualized in 3D.
Technical Details of Final DeliverableThe resultant product is a Image capturing and display system of virtual museum application which include software Unity3D that is cross-platform game engine and python programming languages and hardware like kinect Xbox 360 to track user and webcam which is connected to stepper motor by belt which is moved by Arduino uno to capture the 100 images at 1meter which shows 2D Image into 3D in 2D screen. This system removes an obstruction of an object seen by user in 2D. In this project we finally get a system in which we could track user in 360' angle and shown him an object of 3D which was actually in 2D and we remove unclearness of an object in 3D.
Final Deliverable of the Project HW/SW integrated systemCore Industry EducationOther Industries IT Core Technology Augmented & Virtual RealityOther Technologies Augmented & Virtual RealitySustainable Development Goals Quality Education, Decent Work and Economic Growth, Industry, Innovation and InfrastructureRequired Resources| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Total in (Rs) | 45700 | |||
| Microsoft Kinect V2 | Equipment | 1 | 15000 | 15000 |
| Stepper Motors (Nema 17) | Equipment | 2 | 3000 | 6000 |
| Motor Driver IC (A4988) | Equipment | 2 | 600 | 1200 |
| Conveyor Belt | Equipment | 1 | 500 | 500 |
| 2D display screen for 3D visualization | Equipment | 1 | 10000 | 10000 |
| Multi-view capturing camera (i.e. webcam) | Equipment | 1 | 3000 | 3000 |
| Gantry machanical assembly for moving camera | Equipment | 1 | 5000 | 5000 |
| Microcontroller (Arduino, wifi sheild) | Equipment | 1 | 2000 | 2000 |
| Plastic case for hardware, wiring, soldering items, stationary. | Miscellaneous | 1 | 3000 | 3000 |