Adil Khan 9 months ago
AdiKhanOfficial #FYP Ideas

Emotion Recognition in Response to Mulsemedia

Multimedia is everything that we watch, listen or read in form of sounds, text, graphics(image/video), animation etc and it has become a common component of modern software and hardware as we can find it almost in both desktop and mobile platform applications. But The traditional multimedia engages

Project Title

Emotion Recognition in Response to Mulsemedia

Project Area of Specialization

Augmented and Virtual Reality

Project Summary

Multimedia is everything that we watch, listen or read in form of sounds, text, graphics(image/video), animation etc and it has become a common component of modern software and hardware as we can find it almost in both desktop and mobile platform applications. But The traditional multimedia engages auditory and visual sense of the user. Humans have five basic senses and we can create a more immersive and enjoyable environment by engaging more than 2 senses. For this purpose, we can use Mulsemedia (MULtiple SEnsorial MEDIA). Engaging additional senses such as tactile and olfaction can enhance the user experience. Multiple sensorial environment is one that engage more than two human senses while Watching the video by synchronizing different components with the audio-visual content of the video. Mulsemedia started the new era of modern technology by providing a new perspective for all the fields of life. But with the advancement, new solutions are also required for new media as hardware as well as software.

In this project, a setup is developed, which engages more than two senses i.e., auditory, vision, haptic, tactile and other. A video is selected with different Environments having different weather conditions. These conditions are created artificially by synchronizing audio-visual content of the video with fan, heater and haptic to generate cold air, hot air and other effects. We will use a video with two to three different weather condition simultaneously and see how the user responses against mulsemedia environment.

The conventional multimedia uses a combination of two senses i.e. sense of vision and hearing. The traditional multimedia experience does not allow user to fully submerge into the viewing experience.  So, this gives us the motivation to set-up and work on this project.

Project Objectives

Aim:

The main aim of the project is to develop a setup to enhance the conventional multimedia experience by developing a mulsemedia environment.

Objective:

Selecting multimedia videos with various effects taking place in the background of the clip.

  • Synchronizing videos with devices, which will provide mulsemedia effect (i.e. fan, heater, olfaction and haptic vest.). These devices will create an argmented reality for the user and the participant will feel the difference from the traditional multimedia.
  • Develop a setup for mulsemedia environment. This will create the sort of experience for the user as the main characters experience in the virtual world.
  • EEG data acquisition of different subjects in response to mulsemedia and traditional multimedia content.The data acqusition using EEG will help us evaulate the viewing expereince of the subjects. This will be advantageous in many ways because it enables us to record and analyse the data without subjective self-analysis.
  • Evaluate user experience in response to the mulsemedia environment. We will compare the difference in the user experience.

Project Implementation Method

A four-step approach i.e. Setup Development, data acquisition  pre-processing, feature extraction and classification of emotion is utilized for emotion recognition in response to mulsemedia using EEG.

Setup Development:

The main steps for designing a mulsemedia environment are explain below

Video Selection:

Video is selected on the bases of the emotions and the background effects. We are dealing with six effects i.e. cold air, hot air, olfaction (sense of smell), haptic, visual and audio effect. Total of 12 videos were selected. Videos were selected on the basis of arousal scale.  For each quadrant of arousal scale 3 videos were selected and the senses involved in each video is different. The senses involved in video are shown in the table.

Senses involved in videos

Videos

Senses Involved

1

Audio, Visual, Cold air, Olfaction

2

Audio, Visual, Hot air, Olfaction

3

Audio, Visual, Haptic, Olfaction

Every video clip is of 90 to 120 seconds. And the effect duration is almost 60 seconds see image.

Video Synchronization:

After video selection, we need to synchronize each video clip with the hardware. The framework is designed in synchronization of Arduino which is controlling other hardware components. With respect to the video the effects that participant feels sensations produced by hardware components. Videos are synchronized by integrating hardware and software. We need to test the integration so that it will work properly. Arduino code was written in which the time stamp of videos and hardware components are synchronized. The video player and Arduino is connected through the inport of computer. Simply we can say that before video synchronization we need to complete these steps

  • The video player designing
  • The Arduino coding
  • Hardware component designing
  • Integrating all these elements together
  • Testing of the synchronization of videos

Software Development:

We are designing GUI which include a multimedia player on Visual Studio. A video is selected in media player and played on the screen. User will watch the video and after every video the subjective analysis form occur that user have to fill for classification the emotions of the human. As we will detect emotion of Brain signal using EEG, we will compare the result the subjective analysis. So, after every video a form occur and after filling it user will see next video.

Emotion Classification:

Emotion is a strong feeling that derive from the condition, mood and the relationships with others. We can classify emotion into different categories i.e., happy, sad, angry, relax, excited and etc. We can classify emotion into different categories i.e., happy, sad, angry, relax, excited and etc. we detect emotion of different user by face expression and brain activity using EEG as the studies also conclude that humans’ emotions are distinguishable by facial expression using brain signal. The videos are selected using the arousal scale.

Videos

1

2

3

Benefits of the Project

  • This project will enhance the viewing experience of the users.
  • It is a step forward towards the innovation of multi-billion dollar industry. If Pakistan or Pakistani business can fast adopt this innovation, it can give them a competitive edge to innovate further forward. 
  • It'll create a argumented reality viewing experience for the end users. It is a technology that is still in it's new phase and lots of progress needs to be made. So, this is a big leap forward.
  • This setup helps the users to have a viewing experience in their homes rather than having to pay for the luxury by going to goto the cinema or entertainment centers to have the even different viewing experience.
  • The product will use the sense of smell and touch over the sense of vision and hearing so, this is the life cahnging sort of innovation for the home users.
  • The same implementation can be used by big corporations and entertainment centers to provide the experience to the users.

Technical Details of Final Deliverable

Hardware:

We have designed a control system via Arduino Microcontroller which is connected with heater, cooling fan, haptic vest and olfaction dispenser through relays, transistors and resistors. These components are used to create immersive mulsemedia environment. We have developed a media player GUI that will not only be able to play media on the device but also will be able to regulate the fan and the heater to use them at required sequences of time with respect to the video playing.  Fan is used to provide immersive environment for airy conditions while the heater will be used in warm heated scenery, haptic vest will be used to recreate impact situations while olfaction dispenser is used for the aroma production.

Setup Development:

After the complete hardware synchronization, we will record the data of participants Using EEG and PSG for the classification of Emotion. So, for this purpose we need to develop a setup.

Software:

In the software, we have developed a Media player using Visual Studio. At the back-end of it, it works with the Arduino software for the synchronization with the chosen media clips and videos. It is a media player with basic playback capabilities of pause, play and forward to the next media item clip. It can play the video/music files on the computer. Other features such as fast forward, reverse, file markers (if present) and variable playback speed can be added later.

The arduino software is used foe the synchronization of videos with the hardware. For this purpose we have used the time stamp of videos and according to the effect the arduino give the ON and OFF signal to the hardware and hardware work accordingly.

For the proper working all the component are connected to one another. Visual studio plays the video and sends the signal to arduino. Arduino gets the signal, controls it and waits for the time when effects occur in video. Each time the effects occur, arduino will send the ON signal to the hardware and the hardware responds accordingly.

Final Deliverable of the Project

Hardware System

Core Industry

Media

Other Industries

Core Technology

Augmented & Virtual Reality

Other Technologies

Sustainable Development Goals

Industry, Innovation and Infrastructure

Required Resources

Elapsed time in (days or weeks or month or quarter) since start of the project Milestone Deliverable
Month 1Solution Design and planningEstimation of the all of the work and a clear cut working idea of the project
Month 2Literature review and setup development The setup of the working model in light of already done work
Month 3Setup DevelopmentTo develop all the setup of the Final Year Project and setting up of muslemedia enviornment
Month 4Video SelectionThe selection of the video so the project's hardware is totally up and running
Month 5Data AcquisitionThis involves the acquisition of data from the participants
Month 6Feature Extraction & Data processingThe data that will be acquired, in this step the features to be used will be extracted and processed to show the impact of the setup
Month 7Result AnalysisThe difference of the impact of melsemedia and multimedia will be compared
Month 8Additional Work & optimization and ResearchIn this the results will be optimized and further research and report writing will be done
If you need this project, please contact me on contact@adikhanofficial.com
0
122
X Ray Based Pneumonia Detection

In the project ?Pneumonia Detection Through X-Ray?, a web application will be developed fo...

1675638330.png
Adil Khan
9 months ago
IOT based garbage and sewage monitoring system

It is an iot based garbage n sewage monitoring system. As future is of technology n of iot...

1675638330.png
Adil Khan
9 months ago
Development of a Force Feedback Glove for Interacting with Objects in...

A wearable force-feedback glove is to be developed that will produce forces on each of the...

1675638330.png
Adil Khan
9 months ago
Domestic Automation by using Brain Signal

Our brain is composed of neurons, glial cells, and blood vessels. The number of neurons is...

1675638330.png
Adil Khan
9 months ago
SyncHorizon

We are actually developing an All-in-one mobile app where customers can chat, do their mon...

1675638330.png
Adil Khan
9 months ago