Urdu Mobile Assistant
There are limited mobile assistants with close to Urdu Mobile Assistant (UMA) level performance and it is particularly true for the Urdu language. We are developing our own Urdu mobile assistant. Our Urdu Mobile Assistant will help users to open specific apps by speaking Urdu commands thus making th
2025-06-28 16:29:54 - Adil Khan
Urdu Mobile Assistant
Project Area of Specialization Artificial IntelligenceProject SummaryThere are limited mobile assistants with close to Urdu Mobile Assistant (UMA) level performance and it is particularly true for the Urdu language. We are developing our own Urdu mobile assistant. Our Urdu Mobile Assistant will help users to open specific apps by speaking Urdu commands thus making the experience more user friendly. User will simply give a voice command to system, that voice command will be converted in Urdu textual form and shown to user. Then that textual data will be further processed and related event to that particular voice command would be triggered. User of application can manage profile info and can give feedback related to any query face in using the application.
Project ObjectivesA large number of people in Pakistan can’t fluently speak English and thus are unable to use the mobile assistants provided by Google, Huawei etc. therefore there exists a need for an app that provides a friendly experience to Urdu speaking individuals so that they can find their apps easily without going through the hassle of scrolling through many identical icons. Many people in Pakistan are casual mobile users and often forget where the icons of their most used apps such as Facebook etc. are placed and then try to find the icon by scrolling through many pages of apps. Our assistant will solve this problem by allowing the user to access common apps just by giving a voice command to the assistant.
Our project will perform the following functions:
- Receive voice command
- Extract keywords from voice data
- Open desired mobile application
- Submit Feedback
- Login/sign up
User gives voice command to system. that voice command is converted into text format. The text is first cleaned and after that the text is feeded to model. The input to model is given and models returns event that wether the user wants to open app or not. Its prob value is returned if prob value crosses threashhold then the app is checked that is spoken by the user and then the related app is launched. Furthur more user can login in system, forget pass, sign up, give feedback and manage profile are some of the other use cases that are covered in this project.
Benefits of the ProjectThe purpose of this document is to describe the behavior of the URDU MOBILE ASSISTANT (UMA). The Vision Document contains all requirements and design constraints, which gives the reader an understanding of the application to be developed. Requirements specification defines and describes the operations, interfaces and performance of the URDU MOBILE ASSISTANT (UMA) project. A large number of people in Pakistan can’t fluently speak English and thus are unable to use the mobile assistants provided by Google, Huawei etc. therefore there exists a need for an app that provides a friendly experience to Urdu speaking individuals so that they can find their apps easily without going through the hassle of scrolling through many identical icons. Many people in Pakistan are casual mobile users and often forget where the icons of their most used apps such as Facebook etc. are placed and then try to find the icon by scrolling through many pages of apps. Our assistant will solve this problem by allowing the user to access common apps just by giving a voice command to the assistant.
Technical Details of Final DeliverableThe software is deliverable in android app format. Any android app user can use the app and experince.
Final Deliverable of the Project Software SystemCore Industry ITOther Industries Others Core Technology Artificial Intelligence(AI)Other Technologies OthersSustainable Development Goals Peace and Justice Strong InstitutionsRequired Resources