Vision Based Real-time Recognition of Hand Gestures for Indian Sign Language using Histogram of Oriented Gradients Features

##plugins.themes.academic_pro.article.main##

Pradip Patel
Narendra Patel

Abstract

A sign language is the method of communication used by the deaf people where gestures are used to express meaning. Due to illiteracy of sign language by normal people, there exists communication gap between normal people and deaf people. Very little work has been done for recognition of Indian Sign Language due to lack of standardization and complexity of hand gestures. This resulted in need of automatic system that can recognize Indian Sign Language. Such system, developed via use of image processing and computer vision techniques, will help deaf peoples to communicate with normal people thus filling the communication gap. This paper presents vision based system for real time recognition of hand gesture for Indian Sign Language. The developed system is first trained using training data set. For this, from all the images of dataset, hand region is cropped by performing segmentation using thresholding in YCbCr color space. Histogram of Oriented Gradients (HOG) features of these cropped images are then computed and used to train the classifier. During testing, features of hand region of frames from real time video are presented to classifier for classification. Recognition rates of different classifiers like Support Vector Machine (SVM), K-Nearest Neighbors (KNN) and Linear Discriminant Analysis (LDA) are discussed.

##plugins.themes.academic_pro.article.details##

How to Cite
Pradip Patel, & Narendra Patel. (2019). Vision Based Real-time Recognition of Hand Gestures for Indian Sign Language using Histogram of Oriented Gradients Features. International Journal of Next-Generation Computing, 10(2), 92–102. https://doi.org/10.47164/ijngc.v10i2.158

References

  1. Adithya, Vinod, and Gopalakrishnan, U. 2013. Artificial neural network based method for indian sign language recognition. In Conference on Information and Communication Technologies (ICT 2013). IEEE, pp.1080-1085.
  2. Ansari, Z. A. and Harit, G. 2016. Nearest neighbour classification of indian sign language gestures using kinect camera. Indian Academy of Sciences, Volume 41, Number 2, pp.161-182.
  3. Badhe, P. C. and Kulkarni, V. 2015. Indian sign language translator using gesture recognition algorithm. In IEEE International Conference on Computer Graphics, Vision and Information Security (CGVIS). Bhubaneswar, India, pp. 195-200.
  4. Chaudhary, D. and Beevi, S. 2017. Spotting and recognition of hand gesture for Indian sign language using skin segmentation with ycbcr and hsv color models under different lighting conditions. International Journal of Innovations and Advancement in Computer Science(IJIACS) Volume 6, Issue 9.
  5. Dalal, N. and Triggs, B. 2005. Histograms of oriented gradients for human detection. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition. pp. 886-893.
  6. Dixit, K. and Jalal, A. S. 2013. Automatic indian sign language recognition system. In 3rd IEEE International Advance Computing Conference (IACC). Ghaziabad, India.
  7. Forsyth, D. A. and Ponce, J. 2015. In Computer Vision A Modern Approach. Pearson Education. 2nd Edition.
  8. Gonzalez, R. C. and Woods, R. E. 2018. In Digital Image Processing. Pearson Education. 4th Edition.
  9. Gupta, B., Shukla, P., and Mittal, A. 2016. K-nearest correlated neighbor classification for indian sign language gesture recognition using feature fusion. In International Conference on Computer Communication and Informatics (ICCCI). Coimbatore, India, pp. 1-5.
  10. Kaur, B., Joshi, G., and Vig, R. 2017. Indian sign language recognition using krawtchouk moment-based local features. The Imaging Science Journal Volume 65, Number 3, pp.171-179.
  11. Kolkur, S., Kalbande, D., Shimpi, P., Bapat, C., and Jatakia, J. 2017. Human skin detection using rgb, hsv and ycbcr color models. Advances in Intelligent Systems Research Volume 137, pp.324-332.
  12. Kumar, M. 2018. Conversion of sign language into text. International Journal of Applied Engineering Research Volume 13, Number 9, pp. 7154-7161.
  13. Kumar, P., Roy, P. P., and Dogra, D. P. 2018. Independent bayesian classifier combination based sign language recognition using facial expression. Information Sciences Volume 428, pp. 30-48.
  14. Raheja, J. L., Mishra, A., and Chaudhary, A. 2016. Pattern Recognition and Image Analysis Volume 26, Issue 2, pp.434 - 441.
  15. Rokade, Y. I. and Jadav, P. M. 2017. Indian sign language recognition system. International Journal of Engineering and Technology Volume 9, Number 3.
  16. Singha, J. and Das, K. 2013. Recognition of indian sign language in live video. International Journal of Computer Applications Volume 70, Number 19, pp. 17-22.
  17. Vinh, T. Q. and Tri, N. T. 2015. Hand gesture recognition based on depth image using kinect sensor. In IEEE 2nd National Foundation for Science and Technology Development Conference on Information and Computer Science. Ho Chi Minh City, Vietnam.
  18. Yusuf, A., Mohamad, F., and Sufyanu, Z. 2017. Human face detection using skin color segmentation and watershed algorithm. American Journal of Arti_cial Intelligence Volume 1, Issue 1, pp.29-35.