Publications

  1. Hosseini, H., Mandal, D., & Puhan, A. (2024). Surprisingly Popular Voting for Concentric Rank-Order Models. arXiv preprint arXiv:2411.08367. (Accepted at ACM's The Web Conference 2025) - read the paper here

  2. Hosseini, H., Mandal, D. and Puhan, A., 2024. The Surprising Effectiveness of SP Voting with Partial Preferences. arXiv preprint arXiv:2406.00870.  (Accepted at NeurIPS 2024) - read the paper here

  3. Hazra, S., Pisipati, M., Puhan, A., Nandy, A., & Scherer, R. (2021, December). Two Novel Methods for Multiple Kinect v2 Sensor Calibration. In International Conference on Computer Vision and Image Processing (pp. 403-414). Cham: Springer International Publishing. - read the paper here

  4. Pisipati, M., Puhan, A., Kumar, A., Semwal, V. B., & Agrawal, H. (2020). A dynamic hand gesture-based password recognition system. In Communication and Intelligent Systems: Proceedings of ICCIS 2019 (pp. 21-34). Springer Singapore. - read the paper here 

Research Projects

Fine-Tuning Large Language Models using Reinforcement Learning with Human Feedback

(Nov 2023 - Present)

Since this research is ongoing, more details will be provided soon...

The Surprising Effectiveness of SP Voting with Partial Preferences

Collaborators: Dr. Hadi Hosseini, Pennsylvania State University (Thesis Advisor), and Dr. Debmalya Mandal, University of Warwick. 

(Jan 2023- May 2024)

The research work is focused on Surprisingly Popular Voting (SP-Voting), which is an aggregation rule that can recover the truth even when the experts are in the minority of the voting population. The original method necessitates voters to provide: 1) Personal candidate preference, and 2) Perceptions of the broader electorate's preferences. Our work pivots towards devising a mechanism that effectively implements SP-Voting, even with partial preference profiles, thus widening its applicability and effectiveness in various democratic contexts. Additionally, we propose a mathematical model that can accurately simulate voter preferences when the population has a mixture of experts and non-experts. We also provide theoretical bounds on the sample complexity in such a setup. You can find the full paper here.

Keywords: Multi-agent systems, Surprisingly Popular Voting, Partial Preference Profiles, Algorithmic Game Theory, Truth Recovery.

Our work has been accepted at NeurIPS 2024

Detection of Breast Cancer based on Mammograms with Metadata Integration

(Jan 2023 - May 2023)

Collaborators: Irfan Tekdir, Dept. of Economics, Pennsylvania State  University and Akshaya Jayant Patil, Dept. of CSE, Pennsylvania State University

Supervisor: Dr. C. Lee. Giles

Conducted an empirical research project utilizing prominent CNN models (ResNet-50, ResNet-101, DenseNet-121) and Attention-based models (DeiT, DeiT-III, PatchConvnet) for predicting breast cancer from mammogram images, specifically employing the Kaggle RSNA Mammogram dataset. The innovative angle of our approach was the integration of image metadata during the deep learning models' training process. Results showcased that this metadata incorporation notably enhanced the AUC-ROC score across all models and improved generalization, pointing towards a viable avenue for advancing medical diagnostic AI. You can find the full report here

Keywords: CNN, Attention-Based Models, Breast Cancer Prediction, Mammogram Image Analysis, Metadata Integration, Deep Learning.

Analysed human gait as a predictor for cognitive state

Thesis Advisor: Dr. Anup Nandy, CSE Dept, NIT Rourkela

(Aug 2019 - May 2020)

In my undergraduate thesis, I examined the potential correlation between human cognitive states and gait. Data was gathered across six distinct cognitive states using eight Microsoft Kinect V2 sensors, Emotiv Epoc EEG headset, and inertial measurement units. By employing time-series Multivariate Adaptive Regression Splines to model the temporal EEG and gait data, a 95.56% correlation between cognitive states and gait variations was notably found, suggesting dependency between emotional states and gait.

Keywords: Cognitive States, Human Gait, Kinect V2 Sensors, EEG Data, Time-Series Analysis, Multivariate Adaptive Regression Splines, Correlation Analysis, Biomechanics.

Camera Calibration using multiple Microsoft Kinect Sensors

(May 2018 - Jan 2020)

Two innovative camera calibration techniques were proposed for eight Kinect v2.0 sensors, employing a meticulous six-step pairwise method and a novel octagonal model for simultaneous single-camera calibration. These methods achieved impressively low reprojection errors of 0.63 pixels and 0.27 pixels, respectively, outperforming existing calibration techniques when evaluated using metrics like root mean square and reprojection error. Our work lays a foundation for enhancing camera calibration practices, ensuring improved measurement accuracy in applications utilizing multiple Kinect sensors. You can find the full paper here.

Keywords: Kinect v2.0, Camera Calibration, Pairwise Method, Octagonal Model, Reprojection Error, Measurement Accuracy, Multiple Kinect Applications.

Our work was accepted in CVIP 2021 conference and was published in Springer CCIS journal.


Dynamic Hand Gesture-Based Password Recognition System

(Jan 2019- May 2019)

Developed a gesture recognition system employing a low-cost webcam and utilizing a Multilayer Perceptron (MLP) for feature extraction and gesture classification. Successfully classifying nine hand signs with a notable accuracy of 96.16%, our model surpassed previous studies achieving 86.5% for three hand signs and 96% for a single sign. Additionally, a novel password recognition system was proposed, showcasing an innovative application of gesture technology and enhancing secure access through physical user interaction.  You can find the full paper here.

Keywords: Gesture Recognition, Multilayer Perceptron, Feature Extraction, Hand Sign Classification, Password Recognition System, Human-Computer Interaction, Machine Learning, Security.

Our work was accepted in the ICCIS 2019 conference and was published in Springer LNNS journal.

Facial Detection and Tracking System

(Aug 2018 - Dec 2018)

Product Development Laboratory Project,

Supervisor: Dr Sujata Mohanty, CSE Dept, NIT Rourkela

Implemented a real-time face tracking system using MATLAB and Arduino. Utilized the Viola-Jones algorithm for precise face detection and localization in a video stream, directing the position of a camera via servo motors controlled by Arduino, ensuring continuous face centering within the frame. This project demonstrates a seamless integration of computer vision with physical actuation, offering potential applications in interactive technology and security.

Keywords: Arduino, Viola-Jones Algorithm, Servo Motor Control, Computer Vision, , Face Detection, Face Tracking.