Avatar

Sharat Agarwal

Ph.D.Candidate

IIIT Delhi, India


CV | Google Scholar | Github


I am a final year Ph.D. candidate advised by Prof. Saket Anand and Prof. Chetan Arora .

My research interests lie at the intersection of Computer Vision and Deep Learning, with a focus on Active Learning, Data Fairness, and Domain Adaptation. I am broadly interested in designing data-efficient learning systems that can perform effectively under limited supervision. Toward this goal, my work explores the contextual richness of visual data and leverages model uncertainty to guide sample selection and improve model generalization. A key emphasis of my research is on reducing the reliance on large-scale annotated datasets by identifying and utilizing the most informative data.

I am currently a Computer Vision Consultant at The Habitat Trust , where I collaborate closely with ecologists and conservation practitioners to co-develop tools for wildlife monitoring and data annotation. My work involves curating large-scale Indian species datasets, building context-aware vision models for species detection and classification, and designing Active Learning and Human-in-the-Loop (HITL) pipelines to address sparse and imbalanced ecological data. These efforts aim to accelerate annotation workflows while improving the adaptability and robustness of deployed models in real-world conservation settings.

I am actively engaged in solving novel research challenges at the intersection of biodiversity, representation learning, and data-efficient training. I envision building inclusive, human-centered AI systems that support environmental resilience and sustainability.

I am on the job market. Please reach out if you think I could be a good fit for your team

Contact

  • sharata [at] iiitd.ac.in

    sharat29ag [at] gmail.com

  • LAB B413, R&D Block, IIIT-Delhi, Delhi, 110020

Education

  • B.Tech in CSE, 2016

    GEU, Dehradun, India

Publications

  • NCAL: Neural Collapse-Guided Active Learning for Robust and Generalizable Representations

    Sharat Agarwal, Atharv Goel, Saket Anand, Chetan Arora.

    Under Submission

  • Reducing Annotation Effort by Identifying and Labeling Contextually Diverse Classes for Semantic Segmentation Under Domain Shift

    Sharat Agarwal, Saket Anand, Chetan Arora.

    WACV 2023

    [ Paper ] [ Code ]

  • Does Data Repair Leads to Fair Models? Curating Contextually Fair Data to Reduce Model Bias

    Sharat Agarwal, Sumanyu Muku, Chetan Arora, Saket Anand.

    WACV 2022

    [ Paper ] [ Code ] [ Project Page ]

  • Contextual Diversity For Active Learning

    Sharat Agarwal, Himanshu Singh, Saket Anand, Chetan Arora.

    ECCV 2020

    [ Paper ] [ Code ] [ Slides ]

  • Improved Dynamic Time Warping Based Approach for Activity Recognition

    Vikas Tripathi, Sharat Agarwal, A Mittal, D Gangodkar

    FICTA 2017

  • Modified Dense Trajectory for Real Time Action Recognition

    Vikas Tripathi, Piyush Bhatt, Sharat Agarwal, Monika Semwal

    IJCTA 2017

Academic Projects

  • Domain Adaptation for Semantic Segmentation.[ Slides ] Advisor: Dr.Saket Anand and Dr.Chetan Arora
  • Using Image Processing for detecting people with Down Syndrome.[ Report ] Advisor:Dr. A.V Subramanyam
  • Pairwise confusion loss for Semantic Segmentation.[ Slides ] Advisor: Dr.Koteswar Rao
  • Depression Detection Using Tweets.[ Slides ] Advisor: Dr.Tanmoy Chakraborty
  • Improved Study of Heart Disease Prediction using Data Mining Classification Techniques.[ Slides ]Advisor: Dr. G.P.S Raghava
  • Quora question duplicates dectection.[ Report ] Advisor: Dr.Saket Anand
  • Driver drowsiness detection on long videos.[ Slides ] Advisor:Dr. Saket Anand

Professional Services

  • Reviewed Journal: TPAMI 2022
  • Reviewed Conference: ICCV-23, ECCV-22,24, CVPR-22,23, WACV-22,23,24
  • Technical Program Committee COMSNETS-2023,2024-CVAD Workshop
  • Committe member, ICVGIP Data Challenge 2021
  • DL Tutorial, AI Assisted Data Analytics (AIDA) 2020, IIITD
  • ML Tutorial, Economics Workshop, 2019, IIITD

Academic Services

Teaching Assistant