This is the official PyTorch implementation of the following publication:
Expert Knowledge-Guided Decision Calibration for Accurate Fine-Grained Tree Species Classification
Chen Long, Dian Chen, Ruifei Ding, Zhe Chen, Zhen Dong, Bisheng Yang
Under Review
Paper
Expert Knowledge-Guided Decision Calibration for Accurate Fine-Grained Tree Species Classification
Abstract: Accurate fine-grained tree species classification is critical for forest inventory and biodiversity monitoring. Existing methods predominantly focus on designing complex architectures to fit local data distributions. However, they often overlook the long-tailed distributions and high inter-class similarity inherent in limited data, thereby struggling to distinguish between few-shot or confusing categories. In the process of knowledge dissemination in the human world, individuals will actively seek expert assistance to transcend the limitations of local thinking. Inspired by this, we introduce an external "Domain Expert" and propose an Expert Knowledge-Guided Classification Decision Calibration Network (EKDC-Net) to overcome these challenges. Our framework addresses two core issues: expert knowledge extraction and utilization. Specifically, we first develop a Local Prior Guided Knowledge Extraction Module (LPKEM). By leveraging Class Activation Map (CAM) analysis, LPKEM guides the domain expert to focus exclusively on discriminative features essential for classification. Subsequently, to effectively integrate this knowledge, we design an Uncertainty-Guided Decision Calibration Module (UDCM). This module dynamically corrects the local model's decisions by considering both overall category uncertainty and instance-level prediction uncertainty. Furthermore, we present a large-scale classification dataset covering 102 tree species, named CU-Tree102 to address the issue of scarce diversity in current benchmarks. Experiments on three benchmark datasets demonstrate that our approach achieves state-of-the-art performance. Crucially, as a lightweight plug-and-play module, EKDC-Net improves backbone accuracy by **6.42% ** and precision by **11.46%** using only **0.08M** additional learnable parameters.
- 2026-01-26: Code, Preprint paper are available! 🎉
The code has been trained on:
- Ubuntu 22.04
- CUDA 11.8
- Python 3.9.18
- Pytorch 2.1.2
- GeForce RTX 4090
$\times$ 1.
- First, create the conda environment:
conda env create -f environment.yaml conda activate treecls
We used two datasets for training and three datasets for evaluation.
We proposed CU-Tree102 dataset available at the Google Drive
We reprocessed WHU-RSTree to obtain this classification dataset, available at Google Drive.
Jekyll Dataset is available at the Jekyll Website.
You can download the pretrained model from GoogleDrive, and put it in folder records/.
To train EKDC-Net, you should prepare the dataset, and replace the "val_root" to your data path. Then, you use the follow command:
$ python main.py --c config/whole_pipeline.yaml # for CGL baselineTo eval EKDC-Net on CU-Tree102 Dataset, you can use the following commands:
$ python eval.py --pr records/whole_pipelineWe sincerely thank the excellent projects:
