/
Evaluation Rubric

Evaluation Rubric

Teams will be evaluated on

  1. Model performance

  2. Presentation of methods and results

Presentations must be no more than 5 minutes.

 

A rubric for final team score is provided below.

 

Judging Criteria

Category

Weight

Actual Score

Category

Weight

Actual Score

Model Performance

50% (Max: 50 points)

 

  • Accuracy on real test data (e.g., mAP, precision/recall, IoU)

1-30

 

  • Performance consistency across object classes (avoiding overfitting)

1-15

 

  • Effective use of synthetic data to generalize

1-5

 

Presentation of Solution

25% (Max: 25 points)

 

  • Problem and approach were clearly defined

1-5

 

  • Presentation was structured and easy to follow

1-5

 

  • Visuals effectively explained challenges and results

1-5

 

  • Delivery was confident and engaging

1-5

 

  • Presentation was within the time limit allotted (5 min)

1-5

 

Model Interpretability and Insights

15% (Max: 15 points)

 

  • Interpretation of model predictions and errors

1-5

 

  • Identified key performance bottlenecks

1-5

 

  • Addressed challenges of domain adaptation

1-5

 

Collaboration and Teamwork

10% (Max: 10 points)

 

  • Effective division of tasks and collaboration

1-5

 

  • Problem-solving abilities under pressure

1-3

 

  • Adaptability to new information or constraints

1-2

 


Total Score

Maximum Possible Score: 100 points
Actual Score: ______ / 100 points

Related content

Overview
Read with this
Rubrics and Submission
Rubrics and Submission
More like this
Synthetic Configuration
Synthetic Configuration
Read with this
Drone User-testing Meeting Notes
Drone User-testing Meeting Notes
More like this
Test Data
Read with this
Drone Testing Questions
Drone Testing Questions
More like this