Duckietown Challenges Home Challenges Submissions

Job 587

Job ID587
submission54
userBea Baselines 🐤
user labelbaseline-RL-sim-pytorch
challengeaido-LFV_multi-sim-validation
stepsim-1of4
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorgpu-staging-spot-0-04
date started
date completed
duration0:19:22
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median59.99999999999873
in-drivable-lane_median20.549999999999493
driven_lanedir_consec_median-4.088788303637268
deviation-center-line_median3.048481102869623


other stats
deviation-center-line_max5.224761926851118
deviation-center-line_mean3.351803550967361
deviation-center-line_min2.0854900712790796
deviation-heading_max32.84335340649624
deviation-heading_mean29.670365247413805
deviation-heading_median30.16499435436135
deviation-heading_min25.508118874436285
distance-from-start_max0.21935620855780744
distance-from-start_mean0.19394557932581705
distance-from-start_median0.19493835525253225
distance-from-start_min0.16654939824039608
driven_any_max14.903987163779872
driven_any_mean13.440133792182188
driven_any_median13.217096504889462
driven_any_min12.422354995169956
driven_lanedir_consec_max-3.4659575894049484
driven_lanedir_consec_mean-4.087459527269676
driven_lanedir_consec_min-4.706303912399219
driven_lanedir_max-3.9860606967757257
driven_lanedir_mean-5.651747347437626
driven_lanedir_median-5.092477587697946
driven_lanedir_min-8.435973517578889
in-drivable-lane_max28.49999999999934
in-drivable-lane_mean20.287499999999536
in-drivable-lane_min11.549999999999828
per-episodes
details{"autolab-000-ego0": {"driven_any": 13.324342687278648, "survival_time": 59.99999999999873, "driven_lanedir": -4.824755239464533, "in-drivable-lane": 28.49999999999934, "deviation-heading": 25.508118874436285, "distance-from-start": 0.16654939824039608, "deviation-center-line": 2.180407458123194, "driven_lanedir_consec": -4.191515910498811}, "autolab-000-ego1": {"driven_any": 13.109850322500272, "survival_time": 59.99999999999873, "driven_lanedir": -3.9860606967757257, "in-drivable-lane": 26.849999999999408, "deviation-heading": 28.456721097036105, "distance-from-start": 0.21935620855780744, "deviation-center-line": 2.0854900712790796, "driven_lanedir_consec": -3.9860606967757257}, "autolab-000-ego2": {"driven_any": 12.422354995169956, "survival_time": 59.99999999999873, "driven_lanedir": -5.360199935931358, "in-drivable-lane": 14.249999999999575, "deviation-heading": 31.873267611686597, "distance-from-start": 0.18814186249738343, "deviation-center-line": 3.9165547476160527, "driven_lanedir_consec": -3.4659575894049484}, "autolab-000-ego3": {"driven_any": 14.903987163779872, "survival_time": 59.99999999999873, "driven_lanedir": -8.435973517578889, "in-drivable-lane": 11.549999999999828, "deviation-heading": 32.84335340649624, "distance-from-start": 0.201734848007681, "deviation-center-line": 5.224761926851118, "driven_lanedir_consec": -4.706303912399219}}
simulation-passed1
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873

Highlights

587

Click the images to see detailed statistics about the episode.

autolab-000

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.