- This topic has 17 replies, 7 voices, and was last updated 5 years, 3 months ago by samuel_admin.
-
AuthorPosts
-
August 1, 2019 at 1:18 pm #75620YearsMember
Dear Kurt, Is it possible to post the ground truth crack lengths of validation specimens now? Thanks!
August 4, 2019 at 10:31 pm #760kjd02002ModeratorDear 20Years,
We will be sharing measured crack lengths for the validation data within the next few days.
Final team results (scores and rank) will be posted on 18 August, 2019. This is to allow time for the results and preliminary winner review process.
Please note: preliminary winners will be contacted soon.
August 6, 2019 at 2:28 pm #76420YearsMemberThanks Kurt! Looking forward to it.
August 13, 2019 at 3:35 am #768cranthenaMemberI wonder if the ground truth / measured crack lenghts for the validation data have been posted anywhere, as per message above?
August 14, 2019 at 3:24 am #769realaiMemberDear Kurt,
We have the same question, please post the ground truth crack lengths, thanks!August 17, 2019 at 3:10 am #770kjd02002ModeratorSorry for the delay posting the actual crack lengths for T7 and T8. These will be posted in a results page soon, but I am posting them here in the interim:
Cycle | Crack length (mm)
T7 36001 | 0
40167 | 0
44054 | 2.07
47022 | 3.14
49026 | 3.56
51030 | 4.13
53019 | 5.05
55031 | 7.22T8 40000 | 0
50000 | 0
70000 | 0
74883 | 1.94
76931 | 2.5
89237 | 3.71
92315 | 3.88
96475 | 4.61
98492 | 4.96
100774 | 5.52August 17, 2019 at 11:08 am #77220YearsMemberThanks Kurt!
When the final results are posted, it would be also interesting to see the scores achieved by every teams for two validation datasets T7 and T8, respectively.
August 20, 2019 at 8:37 pm #773tsinghua_wcyMemberThanks Kurt!
Accroading to the posted final results, the ground truth is likely to be leaked before (publicly). The reproducibility should be double checked before the announcement of the winners.
August 20, 2019 at 10:32 pm #774kjd02002ModeratorDear tsinghua_wcy,
Thank you for your message. I appreciate that you’ve brought this to our attention in the interest of a fair competition.
Please note that preliminary winners of the data challenge must submit explanation of their models as well as journal papers before the committee approves final competition winners. Therefore, submitting predictions that yield the best score alone will not necessarily result in a team winning a top 3 position for the competition.
We have been working to validate the approaches of all potential winners over the past 2 weeks which is why final results have not yet been posted.
August 20, 2019 at 11:39 pm #775tsinghua_wcyMemberThanks Kurt and the committee for your work. Looking forward to seeing the final results!
Some tools you might consider to check the reproducibility, which is better than just manuscripts:
1. codelab: https://worksheets.codalab.org/
2. Jupyter NotebookAugust 26, 2019 at 2:00 pm #776SyRRAMemberDear Kurt,
We’re still waiting to see the final rankings just to know how well we did, comparing to others.
You mentioned August 18th and 8 days have passed. When do you think you can post the final results?Bests
August 27, 2019 at 11:06 am #78020YearsMember+?
How about to post the original scores/rankings first if it is still need more time to evaluate the winners? Thanks!August 28, 2019 at 12:12 am #781kjd02002ModeratorFinal results are still pending validation/review. However, here are preliminary results which will be posted to the website soon:
RANK | TEAM | SCORE
1-6 | Angler, KoreanAvengers, pyEstimate, Seoulg, SLUNG, ValyrianAluminumers | Not Yet Announced (alphabetic order)7 | trybest | 24.39
8 | tsinghua_wcy | 29.72
9 | HIAI | 31.61
10 | LDM | 34.97
11 | PHiMa | 43.46
12 | TeamBlue | 46.32
13 | HITF | 48.17
14 | JCHA | 52.54
15 | ChoochooTrain | 53.83
16 | NaN | 55.94
17 | realai | 58.95
18 | beat_real | 59.00
19 | 20Years | 72.56
20 | Runtime_Terror | 78.57
21 | CAPE_HM301 | 88.67
22 | RRML | 96.04
23 | SyRRA | 128.86
24 | 553 | 153.86
25 | ACES | 165.68
26 | Cracker | 189.35
27 | FMAKE | 192.39
28 | Xukunv | 194.88
29 | NTS01 | 199.43
30 | cranthena | 239.02
31 | _______ | 243.54
32 | Apostov | 253.86
33 | TeamKawakatsu | 282.46
34 | U-Q | 526.79
35 | ISNDE | 594.61
36 | UTJS-1 | 621.93
37 | ISX-MPO | 755.49
38 | UBC-Okanagan | 1140.69
39 | NukeGrads | 2175.56
40 | NUTN_DSG_TW | 2381.42
41 | GTC | 5513.57
42 | TWT | 6346.47
43 | dataking | 6346.47
44 | Arundites | 16571.80
45 | LIACS | 86762.72
46 | TPRML | 10^7
47 | Mizzou | 10^19
48 | DSBIGINNER333 | > 10^100- This reply was modified 4 years, 1 month ago by samuel_admin.
August 29, 2019 at 11:34 pm #784realaiMemberWhat about asking everyone for a self-contained jupyter notebook, and we can have a ranking where all teams have release the codes? Thanks!
September 2, 2019 at 3:38 am #1685_______MemberHello Kurt,
Since not everyone can come to the conference, could there be a way for the other competitors to share their ideas and codes? By the way, the email address for the student poster session seems to be unavailable now.
Thank you
-
AuthorPosts
- The forum ‘2019 Data Challenge: General Competition Questions’ is closed to new topics and replies.