Skip to content

Commit 57b9fc3

Browse files
author
Bojan Misic
committed
Modify baseline test files for multiclass classification with Iris dataset and LightGBM
1 parent 67072c5 commit 57b9fc3

12 files changed

+316
-316
lines changed

test/BaselineOutput/SingleDebug/LightGBMMC/LightGBMMC-CV-iris.key-out.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@ TRUTH ||========================
2121
Precision ||1.0000 |0.9310 |0.8966 |
2222
Accuracy(micro-avg): 0.936709
2323
Accuracy(macro-avg): 0.942857
24-
Log-loss: 0.312681
25-
Log-loss reduction: 71.248182
24+
Log-loss: 0.312759
25+
Log-loss reduction: 71.240938
2626

2727
Confusion table
2828
||========================
@@ -42,8 +42,8 @@ OVERALL RESULTS
4242
---------------------------------------
4343
Accuracy(micro-avg): 0.947228 (0.0105)
4444
Accuracy(macro-avg): 0.947944 (0.0051)
45-
Log-loss: 0.253035 (0.0596)
46-
Log-loss reduction: 76.717466 (5.4693)
45+
Log-loss: 0.253074 (0.0597)
46+
Log-loss reduction: 76.713844 (5.4729)
4747

4848
---------------------------------------
4949
Physical memory usage(MB): %Number%
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
LightGBMMC
22
Accuracy(micro-avg) Accuracy(macro-avg) Log-loss Log-loss reduction /iter /lr /nl /mil /nt Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
3-
0.947228 0.947944 0.253035 76.71747 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:TX:0 col=Features:1-*} data=%Data% seed=1 xf=Term{col=Label} /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
3+
0.947228 0.947944 0.253074 76.71384 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:TX:0 col=Features:1-*} data=%Data% seed=1 xf=Term{col=Label} /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
44

test/BaselineOutput/SingleDebug/LightGBMMC/LightGBMMC-CV-iris.key.txt

Lines changed: 74 additions & 74 deletions
Original file line numberDiff line numberDiff line change
@@ -1,83 +1,83 @@
11
Instance Label Assigned Log-loss #1 Score #2 Score #3 Score #1 Class #2 Class #3 Class
2-
5 0 1 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
3-
6 0 1 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
4-
8 0 1 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
5-
9 0 1 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
6-
10 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
7-
11 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
8-
18 0 1 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
9-
20 0 1 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
10-
21 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
11-
25 0 1 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
12-
28 0 1 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
13-
31 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
14-
32 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
15-
35 0 1 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
16-
37 0 1 0.13970851121881944 0.8696117 0.07047898 0.0599093363 0 1 2
17-
40 0 1 0.12225769559664824 0.8849203 0.0591422766 0.05593741 0 2 1
18-
41 0 1 0.17550956509619134 0.8390294 0.09255582 0.0684148148 0 1 2
19-
44 0 1 0.25328578422472414 0.776246 0.1675262 0.0562277846 0 1 2
20-
45 0 1 0.13903052119099127 0.870201468 0.07016017 0.05963834 0 1 2
21-
46 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
22-
48 0 1 0.12269405525649343 0.88453424 0.0593406856 0.0561250672 0 2 1
23-
50 1 2 0.48031316690941278 0.61858964 0.2931589 0.08825144 1 2 0
24-
51 1 2 0.18552267596609509 0.83067 0.09896274 0.07036729 1 0 2
25-
52 1 1.8686139310181762 0.745523036 0.154337436 0.1001395 2 1 0
26-
54 1 2 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
27-
56 1 2 0.58631345356437459 0.5563746 0.357763946 0.0858614147 1 2 0
28-
60 1 2 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
29-
63 1 2 0.442888085987238 0.6421791 0.304338247 0.0534826852 1 2 0
30-
64 1 2 0.14288655580917453 0.8668524 0.06818299 0.06496459 1 2 0
31-
66 1 2 0.13927185898584951 0.8699915 0.06910439 0.060904108 1 2 0
32-
68 1 2 0.1475586146516118 0.862811863 0.08110718 0.0560809337 1 2 0
33-
69 1 2 0.13690026149264065 0.8720572 0.07104707 0.056895718 1 2 0
34-
70 1 2 0.58631345356437459 0.5563746 0.357763946 0.0858614147 1 2 0
35-
71 1 2 0.15194427686527462 0.859036148 0.07716796 0.06379592 1 2 0
36-
72 1 1.4639003870351257 0.712372541 0.231332228 0.0562952235 2 1 0
37-
73 1 2 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
38-
74 1 2 0.13796619253742226 0.871128142 0.06828712 0.0605847277 1 2 0
39-
76 1 2 0.45819815025419125 0.632422149 0.3149078 0.0526700951 1 2 0
40-
77 1 2.0734221020246566 0.815010846 0.1257547 0.05923444 2 1 0
41-
79 1 2 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
42-
82 1 2 0.13641919697507263 0.8724768 0.07081407 0.0567091331 1 2 0
43-
88 1 2 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
44-
90 1 2 0.1425659799992052 0.867130339 0.0762954 0.0565742739 1 2 0
45-
91 1 2 0.442888085987238 0.6421791 0.304338247 0.0534826852 1 2 0
46-
92 1 2 0.13641919697507263 0.8724768 0.07081407 0.0567091331 1 2 0
47-
93 1 2 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
48-
95 1 2 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
49-
96 1 2 0.13407533925580511 0.8745242 0.06425438 0.0612214245 1 2 0
50-
97 1 2 0.13796619253742226 0.871128142 0.06828712 0.0605847277 1 2 0
51-
98 1 2 0.54904634529954011 0.5775003 0.363432646 0.0590670444 1 0 2
52-
99 1 2 0.13879074815854195 0.870410144 0.06865643 0.06093342 1 2 0
53-
100 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
54-
102 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
55-
104 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
56-
105 2 0.12282263476045074 0.8844205 0.0591996 0.056379877 2 1 0
57-
106 2 2 2.3434392875794119 0.8476237 0.09599691 0.05637939 1 2 0
58-
108 2 0.22657594234978759 0.7972588 0.1479769 0.0547643229 2 1 0
59-
109 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
60-
111 2 0.177848875720656 0.8370689 0.108788572 0.0541424938 2 1 0
2+
5 0 1 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
3+
6 0 1 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
4+
8 0 1 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
5+
9 0 1 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
6+
10 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
7+
11 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
8+
18 0 1 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
9+
20 0 1 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
10+
21 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
11+
25 0 1 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
12+
28 0 1 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
13+
31 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
14+
32 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
15+
35 0 1 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
16+
37 0 1 0.13970878538557358 0.869611442 0.07047896 0.05990959 0 1 2
17+
40 0 1 0.12225796502047259 0.884920061 0.05914253 0.0559373945 0 2 1
18+
41 0 1 0.17550530270540357 0.839032948 0.09255621 0.068410866 0 1 2
19+
44 0 1 0.25328601458204941 0.776245832 0.167526156 0.0562280267 0 1 2
20+
45 0 1 0.13903079517192601 0.87020123 0.07016016 0.0596385933 0 1 2
21+
46 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
22+
48 0 1 0.12269432479790915 0.884534 0.05934094 0.05612505 0 2 1
23+
50 1 2 0.48031528673730972 0.6185883 0.2931604 0.0882512555 1 2 0
24+
51 1 2 0.1855230347406718 0.8306697 0.09896271 0.0703676 1 0 2
25+
52 1 1.8686158620047173 0.7455236 0.154337138 0.100139305 2 1 0
26+
54 1 2 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
27+
56 1 2 0.58631409634709242 0.556374252 0.357764423 0.08586135 1 2 0
28+
60 1 2 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
29+
63 1 2 0.44289031357940112 0.642177641 0.3043398 0.053482566 1 2 0
30+
64 1 2 0.14288683084862894 0.866852164 0.06818328 0.06496458 1 2 0
31+
66 1 2 0.13881478450725465 0.8703892 0.0686788 0.0609319545 1 2 0
32+
68 1 2 0.14755364077036032 0.862816155 0.0811026245 0.0560812131 1 2 0
33+
69 1 2 0.13689581878715465 0.8720611 0.07104298 0.0568959676 1 2 0
34+
70 1 2 0.58631409634709242 0.556374252 0.357764423 0.08586135 1 2 0
35+
71 1 2 0.15245577872775815 0.858596861 0.07763986 0.0637633 1 2 0
36+
72 1 1.4638898231048283 0.7123695 0.231334671 0.05629582 2 1 0
37+
73 1 2 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
38+
74 1 2 0.13842123633682313 0.870731831 0.068711035 0.06055716 1 2 0
39+
76 1 2 0.45820041221338154 0.6324207 0.3149093 0.05266998 1 2 0
40+
77 1 2.0734225760002563 0.815010965 0.12575464 0.0592344068 2 1 0
41+
79 1 2 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
42+
82 1 2 0.13641482472258923 0.872480631 0.07081 0.0567093827 1 2 0
43+
88 1 2 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
44+
90 1 2 0.14206322054986673 0.8675664 0.07583086 0.0566027239 1 2 0
45+
91 1 2 0.44289031357940112 0.642177641 0.3043398 0.053482566 1 2 0
46+
92 1 2 0.13641482472258923 0.872480631 0.07081 0.0567093827 1 2 0
47+
93 1 2 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
48+
95 1 2 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
49+
96 1 2 0.13407561188247241 0.874523938 0.06425465 0.0612214059 1 2 0
50+
97 1 2 0.13842123633682313 0.870731831 0.068711035 0.06055716 1 2 0
51+
98 1 2 0.5490426296940486 0.5775024 0.363434 0.0590636 1 0 2
52+
99 1 2 0.13879102207379132 0.8704099 0.06865671 0.0609334 1 2 0
53+
100 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
54+
102 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
55+
104 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
56+
105 2 0.12282263476045074 0.8844205 0.0591995977 0.0563798733 2 1 0
57+
106 2 2 2.3492272871651649 0.84814316 0.09544288 0.05641394 1 2 0
58+
108 2 0.22657586758781198 0.797258854 0.14797686 0.0547643043 2 1 0
59+
109 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
60+
111 2 0.177848875720656 0.8370689 0.108788565 0.0541424938 2 1 0
6161
112 2 0.13281455464792449 0.875627458 0.06831084 0.0560617261 2 1 0
62-
113 2 0.19621674447868781 0.8218341 0.12273933 0.05542656 2 1 0
62+
113 2 0.1962166719523184 0.821834147 0.122739322 0.0554265566 2 1 0
6363
115 2 0.17200937673419167 0.8419713 0.09234353 0.0656852052 2 0 1
64-
117 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
65-
120 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
66-
121 2 0.16411842591849909 0.8486415 0.09412396 0.05723452 2 1 0
64+
117 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
65+
120 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
66+
121 2 0.16411842591849909 0.8486415 0.09412395 0.0572345145 2 1 0
6767
122 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
68-
123 2 0.28256671512014453 0.753846347 0.189867079 0.05628657 2 1 0
69-
125 2 0.20564890133993838 0.814118862 0.09413585 0.09174529 2 0 1
68+
123 2 0.28256655698542577 0.753846467 0.18986699 0.0562865436 2 1 0
69+
125 2 0.20564882812625254 0.8141189 0.09413582 0.09174526 2 0 1
7070
128 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
71-
129 2 0.16567795334648433 0.847319067 0.09548671 0.057194218 2 1 0
72-
131 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
71+
129 2 0.16567788300150446 0.8473191 0.09548668 0.0571942 2 1 0
72+
131 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
7373
132 2 0.13716672321276682 0.871824861 0.07206305 0.0561121143 2 1 0
74-
133 2 0.29113037794713281 0.7474182 0.191831991 0.0607497729 2 1 0
75-
137 2 0.22116862531406531 0.8015815 0.104995139 0.09342336 2 1 0
76-
138 2 2 0.99148905684440769 0.5769956 0.3710238 0.05198058 1 2 0
77-
141 2 0.18520119392899573 0.8309371 0.09454043 0.07452248 2 0 1
78-
144 2 0.16223550400064654 0.850240946 0.0928473249 0.0569117554 2 0 1
79-
145 2 0.14505497806361808 0.864974737 0.07757514 0.0574501 2 1 0
80-
147 2 0.14505497806361808 0.864974737 0.07757514 0.0574501 2 1 0
74+
133 2 0.29112966022097542 0.747418761 0.191831574 0.06074964 2 1 0
75+
137 2 0.22116855095526036 0.801581562 0.1049951 0.09342333 2 1 0
76+
138 2 2 0.99148777165233459 0.5769952 0.371024281 0.05198054 1 2 0
77+
141 2 0.18520119392899573 0.8309371 0.09454043 0.07452247 2 0 1
78+
144 2 0.16223550400064654 0.850240946 0.0928473249 0.05691175 2 0 1
79+
145 2 0.14505497806361808 0.864974737 0.07757514 0.0574500971 2 1 0
80+
147 2 0.14505497806361808 0.864974737 0.07757514 0.0574500971 2 1 0
8181
0 0 1 0.12805244799353907 0.879807234 0.0614267066 0.0587660335 0 1 2
8282
1 0 1 0.13227381045206946 0.8761011 0.06422711 0.0596717857 0 1 2
8383
2 0 1 0.12805244799353907 0.879807234 0.0614267066 0.0587660335 0 1 2

test/BaselineOutput/SingleDebug/LightGBMMC/LightGBMMC-CV-iris.keyU404-out.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,8 @@ TRUTH ||========================================
2323
Precision ||1.0000 |0.9310 |0.8966 |0.0000 |0.0000 |
2424
Accuracy(micro-avg): 0.936709
2525
Accuracy(macro-avg): 0.942857
26-
Log-loss: 0.312681
27-
Log-loss reduction: 71.248176
26+
Log-loss: 0.312759
27+
Log-loss reduction: 71.240931
2828

2929
Confusion table
3030
||========================================
@@ -46,8 +46,8 @@ OVERALL RESULTS
4646
---------------------------------------
4747
Accuracy(micro-avg): 0.947228 (0.0105)
4848
Accuracy(macro-avg): 0.947944 (0.0051)
49-
Log-loss: 0.253035 (0.0596)
50-
Log-loss reduction: 76.717461 (5.4693)
49+
Log-loss: 0.253074 (0.0597)
50+
Log-loss reduction: 76.713839 (5.4729)
5151

5252
---------------------------------------
5353
Physical memory usage(MB): %Number%
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
LightGBMMC
22
Accuracy(micro-avg) Accuracy(macro-avg) Log-loss Log-loss reduction /iter /lr /nl /mil /nt Learner Name Train Dataset Test Dataset Results File Run Time Physical Memory Virtual Memory Command Line Settings
3-
0.947228 0.947944 0.253035 76.71746 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:U4[0-4]:0 col=Features:1-4} data=%Data% seed=1 /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
3+
0.947228 0.947944 0.253074 76.71384 10 0.2 20 10 1 LightGBMMC %Data% %Output% 99 0 0 maml.exe CV tr=LightGBMMC{nt=1 iter=10 v=- lr=0.2 mil=10 nl=20} threads=- dout=%Output% loader=Text{col=Label:U4[0-4]:0 col=Features:1-4} data=%Data% seed=1 /iter:10;/lr:0.2;/nl:20;/mil:10;/nt:1
44

0 commit comments

Comments
 (0)