Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
I
Im
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
文档服务地址:
http://47.92.0.57:3000/
周报索引地址:
http://47.92.0.57:3000/s/NruNXRYmV
Open sidebar
王肇一
Im
Commits
86d39706
Commit
86d39706
authored
Feb 11, 2020
by
王肇一
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Cycle learning rate for unet
parent
515970c7
Hide whitespace changes
Inline
Side-by-side
Showing
6 changed files
with
515 additions
and
1770 deletions
+515
-1770
train.txt
data/voc/train.txt
+500
-1753
mrnet_module.py
mrnet/mrnet_module.py
+2
-1
train.py
mrnet/train.py
+2
-2
train_ignite.py
train_ignite.py
+5
-3
train.py
unet/train.py
+4
-11
eval.py
utils/eval.py
+2
-0
No files found.
data/voc/train.txt
View file @
86d39706
319_4
561
93_0
97
491_1
101
278_0
357
536_3
81
555_0
220
184_0
69
471_4
55
510_0
179
434_4
439
304_1
573
179_0
132
341_1
388
367_2
183
322_2
523
48_3
113
285_0
133
429_1
448
493_3
170
91_2
157
219_1
91
105_4
64
140_4
421
12_4
161
398_0
440
57_4
233
534_1
160
512_2
309
186_2
87
557_2
118
343_3
462
306_3
67
320_0
458
204_4
226
118_1
405
365_0
82
241_4
442
448_0
115
529_4
504
287_2
394
29_2
27
103_2
300
8_0
581
146_2
272
160_1
143
358_0
537
125_1
150
239_4
520
97_4
302
430_0
147
77_1
10
514_4
417
32_1
308
475_0
553
180_4
95
551_4
196
14_2
155
453_3
127
416_3
434
51_2
297
202_2
505
247_2
531
488_0
2
261_1
154
138_4
490
224_1
345
509_1
60
281_4
254
383_3
578
144_0
478
339_1
23
101_0
386
127_3
180
162_3
73
477_2
65
30_3
538
75_3
390
432_2
444
53_0
529
414_1
231
451_1
251
16_0
568
245_0
369
361_4
203
200_0
469
324_4
124
226_3
271
88_3
76
263_3
208
409_4
86
381_1
33
381_0
159
88_2
517
226_2
471
263_2
298
245_1
455
159_4
565
200_1
404
414_0
411
53_1
166
530_4
140
16_1
371
451_0
240
30_2
327
477_3
428
432_3
243
75_2
270
127_2
215
162_2
274
258_4
306
144_1
549
101_1
255
339_0
413
383_2
387
509_0
562
468_4
445
261_0
449
488_1
299
345_4
545
224_0
38
300_4
72
202_3
149
247_3
214
453_2
8
14_3
198
51_3
206
416_2
397
77_0
427
430_1
195
475_1
74
32_0
6
358_1
32
495_4
495
160_0
416
125_0
262
8_1
263
103_3
273
146_3
524
29_3
144
287_3
525
448_1
474
385_4
44
118_0
365
320_1
88
365_1
266
343_2
83
306_2
389
512_3
264
557_3
319
186_3
156
398_1
563
455_4
205
534_0
360
410_4
39
219_0
227
378_4
572
493_2
446
91_3
401
548_4
486
199_4
232
285_1
536
429_0
383
48_2
265
367_3
245
322_3
201
304_0
287
220_4
314
341_0
119
179_1
443
265_4
547
36_4
552
184_1
92
298_4
362
555_1
165
73_4
4
510_1
433
536_2
85
121_4
367
93_1
250
164_4
521
278_1
435
491_0
503
491_2
185
278_3
221
93_3
96
510_3
481
555_3
292
184_3
368
457_4
325
536_0
15
412_4
175
322_1
564
367_1
311
179_3
138
341_2
189
304_2
289
429_2
406
285_3
194
387_4
589
48_0
588
219_2
30
123_4
475
91_1
502
166_4
102
493_0
355
34_4
468
186_1
269
557_1
373
71_4
258
512_1
137
534_2
9
398_3
396
365_3
89
320_3
146
118_2
318
306_0
480
222_4
337
343_0
493
267_4
376
287_1
224
29_1
294
448_3
117
125_2
141
160_2
548
358_3
98
146_1
202
103_1
483
8_3
148
416_0
575
51_1
459
532_4
162
14_1
234
453_0
317
32_2
130
475_3
482
430_3
510
77_2
560
224_2
108
488_3
380
261_2
400
247_1
466
202_1
583
383_0
230
509_2
541
497_4
296
162_0
582
127_0
19
339_2
22
101_3
286
144_3
126
451_2
122
16_3
290
53_3
42
414_2
377
75_0
408
432_1
399
477_1
43
30_0
431
263_0
169
347_4
403
226_0
382
88_0
544
302_4
242
200_3
249
245_3
358
381_2
24
68_4
152
283_4
425
381_3
378
200_2
282
245_2
479
263_1
379
88_1
293
226_1
341
432_0
239
75_1
375
516_4
410
30_1
51
477_0
315
182_4
123
553_4
257
16_2
191
451_3
513
414_3
307
53_2
71
101_2
129
339_3
188
144_2
577
162_1
7
127_1
181
95_4
241
509_3
361
383_1
291
247_0
142
363_4
285
202_0
514
326_4
450
224_3
328
261_3
80
488_2
465
475_2
63
32_3
5
77_3
585
430_2
158
51_0
571
416_1
225
453_1
441
14_0
53
146_0
268
8_2
267
103_0
366
125_3
320
358_2
14
160_3
500
448_2
99
29_0
447
287_0
527
306_1
496
343_1
463
365_2
301
118_3
284
320_2
45
534_3
49
398_2
567
557_0
494
186_0
453
473_4
178
512_0
586
436_4
420
91_0
332
493_1
322
219_3
432
48_1
492
429_3
173
285_2
182
341_3
100
179_2
106
304_3
556
322_0
37
206_4
28
367_0
346
243_4
558
10_4
107
55_4
574
536_1
551
510_2
550
184_2
75
555_2
204
278_2
17
491_3
370
93_2
507
107_4
451
142_4
436
298_3
94
36_3
580
471_2
385
434_2
103
73_3
57
412_1
528
55_0
279
10_0
509
457_1
66
142_0
519
107_0
535
319_2
350
121_3
488
164_3
93
199_3
168
548_3
363
387_1
247
243_0
77
367_4
554
206_0
331
322_4
526
220_3
508
265_3
90
71_1
429
436_0
120
512_4
515
473_0
342
34_1
518
186_4
18
557_4
334
455_3
110
12_2
460
57_2
276
410_3
200
105_2
151
140_2
344
378_3
236
166_1
312
123_1
246
91_4
339
287_4
177
29_4
477
385_3
392
529_2
534
204_2
384
241_2
506
267_1
415
222_1
21
14_4
235
51_4
164
532_1
414
514_2
391
180_2
356
551_2
330
495_3
217
239_2
222
97_2
470
103_4
393
146_4
349
281_2
277
468_3
25
345_3
559
300_3
228
138_2
34
326_0
219
202_4
259
363_0
423
247_4
398
530_3
192
553_0
430
182_0
210
477_4
473
516_0
176
432_4
40
95_0
58
497_1
1
258_3
354
409_2
193
283_0
41
68_0
199
302_1
539
347_1
121
159_3
238
361_2
438
324_2
374
361_3
125
159_2
338
324_3
484
302_0
348
88_4
253
226_4
305
347_0
135
263_4
569
283_1
244
68_1
498
409_3
213
258_2
261
127_4
280
95_1
186
162_4
329
497_0
61
30_4
111
182_1
229
553_1
131
75_4
184
516_1
260
530_2
584
326_1
464
363_1
372
345_2
333
138_3
487
300_2
591
468_2
79
281_3
145
383_4
216
495_2
295
97_3
461
239_3
223
514_3
59
551_3
275
180_3
530
453_4
454
532_0
456
416_4
35
267_0
424
343_4
153
222_0
20
306_4
256
204_3
418
241_3
522
385_2
283
529_3
52
493_4
190
166_0
499
123_0
62
105_3
491
378_2
278
140_3
511
12_3
587
455_2
546
410_2
343
57_3
335
436_1
172
71_0
174
34_0
304
473_1
457
220_2
187
265_2
243_1
206_1
48_4
387_0
548_2
199_2
121_2
319_3
164_2
142_1
107_1
55_1
412_0
536_4
457_0
10_1
471_3
36_2
298_2
73_2
434_3
10_3
457_2
412_2
55_3
434_1
73_0
298_0
36_0
471_1
491_4
164_0
319_1
121_0
107_3
142_3
387_2
429_4
199_0
548_0
265_0
341_4
220_0
304_4
206_3
243_3
57_1
410_0
534_4
455_0
12_1
473_3
34_2
71_2
436_3
123_2
166_2
140_1
378_0
219_4
105_1
529_1
385_0
222_2
267_2
241_1
204_1
118_4
32_4
180_1
551_1
77_4
514_1
532_2
125_4
239_1
97_1
160_4
495_0
281_1
468_0
509_4
363_3
326_3
300_0
138_1
224_4
345_0
261_4
516_3
553_3
182_3
451_4
530_0
414_4
339_4
258_0
497_2
95_3
68_3
283_3
381_4
409_1
324_1
159_0
361_1
347_2
302_2
347_3
302_3
324_0
200_4
361_0
245_4
159_1
409_0
68_2
283_2
497_3
95_2
101_4
258_1
144_4
16_4
53_4
530_1
516_2
182_2
553_2
138_0
300_1
488_4
345_1
363_2
326_2
468_1
281_0
97_0
239_0
358_4
495_1
8_4
532_3
551_0
180_0
475_4
514_0
430_4
241_0
365_4
204_0
320_4
222_3
267_3
529_0
448_4
385_1
378_1
140_0
105_0
123_3
166_3
34_3
473_2
436_2
71_3
410_1
57_0
398_4
12_0
455_1
206_2
243_2
179_4
265_1
220_1
548_1
199_1
285_4
387_3
107_2
142_2
164_1
278_4
121_1
93_4
319_0
73_1
434_0
510_4
471_0
36_1
184_4
298_1
555_4
457_3
10_2
55_2
412_3
126_0
163_0
496_4
145_3
338_2
100_3
415_2
52_3
17_3
450_2
31_0
476_1
433_1
74_0
89_0
227_0
303_4
262_0
346_4
244_3
201_3
380_2
161_2
359_3
124_2
102_1
9_3
147_1
452_0
15_1
50_1
417_0
533_4
76_2
431_3
474_3
33_2
260_2
489_3
225_2
203_1
246_1
382_0
508_2
218_2
492_0
167_4
90_1
122_4
70_4
513_1
187_1
35_4
556_1
399_3
535_2
321_3
119_2
364_3
342_0
266_4
307_0
223_4
28_1
286_1
449_3
92_3
279_3
490_2
554_3
185_3
511_3
537_0
413_4
456_4
366_1
323_1
305_2
178_3
340_2
284_3
428_2
49_0
386_4
49_1
284_2
428_3
305_3
340_3
178_2
366_0
242_4
323_0
207_4
54_4
537_1
11_4
185_2
554_2
511_2
92_2
490_3
279_2
143_4
106_4
449_2
286_0
28_0
342_1
307_1
119_3
321_2
364_2
399_2
535_3
513_0
437_4
556_0
472_4
187_0
492_1
90_0
218_3
508_3
382_1
203_0
327_4
246_0
362_4
489_2
260_3
225_3
431_2
76_3
33_3
474_2
15_0
452_1
417_1
50_0
9_2
102_0
147_0
359_2
161_3
124_3
282_4
69_4
380_3
244_2
201_2
227_1
89_1
262_1
183_4
476_0
31_1
552_4
74_1
433_0
517_4
52_2
415_3
450_3
17_2
145_2
100_2
338_3
94_4
126_1
163_1
338_1
100_0
145_0
163_3
126_3
433_2
74_3
31_3
476_2
17_0
450_1
415_1
52_0
201_0
325_4
244_0
360_4
262_3
89_3
227_3
380_1
408_4
147_2
102_2
9_0
96_4
124_1
238_4
161_1
359_0
181_4
474_0
33_1
550_4
76_1
431_0
515_4
50_2
417_3
452_3
15_2
246_2
203_2
139_4
225_1
260_1
489_0
280_4
508_1
382_3
90_2
492_3
141_4
218_1
104_4
56_4
535_1
13_4
399_0
187_2
556_2
513_2
307_3
342_3
364_0
240_4
321_0
205_4
119_1
528_4
449_0
28_2
286_2
279_0
490_1
318_4
92_0
537_3
511_0
435_4
554_0
470_4
185_0
178_0
340_1
305_1
323_2
366_2
49_3
428_1
284_0
428_0
549_4
198_4
284_1
49_2
323_3
366_3
340_0
178_1
264_4
305_0
221_4
72_4
511_1
185_1
299_4
37_4
554_1
537_2
490_0
165_4
279_1
92_1
120_4
286_3
28_3
449_1
384_4
364_1
119_0
321_1
307_2
342_2
556_3
187_3
513_3
535_0
411_4
399_1
454_4
379_4
218_0
90_3
492_2
382_2
469_4
508_0
225_0
301_4
489_1
260_0
344_4
246_3
203_3
417_2
50_3
15_3
452_2
33_0
474_1
431_1
76_0
124_0
359_1
161_0
494_4
147_3
9_1
102_3
380_0
262_2
227_2
89_2
201_1
244_1
158_4
450_0
17_1
52_1
415_0
531_4
74_2
433_3
476_3
31_2
163_2
126_2
100_1
338_0
259_4
145_1
552_3
183_3
517_3
531_0
415_4
450_4
259_0
338_4
94_3
496_2
282_3
69_3
408_1
380_4
158_0
360_1
325_1
303_2
346_2
76_4
515_1
181_1
33_4
550_1
533_2
494_0
161_4
96_1
124_4
238_1
508_4
469_0
280_1
327_3
362_3
344_0
260_4
301_0
139_1
225_4
13_1
454_0
411_0
56_1
535_4
437_3
70_2
35_2
472_3
167_2
122_2
218_4
104_1
141_1
379_0
384_0
528_1
266_2
223_2
205_1
119_4
240_1
54_3
413_2
456_2
11_3
470_1
37_0
299_0
72_0
435_1
318_1
120_0
165_0
490_4
143_3
106_3
386_2
198_0
549_0
428_4
221_0
305_4
264_0
340_4
242_3
207_3
242_2
207_2
221_1
178_4
264_1
549_1
198_1
284_4
386_3
143_2
106_2
92_4
120_1
318_0
165_1
279_4
185_4
299_1
37_1
470_0
554_4
435_0
72_1
511_4
413_3
54_2
11_2
456_3
205_0
321_4
240_0
364_4
266_3
223_3
449_4
384_1
528_0
104_0
379_1
141_0
167_3
122_3
70_3
437_2
472_2
35_3
399_4
454_1
13_0
56_0
411_1
489_4
344_1
139_0
301_1
327_2
362_2
280_0
469_1
359_4
494_1
238_0
96_0
9_4
533_3
515_0
431_4
550_0
474_4
181_0
303_3
346_3
360_0
244_4
158_1
325_0
201_4
408_0
282_2
69_2
94_2
496_3
259_1
145_4
100_4
52_4
531_1
17_4
183_2
552_2
517_2
531_3
517_0
433_4
552_0
476_4
183_0
496_1
94_0
259_3
408_2
69_0
282_0
346_1
303_1
325_2
158_3
360_2
50_4
533_1
15_4
181_2
550_2
515_2
96_2
238_2
494_3
147_4
102_4
469_3
280_2
301_3
139_2
344_3
362_0
246_4
327_0
203_4
187_4
35_1
472_0
556_4
437_0
70_1
513_4
411_3
56_2
13_2
454_3
141_2
379_3
104_2
90_4
122_1
167_1
28_4
286_4
528_2
384_3
240_2
205_2
223_1
266_1
72_3
435_2
470_2
37_3
299_3
456_1
11_0
54_0
413_1
106_0
143_0
165_3
318_2
120_3
198_3
549_3
386_1
207_0
323_4
242_0
366_4
264_3
221_3
264_2
221_2
207_1
242_1
386_0
49_4
549_2
198_2
165_2
120_2
318_3
106_1
143_1
11_1
456_0
413_0
54_1
537_4
435_3
72_2
299_2
37_2
470_3
223_0
307_4
266_0
342_4
240_3
205_3
528_3
384_2
122_0
167_0
492_4
379_2
141_3
104_3
56_3
411_2
454_2
13_3
472_1
35_0
70_0
437_1
362_1
327_1
139_3
301_2
344_2
280_3
469_2
382_4
238_3
96_3
494_2
550_3
181_3
515_3
533_0
417_4
452_4
325_3
360_3
158_2
346_0
262_4
303_0
227_4
89_4
69_1
282_1
408_3
259_2
496_0
163_4
94_1
126_4
74_4
517_1
183_1
31_4
552_1
531_2
41_0
406_1
443_1
465_2
22_3
67_3
420_2
135_3
348_2
170_3
156_0
113_0
393_1
519_3
234_3
271_3
498_2
257_0
373_4
212_0
336_4
441_3
404_3
43_2
422_0
65_1
506_4
20_1
467_0
192_4
543_4
172_1
137_1
85_4
111_2
329_3
154_2
391_3
78_4
293_4
273_1
98_1
236_1
210_2
255_2
500_2
194_2
545_2
563_1
45_4
526_1
117_4
152_4
268_2
481_3
83_2
439_3
295_2
58_1
332_0
1_3
216_4
377_0
253_4
351_3
169_2
314_3
547_0
196_0
463_4
502_0
426_4
524_3
388_2
561_3
209_3
81_0
483_1
39_0
297_0
458_2
375_2
108_3
3_1
330_2
316_1
353_1
316_0
232_4
353_0
277_4
375_3
330_3
3_0
108_2
458_3
297_1
39_1
133_4
81_1
176_4
483_0
209_2
524_2
561_2
388_3
24_4
196_1
547_1
61_4
502_1
169_3
351_2
314_2
1_2
332_1
377_1
397_4
58_0
439_2
295_3
481_2
268_3
83_3
563_0
447_4
526_0
402_4
500_3
545_3
194_3
210_3
7_4
255_3
273_0
357_4
236_0
98_0
312_4
391_2
329_2
111_3
154_3
487_4
172_0
137_0
65_0
422_1
467_1
20_0
441_2
43_3
404_2
257_1
212_1
234_2
498_3
271_2
519_2
393_0
19_4
156_1
113_1
135_2
170_2
348_3
22_2
465_3
420_3
67_2
406_0
41_1
522_4
443_0
67_0
420_1
465_1
22_0
443_2
41_3
406_2
113_3
156_3
348_1
485_4
170_0
135_0
519_0
478_4
393_2
212_3
5_4
257_3
271_0
498_1
355_4
234_0
310_4
20_2
467_3
422_3
65_2
404_0
43_1
520_4
441_0
154_1
248_4
111_1
329_0
137_2
172_2
391_0
149_4
255_1
210_1
98_2
236_2
273_2
526_2
563_2
26_4
288_4
194_1
545_1
63_4
500_1
131_4
83_1
268_1
174_4
481_0
58_2
558_4
295_1
189_4
439_0
314_0
230_4
351_0
275_4
169_1
377_3
332_3
1_0
388_1
561_0
445_4
524_0
400_4
502_3
547_3
196_3
483_2
81_3
209_0
368_4
458_1
395_4
39_3
297_3
353_2
316_2
108_0
3_2
330_1
375_1
330_0
3_3
108_1
214_4
375_0
251_4
353_3
316_3
297_2
39_2
458_0
539_4
115_4
209_1
150_4
483_3
81_2
502_2
196_2
547_2
561_1
388_0
47_4
524_1
377_2
1_1
332_2
314_1
169_0
351_1
295_0
439_1
58_3
309_4
83_0
481_1
268_0
545_0
194_0
461_4
500_0
424_4
526_3
563_3
236_3
98_3
273_3
255_0
371_4
210_0
334_4
419_4
391_1
137_3
172_3
154_0
329_1
111_0
43_0
404_1
441_1
467_2
20_3
65_3
422_2
498_0
271_1
234_1
128_4
212_2
257_2
393_3
519_1
291_4
170_1
348_0
229_4
135_1
87_4
113_2
156_2
443_3
406_3
41_2
420_0
\ No newline at end of file
mrnet/mrnet_module.py
View file @
86d39706
...
@@ -42,7 +42,8 @@ class MultiUnet(nn.Module):
...
@@ -42,7 +42,8 @@ class MultiUnet(nn.Module):
self
.
pool
=
nn
.
MaxPool2d
(
2
)
self
.
pool
=
nn
.
MaxPool2d
(
2
)
self
.
outconv
=
nn
.
Sequential
(
self
.
outconv
=
nn
.
Sequential
(
nn
.
Conv2d
(
self
.
res9
.
outc
,
n_classes
,
kernel_size
=
1
),
nn
.
Conv2d
(
self
.
res9
.
outc
,
n_classes
,
kernel_size
=
1
),
nn
.
Sigmoid
()
nn
.
Softmax
()
#nn.Sigmoid()
)
)
# self.outconv = nn.Conv2d(self.res9.outc, n_classes,kernel_size = 1)
# self.outconv = nn.Conv2d(self.res9.outc, n_classes,kernel_size = 1)
...
...
mrnet/train.py
View file @
86d39706
...
@@ -28,8 +28,8 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1):
...
@@ -28,8 +28,8 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1):
val_loader
=
DataLoader
(
evalset
,
batch_size
=
batch_size
,
shuffle
=
False
,
num_workers
=
8
,
pin_memory
=
True
)
val_loader
=
DataLoader
(
evalset
,
batch_size
=
batch_size
,
shuffle
=
False
,
num_workers
=
8
,
pin_memory
=
True
)
optimizer
=
optim
.
Adam
(
net
.
parameters
(),
lr
=
lr
)
optimizer
=
optim
.
Adam
(
net
.
parameters
(),
lr
=
lr
)
criterion
=
nn
.
BCELoss
()
#nn.BCEWithLogitsLoss()
criterion
=
nn
.
BCELoss
()
#
nn.BCEWithLogitsLoss()
scheduler
=
lr_scheduler
.
StepLR
(
optimizer
,
30
,
0.5
)
#
lr_scheduler.ReduceLROnPlateau(optimizer, 'min')
scheduler
=
lr_scheduler
.
StepLR
(
optimizer
,
30
,
0.5
)
#
lr_scheduler.ReduceLROnPlateau(optimizer, 'min')
for
epoch
in
range
(
epochs
):
for
epoch
in
range
(
epochs
):
net
.
train
()
net
.
train
()
...
...
train_ignite.py
View file @
86d39706
...
@@ -7,7 +7,7 @@ from torch.utils.data import DataLoader
...
@@ -7,7 +7,7 @@ from torch.utils.data import DataLoader
from
torch.optim
import
lr_scheduler
from
torch.optim
import
lr_scheduler
from
ignite.contrib.handlers.param_scheduler
import
LRScheduler
from
ignite.contrib.handlers.param_scheduler
import
LRScheduler
from
ignite.engine
import
Events
,
create_supervised_trainer
,
create_supervised_evaluator
from
ignite.engine
import
Events
,
create_supervised_trainer
,
create_supervised_evaluator
from
ignite.metrics
import
Accuracy
,
Loss
,
DiceCoefficient
,
ConfusionMatrix
,
RunningAverage
from
ignite.metrics
import
Accuracy
,
Loss
,
DiceCoefficient
,
ConfusionMatrix
,
RunningAverage
,
mIoU
from
ignite.contrib.handlers
import
ProgressBar
from
ignite.contrib.handlers
import
ProgressBar
from
argparse
import
ArgumentParser
from
argparse
import
ArgumentParser
...
@@ -34,11 +34,13 @@ def run(train_batch_size, val_batch_size, epochs, lr):
...
@@ -34,11 +34,13 @@ def run(train_batch_size, val_batch_size, epochs, lr):
optimizer
=
optim
.
Adam
(
model
.
parameters
(),
lr
=
lr
)
optimizer
=
optim
.
Adam
(
model
.
parameters
(),
lr
=
lr
)
cm
=
ConfusionMatrix
(
num_classes
=
1
)
cm
=
ConfusionMatrix
(
num_classes
=
1
)
dice
=
DiceCoefficient
(
cm
)
dice
=
DiceCoefficient
(
cm
)
iou
=
mIoU
(
cm
)
loss
=
torch
.
nn
.
BCELoss
()
# torch.nn.NLLLoss()
loss
=
torch
.
nn
.
BCELoss
()
# torch.nn.NLLLoss()
scheduler
=
LRScheduler
(
lr_scheduler
.
ReduceLROnPlateau
(
optimizer
))
scheduler
=
LRScheduler
(
lr_scheduler
.
StepLR
(
optimizer
,
30
,
0.5
))
trainer
=
create_supervised_trainer
(
model
,
optimizer
,
loss
,
device
=
device
)
trainer
=
create_supervised_trainer
(
model
,
optimizer
,
loss
,
device
=
device
)
evaluator
=
create_supervised_evaluator
(
model
,
metrics
=
{
'accuracy'
:
Accuracy
(),
'dice'
:
dice
,
'nll'
:
Loss
(
loss
)},
evaluator
=
create_supervised_evaluator
(
model
,
metrics
=
{
'accuracy'
:
Accuracy
(),
'dice'
:
dice
,
'nll'
:
Loss
(
loss
)},
device
=
device
)
device
=
device
)
RunningAverage
(
output_transform
=
lambda
x
:
x
)
.
attach
(
trainer
,
'loss'
)
RunningAverage
(
output_transform
=
lambda
x
:
x
)
.
attach
(
trainer
,
'loss'
)
trainer
.
add_event_handler
(
Events
.
EPOCH_COMPLETED
,
scheduler
)
trainer
.
add_event_handler
(
Events
.
EPOCH_COMPLETED
,
scheduler
)
...
...
unet/train.py
View file @
86d39706
...
@@ -47,8 +47,8 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
...
@@ -47,8 +47,8 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
# optimizer = optim.Adam(net.parameters(), lr=lr, weight_decay = 1e-8)
# optimizer = optim.Adam(net.parameters(), lr=lr, weight_decay = 1e-8)
optimizer
=
optim
.
RMSprop
(
net
.
parameters
(),
lr
=
lr
,
weight_decay
=
1e-8
)
optimizer
=
optim
.
RMSprop
(
net
.
parameters
(),
lr
=
lr
,
weight_decay
=
1e-8
)
scheduler
=
lr_scheduler
.
ReduceLROnPlateau
(
optimizer
,
'min'
)
#
scheduler = lr_scheduler.ReduceLROnPlateau(optimizer, 'min')
# criterion = nn.BCEWithLogitsLoss(
)
scheduler
=
lr_scheduler
.
CyclicLR
(
optimizer
,
base_lr
=
1e-10
,
max_lr
=
0.01
)
if
net
.
n_classes
>
1
:
if
net
.
n_classes
>
1
:
criterion
=
nn
.
CrossEntropyLoss
()
criterion
=
nn
.
CrossEntropyLoss
()
else
:
else
:
...
@@ -59,13 +59,6 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
...
@@ -59,13 +59,6 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
epoch_loss
=
0
epoch_loss
=
0
with
tqdm
(
total
=
n_train
,
desc
=
f
'Epoch {epoch + 1}/{epochs}'
,
unit
=
'img'
)
as
pbar
:
with
tqdm
(
total
=
n_train
,
desc
=
f
'Epoch {epoch + 1}/{epochs}'
,
unit
=
'img'
)
as
pbar
:
for
imgs
,
true_masks
in
train_loader
:
for
imgs
,
true_masks
in
train_loader
:
# imgs = batch['image']
# true_masks = batch['mask']
# assert imgs.shape[1] == net.n_channels, \
# f'Network has been defined with {net.n_channels} input channels, ' \
# f'but loaded images have {imgs.shape[1]} channels. Please check that ' \
# 'the images are loaded correctly.'
imgs
=
imgs
.
to
(
device
=
device
,
dtype
=
torch
.
float32
)
imgs
=
imgs
.
to
(
device
=
device
,
dtype
=
torch
.
float32
)
mask_type
=
torch
.
float32
if
net
.
n_classes
==
1
else
torch
.
long
mask_type
=
torch
.
float32
if
net
.
n_classes
==
1
else
torch
.
long
true_masks
=
true_masks
.
to
(
device
=
device
,
dtype
=
mask_type
)
true_masks
=
true_masks
.
to
(
device
=
device
,
dtype
=
mask_type
)
...
@@ -80,11 +73,11 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
...
@@ -80,11 +73,11 @@ def train_net(net, device, epochs = 5, batch_size = 1, lr = 0.1, save_cp = True)
optimizer
.
zero_grad
()
optimizer
.
zero_grad
()
loss
.
backward
()
loss
.
backward
()
optimizer
.
step
()
optimizer
.
step
()
scheduler
.
step
()
pbar
.
update
(
imgs
.
shape
[
0
])
pbar
.
update
(
imgs
.
shape
[
0
])
global_step
+=
1
# if global_step % (len(dataset) // (10 * batch_size)) == 0:
global_step
+=
1
# if global_step % (len(dataset) // (10 * batch_size)) == 0:
val_score
=
eval_net
(
net
,
val_loader
,
device
,
n_val
)
val_score
=
eval_net
(
net
,
val_loader
,
device
,
n_val
)
scheduler
.
step
(
val_score
)
#
scheduler.step(val_score)
if
net
.
n_classes
>
1
:
if
net
.
n_classes
>
1
:
logging
.
info
(
'Validation cross entropy: {}'
.
format
(
val_score
))
logging
.
info
(
'Validation cross entropy: {}'
.
format
(
val_score
))
writer
.
add_scalar
(
'Loss/test'
,
val_score
,
global_step
)
writer
.
add_scalar
(
'Loss/test'
,
val_score
,
global_step
)
...
...
utils/eval.py
View file @
86d39706
...
@@ -2,6 +2,7 @@ import torch
...
@@ -2,6 +2,7 @@ import torch
import
torch.nn.functional
as
F
import
torch.nn.functional
as
F
from
tqdm
import
tqdm
from
tqdm
import
tqdm
from
sklearn.metrics
import
jaccard_score
from
sklearn.metrics
import
jaccard_score
import
numpy
as
np
from
utils.dice_loss
import
dice_coeff
,
dice_coef
from
utils.dice_loss
import
dice_coeff
,
dice_coef
...
@@ -42,6 +43,7 @@ def eval_jac(net, loader, device, n_val):
...
@@ -42,6 +43,7 @@ def eval_jac(net, loader, device, n_val):
pred_masks
=
torch
.
round
(
pred_masks
)
.
cpu
()
.
detach
()
.
numpy
()
pred_masks
=
torch
.
round
(
pred_masks
)
.
cpu
()
.
detach
()
.
numpy
()
true_masks
=
torch
.
round
(
true_masks
)
.
cpu
()
.
numpy
()
true_masks
=
torch
.
round
(
true_masks
)
.
cpu
()
.
numpy
()
pred_masks
=
np
.
array
([
1
if
x
>
0
else
0
for
x
in
pred_masks
])
jac
+=
jaccard_score
(
true_masks
.
flatten
(),
pred_masks
.
flatten
())
jac
+=
jaccard_score
(
true_masks
.
flatten
(),
pred_masks
.
flatten
())
pbar
.
update
(
imgs
.
shape
[
0
])
pbar
.
update
(
imgs
.
shape
[
0
])
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment