2025-02-23 15:57:25 +08:00
|
|
|
================ Training Loss (Sun Feb 23 15:46:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 15:52:29 2025) ================
|
2025-02-23 16:02:17 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:00:07 2025) ================
|
2025-02-23 16:04:38 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:02:40 2025) ================
|
2025-02-23 16:06:28 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:05:19 2025) ================
|
2025-02-23 16:09:20 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:06:44 2025) ================
|
2025-02-23 16:44:28 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:09:38 2025) ================
|
Traceback (most recent call last):
File "/home/openxs/jj/roma_unsb/train.py", line 47, in <module>
model.optimize_parameters() # calculate loss functions, get gradients, update network weights
File "/home/openxs/jj/roma_unsb/models/roma_unsb_model.py", line 315, in optimize_parameters
self.forward()
File "/home/openxs/jj/roma_unsb/models/roma_unsb_model.py", line 445, in forward
Xt_1 = self.netG(Xt, self.time, z)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward
return self.module(*inputs[0], **kwargs[0])
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/jj/roma_unsb/models/networks.py", line 980, in forward
feat = layer(feat)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/conv.py", line 463, in forward
return self._conv_forward(input, self.weight, self.bias)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/conv.py", line 459, in _conv_forward
return F.conv2d(input, weight, bias, self.stride,
RuntimeError: Given groups=1, weight of size [64, 3, 7, 7], expected input[1, 1, 1734, 774] to have 3 channels, but got 1 channels instead
2025-02-23 16:49:26 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:44:56 2025) ================
|
2025-02-23 18:42:21 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:49:46 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 16:51:03 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 16:51:23 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:04:02 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:04:39 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:05:17 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:06:40 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:11:48 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:13:31 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:14:11 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:14:29 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:16:27 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:16:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:20:39 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:21:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:35:27 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:39:21 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:40:15 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:41:15 2025) ================
|
2025-02-23 19:06:35 +08:00
|
|
|
================ Training Loss (Sun Feb 23 18:47:46 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:48:36 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:50:20 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:51:50 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:58:45 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:59:52 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 19:03:05 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 19:03:57 2025) ================
|
2025-02-23 21:15:41 +08:00
|
|
|
================ Training Loss (Sun Feb 23 21:11:47 2025) ================
|
2025-02-23 22:26:04 +08:00
|
|
|
================ Training Loss (Sun Feb 23 21:17:10 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 21:20:14 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 21:29:03 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 21:34:57 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 21:35:26 2025) ================
|
2025-02-23 22:40:34 +08:00
|
|
|
================ Training Loss (Sun Feb 23 22:28:43 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:29:04 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:29:52 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:30:40 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:33:48 2025) ================
|
2025-02-23 22:40:36 +08:00
|
|
|
================ Training Loss (Sun Feb 23 22:39:16 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:39:48 2025) ================
|
2025-02-23 23:15:25 +08:00
|
|
|
================ Training Loss (Sun Feb 23 22:41:34 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:42:01 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:44:17 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:45:53 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:46:48 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:47:42 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:49:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:50:29 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:51:47 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:55:56 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:56:19 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:57:58 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 22:59:09 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:02:36 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:03:56 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:09:21 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:10:05 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:11:43 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:12:41 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:13:05 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:13:59 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 23:14:59 2025) ================
|
2025-02-24 23:00:25 +08:00
|
|
|
================ Training Loss (Mon Feb 24 22:59:41 2025) ================
|
2025-02-24 23:10:23 +08:00
|
|
|
================ Training Loss (Mon Feb 24 23:01:03 2025) ================
|
|
|
|
|
================ Training Loss (Mon Feb 24 23:02:59 2025) ================
|
|
|
|
|
================ Training Loss (Mon Feb 24 23:07:07 2025) ================
|
|
|
|
|
(epoch: 1, iters: 100, time: 0.155, data: 0.229) G_GAN: 6.946 D_real_ViT: 0.807 D_fake_ViT: 0.309 G: 7.258 SB: 0.056
|
|
|
|
|
(epoch: 1, iters: 200, time: 0.174, data: 0.004) G_GAN: 5.494 D_real_ViT: 0.538 D_fake_ViT: 0.287 G: 5.808 SB: 0.054
|
|
|
|
|
(epoch: 1, iters: 300, time: 0.185, data: 0.003) G_GAN: 4.615 D_real_ViT: 0.424 D_fake_ViT: 0.302 G: 4.865 SB: 0.073
|
|
|
|
|
(epoch: 1, iters: 400, time: 0.191, data: 0.003) G_GAN: 4.170 D_real_ViT: 0.357 D_fake_ViT: 0.315 G: 4.751 SB: 0.055
|
|
|
|
|
(epoch: 1, iters: 500, time: 0.196, data: 0.004) G_GAN: 4.437 D_real_ViT: 0.323 D_fake_ViT: 0.265 G: 4.986 SB: 0.062
|
|
|
|
|
(epoch: 1, iters: 600, time: 0.199, data: 0.003) G_GAN: 4.024 D_real_ViT: 0.289 D_fake_ViT: 0.252 G: 4.247 SB: 0.073
|
|
|
|
|
(epoch: 1, iters: 700, time: 0.202, data: 0.004) G_GAN: 3.999 D_real_ViT: 0.273 D_fake_ViT: 0.232 G: 4.461 SB: 0.058
|
|
|
|
|
(epoch: 1, iters: 800, time: 0.204, data: 0.003) G_GAN: 3.754 D_real_ViT: 0.233 D_fake_ViT: 0.241 G: 4.065 SB: 0.054
|
|
|
|
|
(epoch: 1, iters: 900, time: 0.205, data: 0.003) G_GAN: 4.083 D_real_ViT: 0.249 D_fake_ViT: 0.207 G: 4.281 SB: 0.059
|
2025-02-24 23:35:03 +08:00
|
|
|
(epoch: 1, iters: 1000, time: 0.206, data: 0.003) G_GAN: 4.079 D_real_ViT: 0.244 D_fake_ViT: 0.207 G: 4.516 SB: 0.062
|
|
|
|
|
(epoch: 1, iters: 1100, time: 0.205, data: 0.005) G_GAN: 3.630 D_real_ViT: 0.212 D_fake_ViT: 0.248 G: 3.740 SB: 0.062
|
|
|
|
|
(epoch: 1, iters: 1200, time: 0.207, data: 0.004) G_GAN: 4.639 D_real_ViT: 0.174 D_fake_ViT: 0.159 G: 4.870 SB: 0.058
|
|
|
|
|
(epoch: 1, iters: 1300, time: 0.208, data: 0.004) G_GAN: 3.868 D_real_ViT: 0.224 D_fake_ViT: 0.198 G: 4.078 SB: 0.078
|
|
|
|
|
(epoch: 1, iters: 1400, time: 0.206, data: 0.003) G_GAN: 3.604 D_real_ViT: 0.170 D_fake_ViT: 0.215 G: 3.819 SB: 0.058
|
|
|
|
|
(epoch: 1, iters: 1500, time: 0.206, data: 0.003) G_GAN: 3.718 D_real_ViT: 0.156 D_fake_ViT: 0.199 G: 4.209 SB: 0.057
|
|
|
|
|
(epoch: 1, iters: 1600, time: 0.206, data: 0.002) G_GAN: 4.552 D_real_ViT: 0.184 D_fake_ViT: 0.142 G: 4.772 SB: 0.054
|
|
|
|
|
(epoch: 1, iters: 1700, time: 0.205, data: 0.003) G_GAN: 3.300 D_real_ViT: 0.189 D_fake_ViT: 0.255 G: 3.759 SB: 0.076
|
|
|
|
|
(epoch: 1, iters: 1800, time: 0.204, data: 0.004) G_GAN: 3.840 D_real_ViT: 0.193 D_fake_ViT: 0.185 G: 4.465 SB: 0.073
|
|
|
|
|
(epoch: 1, iters: 1900, time: 0.204, data: 0.002) G_GAN: 3.524 D_real_ViT: 0.155 D_fake_ViT: 0.207 G: 4.182 SB: 0.053
|
|
|
|
|
(epoch: 1, iters: 2000, time: 0.204, data: 0.003) G_GAN: 3.437 D_real_ViT: 0.169 D_fake_ViT: 0.223 G: 3.529 SB: 0.061
|
|
|
|
|
(epoch: 1, iters: 2100, time: 0.204, data: 0.003) G_GAN: 4.786 D_real_ViT: 0.153 D_fake_ViT: 0.132 G: 5.162 SB: 0.061
|
|
|
|
|
(epoch: 2, iters: 95, time: 0.203, data: 0.003) G_GAN: 4.433 D_real_ViT: 0.192 D_fake_ViT: 0.147 G: 4.737 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 195, time: 0.204, data: 0.004) G_GAN: 3.592 D_real_ViT: 0.182 D_fake_ViT: 0.199 G: 3.722 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 295, time: 0.204, data: 0.003) G_GAN: 4.473 D_real_ViT: 0.238 D_fake_ViT: 0.133 G: 4.573 SB: 0.062
|
|
|
|
|
(epoch: 2, iters: 395, time: 0.203, data: 0.003) G_GAN: 3.919 D_real_ViT: 0.175 D_fake_ViT: 0.172 G: 4.029 SB: 0.076
|
|
|
|
|
(epoch: 2, iters: 495, time: 0.204, data: 0.003) G_GAN: 3.096 D_real_ViT: 0.204 D_fake_ViT: 0.234 G: 3.726 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 595, time: 0.202, data: 0.005) G_GAN: 2.927 D_real_ViT: 0.168 D_fake_ViT: 0.254 G: 3.603 SB: 0.059
|
|
|
|
|
(epoch: 2, iters: 695, time: 0.204, data: 0.003) G_GAN: 3.383 D_real_ViT: 0.198 D_fake_ViT: 0.209 G: 3.787 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 795, time: 0.205, data: 0.003) G_GAN: 3.755 D_real_ViT: 0.198 D_fake_ViT: 0.185 G: 4.026 SB: 0.075
|
|
|
|
|
(epoch: 2, iters: 895, time: 0.205, data: 0.006) G_GAN: 3.717 D_real_ViT: 0.201 D_fake_ViT: 0.170 G: 3.863 SB: 0.062
|
|
|
|
|
(epoch: 2, iters: 995, time: 0.208, data: 0.003) G_GAN: 3.538 D_real_ViT: 0.268 D_fake_ViT: 0.182 G: 4.136 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 1095, time: 0.207, data: 0.003) G_GAN: 3.946 D_real_ViT: 0.173 D_fake_ViT: 0.166 G: 4.251 SB: 0.058
|
|
|
|
|
(epoch: 2, iters: 1195, time: 0.206, data: 0.003) G_GAN: 4.461 D_real_ViT: 0.162 D_fake_ViT: 0.136 G: 4.591 SB: 0.058
|
|
|
|
|
(epoch: 2, iters: 1295, time: 0.205, data: 0.005) G_GAN: 3.981 D_real_ViT: 0.132 D_fake_ViT: 0.158 G: 4.403 SB: 0.054
|
|
|
|
|
(epoch: 2, iters: 1395, time: 0.206, data: 0.004) G_GAN: 3.391 D_real_ViT: 0.155 D_fake_ViT: 0.190 G: 3.982 SB: 0.078
|
|
|
|
|
(epoch: 2, iters: 1495, time: 0.206, data: 0.005) G_GAN: 3.082 D_real_ViT: 0.264 D_fake_ViT: 0.212 G: 3.343 SB: 0.061
|
|
|
|
|
(epoch: 2, iters: 1595, time: 0.206, data: 0.003) G_GAN: 4.052 D_real_ViT: 0.149 D_fake_ViT: 0.153 G: 4.585 SB: 0.058
|
|
|
|
|
(epoch: 2, iters: 1695, time: 0.206, data: 0.003) G_GAN: 4.096 D_real_ViT: 0.142 D_fake_ViT: 0.153 G: 4.442 SB: 0.058
|
|
|
|
|
(epoch: 2, iters: 1795, time: 0.206, data: 0.003) G_GAN: 2.935 D_real_ViT: 0.337 D_fake_ViT: 0.258 G: 3.227 SB: 0.061
|
|
|
|
|
(epoch: 2, iters: 1895, time: 0.205, data: 0.003) G_GAN: 3.350 D_real_ViT: 0.149 D_fake_ViT: 0.214 G: 3.626 SB: 0.057
|
|
|
|
|
(epoch: 2, iters: 1995, time: 0.205, data: 0.003) G_GAN: 5.271 D_real_ViT: 0.147 D_fake_ViT: 0.101 G: 5.485 SB: 0.073
|
|
|
|
|
(epoch: 2, iters: 2095, time: 0.209, data: 0.003) G_GAN: 4.324 D_real_ViT: 0.171 D_fake_ViT: 0.129 G: 4.600 SB: 0.054
|
|
|
|
|
(epoch: 3, iters: 90, time: 0.208, data: 0.003) G_GAN: 3.161 D_real_ViT: 0.242 D_fake_ViT: 0.206 G: 3.435 SB: 0.055
|
|
|
|
|
(epoch: 3, iters: 190, time: 0.207, data: 0.003) G_GAN: 4.150 D_real_ViT: 0.264 D_fake_ViT: 0.143 G: 4.741 SB: 0.056
|
|
|
|
|
(epoch: 3, iters: 290, time: 0.207, data: 0.003) G_GAN: 5.194 D_real_ViT: 0.268 D_fake_ViT: 0.096 G: 5.567 SB: 0.054
|
|
|
|
|
(epoch: 3, iters: 390, time: 0.208, data: 0.004) G_GAN: 4.214 D_real_ViT: 0.126 D_fake_ViT: 0.145 G: 4.984 SB: 0.061
|
|
|
|
|
(epoch: 3, iters: 490, time: 0.209, data: 0.003) G_GAN: 4.436 D_real_ViT: 0.283 D_fake_ViT: 0.137 G: 4.779 SB: 0.057
|
|
|
|
|
(epoch: 3, iters: 590, time: 0.208, data: 0.003) G_GAN: 4.017 D_real_ViT: 0.162 D_fake_ViT: 0.172 G: 4.254 SB: 0.061
|
|
|
|
|
(epoch: 3, iters: 690, time: 0.210, data: 0.006) G_GAN: 3.531 D_real_ViT: 0.100 D_fake_ViT: 0.179 G: 3.961 SB: 0.062
|
|
|
|
|
(epoch: 3, iters: 790, time: 0.210, data: 0.003) G_GAN: 3.909 D_real_ViT: 0.325 D_fake_ViT: 0.154 G: 4.414 SB: 0.073
|
|
|
|
|
(epoch: 3, iters: 890, time: 0.209, data: 0.006) G_GAN: 5.494 D_real_ViT: 0.221 D_fake_ViT: 0.082 G: 6.089 SB: 0.072
|
|
|
|
|
(epoch: 3, iters: 990, time: 0.207, data: 0.003) G_GAN: 4.783 D_real_ViT: 0.165 D_fake_ViT: 0.111 G: 5.441 SB: 0.057
|
|
|
|
|
(epoch: 3, iters: 1090, time: 0.207, data: 0.004) G_GAN: 3.759 D_real_ViT: 0.335 D_fake_ViT: 0.166 G: 4.676 SB: 0.056
|
|
|
|
|
(epoch: 3, iters: 1190, time: 0.207, data: 0.005) G_GAN: 3.415 D_real_ViT: 0.166 D_fake_ViT: 0.193 G: 4.474 SB: 0.073
|
|
|
|
|
(epoch: 3, iters: 1290, time: 0.206, data: 0.003) G_GAN: 3.669 D_real_ViT: 0.204 D_fake_ViT: 0.169 G: 4.728 SB: 0.060
|
|
|
|
|
(epoch: 3, iters: 1390, time: 0.205, data: 0.004) G_GAN: 3.358 D_real_ViT: 0.199 D_fake_ViT: 0.187 G: 4.412 SB: 0.055
|
|
|
|
|
(epoch: 3, iters: 1490, time: 0.206, data: 0.003) G_GAN: 3.018 D_real_ViT: 0.302 D_fake_ViT: 0.235 G: 4.051 SB: 0.055
|
|
|
|
|
(epoch: 3, iters: 1590, time: 0.206, data: 0.005) G_GAN: 4.606 D_real_ViT: 0.168 D_fake_ViT: 0.138 G: 5.547 SB: 0.061
|
|
|
|
|
(epoch: 3, iters: 1690, time: 0.205, data: 0.003) G_GAN: 3.548 D_real_ViT: 0.269 D_fake_ViT: 0.195 G: 4.058 SB: 0.071
|
|
|
|
|
(epoch: 3, iters: 1790, time: 0.205, data: 0.003) G_GAN: 3.814 D_real_ViT: 0.201 D_fake_ViT: 0.182 G: 4.869 SB: 0.057
|
|
|
|
|
(epoch: 3, iters: 1890, time: 0.207, data: 0.003) G_GAN: 3.740 D_real_ViT: 0.238 D_fake_ViT: 0.173 G: 4.679 SB: 0.058
|
|
|
|
|
(epoch: 3, iters: 1990, time: 0.206, data: 0.004) G_GAN: 4.623 D_real_ViT: 0.227 D_fake_ViT: 0.129 G: 4.969 SB: 0.071
|
|
|
|
|
(epoch: 3, iters: 2090, time: 0.206, data: 0.003) G_GAN: 3.498 D_real_ViT: 0.254 D_fake_ViT: 0.183 G: 4.168 SB: 0.077
|
|
|
|
|
(epoch: 4, iters: 85, time: 0.206, data: 0.003) G_GAN: 3.626 D_real_ViT: 0.252 D_fake_ViT: 0.167 G: 4.710 SB: 0.055
|
|
|
|
|
(epoch: 4, iters: 185, time: 0.207, data: 0.003) G_GAN: 4.156 D_real_ViT: 0.266 D_fake_ViT: 0.140 G: 4.893 SB: 0.057
|
|
|
|
|
(epoch: 4, iters: 285, time: 0.206, data: 0.003) G_GAN: 3.351 D_real_ViT: 0.233 D_fake_ViT: 0.192 G: 4.369 SB: 0.062
|
|
|
|
|
(epoch: 4, iters: 385, time: 0.205, data: 0.004) G_GAN: 3.621 D_real_ViT: 0.296 D_fake_ViT: 0.157 G: 4.613 SB: 0.077
|
|
|
|
|
(epoch: 4, iters: 485, time: 0.207, data: 0.003) G_GAN: 3.686 D_real_ViT: 0.259 D_fake_ViT: 0.170 G: 4.029 SB: 0.072
|
|
|
|
|
(epoch: 4, iters: 585, time: 0.207, data: 0.003) G_GAN: 4.395 D_real_ViT: 0.075 D_fake_ViT: 0.131 G: 4.856 SB: 0.056
|
|
|
|
|
(epoch: 4, iters: 685, time: 0.207, data: 0.003) G_GAN: 4.882 D_real_ViT: 0.104 D_fake_ViT: 0.108 G: 5.622 SB: 0.054
|
|
|
|
|
(epoch: 4, iters: 785, time: 0.205, data: 0.003) G_GAN: 3.992 D_real_ViT: 0.254 D_fake_ViT: 0.134 G: 4.976 SB: 0.075
|
|
|
|
|
(epoch: 4, iters: 885, time: 0.206, data: 0.006) G_GAN: 4.044 D_real_ViT: 0.187 D_fake_ViT: 0.143 G: 5.105 SB: 0.058
|
|
|
|
|
(epoch: 4, iters: 985, time: 0.206, data: 0.002) G_GAN: 3.378 D_real_ViT: 0.065 D_fake_ViT: 0.195 G: 3.964 SB: 0.062
|
|
|
|
|
(epoch: 4, iters: 1085, time: 0.207, data: 0.003) G_GAN: 3.711 D_real_ViT: 0.184 D_fake_ViT: 0.172 G: 4.842 SB: 0.077
|
|
|
|
|
(epoch: 4, iters: 1185, time: 0.207, data: 0.003) G_GAN: 3.574 D_real_ViT: 0.120 D_fake_ViT: 0.186 G: 3.918 SB: 0.070
|