2025-02-23 15:57:25 +08:00
|
|
|
================ Training Loss (Sun Feb 23 15:46:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 15:52:29 2025) ================
|
2025-02-23 16:02:17 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:00:07 2025) ================
|
2025-02-23 16:04:38 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:02:40 2025) ================
|
2025-02-23 16:06:28 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:05:19 2025) ================
|
2025-02-23 16:09:20 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:06:44 2025) ================
|
2025-02-23 16:44:28 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:09:38 2025) ================
|
Traceback (most recent call last):
File "/home/openxs/jj/roma_unsb/train.py", line 47, in <module>
model.optimize_parameters() # calculate loss functions, get gradients, update network weights
File "/home/openxs/jj/roma_unsb/models/roma_unsb_model.py", line 315, in optimize_parameters
self.forward()
File "/home/openxs/jj/roma_unsb/models/roma_unsb_model.py", line 445, in forward
Xt_1 = self.netG(Xt, self.time, z)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/parallel/data_parallel.py", line 169, in forward
return self.module(*inputs[0], **kwargs[0])
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/jj/roma_unsb/models/networks.py", line 980, in forward
feat = layer(feat)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/conv.py", line 463, in forward
return self._conv_forward(input, self.weight, self.bias)
File "/home/openxs/anaconda3/envs/I2V/lib/python3.9/site-packages/torch/nn/modules/conv.py", line 459, in _conv_forward
return F.conv2d(input, weight, bias, self.stride,
RuntimeError: Given groups=1, weight of size [64, 3, 7, 7], expected input[1, 1, 1734, 774] to have 3 channels, but got 1 channels instead
2025-02-23 16:49:26 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:44:56 2025) ================
|
2025-02-23 18:42:21 +08:00
|
|
|
================ Training Loss (Sun Feb 23 16:49:46 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 16:51:03 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 16:51:23 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:04:02 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:04:39 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:05:17 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:06:40 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:11:48 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:13:31 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:14:11 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:14:29 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:16:27 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:16:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:20:39 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:21:44 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:35:27 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:39:21 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:40:15 2025) ================
|
|
|
|
|
================ Training Loss (Sun Feb 23 18:41:15 2025) ================
|