You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for the great work. But when I debugging the example file, I find that the last 24 rows of target (power_usage) in data_input have the same value of the that in the data_output, where the shape of the data_input is (batch_size, 192, 5) and the data_output is (batch_size, 24,1). This can be printed in the def __getitem__(self, idx) of class TFTDataset. Obviously, the input should not include the value in output. And the proper shape of inputs is 192-24 = 168, which exclude the power_usage of the last 24 values, in my opinion. What is the problem? Thanks.
The text was updated successfully, but these errors were encountered:
Thank you for the great work. But when I debugging the example file, I find that the last 24 rows of target (power_usage) in data_input have the same value of the that in the data_output, where the shape of the data_input is (batch_size, 192, 5) and the data_output is (batch_size, 24,1). This can be printed in the def __getitem__(self, idx) of class TFTDataset. Obviously, the input should not include the value in output. And the proper shape of inputs is 192-24 = 168, which exclude the power_usage of the last 24 values, in my opinion. What is the problem? Thanks.
# Isolate known and observed historical inputs.
if unknown_inputs is not None:
historical_inputs = torch.cat([
unknown_inputs[:, :self.num_encoder_steps, :],
known_combined_layer[:, :self.num_encoder_steps, :],
obs_inputs[:, :self.num_encoder_steps, :]
], dim=-1)
else:
historical_inputs = torch.cat([
known_combined_layer[:, :self.num_encoder_steps, :],
obs_inputs[:, :self.num_encoder_steps, :]
], dim=-1)
Here only the first 168 of the input data are taken
Thank you for the great work. But when I debugging the example file, I find that the last 24 rows of target (power_usage) in data_input have the same value of the that in the data_output, where the shape of the data_input is (batch_size, 192, 5) and the data_output is (batch_size, 24,1). This can be printed in the
def __getitem__(self, idx)
ofclass TFTDataset
. Obviously, the input should not include the value in output. And the proper shape of inputs is 192-24 = 168, which exclude the power_usage of the last 24 values, in my opinion. What is the problem? Thanks.The text was updated successfully, but these errors were encountered: