You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, I tried to generate a Beatmap for the following Song: https://songwhip.com/krankzinnig/little-story.
However, I was always getting verry weird and off beat timings. So I made the timings myself, because I saw the argument:
--timing_points_from TIMING_POINTS_FROM
beatmap file to take timing points from (optional)
However, it seems they cannot be used in the google collab.
Can you please add the (optional) option for uploading / inputting the timings (file) inside of the google collab script?
I hope to see much better results then. It would be much appreciated <3
I am also unable to use the project locally for some reason.
Followed setup instructions, but apparently it installed pytorch for CPU instead for GPU, and thus I currently cant use the custom timings at all... RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False....
Thanks in advance, and keep up the good stuff :)
The text was updated successfully, but these errors were encountered:
Hey, I tried to generate a Beatmap for the following Song: https://songwhip.com/krankzinnig/little-story.
However, I was always getting verry weird and off beat timings. So I made the timings myself, because I saw the argument:
However, it seems they cannot be used in the google collab.
Can you please add the (optional) option for uploading / inputting the timings (file) inside of the google collab script?
I hope to see much better results then. It would be much appreciated <3
I am also unable to use the project locally for some reason.
Followed setup instructions, but apparently it installed pytorch for CPU instead for GPU, and thus I currently cant use the custom timings at all...
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False.
...Thanks in advance, and keep up the good stuff :)
The text was updated successfully, but these errors were encountered: