You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I am currently using DTLN-aec for real-time acoustic echo cancellation testing.
For me, this model's performance is so lovely, but it needs to be lightweightening for using in real-time process. so, I'm trying to quantization the tflite file.
I want to change float32 to float16 through the dynamic range quantization.
However, In the process of quantization, The tf.lite.TFLiteConverter.from_saved_model function takes a tensorflow(.pb) model as a parameter. so I needs .pb file.
Therefore, can I get a .pb file or quantized-tflite file?
If I succeed in quantization, I can share my code and results with you.
Thank you for reading it. Have a nice day.
The text was updated successfully, but these errors were encountered:
Hi,
I am currently using
DTLN-aec
for real-time acoustic echo cancellation testing.For me, this model's performance is so lovely, but it needs to be lightweightening for using in real-time process. so, I'm trying to quantization the tflite file.
I want to change
float32
tofloat16
through thedynamic range quantization
.However, In the process of quantization, The
tf.lite.TFLiteConverter.from_saved_model
function takes a tensorflow(.pb
) model as a parameter. so I needs.pb
file.Therefore, can I get a
.pb
file orquantized-tflite
file?If I succeed in quantization, I can share my code and results with you.
Thank you for reading it. Have a nice day.
The text was updated successfully, but these errors were encountered: