Conversation
There was a problem hiding this comment.
Changes requested. Reviewed entire PR up to commit f4b5f76
Reviewed 467 lines of code across 2 files in 3 minute(s) and 13 second(s).
See details
- Skipped files: 0 (please contact us to request support for these files)
- Confidence threshold:
85% - Drafted
2additional comments. - Workflow ID:
wflow_SXferBdYytUUTJzJ
View 2 draft comments
These comments were drafted by Ellipsis, but were filtered out of the final review. They're included here so you can see our internal thought process and help you configure your ellipsis.yaml.
Drafted 2 additional comments
Comment at applications/finetune-quora-embeddings/eval.py:2
The result import from unittest is not used in the code. Consider removing it to keep the code clean.
Comment at applications/finetune-quora-embeddings/eval.py:3
The embed import from IPython is not used in the code. Consider removing it to keep the code clean.
Something look wrong? You can customize Ellipsis by editing the ellipsis.yaml for this repository.
Generated with ❤️ by ellipsis.dev
| @@ -1,26 +1,44 @@ | |||
| from distutils.log import Log | |||
There was a problem hiding this comment.
The Log import from distutils.log is not used in the code. Consider removing it to keep the code clean.
| @@ -1,4 +1,8 @@ | |||
| import os | |||
| from IPython import embed | |||
| from regex import D | |||
There was a problem hiding this comment.
The D import from regex is not used in the code. Consider removing it to keep the code clean.
| @@ -1,4 +1,8 @@ | |||
| import os | |||
| from IPython import embed | |||
There was a problem hiding this comment.
The embed import from IPython is not used in the code. Consider removing it to keep the code clean.
| train_file: str, | ||
| test_file: str, | ||
| model= None, | ||
| ):x |
There was a problem hiding this comment.
There is a syntax error at the end of the function signature. Consider removing :x.
|
|
||
| # Initialize the model | ||
| model = LogisticRegression() | ||
| model = () |
There was a problem hiding this comment.
The variable model is initialized but not used in the code. Consider removing it to keep the code clean.
Summary:
The PR refactors several functions in the
finetune-quora-embeddingsapplication to improve code efficiency and readability, and introduces new functions for handling sentence embeddings and model scoring.Key points:
yield_dataset_with_embeddingsfunction inembed.pyto yield a dictionary with the sentence embeddings and other related information.process_embeddingsinembed.pyto handle the creation of sentence embeddings.split_embed_train_testfunction inembed.pyto use the newprocess_embeddingsfunction.generate_embeddingsfunction inembed.pyto use a generator for saving the embeddings.scorerineval.pyto handle the scoring of models.mainfunction ineval.pyto use the newscorerfunction.Generated with ❤️ by ellipsis.dev