Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 889 Bytes

README.md

File metadata and controls

28 lines (18 loc) · 889 Bytes

Text Summarization using BART

This project demonstrates text summarization using the BART (Bidirectional and Auto-Regressive Transformers) model. BART is a transformer model trained as a denoising autoencoder and is effective for text generation tasks such as summarization.

Features

  • Text Preprocessing: Load and preprocess text data.
  • Summarization Model: A BART model for text summarization.
  • Training Script: Train the model with text data.
  • Summarization Script: Generate text summaries using the trained model.
  • Results Visualization: Save and view generated summaries.

Setup

  1. Clone the repository and install dependencies:
git clone https://github.com/SreeEswaran/Text-Summarization-using-BART.git
cd Text-Summarization-using-BART
  1. Install the dependencies

    pip install -r requirements.txt