Learning to Skim Text Transformer Replication
- Fine-tuned an NLP model that surpassed previous state-of-the-art performance in text skimming tasks.
- Achieved significant improvements over the proposed LSTM-jump model:
- Increased accuracy by 5%
- Doubled computational speed
- Authored and presented a detailed technical report to the professor and class, demonstrating strong communication and presentation skills.
Technologies
- Developed the project using a comprehensive tech stack:
- Python
- Google Colab
- PyTorch
- Scikit-learn
- NumPy
- Pandas
- Hugging Face Transformers
- SQL
- Git
- TensorFlow