This project provides a web application for visualizing attention weights generated by DistillBERT model. It allows users to input text, select layers and heads, and visualize the attention weights in various formats.
App at Streamlit Community Cloud: https://bertattentionviz.streamlit.app/
- Visualize attention weights for each layer and head.
- Exclude special tokens from visualization.
- Plot top predictions for masked tokens.
- Customizable figure sizes for visualizations.
- User-friendly interface with Streamlit.
To run the application locally, follow these steps:
-
Clone the repository:
git clone https://github.com/insdout/bert-attention-viz.git
-
Navigate to the project directory:
cd bert-attention-viz
-
Install the required dependencies:
pip install -r requirements.txt
-
Run the Streamlit app:
streamlit run streamlit_app.py
-
Access the app in your web browser at
http://localhost:8501
.
- Input a sentence in the text box.
- Select layers and heads for attention visualization.
- Toggle special tokens inclusion.
- Adjust figure sizes for better visualization.
- Click "Submit" to see the attention plots.
Contributions are welcome! If you'd like to contribute to this project, please follow these steps:
- Fork the repository.
- Create a new branch (
git checkout -b feature/new-feature
). - Make your changes.
- Commit your changes (
git commit -am 'Add new feature'
). - Push to the branch (
git push origin feature/new-feature
). - Create a new Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- This project was inspired by the paper 'Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned' by Elena Voita, David Talbot, Fedor Moiseev, Rico Sennrich, Ivan Titov.
- Special thanks to the Streamlit team for providing a powerful and easy-to-use platform for building web applications.