GitHub - keon/seq2seq: Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch (original) (raw)
Navigation Menu
Appearance settings
- AI CODE CREATION
* GitHub CopilotWrite better code with AI
* GitHub SparkBuild and deploy intelligent apps
* GitHub ModelsManage and compare prompts
* MCP RegistryNewIntegrate external tools - DEVELOPER WORKFLOWS
* ActionsAutomate any workflow
* CodespacesInstant dev environments
* IssuesPlan and track work
* Code ReviewManage code changes - APPLICATION SECURITY
* GitHub Advanced SecurityFind and fix vulnerabilities
* Code securitySecure your code as you build
* Secret protectionStop leaks before they start - EXPLORE
* Why GitHub
* Documentation
* Blog
* Changelog
* Marketplace
- AI CODE CREATION
- BY COMPANY SIZE
* Enterprises
* Small and medium teams
* Startups
* Nonprofits - BY USE CASE
* App Modernization
* DevSecOps
* DevOps
* CI/CD
* View all use cases - BY INDUSTRY
* Healthcare
* Financial services
* Manufacturing
* Government
* View all industries
- BY COMPANY SIZE
- EXPLORE BY TOPIC
* AI
* Software Development
* DevOps
* Security
* View all topics - EXPLORE BY TYPE
* Customer stories
* Events & webinars
* Ebooks & reports
* Business insights
* GitHub Skills - SUPPORT & SERVICES
* Documentation
* Customer support
* Community forum
* Trust center
* Partners
- EXPLORE BY TOPIC
- COMMUNITY
* GitHub SponsorsFund open source developers - PROGRAMS
* Security Lab
* Maintainer Community
* Accelerator
* Archive Program - REPOSITORIES
* Topics
* Trending
* Collections
- COMMUNITY
- Pricing
Provide feedback
We read every piece of feedback, and take your input very seriously.
Include my email address so I can be contacted
Saved searches
Use saved searches to filter your results more quickly
Appearance settings
This repository was archived by the owner on Apr 25, 2023. It is now read-only.
- Notifications You must be signed in to change notification settings
- Fork168
- Star 702
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
License
702 stars 168 forks Branches Tags Activity
Notifications You must be signed in to change notification settings
Additional navigation options
Folders and files
| Name | Name | Last commit message | Last commit date |
|---|---|---|---|
| Latest commitHistory37 Commits | |||
| .gitignore | .gitignore | ||
| LICENSE | LICENSE | ||
| README.md | README.md | ||
| model.py | model.py | ||
| train.py | train.py | ||
| utils.py | utils.py |
Repository files navigation
mini seq2seq
Minimal Seq2Seq model with attention for neural machine translation in PyTorch.
This implementation focuses on the following features:
- Modular structure to be used in other projects
- Minimal code for readability
- Full utilization of batches and GPU.
This implementation relies on torchtext to minimize dataset management and preprocessing parts.
Model description
- Encoder: Bidirectional GRU
- Decoder: GRU with Attention Mechanism
- Attention: Neural Machine Translation by Jointly Learning to Align and Translate
Requirements
- GPU & CUDA
- Python3
- PyTorch
- torchtext
- Spacy
- numpy
- Visdom (optional)
download tokenizers by doing so:
python -m spacy download de
python -m spacy download en
References
Based on the following implementations
About
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Topics
deep-learning machine-translation seq2seq
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages
No packages published
