AI2 Olmo 7B does not support Flash-Attention 2.0. ValueError: OLMoForCausalLM does not support Flash Attention 2.0 yet. (original) (raw)
Navigation Menu
- GitHub Copilot Write better code with AI
- GitHub Models New Manage and compare prompts
- GitHub Advanced Security Find and fix vulnerabilities
- Actions Automate any workflow
- Codespaces Instant dev environments
- Issues Plan and track work
- Code Review Manage code changes
- Discussions Collaborate outside of code
- Code Search Find more, search less
- Explore
- Pricing
Provide feedback
Saved searches
Use saved searches to filter your results more quickly
Appearance settings
Description
Model description
Model Name: allenai/OLMo-7B
Open source status
- The model implementation is available
- The model weights are available
Provide useful links for the implementation
No response
Metadata
Metadata
Labels
Development
No branches or pull requests