We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent eb8ef42 commit 4232b3fCopy full SHA for 4232b3f
requirements/molmo.txt
@@ -0,0 +1,20 @@
1
+# Core vLLM-compatible dependencies with Molmo accuracy setup (tested on L40)
2
+torch==2.5.1
3
+torchvision==0.20.1
4
+transformers==4.48.1
5
+tokenizers==0.21.0
6
+tiktoken==0.7.0
7
+vllm==0.7.0
8
+
9
+# Optional but recommended for improved performance and stability
10
+triton==3.1.0
11
+xformers==0.0.28.post3
12
+uvloop==0.21.0
13
+protobuf==5.29.3
14
+openai==1.60.2
15
+opencv-python-headless==4.11.0.86
16
+pillow==10.4.0
17
18
+# Installed FlashAttention (for float16 only)
19
+flash-attn>=2.5.6 # Not used in float32, but should be documented
20
0 commit comments