You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following model configurations has been modified according to `config.json` or kwargs:
{'num_layers', 'num_attention_heads', 'hidden_size', 'vocab_size'}
W20230627 17:50:38.624037 18121 cuda_stream.cpp:48] Runtime version 11.2 of cuBLAS incompatible with compiletime version 11.10.
W20230627 17:50:39.510000 18121 cuda_stream.cpp:48] Runtime version 8.4 of cuDNN incompatible with compiletime version 8.5.
Traceback (most recent call last):
File "/home/ubuntu/newspace/deploy/oneflow/libai/demo_glm.py", line 124, in<module>
outputs = model.generate(
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/autograd/autograd_mode.py", line 154, in wrapper
return func(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/libai/inference/generator/generation_utils.py", line 1002, in generate
return self.greedy_search(
File "/home/ubuntu/newspace/deploy/oneflow/libai/libai/inference/generator/generation_utils.py", line 490, in greedy_search
outputs = self(**model_inputs)
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/nn/modules/module.py", line 163, in __call__
res = self.forward(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/projects/GLM/modeling_glm.py", line 397, in forward
lm_logits, mems = self.glm(
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/nn/modules/module.py", line 163, in __call__
res = self.forward(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/projects/GLM/modeling_glm.py", line 204, in forward
logits, mem_layers = self.transformer(
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/nn/modules/module.py", line 163, in __call__
res = self.forward(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/projects/GLM/modeling_glm.py", line 75, in forward
hidden_states = layer(hidden_states, attention_mask, mem=mem_i)
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/nn/modules/module.py", line 163, in __call__
res = self.forward(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/projects/GLM/layers/transformer_layer.py", line 103, in forward
attention_output = self.attention(
File "/home/ubuntu/anaconda3/envs/libai/lib/python3.9/site-packages/oneflow/nn/modules/module.py", line 163, in __call__
res = self.forward(*args, **kwargs)
File "/home/ubuntu/newspace/deploy/oneflow/libai/projects/GLM/layers/attention_layer.py", line 111, in forward
context = flow._C.fused_multi_head_attention_inference_v2(
AttributeError: module 'oneflow._C' has no attribute 'fused_multi_head_attention_inference_v2'
Summary
运行projects目录GLM例子,运行报错:
Code to reproduce bug
python -m oneflow.distributed.launch --nproc_per_node 1 demo_glm.py
报错
System Information
python3 -m oneflow --doctor
):0.9.0The text was updated successfully, but these errors were encountered: