You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
mistral-chat $LLM_MODEL --instruct --max_tokens 256
$LLM_MODEL is mamba-codestral-7B-v0.1 folder
Traceback (most recent call last):
File "/usr/local/bin/mistral-chat", line 8, in <module>
sys.exit(mistral_chat())
File "/usr/local/lib/python3.10/dist-packages/mistral_inference/main.py", line 203, in mistral_chat
fire.Fire(interactive)
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 143, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 477, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/usr/local/lib/python3.10/dist-packages/fire/core.py", line 693, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/mistral_inference/main.py", line 117, in interactive
generated_tokens, _ = generate_fn( # type: ignore[operator]
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/mistral_inference/generate.py", line 21, in generate_mamba
output = model.model.generate(
File "/usr/local/setup/mamba/mamba_ssm/utils/generation.py", line 260, in generate
output = decode(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/usr/local/setup/mamba/mamba_ssm/utils/generation.py", line 221, in decode
scores.append(get_logits(sequences[-1], inference_params))
File "/usr/local/setup/mamba/mamba_ssm/utils/generation.py", line 184, in get_logits
logits = model(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/setup/mamba/mamba_ssm/models/mixer_seq_simple.py", line 279, in forward
hidden_states = self.backbone(input_ids, inference_params=inference_params, **mixer_kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/setup/mamba/mamba_ssm/models/mixer_seq_simple.py", line 194, in forward
hidden_states, residual = layer(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/setup/mamba/mamba_ssm/modules/block.py", line 67, in forward
hidden_states = self.mixer(hidden_states, inference_params=inference_params, **mixer_kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/setup/mamba/mamba_ssm/modules/mamba2.py", line 233, in forward
self.conv1d(xBC.transpose(1, 2)).transpose(1, 2)[:, -(self.dconv - 1):]
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1709, in __getattr__
raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'Mamba2' object has no attribute 'dconv'. Did you mean: 'd_conv'?
Expected Behavior
chat output
Additional Context
I install mistral-inference and causal-conv1d from pip mamba-ssm build from github source. (2.2.2 )
because it raise Undefined Symbol Error.
Suggested Solutions
No response
The text was updated successfully, but these errors were encountered:
Python -VV
Pip Freeze
Reproduction Steps
mistral-chat $LLM_MODEL --instruct --max_tokens 256
$LLM_MODEL is mamba-codestral-7B-v0.1 folder
Expected Behavior
chat output
Additional Context
I install
mistral-inference
andcausal-conv1d
from pipmamba-ssm
build from github source. (2.2.2 )because it raise Undefined Symbol Error.
Suggested Solutions
No response
The text was updated successfully, but these errors were encountered: