BF16 mmproj crashes

#1
by Volnovik - opened

Built llama-server today. BF16 mmproj fails. F16 is working fine.

srv process_chun: processing image...
encoding image slice...
llama.cpp/llama.cpp/ggml/src/ggml-cuda/im2col.cu:84: GGML_ASSERT(dst->type == GGML_TYPE_F16 || dst->type == GGML_TYPE_F32) failed
llama.cpp/llama.cpp/llama-server(+0xdb1aab)[0x60e8d5233aab]
llama.cpp/llama.cpp/llama-server(+0xdb206c)[0x60e8d523406c]
llama.cpp/llama.cpp/llama-server(+0xdb2247)[0x60e8d5234247]
llama.cpp/llama.cpp/llama-server(+0x6b2393)[0x60e8d4b34393]
llama.cpp/llama.cpp/llama-server(+0x6a46ce)[0x60e8d4b266ce]
llama.cpp/llama.cpp/llama-server(+0x6a80d3)[0x60e8d4b2a0d3]
llama.cpp/llama.cpp/llama-server(+0x6aa67a)[0x60e8d4b2c67a]
llama.cpp/llama.cpp/llama-server(+0xdd17a3)[0x60e8d52537a3]
llama.cpp/llama.cpp/llama-server(+0xdd252d)[0x60e8d525452d]
llama.cpp/llama.cpp/llama-server(+0x4315da)[0x60e8d48b35da]
llama.cpp/llama.cpp/llama-server(+0x3bedf3)[0x60e8d4840df3]
llama.cpp/llama.cpp/llama-server(+0x42ac49)[0x60e8d48acc49]
llama.cpp/llama.cpp/llama-server(+0x1850f3)[0x60e8d46070f3]
llama.cpp/llama.cpp/llama-server(+0x1cd364)[0x60e8d464f364]
llama.cpp/llama.cpp/llama-server(+0x174c98)[0x60e8d45f6c98]
llama.cpp/llama.cpp/llama-server(+0xb0375)[0x60e8d4532375]
/lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca)[0x7af30562a1ca]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x8b)[0x7af30562a28b]
llama.cpp/llama.cpp/llama-server(+0x100db5)[0x60e8d4582db5]
Aborted (core dumped)

Sign up or log in to comment