[ot][spam][crazy] coping spam while trying to make code to prompt a model
Undescribed Horrific Abuse, One Victim & Survivor of Many
gmkarl at gmail.com
Sat Jan 7 00:29:21 PST 2023
actually the error is different:
RuntimeError: CUDA error: CUBLAS_STATUS_NOT_SUPPORTED when calling
`cublasGemmStridedBatchedExFix(handle, opa, opb, (int)m,
(int)n, (int)k, (void*)&falpha, a, CUDA_R_16BF, (int)lda, stridea, b,
CUDA_R_16BF, (int)ldb, strideb, (void*)&fbeta, c,
CUDA_R_16BF, (int)ldc, stridec, (int)num_batches, CUDA_R_32F,
CUBLAS_GEMM_DEFAULT_TENSOR_OP)`
it looks like it's telling me it doesn't support bfloat16, when i can
do bfloat16 inference in the repl O_o
More information about the cypherpunks
mailing list