Deploy app with llama-cpp-python dependency on Vercel

232 views Asked by At

Cant deploy to vercel my app that requires llama-cpp-python (sorry if a newbie question):

    (venv) bacelar@bnr:~/www/2023/python/<app>$ vercel --force
    Vercel CLI 30.2.3
  Inspect: https://vercel.com/<account> [1s]
Error: Command failed: pip3.9 install --disable-pip-version-check --target . --upgrade -r /vercel/path1/requirements.txt
  error: subprocess-exited-with-error
  
  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [118 lines of output]
      
      
      --------------------------------------------------------------------------------
      -- Trying 'Ninja' generator
      --------------------------------
      ---------------------------
      ----------------------
      -----------------
      ------------
      -------
      --
      Not searching for unused variables given on the command line.
      -- The C compiler identification is GNU 7.3.1
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /usr/bin/cc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- The CXX compiler identification is GNU 7.3.1
      -- Detecting CXX compiler ABI

local setup:

python: 3.9.17

nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Tue_Jun_13_19:16:58_PDT_2023 Cuda compilation tools, release 12.2, V12.2.91 Build cuda_12.2.r12.2/compiler.32965470_0

0

There are 0 answers