Using llama index but avoiding the tiktoken API call

309 views Asked by At

I want to use llama_index but when I import the package I get the following error

ConnectionError: ('Connected aborted.', ConnectionResetError(10054, 'An existing connection was forcibly closed by the remote host', None, 10054, None))

This is because llama_index makes a request via the package tiktoken by open ai. You can see this in line 55. https://github.com/run-llama/llama_index/blob/main/llama_index/utils.py#L52

Which at the end makes a call to this API (https://openaipublic.blob.core.windows.net/gpt-2/encodings/main/vocab.bpe) https://github.com/openai/tiktoken/blob/main/tiktoken_ext/openai_public.py#L11

Is there a way to avoid this API call? And just maybe have it locally? I can't access the internet in the environment I'm working in so I can't make API calls.

0

There are 0 answers