Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'. #206

Open
LiuJiaoShouYa opened this issue Dec 19, 2024 · 0 comments
Labels
question Further information is requested

Comments

@LiuJiaoShouYa
Copy link

Describe the issue

According to the case, there may be connection exceptions during instantiation, and I have not found any relevant configuration information. Can you help me?

from llmlingua import PromptCompressor

llm_lingua = PromptCompressor()
compressed_prompt = llm_lingua.compress_prompt(
demonstration_str.split("\n"),
instruction=instruction,
question=question,
target_token=500,
condition_compare=True,
condition_in_question="after",
rank_method="longllmlingua",
use_sentence_level_filter=False,
context_budget="+100",
dynamic_context_compression_ratio=0.4, # enable dynamic_context_compression_ratio
reorder_context="sort",
)

error:
Traceback (most recent call last):
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 1374, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 1294, in get_hf_file_metadata
r = _request_wrapper(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 278, in _request_wrapper
response = _request_wrapper(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 301, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\utils_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /NousResearch/Llama-2-7b-hf/resolve/main/config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000021BDDB63490>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: e50c0aa5-041c-4a91-b1d1-ef4bd36e7bf0)')

The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Virtualenvs\fiv-coder\lib\site-packages\transformers\utils\hub.py", line 403, in cached_file
resolved_file = hf_hub_download(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 860, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 967, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\huggingface_hub\file_download.py", line 1485, in _raise_on_head_call_error
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\pyfile\test_script\llmlingua_test\test2.py", line 3, in
llm_lingua = PromptCompressor()
File "C:\Virtualenvs\fiv-coder\lib\site-packages\llmlingua\prompt_compressor.py", line 89, in init
self.load_model(model_name, device_map, model_config)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\llmlingua\prompt_compressor.py", line 122, in load_model
config = AutoConfig.from_pretrained(model_name, **model_config)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1021, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\transformers\configuration_utils.py", line 590, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "C:\Virtualenvs\fiv-coder\lib\site-packages\transformers\configuration_utils.py", line 649, in _get_config_dict
resolved_config_file = cached_file(
File "C:\Virtualenvs\fiv-coder\lib\site-packages\transformers\utils\hub.py", line 446, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like NousResearch/Llama-2-7b-hf is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

Process finished with exit code 1

@LiuJiaoShouYa LiuJiaoShouYa added the question Further information is requested label Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant