Skip to content

Commit

Permalink
Suppressed UserWarning: TORCH_CUDA_ARCH_LIST (#3160)
Browse files Browse the repository at this point in the history
### Changes

<!--- What was changed (briefly), how to reproduce (if applicable), what
the reviewers should focus on -->
TORCH_CUDA_ARCH_LIST warning is suppressed using
`torch.cuda.get_arch_list()`.

Please let me know, if any changes are required.

### Related tickets

<!--- Post the numerical ID of the ticket, if available -->
Solves issue #3141
  • Loading branch information
devesh-2002 authored Jan 7, 2025
1 parent 2030ec1 commit d90d285
Showing 1 changed file with 6 additions and 3 deletions.
9 changes: 6 additions & 3 deletions nncf/torch/extensions/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import enum
import os
import textwrap
import warnings
from abc import ABC
from abc import abstractmethod
from multiprocessing.context import TimeoutError as MPTimeoutError
Expand Down Expand Up @@ -98,9 +99,11 @@ def get(self, fn_name: str) -> Callable:

with extension_is_loading_info_log(self._loader.name()):
try:
pool = ThreadPool(processes=1)
async_result = pool.apply_async(self._loader.load)
self._loaded_namespace = async_result.get(timeout=timeout)
with warnings.catch_warnings():
warnings.filterwarnings("ignore", message="TORCH_CUDA_ARCH_LIST is not set")
pool = ThreadPool(processes=1)
async_result = pool.apply_async(self._loader.load)
self._loaded_namespace = async_result.get(timeout=timeout)
except MPTimeoutError as error:
msg = textwrap.dedent(
f"""\
Expand Down

0 comments on commit d90d285

Please sign in to comment.