-
-
Notifications
You must be signed in to change notification settings - Fork 263
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plugins incorrectly loaded during test runs #626
Comments
That's not supposed to happen because of this code: Lines 15 to 17 in d34eac5
And: Lines 11 to 14 in d34eac5
But I added some print statements and saw this:
So those things are firing in the opposite order from what I expect. diff --git a/llm/plugins.py b/llm/plugins.py
index 933725c..25d6bc3 100644
--- a/llm/plugins.py
+++ b/llm/plugins.py
@@ -14,6 +14,7 @@ LLM_LOAD_PLUGINS = os.environ.get("LLM_LOAD_PLUGINS", None)
if not hasattr(sys, "_called_from_test") and LLM_LOAD_PLUGINS is None:
# Only load plugins if not running tests
+ print("Gonna load those plugins", getattr(sys, "_called_from_test", False))
pm.load_setuptools_entrypoints("llm")
diff --git a/tests/conftest.py b/tests/conftest.py
index 7eb3dd5..322666b 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -10,6 +10,7 @@ from typing import Optional
def pytest_configure(config):
import sys
+ print("Setting that sys._called_from_test = True")
sys._called_from_test = True |
This seems to fix it, by only loading plugins when a CLI command is invocated: diff --git a/llm/cli.py b/llm/cli.py
index bb9075e..eb585f0 100644
--- a/llm/cli.py
+++ b/llm/cli.py
@@ -137,6 +137,9 @@ def cli():
llm 'Five outrageous names for a pet pelican'
"""
+ from .plugins import load_plugins
+
+ load_plugins()
@cli.command(name="prompt")
diff --git a/llm/plugins.py b/llm/plugins.py
index 933725c..5c00b9e 100644
--- a/llm/plugins.py
+++ b/llm/plugins.py
@@ -12,27 +12,36 @@ pm.add_hookspecs(hookspecs)
LLM_LOAD_PLUGINS = os.environ.get("LLM_LOAD_PLUGINS", None)
-if not hasattr(sys, "_called_from_test") and LLM_LOAD_PLUGINS is None:
- # Only load plugins if not running tests
- pm.load_setuptools_entrypoints("llm")
-
-
-# Load any plugins specified in LLM_LOAD_PLUGINS")
-if LLM_LOAD_PLUGINS is not None:
- for package_name in [name for name in LLM_LOAD_PLUGINS.split(",") if name.strip()]:
- try:
- distribution = metadata.distribution(package_name) # Updated call
- llm_entry_points = [
- ep for ep in distribution.entry_points if ep.group == "llm"
- ]
- for entry_point in llm_entry_points:
- mod = entry_point.load()
- pm.register(mod, name=entry_point.name)
- # Ensure name can be found in plugin_to_distinfo later:
- pm._plugin_distinfo.append((mod, distribution)) # type: ignore
- except metadata.PackageNotFoundError:
- sys.stderr.write(f"Plugin {package_name} could not be found\n")
-
-for plugin in DEFAULT_PLUGINS:
- mod = importlib.import_module(plugin)
- pm.register(mod, plugin)
+_loaded = False
+
+
+def load_plugins():
+ global _loaded
+ if _loaded:
+ return
+ _loaded = True
+ if not hasattr(sys, "_called_from_test") and LLM_LOAD_PLUGINS is None:
+ # Only load plugins if not running tests
+ pm.load_setuptools_entrypoints("llm")
+
+ # Load any plugins specified in LLM_LOAD_PLUGINS")
+ if LLM_LOAD_PLUGINS is not None:
+ for package_name in [
+ name for name in LLM_LOAD_PLUGINS.split(",") if name.strip()
+ ]:
+ try:
+ distribution = metadata.distribution(package_name) # Updated call
+ llm_entry_points = [
+ ep for ep in distribution.entry_points if ep.group == "llm"
+ ]
+ for entry_point in llm_entry_points:
+ mod = entry_point.load()
+ pm.register(mod, name=entry_point.name)
+ # Ensure name can be found in plugin_to_distinfo later:
+ pm._plugin_distinfo.append((mod, distribution)) # type: ignore
+ except metadata.PackageNotFoundError:
+ sys.stderr.write(f"Plugin {package_name} could not be found\n")
+
+ for plugin in DEFAULT_PLUGINS:
+ mod = importlib.import_module(plugin)
+ pm.register(mod, plugin) But... won't that break the Python library use-case, which also relies on plugins being loaded so that thinks like Maybe the |
…els (#613) - #507 (comment) * register_model is now async aware Refs #507 (comment) * Refactor Chat and AsyncChat to use _Shared base class Refs #507 (comment) * fixed function name * Fix for infinite loop * Applied Black * Ran cog * Applied Black * Add Response.from_row() classmethod back again It does not matter that this is a blocking call, since it is a classmethod * Made mypy happy with llm/models.py * mypy fixes for openai_models.py I am unhappy with this, had to duplicate some code. * First test for AsyncModel * Still have not quite got this working * Fix for not loading plugins during tests, refs #626 * audio/wav not audio/wave, refs #603 * Black and mypy and ruff all happy * Refactor to avoid generics * Removed obsolete response() method * Support text = await async_mock_model.prompt("hello") * Initial docs for llm.get_async_model() and await model.prompt() Refs #507 * Initial async model plugin creation docs * duration_ms ANY to pass test * llm models --async option Refs #613 (comment) * Removed obsolete TypeVars * Expanded register_models() docs for async * await model.prompt() now returns AsyncResponse Refs #613 (comment) --------- Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
I got a failing test which turned out to be because
llm-ollama
was installed:The text was updated successfully, but these errors were encountered: