Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Guardrails won't work when llm is set to with_structured_output #924

Open
3 of 4 tasks
kanzyai-emirarditi opened this issue Jan 8, 2025 · 0 comments
Open
3 of 4 tasks
Labels
bug Something isn't working status: needs triage New issues that have not yet been reviewed or categorized.

Comments

@kanzyai-emirarditi
Copy link

kanzyai-emirarditi commented Jan 8, 2025

Did you check docs and existing issues?

  • I have read all the NeMo-Guardrails docs
  • I have updated the package to the latest version before submitting this issue
  • (optional) I have used the develop branch
  • I have searched the existing issues of NeMo-Guardrails

Python version (python --version)

3.12.1

Operating system/version

Windows 11

NeMo-Guardrails version (if you must use a specific version and not the latest

No response

Describe the bug

Package versions used:

aiofiles==24.1.0
aiohappyeyeballs==2.4.4
aiohttp==3.11.11
aiolimiter==1.2.1
aiosignal==1.3.2
aiosqlite==0.20.0
alembic==1.14.0
annotated-types==0.7.0
annoy==1.17.3
anthropic==0.37.1
anyio==4.8.0
appdirs==1.4.4
asyncer==0.0.8
asyncio==3.4.3
asyncpg==0.30.0
attrs==24.3.0
backoff==2.2.1
boto3==1.35.94
botocore==1.35.94
cachetools==5.5.0
certifi==2024.12.14
cffi==1.17.1
cfgv==3.4.0
charset-normalizer==3.4.1
click==8.1.8
cloudpickle==3.1.0
cohere==5.13.6
colorama==0.4.6
coloredlogs==15.0.1
colorlog==6.9.0
cryptography==44.0.0
dataclasses-json==0.6.7
datasets==3.2.0
deepeval==1.6.2
defusedxml==0.7.1
Deprecated==1.2.15
dill==0.3.8
diskcache==5.6.3
distlib==0.3.9
distro==1.9.0
dnspython==2.7.0
docx2txt==0.8
dspy==2.5.42
dspy-ai==2.5.43
ecdsa==0.19.0
email_validator==2.2.0
execnet==2.1.1
fastapi==0.111.1
fastapi-cli==0.0.7
fastavro==1.10.0
fastembed==0.5.0
filelock==3.16.1
flatbuffers==24.12.23
frozenlist==1.5.0
fsspec==2024.9.0
googleapis-common-protos==1.66.0
greenlet==3.1.1
grpcio==1.63.2
h11==0.14.0
httpcore==1.0.7
httptools==0.6.4
httpx==0.28.1
httpx-sse==0.4.0
huggingface-hub==0.27.1
humanfriendly==10.0
identify==2.6.5
idna==3.10
importlib-metadata==7.0.0
iniconfig==2.0.0
Jinja2==3.1.5
jiter==0.8.2
jmespath==1.0.1
joblib==1.4.2
json_repair==0.35.0
jsonpatch==1.33
jsonpointer==3.0.0
jsonschema==4.23.0
jsonschema-specifications==2024.10.1
langchain==0.3.14
langchain-anthropic==0.2.4
langchain-aws==0.2.10
langchain-community==0.3.14
langchain-core==0.3.29
langchain-milvus==0.1.7
langchain-openai==0.2.14
langchain-text-splitters==0.3.5
langchain-voyageai==0.1.3
langgraph==0.2.61
langgraph-checkpoint==2.0.9
langgraph-checkpoint-sqlite==2.0.1
langgraph-sdk==0.1.48
langsmith==0.2.10
lark==1.1.9
litellm==1.51.0
loguru==0.7.3
magicattr==0.1.6
Mako==1.3.8
markdown-it-py==3.0.0
MarkupSafe==3.0.2
marshmallow==3.24.1
mdurl==0.1.2
mmh3==4.1.0
mpmath==1.3.0
msgpack==1.1.0
multidict==6.1.0
multiprocess==0.70.16
mypy-extensions==1.0.0
nemoguardrails==0.11.0
nest-asyncio==1.6.0
nodeenv==1.9.1
numpy==2.2.1
onnx==1.17.0
onnxruntime==1.20.1
openai==1.59.4
opentelemetry-api==1.24.0
opentelemetry-exporter-otlp-proto-common==1.24.0
opentelemetry-exporter-otlp-proto-grpc==1.24.0
opentelemetry-proto==1.24.0
opentelemetry-sdk==1.24.0
opentelemetry-semantic-conventions==0.45b0
optuna==4.1.0
orjson==3.10.13
packaging==24.2
pandas==2.2.3
parameterized==0.9.0
pillow==10.4.0
platformdirs==4.3.6
pluggy==1.5.0
portalocker==3.1.1
pre-commit==3.8.0
prompt_toolkit==3.0.48
propcache==0.2.1
protobuf==4.25.5
psycopg2==2.9.10
py_rust_stemmers==0.1.3
pyarrow==18.1.0
pyasn1==0.6.1
pycparser==2.22
pydantic==2.10.4
pydantic-settings==2.7.1
pydantic_core==2.27.2
Pygments==2.19.1
pymilvus==2.5.3
pyreadline3==3.5.4
pysbd==0.3.4
pytest==8.3.4
pytest-asyncio==0.25.2
pytest-order==1.3.0
pytest-repeat==0.9.3
pytest-xdist==3.6.1
python-dateutil==2.9.0.post0
python-dotenv==1.0.1
python-jose==3.3.0
python-multipart==0.0.20
pytz==2024.2
pywin32==308
PyYAML==6.0.2
ragas==0.2.10
redis==5.2.1
referencing==0.35.1
regex==2024.11.6
requests==2.32.3
requests-toolbelt==1.0.0
rich==13.9.4
rich-toolkit==0.12.0
rpds-py==0.22.3
rsa==4.9
s3transfer==0.10.4
sentry-sdk==2.19.2
setuptools==75.7.0
shellingham==1.5.4
simpleeval==1.0.3
six==1.17.0
sniffio==1.3.1
SQLAlchemy==2.0.36
sqlglot==25.34.1
starlette==0.37.2
sympy==1.13.3
tabulate==0.9.0
tenacity==8.4.2
tiktoken==0.8.0
tokenizers==0.21.0
tqdm==4.67.1
typer==0.15.1
types-requests==2.32.0.20241016
typing-inspect==0.9.0
typing_extensions==4.12.2
tzdata==2024.2
ujson==5.10.0
urllib3==2.3.0
uvicorn==0.29.0
virtualenv==20.28.1
voyageai==0.2.4
watchdog==6.0.0
watchfiles==1.0.3
wcwidth==0.2.13
websockets==14.1
win32_setctime==1.2.0
wrapt==1.17.0
xxhash==3.5.0
yarl==1.18.3
zipp==3.21.0

When I try to use the guardrails with a regular llm, it works just fine, but when I try to use with an llm which is wrapped with langchain's with_structured_output, I get the following error:

AttributeError("'ChatPromptValue' object has no attribute 'get'")

Steps To Reproduce

llm = ChatOpenAI()
llm = llm.with_structured_output(SOME_PYDANTIC_CLASS)
prompt = ChatPromptTemplate([("user", "{question}"])
guardrail_config = RailsConfig.from_path("src/agent/guardrail")
guardrails = RunnableRails(guardrail_config)
llm = prompt | (guardrails | llm)
llm.invoke({"question": "How are you?"})

Expected Behavior

This should work fine with or without the with_structured_output functionality.

Actual Behavior

When I invoke the chain when llm is set to with_structured_output, I receive the following error:

AttributeError("'ChatPromptValue' object has no attribute 'get'")
@kanzyai-emirarditi kanzyai-emirarditi added bug Something isn't working status: needs triage New issues that have not yet been reviewed or categorized. labels Jan 8, 2025
@kanzyai-emirarditi kanzyai-emirarditi changed the title bug: bug: Guardrails won't work when llm is set to with_structured_output Jan 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working status: needs triage New issues that have not yet been reviewed or categorized.
Projects
None yet
Development

No branches or pull requests

1 participant