Uploaded image for project: 'Python Integrations'
  1. Python Integrations
  2. INTPYTHON-553

[Langchain] Reset global cache to None after test_cache

    • Type: Icon: Bug Bug
    • Resolution: Fixed
    • Priority: Icon: Unknown Unknown
    • langchain-mongodb-0.6
    • Affects Version/s: None
    • Component/s: None
    • None
    • None
    • Python Drivers
    • Not Needed
    • None
    • None
    • None
    • None
    • None
    • None

      We've noticed features requiring an LLM doing funny things. The workaround has been to explicitly set the cache to False as a kwarg. This isn't necessary though.

      The problem is being introduced in test_cache.py.

      The solution that I've come up with is to add a module-scope fixture that simply sets the cache to None after tests have run. I've tested and all tests pass after this is done. Even without the workaround we do. Hence we can now do llm = ChatOpenAI(model="gpt-4o") without further kwargs.

      Only the one in integration_tests has bit us so far, but I'll make the same safeguard

            Assignee:
            casey.clements@mongodb.com Casey Clements
            Reporter:
            casey.clements@mongodb.com Casey Clements
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: