[Langchain] Reset global cache to None after test_cache

XMLWordPrintableJSON

    • Type: Bug
    • Resolution: Fixed
    • Priority: Unknown
    • langchain-mongodb-0.6
    • Affects Version/s: None
    • Component/s: None
    • None
    • None
    • Python Drivers
    • Not Needed
    • None
    • None
    • None
    • None
    • None
    • None

      We've noticed features requiring an LLM doing funny things. The workaround has been to explicitly set the cache to False as a kwarg. This isn't necessary though.

      The problem is being introduced in test_cache.py.

      The solution that I've come up with is to add a module-scope fixture that simply sets the cache to None after tests have run. I've tested and all tests pass after this is done. Even without the workaround we do. Hence we can now do llm = ChatOpenAI(model="gpt-4o") without further kwargs.

      Only the one in integration_tests has bit us so far, but I'll make the same safeguard

            Assignee:
            Casey Clements
            Reporter:
            Casey Clements
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: