Uploaded image for project: 'Python Integrations'
  1. Python Integrations
  2. INTPYTHON-314

ai-ml-pipeline-testing: test-llama-index in nightly build

    • Type: Icon: Task Task
    • Resolution: Fixed
    • Priority: Icon: Unknown Unknown
    • None
    • Affects Version/s: None
    • Component/s: AI/ML
    • None
    • Python Drivers
    • Not Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      2. Would you like the user to see examples of the syntax and/or executable code and its output?
      3. Which versions of the driver/connector does this apply to?

      Show
      1. What would you like to communicate to the user about this feature? 2. Would you like the user to see examples of the syntax and/or executable code and its output? 3. Which versions of the driver/connector does this apply to?

      Context

      NIghtly builds have been passing recently. LlamaInded failed overnight with a problem we have encountered before. We would get a response back, but it would not contain the number of documents that we requested. 

       

      Failing build: https://spruce.mongodb.com/version/6621b07d2e272f0007c8a352/tasks?sorts=STATUS%3AASC%3BBASE_STATUS%3ADESC

       

      Definition of done

      For now, we update the llama-index code (which has not yet been merged upstream) to continue requesting until we get the number we expect.

      Pitfalls

      The solution - to continue requesting until we get the number we expect - does NOT leading to a GOOD USER EXPERIENCE.

            Assignee:
            casey.clements@mongodb.com Casey Clements
            Reporter:
            casey.clements@mongodb.com Casey Clements
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: