[DOCS-12257] [Spark] Explain how to query Mongo Date fields Created: 31/May/17  Updated: 25/May/21  Resolved: 25/May/21

Status: Closed
Project: Documentation
Component/s: Spark Connector
Affects Version/s: None
Fix Version/s: None

Type: Improvement Priority: Minor - P4
Reporter: Georgios Andrianakis Assignee: Unassigned
Resolution: Won't Do Votes: 0
Labels: sp-docs
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Participants:
Days since reply: 2 years, 37 weeks, 1 day ago
Epic Link: DOCSP-6205

 Description   

Greetings,

I am using Mongo-Spark in order to query a collection that contains an index on a field that is of type Date.
Despite my efforts I have not been able to use Spark Dataframes in order to perform a filtering operation on the aforementioned field.
All the errors I have been receiving are conversion type errors.

I believe that it would be very helpful to have example in the documentation showing how to use such a query.

Thank you very much,
George



 Comments   
Comment by Anthony Sansone (Inactive) [ 25/May/21 ]

This ticket has been closed due to age and inactivity. Please file a new ticket with recent details if needed. Thank you.

Generated at Thu Feb 08 08:04:47 UTC 2024 using Jira 9.7.1#970001-sha1:2222b88b221c4928ef0de3161136cc90c8356a66.