-
Type: Bug
-
Resolution: Gone away
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Labels:None
Hi
Ref: https://jira.mongodb.org/browse/SPARK-387
****************************************************
While connecting Timeseries data through a spark connector, it's giving the following error
Error
py4j.protocol.Py4JJavaError: An error occurred while calling o32.showString.
: com.mongodb.spark.sql.connector.exceptions.MongoSparkException: Partitioning failed.
at com.mongodb.spark.sql.connector.read.partitioner.PartitionerHelper.generatePartitions(PartitionerHelper.java:69)
......
......
Caused by: org.bson.BsonInvalidOperationException: Document does not contain key count
at org.bson.BsonDocument.throwIfKeyAbsent(BsonDocument.java:870)
*****************************************************
Connectors:
Tried with both connectors.
mongo-spark-connector_2.13-10.1.1-all.jar and mongo-spark-connector_2.12-10.1.1-all.jar
**************************************************
other scenarios:
The same script is working fine for a non-timeseries collection.
***************************************************
Attachment:
a. Spark2.12 error dump
b. Spark2.13 error dump
c. sample pyspark script used to test.