-
Type: Improvement
-
Resolution: Unresolved
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Labels:
This is a suggested change I have, currently we have numerous tests in evergreen which upload artifacts of any size as long as they zip the artifact within 15 minutes. In my testing I found we uploaded a 12.5GB artifact for one of our perf tests.
To prevent future tests uploading large artifacts we can consider adding a test killing stage to the 'upload artifact' function in the evergreen.yml.
I have a suggested change;
# This script fails a test if it generates an artifact over 15GB # Some tests, e.g. perf-test-long-500m-btree-populate create large # artifacts. - command: shell.exec params: script: | echo "Echoing size of archive: " ls -lh ${upload_filename|wiredtiger.tgz} filesize=$(stat -c %s ${upload_filename|wiredtiger.tgz}) if [ "$filesize" -gt 15000000000 ]; then echo "Failing upload on artifacts > 15gb" exit 1 fi
Before this goes in some discussion is required which is partly why I am making this ticket, additionally a reasonable limit would need to be agreed on and test that overrun that limit may need to be modified.
The limit itself is hard to define as it is dependent on task frequency, I am proposing a hard limit without worrying about frequency.
cc: luke.chen@mongodb.com what do you think?