[COMPASS-7337] Unable to decompress log file while Compass is still running Created: 11/Oct/23  Updated: 27/Oct/23  Resolved: 18/Oct/23

Status: Closed
Project: Compass
Component/s: Logging
Affects Version/s: None
Fix Version/s: No version

Type: Bug Priority: Minor - P4
Reporter: Jeffrey Yemin Assignee: Unassigned
Resolution: Works as Designed Votes: 1
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified
Environment:

OS: OSX
Compass Version 1.40.3


Documentation Changes: Not Needed

 Description   

Problem Statement/Rationale

What is going wrong? What action would you like the Engineering team to take?

The "current" Compass log file is not readable until Compass is shut down.

Please be sure to attach relevant logs with any sensitive data redacted.
How to retrieve logs for: Compass; Shell

Steps to Reproduce

How could an engineer replicate the issue you’re reporting?

  • Help => Open Log File
  • Copy to Clipboard
  • Open a terminal and gunzip the path from the clipboard.  gunzip fails with an error
  • Shutdown Compass
  • gunzip again.  Now it succeeds

Expected Results

What do you expect to happen?

The log file can be uncompressed while Compass is still running

Actual Results

What do you observe is happening?

I get an error:

~/.mongodb/compass$ gunzip 65271050f5e1ead47a2a48d6_log.gz
gunzip: 65271050f5e1ead47a2a48d6_log.gz: unexpected end of file
gunzip: 65271050f5e1ead47a2a48d6_log.gz: uncompress failed

After shutting down Compass, I can uncompress the file.

Additional Notes

Any additional information that may be useful to include.



 Comments   
Comment by Le Roux Bodenstein [ 12/Oct/23 ]

anna.henningsen@mongodb.com recently explained this to me: https://mongodb.slack.com/archives/G2L10JAV7/p1695118200772879?thread_ts=1695025789.325329&cid=G2L10JAV7

> Sorry to disappoint, but no, they’re truncated, not corrupt.

> That the gzip CLI on macOS can’t unzip them is the tool’s fault (and I wouldn’t be surprised if there was actually some way of overriding this behavior, I think it’s just extra integrity checks that get in the way), not Compass’s. GNU gzip and Node.js can handle these files just fine.
We do flush the zlib stream after each write so that we don’t end up with partially written log lines: https://github.com/mongodb-js/mongodb-log-writer/blob/495a6c2bcc89c7413ba01a53cba4d53c3c2edec6/src/index.ts#L419

> This is kind of an inherent problem with using compression on files that are still being written at the time where they were copied/uploaded/etc.; we do cleanly close the files when Compass exits, but that’s not something that users wait for a lot of the time.

>> Can we be sure these files contain every line?
> We can be as sure as we would be if these files were plaintext, yes.

>> how compression would flush all output yet still do compression
> Yeah, it does degrade the compression quality a bit. With Z_SYNC_FLUSH, zlib ensures that all data that is necessary to decode the stream is sent to the filesystem, so it writes more bytes than it otherwise might have. But unlike Z_FULL_FLUSH, it doesn’t fully reset compression; a decompressor is expected to have all data from the beginning of the compressed file still available, so that it can use that prior context for decompression

The way we usually uncompress these:

node -e 'fs.createReadStream("650863c9105154a02a63e8d4_log.gz").pipe(zlib.createGunzip()).pipe(process.stdout)'

Generated at Wed Feb 07 22:46:13 UTC 2024 using Jira 9.7.1#970001-sha1:2222b88b221c4928ef0de3161136cc90c8356a66.