• Dan Layman
    0
    I move files from different AWS S3 accounts regularly. Among those are compressed tar files (.tar.gz). It wasn't a problem, until we started using the metadata tag "Content-Encoding" to make it easy for a browser/user.

    Now, when I move from S3 to S3 there isn't an issue, but if they copy/move through the local computer they become corrupt or decompressed on the other end. Corruption appears when the file is large >2GB and contains compiled/unreadable data.

    Basic test: Make two 100MB text files, tar and compress them. Upload to S3 using tool, add metadata tag "Content-Encoding" with a value of gzip. Download back to computer and the file is uncompressed now. If I remove the tag, this wont occur.

    Is there anyway to tell the tool not to modify the file because of the metadata?
    Thanks in advance.
    Cloudberry PRO w/maint Build 6.6.0.3
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment