Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have this scenarios happening in my bucket, I have file called red.dat in my storage and this file will be updating regularly by jenkins once this file has been update I trigger event to deploy this red.dat file, I want check md5 hash of the file before and after update and if the value is different only do the deployment

this is how I upload the file to GCS

gsutil cp red.dat gs://example-bucket

and I have tried this command to get hash

gsutil hash -h gs://example-bucket/red.dat

and the result is this

Hashes [hex] for red.dat:
    Hash (crc32c):      d4c9895e
    Hash (md5):     732b9e36d945f31a6f436a8d19f64671

but I'm little confused how I can implement to compare md5 before and after update since the file is always gonna be stay remote location(GCS). I would like some advice or show me right direction to achieve this, solution in commands or ansible is fine

question from:https://stackoverflow.com/questions/65947159/cloud-storage-how-to-check-md5-on-object

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
1.0k views
Welcome To Ask or Share your Answers For Others

1 Answer

You can use the gsutil hash command on the local file, and then compare the output with what you saw from gsutil hash against the cloud object:

gsutil hash red.dat

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...