Skip to content

Instantly share code, notes, and snippets.

@gchait
Last active March 27, 2026 15:46
Show Gist options
  • Select an option

  • Save gchait/6913433598258e13c8844149e00f4b0e to your computer and use it in GitHub Desktop.

Select an option

Save gchait/6913433598258e13c8844149e00f4b0e to your computer and use it in GitHub Desktop.
List S3 bucket sizes and object counts via CloudWatch
#!/bin/bash -eu
set -o pipefail
START=$(date -u -d "3 days ago" +%FT%TZ)
END=$(date -u +%FT%TZ)
get_metric() {
local bucket="${1}"
local region="${2}"
local metric="${3}"
local storage="${4}"
aws cloudwatch get-metric-statistics \
--namespace AWS/S3 \
--metric-name "${metric}" \
--dimensions Name=BucketName,Value="${bucket}" Name=StorageType,Value="${storage}" \
--statistics Average \
--start-time "${START}" \
--end-time "${END}" \
--period 86400 \
--region "${region}" \
--query 'sort_by(Datapoints, &Timestamp)[-1].Average' \
--output text 2> /dev/null || echo "None"
}
printf "%-63s %-12s %20s %15s\n" "BUCKET" "REGION" "SIZE" "OBJECTS"
while read -r bucket; do
[ -z "${bucket}" ] && continue
region=$(aws s3api get-bucket-location \
--bucket "${bucket}" \
--query LocationConstraint \
--output text)
[[ "${region}" == "None" || "${region}" == "null" ]] && region="us-east-1"
size=$(get_metric "${bucket}" "${region}" BucketSizeBytes StandardStorage)
count=$(get_metric "${bucket}" "${region}" NumberOfObjects AllStorageTypes)
[[ "${size}" != "None" ]] && size=$(printf "%.0f" "${size}")
[[ "${count}" != "None" ]] && count=$(printf "%.0f" "${count}")
if [[ "${size}" != "None" ]]; then
hr=$(numfmt --to=iec --suffix=B "${size}" 2> /dev/null || echo "${size}")
else
hr="N/A"
fi
printf "%-63s %-12s %20s %15s\n" \
"${bucket}" "${region}" "${hr}" "${count:-0}"
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment