Using the AWS CLI to Collect Amazon Elastic Block Store (EBS) Information

Share on:

Most of my recent blogs have been focused on not just how to “do” something, but provide some information on how things work. This specific blog will cover how we can use the AWS CLI to gather information and format it in a way that is useful. This collection will provide not only information about an EBS volume, but also the instance that is attached, identify the boot volumes and even provide the instance type.

You may be asking why is this important? Well theres many things that go into this. It may be useful to understand the instances that you have within your environment and the types of disks. Maybe you want to move from one storage type to another and want to make sure the instances are sized appropriately.

In my case here at Pure Storage, we help optimize customers cloud storage spend by providing Pure Cloud Block Store. When an EC2 VM uses Cloud Block Store, it can be used for all of the data volumes–the boot volume will still live on EBS.

My initial use case of this script was to gather all of the existing EBS Volumes and identify which disk was allocated as the root or boot volume so these could be excluded when it comes to sizing.

There are two versions of this script. The first will gather all volumes and instances for a specific region, the second will gather all volumes and instances for all regions.

The first example pulls in details for one specific region for both instances and volumes and presents them in a consolidated view for the multiple queries.

#! /bin/bash
print_help() {
echo Usage:
echo -e " get-all-ebs region|help [aws-profile]"
exit $1
}
if [ "$#" -eq 0 ] || [ "$1" == "help" ]; then
print_help 0
elif [[ ! -z "$2" ]]; then
_profile="--profile $2"
fi
_region="$1"
_volumes_file="/tmp/volumes-$_region.txt"
_sorted_volumes_file="/tmp/sorted-volumes-$_region.txt"
_instances_file="/tmp/instances-$_region.txt"
_volumes_and_instances="/tmp/volume-and-instance-details-$_region.txt"
echo -n > $_volumes_file
echo -n > $_sorted_volumes_file
echo -n > $_instances_file
echo -n > $_volumes_and_instances
#Gather Volumes
echo "Gathering Volumes"
aws ec2 describe-volumes --filter Name=attachment.status,Values=attached --query 'Volumes[*].{VolumeID:VolumeId,Size:Size,Type:VolumeType,Iops:Iops,Throughput:Throughput,AvailabilityZone:AvailabilityZone,State:State,Device:Attachments[0].Device,InstanceId:Attachments[0].InstanceId}' $_profile --region $_region --output text --no-cli-pager > $_volumes_file
sort --reverse $_volumes_file > $_sorted_volumes_file
# put all the instance ids into a string variable
aws_instances="$(cat $_sorted_volumes_file | cut -f 3)"
# convert the string variable to an array
aws_instances_array=($aws_instances)
echo "Gathering Instances"
if [ ${#aws_instances_array[@]} -ne 0 ]; then
# get the VpcId and SubnetId for each instance id in the aws_instances_array saved to /tmp/instances_vpc_subnet-$_region.txt
for (( i=0; i<${#aws_instances_array[@]}; i++ )); do aws ec2 describe-instances --instance-ids "${aws_instances_array[$i]}" --output text --query 'Reservations[*].Instances[*].{Type:InstanceType,RootDevice:RootDeviceName}' $_profile --region $_region >> $_instances_file; done
fi
# combine the columns from /tmp/instances-$_region.txt and /tmp/instances_vpc_subnet-$_region.txt into /tmp/ebs-volume-details-$_region.txt
paste $_sorted_volumes_file $_instances_file > $_volumes_and_instances
rm -rf $_regional_vpcs_subnets_file $_sorted_regional_instances_file $_regional_instances_file
echo "results located in $_volumes_and_instances"
cat $_volumes_and_instances
exit 0

Here is an example output, no Header View, however you can see what each column is based on the Query Fields in the AWS CLI command.

If you are looking to pull in all details for all regions in your account you can use the following code.

Huge thanks to Vincent Lee at Ahead who provided some samples I was able to modify to meet the requirements.

This way to collect data via CLI and combine it via files is new to me, but is definitely something I will continue to use. Looking forward to seeing how else we can expand on collecting info and hopefully it these queries will come to the AWS Powershell Module soon too!

See Also