How to Build Multi-Region S3 Storage with Selectel Using Python and Boto3
Introduction
In today’s IT landscape, ensuring your data is resilient to disasters and always available is more important than ever. Multi-region object storage is a powerful approach to achieve high availability, disaster recovery, and reduced latency for distributed applications. In this guide, we’ll explore how to leverage Selectel’s multi-region S3-compatible object storage using Python and the popular boto3 library. Whether you’re a developer, DevOps engineer, or just getting started with cloud storage, this tutorial will help you set up, manage, and replicate data across regions like Moscow and Saint Petersburg with ease.
Table of Contents
- Overview of Selectel Object Storage Regions
- Quick Setup in the Selectel Control Panel
- Configuring boto3 Clients for Multiple Regions
- Creating Buckets in Different Regions
- Listing Objects Across Regions
- Copying Objects Between Regions
- End-to-End Example: Multi-Region Storage Workflow
- Conclusion & Tips
Overview of Selectel Object Storage Regions
Selectel offers S3-compatible object storage in multiple regions, currently including:
- Saint Petersburg (
ru-1
): Hosted in the Dubrovka data center group, comprising three independent Tier III data centers with robust power, networking, and cooling systems. - Moscow (
gis-1
andru-7
): Located in the Berzarina data center group, with each pool on separate, independent floors, also Tier III certified.
Note: The gis-1
pool is compliant with Russian government information security regulations, making it suitable for sensitive or regulated data.
Selectel is also expanding with a new Tier IV data center in Moscow, featuring advanced cooling and power for high-performance workloads.
Quick Setup in the Selectel Control Panel
Before diving into code, let’s outline the essential steps to prepare your storage environment via the Selectel web interface:
- Log in to the Selectel Control Panel.
- Navigate to Object Storage.
- Choose your desired region and click Create Container.
- Select the vHosted addressing type. For sensitive data, choose a private container; for public access, select public.
- Repeat these steps for each region you want to use (
ru-1
,gis-1
,ru-7
). - Create a service user with the Object Storage Administrator role and grant access to the relevant project.
- Generate an S3 Access Key and Secret Key. Save these credentials securely—they are shown only once!
Configuring boto3 Clients for Multiple Regions
To interact with different Selectel regions programmatically, you’ll need to specify the correct endpoint for each region. Here’s a Python function to create an S3 client for a given region:
import boto3
def get_s3_client(region):
endpoints = {
'ru-1': 'https://s3.ru-1.storage.selcloud.ru',
'gis-1': 'https://s3.gis-1.storage.selcloud.ru',
'ru-7': 'https://s3.ru-7.storage.selcloud.ru'
}
return boto3.client(
's3',
endpoint_url=endpoints[region],
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
Replace 'YOUR_ACCESS_KEY'
and 'YOUR_SECRET_KEY'
with your actual credentials.
Creating Buckets in Different Regions
A bucket (container) in Selectel is uniquely named and accessible via a domain like bucketname.s3.storage.selcloud.ru
. You can create buckets via the API as follows:
s3_ru1 = get_s3_client('ru-1')
s3_gis1 = get_s3_client('gis-1')
s3_ru7 = get_s3_client('ru-7')
bucket_name_ru1 = 'my-bucket-ru1'
bucket_name_gis1 = 'my-bucket-gis1'
bucket_name_ru7 = 'my-bucket-ru7'
# Create buckets in each region
s3_ru1.create_bucket(Bucket=bucket_name_ru1)
s3_gis1.create_bucket(Bucket=bucket_name_gis1)
s3_ru7.create_bucket(Bucket=bucket_name_ru7)
print('Buckets created successfully in all regions.')
Listing Objects Across Regions
To see what files are stored in each region, use the following helper function:
def list_objects(s3_client, bucket_name):
response = s3_client.list_objects_v2(Bucket=bucket_name)
return [obj['Key'] for obj in response.get('Contents', [])]
print('Files in ru-1:', list_objects(s3_ru1, bucket_name_ru1))
print('Files in gis-1:', list_objects(s3_gis1, bucket_name_gis1))
print('Files in ru-7:', list_objects(s3_ru7, bucket_name_ru7))
This allows you to easily monitor your distributed storage.
Copying Objects Between Regions
To replicate or move data between regions (for backup or disaster recovery), you can use the copy_object
method:
file_name = 'example.txt'
copy_source = {'Bucket': bucket_name_ru1, 'Key': file_name}
s3_gis1.copy_object(CopySource=copy_source, Bucket=bucket_name_gis1, Key=file_name)
print(f'File {file_name} copied from ru-1 to gis-1')
This ensures your data is available even if one region becomes unavailable.
End-to-End Example: Multi-Region Storage Workflow
Here’s a complete example that checks for bucket existence, creates them if needed, and prepares clients for all regions:
import boto3
import botocore.exceptions
def get_s3_client(region):
endpoints = {
'ru-1': 'https://s3.ru-1.storage.selcloud.ru',
'gis-1': 'https://s3.gis-1.storage.selcloud.ru',
'ru-7': 'https://s3.ru-7.storage.selcloud.ru'
}
return boto3.client(
's3',
endpoint_url=endpoints[region],
region_name=region,
aws_access_key_id="YOUR_ACCESS_KEY",
aws_secret_access_key="YOUR_SECRET_KEY"
)
def bucket_exists(s3_client, bucket_name):
try:
s3_client.head_bucket(Bucket=bucket_name)
return True
except botocore.exceptions.ClientError as e:
if e.response['Error']['Code'] == '404':
return False
print(f'Error checking bucket {bucket_name}: {e}')
return False
def create_bucket_if_not_exists(s3_client, bucket_name, region):
if bucket_exists(s3_client, bucket_name):
print(f'Bucket {bucket_name} already exists.')
return
try:
s3_client.create_bucket(
Bucket=bucket_name,
CreateBucketConfiguration={'LocationConstraint': region}
)
print(f'Bucket {bucket_name} created in region {region}.')
except Exception as e:
print(f'Error creating bucket {bucket_name}: {e}')
# Set up clients and buckets
s3_clients = {
'ru-1': get_s3_client('ru-1'),
'gis-1': get_s3_client('gis-1'),
'ru-7': get_s3_client('ru-7'),
}
buckets = {
'ru-1': 'habr-ru-1',
'gis-1': 'habr-gis-1',
'ru-7': 'habr-ru-7',
}
for region, s3_client in s3_clients.items():
create_bucket_if_not_exists(s3_client, buckets[region], region)
print('All buckets are ready.')
Conclusion & Tips
Multi-region object storage with Selectel empowers you to build robust, highly available, and low-latency applications. By combining Python and boto3, you can automate bucket management, data replication, and disaster recovery strategies across regions.
Tips:
- Always secure your access keys and never hard-code them in production code.
- Regularly test your disaster recovery procedures by simulating region failures.
- Monitor storage usage and access patterns to optimize costs and performance.
Ready to try multi-region storage with Selectel? Share your experiences or questions in the comments below!