SlideShare ist ein Scribd-Unternehmen logo
1 von 32
Introduction to
Amazon S3
Ashay Shirwadkar
Agenda
What is Amazon S3?
Storage Classes
Namespace
Security
Server Side encryption
Access Control
S3 APIs
But before that...
- Cloud computing, also on-demand computing, is a kind of
Internet-based computing that provides shared processing
resources.
- Resources being
- Networks
- Servers
- Storage
- Applications and services
What?
Why?
- The term cloud is used as a metaphor for the Internet.
- So it means nothing. Just a nice word and now it's hot....
Types of storage
Client Server
Object
NFS/SMB/rsynciSCSI/AoE/Fiber ChannelREST APIs
What is amazon S3?
Amazon S3 is acronym stands for Simple Storage Service .
S3 is web store, not a file system, it’s simple write once, read many (WORM) object
store having eventual consistency.
“Write once” means that an object cannot be changed after it is written, and “read many” means that
multiple copies of the object are made across different availability zones.
S3 is secure, durable & highly-scalable. It is accessed Via API’s (SOAP and REST)
Server side encryption
Data is stored with 99.999999999% durability
Stores data ranging from 1B to 5TB
A bedrock architectural component for many applications
Dropbox, Bitcasa, and Tahoe-LAFS-on-S3, among others, use S3 for online backup and synchronization
services.Tumblr, Spotify, and Pinterest host media on S3.
Consistancy
Durability
Load Balancers
Web Servers
Storage
StorageIndexing Indexing
Web Servers
Load Balancers
Region
Availability Zone Availability Zone
Cloud Storage Classes
Standard
Reduced
Redundancy
Storage
Glacier
● Designed to provide high
durability and high availability
● Designed to sustain
concurrent loss of data in two
availability zone.
● Objects you want to have
high durability.
● E.g. Master copy of a movie
media
● Designed to provide lesser
redundancy with availability.
● Reduces cost by storing data at
lower level of redundancy than
in standard storage
● Objects you can afford to lose or
can recreate.
● E.g. Different encoding of movie
media.
● Suitable for archiving data,
where data access is infrequent
and retrieval time of several
hours is acceptable.
● Uses very low cost amazon
glacier service, but managed
through s3.
● Objects you want to put in
archive ( Rare Use).
● E.g. Digital archive of old movie
media.
Namespaces
The S3 consists of Buckets and Objects. In a single bucket we can have multiple
Objects.
Globally Unique
bucket name + object name (key) => Uniquely identify each object in a S3 cloud. Every object can be
addressed through bucket and key combination.
Buckets are similar to a directories
Object Name has to be unique within the bucket
Max 1024 bytes UTF-8
Can have ‘path’ prefix
Namespaces
Amazon s3
johns-docshare userdocs src
drafts/rpt.doc style.css img/icon.ico swf/flash.swf user/foo.c user/bar.c
Security
S3 provides regional service
Data never leaves region unless you move it
Server Side Encryption
Automatic encryption of data at rest
Strong AES-256
Enabled using simple PUT Header
Self managed i.e no need to manage key store
Server Side Encryption
Object
Bucket
Encrypted
Data
Encrypted
per-object key
per-object
key
Key Management
(monthly rotated)
Master Key
Encrypted Object
Access Control
S3 provides Policies, ACL’s and IAM (Identity and Access Management)
Use these to define rules for sharing objects or buckets
IAM Bucket Policies ACL’s
● Fine Grained
● Provide Role based
Access
● Apply policies at role,
user and group level.
Allow
Actions:
PutObject
Resource:
arn:aws:s3:::mybucket/*
Bob John
Allow
Bob, John
Actions:
PutObject
Resource:
arn:aws:s3:::mybucket/*
My bucket
● Fine Grained
● Apply Policies on bucket from
AWS console.
● Incorporate user restrictions
without using IAM
● Coarse Grained
● Apply access control at object
or bucket level.
Allow
Bob, John
Actions:
Read
My bucket My Object
S3 API
Accessible through SOAP and REST API’s
In S3, The operations can be divided into 3 categories
- Operations on Service/s.
Get list of all buckets owned by the authenticated sender of the request.
- Operations on Bucket/s.
- Operations on Object/s.
User must have Access Key and Secret Access Key
- Provide Temporary access to services
- Keys can be generated through IAM.
String to sign
- Every request has different string to sign.
S3 Authentication - Client side
GET /foo/bar.jpg HTTP/1.1
Host: johnsmith.s3.amazonaws.com
Date: Mon, 26 Mar 2007 19:37:58 +0000
Request
● Create Request
● Create HMAC-SHA1
Signature
GETn
n
n
Mon, 26 Mar 2007 19:37:58 +0000n
/johnsmith/foo/bar.jpg
String to Sign StringToSign = HTTP-Verb + "n" +
Content-MD5 + "n" +
Content-Type + "n" +
Date + "n" +
Canonicalized Amz
Headers +
CanonicalizedResource;
String to Sign Format
. . .
String to Sign
Secret Access
Key
HMAC
calculation
and Base64
Encoding
Your
Signature
● Send Request
GET /foo/bar.jpg HTTP/1.1
Host: johnsmith.s3.amazonaws.com
Date: Mon, 26 Mar 2007 19:37:58 +0000
Authentication: AWS Access Key:Signature
Request
S3 Authentication - Server side
● Retrieve Access
Key
● Create HMAC-SHA1
Signature
. . .
String to Sign
Secret Access
Key
HMAC
calculation
and Base64
Encoding
Calculated
Signature
● Compare Two
Signatures
GET /foo/bar.jpg HTTP/1.1
Host: johnsmith.s3.amazonaws.com
Date: Mon, 26 Mar 2007 19:37:58 +0000
Authentication: AWS Access Key:Signature
Request
Secret Access
Key
Get Access
key
Get Secret
Access key
Calculated
Signature
Your
Signature
Yes: Request is authenticated
No: Request authentication fails
Operations on Buckets
Standard Operations
Put Bucket - Creates bucket if does not exist.
Get Bucket - List all the objects within the bucket.
Delete Bucket - Deletes the bucket. All the objects within the buckets must be deleted.
Other operations
Bucket lifecycle configuration - Set the lifecycle of objects within the bucket
Bucket policies - Set policies on bucket
Bucket location - Set the location of bucket
Bucket notification - Receive notifications when certain events happen in your bucket
Bucket logging - Enable logging for a bucket
Bucket request Payment - Returns the request payment configuration of a bucket.
Bucket versioning - Enable versioning of objects within the bucket
Operations on Objects
Standard Operations
Put Object - Creates Object.
Post Object- POST is an alternate form of PUT that enables browser-based uploads.
Get Object - Gets object along with its metadata.
Head object - Gets only metadata.
Delete Object - Deletes the Object.
Multipart Upload
Upload a single object as a set of parts.
Each part is a contiguous portion of the object's data.
Upload for objects from 5 MB to 5 TB in size.
Other operations
Object ACL’s - set the ACL permissions for an object that already exists in a bucket.
Object Copy - Creates a copy of an object
Multipart Upload
Initiate Multipart Upload
Initiates a multipart upload and returns an upload ID.
Provide this upload ID in each subsequent upload part requests.
POST /example-object?uploads HTTP/1.1
Host: example-bucket.s3.amazonaws.com
Date: Mon, 1 Nov 2010 20:34:56 GMT
Authorization: authorization string
Request
HTTP/1.1 200 OK
x-amz-id-2: Uuag1LuByRx9e6j5Onimru9pO4ZVKnJ2Qz7/C1NPcfTWAtRPfTaOFg==
x-amz-request-id: 656c76696e6727732072657175657374
Date: Mon, 1 Nov 2010 20:34:56 GMT
Content-Length: 197
Connection: keep-alive
Server: AmazonS3
<?xml version="1.0" encoding="UTF-8"?>
<InitiateMultipartUploadResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<Bucket>example-bucket</Bucket>
<Key>example-object</Key>
<UploadId>VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9hZA</Uplo
adId>
</InitiateMultipartUploadResult>
Responce
Multipart Upload
Upload Part
Uploads a part in a multipart upload.
PUT /example-object?partNumber=1&
uploadId=VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9h
ZA
HTTP/1.1
Host: example-bucket.s3.amazonaws.com
Date: Mon, 1 Nov 2010 20:34:56 GMT
Content-Length: 10485760
Content-MD5: pUNXr/BjKK5G2UKvaRRrOA==
Authorization: authorization string
***part data omitted***
Request
HTTP/1.1 200 OK
x-amz-id-2:
Vvag1LuByRx9e6j5Onimru9pO4ZVKnJ2QRPfTaOFg==
x-amz-request-id: 656c76696e6727732072657175657374
Date: Mon, 1 Nov 2010 20:34:56 GMT
ETag: "b54357faf0632cce46e942fa68356b38"
Content-Length: 0
Connection: keep-alive
Server: AmazonS3
Responce
Multipart Upload
Complete Multipart Upload
Completes a multipart upload by assembling previously uploaded parts.
POST /example-object?
uploadId=VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5t
MnRzIHVwbG9hZA HTTP/1.1
Host: example-bucket.s3.amazonaws.com
Date: Mon, 1 Nov 2010 20:34:56 GMT
Content-Length: 391
Authorization: authorization string
<CompleteMultipartUpload>
<Part>
<PartNumber>1</PartNumber>
<ETag>"b54357faf0632cce46e942fa68356b38"</ETag>
</Part>
<Part>
…
</Part>
</CompleteMultipartUpload>
Request
HTTP/1.1 200 OK
x-amz-id-2:
Uuag1LuByRx9e6j5Onimru9pO4ZVKnJ2Qz7/C1NPcfTWAtRPfTaOFg==
x-amz-request-id: 656c76696e6727732072657175657374
Date: Mon, 1 Nov 2010 20:34:56 GMT
Connection: close
Server: AmazonS3
<?xml version="1.0" encoding="UTF-8"?>
<CompleteMultipartUploadResult xmlns="http://s3.amazonaws.com/doc/2006-03-
01/">
<Location>http://Example-Bucket.s3.amazonaws.com/Example-Object</Location>
<Bucket>Example-Bucket</Bucket>
<Key>Example-Object</Key>
<ETag>"3858f62230ac3c915f300c664312c11f-9"</ETag>
</CompleteMultipartUploadResult>
Responce
Multipart Upload
Abort Multipart Upload
Get Object
DELETE /example-object?uploadId=VXBsb2FkIElEIGZvciBlbHZpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9hZ HTTP/1.1
Host: example-bucket.s3.amazonaws.com
Date: Mon, 1 Nov 2010 20:34:56 GMT
Authorization: authorization string
Request
GET /example-object HTTP/1.1
Host: example-bucket.s3.amazonaws.com
Date: Mon, 1 Nov 2010 20:34:56 GMT
Authorization: authorization string
Request
API Operations-
API Operations contd.
DEMO
Create Bucket
#!/bin/bash -x
bucket=$1
if [ -z "$1" ]
then
echo "usage: ./bucket_put <bucket_name>"
exit 1
fi
resource="/${bucket}/"
dateValue=`date -R`
stringToSign="PUTnnn${dateValue}n${resource}"
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac
${s3Secret} -binary | base64`
curl -v -X PUT 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
http://${bucket}.s3.amazonaws.com/
< HTTP/1.1 200 OK
< x-amz-id-2:
jWt9BVmZkL1eU/i1gRoUrsB19/RHYwHGJZdst5ttGlLx7IvFPzHDSSNFluRyDRrCewG4xoFi
oJA=
< x-amz-request-id: 9FC84EB054B018F9
< Date: Wed, 12 Aug 2015 11:52:02 GMT
< Location: /casoft
< Content-Length: 0
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
* Connection #0 to host calsoft.s3.amazonaws.com left intact
Request Responce
#!/bin/bash -x
bucket=$1
if [ -z "$1" ]
then
echo "usage: ./bucket_list <bucket_name>"
exit 1
fi
resource="/${bucket}/"
dateValue=`date -R`
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac
${s3Secret} -binary | base64`
curl -v -X GET 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
http://${bucket}.s3.amazonaws.com/
List Bucket
< HTTP/1.1 200 OK
< x-amz-id-2:
nBbw0yclRZ4jPzPEECEKI0oCRMQrdIihEXbCVuUvYdXl75CGYH3/IcsPu/jxkCJb
< x-amz-request-id: 90A0E3312B09453A
< Date: Wed, 12 Aug 2015 11:55:48 GMT
< x-amz-bucket-region: us-east-1
< Content-Type: application/xml
< Transfer-Encoding: chunked
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
<?xml version="1.0" encoding="UTF-8"?>
* Connection #0 to host calsoft.s3.amazonaws.com left intact
<ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03-
01/"><Name>calsoft</Name><Prefix></Prefix><Marker></Marker><MaxKeys>1000
</MaxKeys><IsTruncated>false</IsTruncated></ListBucketResult>
Request Responce
#!/bin/bash -x
bucket=$1
if [ -z "$1" ]
then
echo "usage: ./bucket_delete <bucket_name>"
exit 1
fi
resource="/${bucket}/"
dateValue=`date -R`
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac
${s3Secret} -binary | base64`
curl -v -X DELETE 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
https://${bucket}.s3.amazonaws.com/
Delete Bucket
< HTTP/1.1 204 No Content
< x-amz-id-2:
4uuTflJqeUnewAYGmgghfiaBf/yfdja3DE7GmC9+e0QmBmE9T+2c/Ylt19jcndrM
< x-amz-request-id: 96AEC8E1A534EC3E
< Date: Wed, 12 Aug 2015 12:06:44 GMT
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
* Connection #0 to host calsoft.s3.amazonaws.com left intact
Request Responce
#!/bin/bash -x
bucket=$1
file=$2
if [[ (-z "$1") || (-z "$2") ]]
then
echo "usage: ./object_upload <bucket_name> <object_name>"
exit 1
fi
resource="/${bucket}/${file}"
contentType="application/text"
dateValue=`date -R`
stringToSign="PUTnn${contentType}n${dateValue}n${resource}"
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -binary |
base64`
curl -v -X PUT -T "${file}" 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Content-Type: ${contentType}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
https://${bucket}.s3.amazonaws.com/${file}
Create Object
< HTTP/1.1 100 Continue
* We are completely uploaded and fine
< HTTP/1.1 200 OK
< x-amz-id-2:
OzR2U7CgsWwtHVbF8qcTiIpezFk5FVt9PxoFq9Px2QP8y7L0kOR2gQysfu9/EFNjUBdqIrzs
E2o=
< x-amz-request-id: 301904328CA5B6FF
< Date: Wed, 12 Aug 2015 12:08:01 GMT
< ETag: "78d5333e735ae15f5f19f2e76838b728"
< Content-Length: 0
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
* Connection #0 to host calsoft.s3.amazonaws.com left intact
Request Responce
#!/bin/bash -x
bucket=$1
file=$2
if [[ (-z "$1") || (-z "$2") ]]
then
echo "usage: ./object_download <bucket_name>
<object_name>"
exit 1
fi
resource="/${bucket}/${file}"
dateValue=`date -R`
stringToSign="GETnnn${dateValue}n${resource}"
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -
binary | base64`
curl -v -X GET 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
https://${bucket}.s3.amazonaws.com/${file}
List Object
< HTTP/1.1 200 OK
< x-amz-id-2:
7yX1htWSKHVn+ssy32emeQoaF8WgFjRQuEio4PNzSyxjoJiPnmjcsmJvbJFZTTDx
< x-amz-request-id: 1786B98B373C737E
< Date: Wed, 12 Aug 2015 12:11:42 GMT
< Last-Modified: Wed, 12 Aug 2015 12:08:01 GMT
< ETag: "78d5333e735ae15f5f19f2e76838b728"
< Accept-Ranges: bytes
< Content-Type: application/text
< Content-Length: 12
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
Hello,World
* Connection #0 to host calsoft.s3.amazonaws.com left intact
Request Responce
#!/bin/bash -x
bucket=$1
file=$2
if [[ (-z "$1") || (-z "$2") ]]
then
echo "usage: ./object_delete <bucket_name>
<object_name>"
exit 1
fi
resource="/${bucket}/${file}"
dateValue=`date -R`
stringToSign="DELETEnnn${dateValue}n${resource}"
s3Key=’Your Access Key’
s3Secret=’Your Secret Access Key’
signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -
binary | base64`
curl -v -X DELETE 
-H "Host: ${bucket}.s3.amazonaws.com" 
-H "Date: ${dateValue}" 
-H "Authorization: AWS ${s3Key}:${signature}" 
https://${bucket}.s3.amazonaws.com/${file}
Delete Object
< HTTP/1.1 204 No Content
< x-amz-id-2:
swMz88s6IV8i3dCwP6fSuklrubABX0O7XV1jBt7fUZtCP2x86IPozq+5Usy5wE7x
< x-amz-request-id: 247E152DAB64A000
< Date: Wed, 12 Aug 2015 12:14:58 GMT
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
* Connection #0 to host calsoft.s3.amazonaws.com left intact
Request Responce
Q & A

Weitere ähnliche Inhalte

Was ist angesagt?

AWS S3 Tutorial For Beginners | Edureka
AWS S3 Tutorial For Beginners | EdurekaAWS S3 Tutorial For Beginners | Edureka
AWS S3 Tutorial For Beginners | EdurekaEdureka!
 
(STG401) Amazon S3 Deep Dive & Best Practices
(STG401) Amazon S3 Deep Dive & Best Practices(STG401) Amazon S3 Deep Dive & Best Practices
(STG401) Amazon S3 Deep Dive & Best PracticesAmazon Web Services
 
Object Storage: Amazon S3 and Amazon Glacier
Object Storage: Amazon S3 and Amazon GlacierObject Storage: Amazon S3 and Amazon Glacier
Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
 
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...Amazon Web Services
 
AWS Monitoring & Logging
AWS Monitoring & LoggingAWS Monitoring & Logging
AWS Monitoring & LoggingJason Poley
 
Elastic Load Balancing Deep Dive - AWS Online Tech Talk
Elastic  Load Balancing Deep Dive - AWS Online Tech TalkElastic  Load Balancing Deep Dive - AWS Online Tech Talk
Elastic Load Balancing Deep Dive - AWS Online Tech TalkAmazon Web Services
 
AWS S3 and GLACIER
AWS S3 and GLACIERAWS S3 and GLACIER
AWS S3 and GLACIERMahesh Raj
 
Intro to AWS: EC2 & Compute Services
Intro to AWS: EC2 & Compute ServicesIntro to AWS: EC2 & Compute Services
Intro to AWS: EC2 & Compute ServicesAmazon Web Services
 
Encryption and Key Management in AWS
Encryption and Key Management in AWSEncryption and Key Management in AWS
Encryption and Key Management in AWSAmazon Web Services
 
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...Amazon Web Services
 

Was ist angesagt? (20)

Deep Dive on Amazon S3
Deep Dive on Amazon S3Deep Dive on Amazon S3
Deep Dive on Amazon S3
 
AWS S3 Tutorial For Beginners | Edureka
AWS S3 Tutorial For Beginners | EdurekaAWS S3 Tutorial For Beginners | Edureka
AWS S3 Tutorial For Beginners | Edureka
 
(STG401) Amazon S3 Deep Dive & Best Practices
(STG401) Amazon S3 Deep Dive & Best Practices(STG401) Amazon S3 Deep Dive & Best Practices
(STG401) Amazon S3 Deep Dive & Best Practices
 
BDA311 Introduction to AWS Glue
BDA311 Introduction to AWS GlueBDA311 Introduction to AWS Glue
BDA311 Introduction to AWS Glue
 
Object Storage: Amazon S3 and Amazon Glacier
Object Storage: Amazon S3 and Amazon GlacierObject Storage: Amazon S3 and Amazon Glacier
Object Storage: Amazon S3 and Amazon Glacier
 
AWS SQS SNS
AWS SQS SNSAWS SQS SNS
AWS SQS SNS
 
Azure storage
Azure storageAzure storage
Azure storage
 
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...
AWS Storage and Database Architecture Best Practices (DAT203) | AWS re:Invent...
 
S3 Versioning.pptx
S3 Versioning.pptxS3 Versioning.pptx
S3 Versioning.pptx
 
AWS Monitoring & Logging
AWS Monitoring & LoggingAWS Monitoring & Logging
AWS Monitoring & Logging
 
Elastic Load Balancing Deep Dive - AWS Online Tech Talk
Elastic  Load Balancing Deep Dive - AWS Online Tech TalkElastic  Load Balancing Deep Dive - AWS Online Tech Talk
Elastic Load Balancing Deep Dive - AWS Online Tech Talk
 
AWS CloudFormation Masterclass
AWS CloudFormation MasterclassAWS CloudFormation Masterclass
AWS CloudFormation Masterclass
 
AWS RDS
AWS RDSAWS RDS
AWS RDS
 
AWS S3 and GLACIER
AWS S3 and GLACIERAWS S3 and GLACIER
AWS S3 and GLACIER
 
Intro to AWS Lambda
Intro to AWS Lambda Intro to AWS Lambda
Intro to AWS Lambda
 
Intro to AWS: EC2 & Compute Services
Intro to AWS: EC2 & Compute ServicesIntro to AWS: EC2 & Compute Services
Intro to AWS: EC2 & Compute Services
 
Encryption and Key Management in AWS
Encryption and Key Management in AWSEncryption and Key Management in AWS
Encryption and Key Management in AWS
 
AWS Cloud Watch
AWS Cloud WatchAWS Cloud Watch
AWS Cloud Watch
 
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...
Access Control for the Cloud: AWS Identity and Access Management (IAM) (SEC20...
 
AWS Lambda
AWS LambdaAWS Lambda
AWS Lambda
 

Ähnlich wie Introduction to Amazon S3

Deep Dive on Amazon S3
Deep Dive on Amazon S3Deep Dive on Amazon S3
Deep Dive on Amazon S3Adrian Hornsby
 
Using amazon web services with cold fusion 11
Using amazon web services with cold fusion 11Using amazon web services with cold fusion 11
Using amazon web services with cold fusion 11ColdFusionConference
 
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierDeep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAdrian Hornsby
 
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierDeep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
 
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech Talks
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech TalksDeep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech Talks
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech TalksAmazon Web Services
 
Aws meetup s3_plus
Aws meetup s3_plusAws meetup s3_plus
Aws meetup s3_plusAdam Book
 
Amazon
AmazonAmazon
Amazoniamzkz
 
Building Highly Scalable Web Applications
Building Highly Scalable Web ApplicationsBuilding Highly Scalable Web Applications
Building Highly Scalable Web ApplicationsIWMW
 
(ARC303) Pure Play Video OTT: A Microservices Architecture
(ARC303) Pure Play Video OTT: A Microservices Architecture(ARC303) Pure Play Video OTT: A Microservices Architecture
(ARC303) Pure Play Video OTT: A Microservices ArchitectureAmazon Web Services
 
Amazon S3 - Masterclass - Pop-up Loft Tel Aviv
Amazon S3 - Masterclass - Pop-up Loft Tel AvivAmazon S3 - Masterclass - Pop-up Loft Tel Aviv
Amazon S3 - Masterclass - Pop-up Loft Tel AvivAmazon Web Services
 
Get Started & Migrate Your Data to AWS (English Session)
Get Started & Migrate Your Data to AWS (English Session)Get Started & Migrate Your Data to AWS (English Session)
Get Started & Migrate Your Data to AWS (English Session)Amazon Web Services
 
AWS Innovate: Build a Data Lake on AWS- Johnathon Meichtry
AWS Innovate: Build a Data Lake on AWS- Johnathon MeichtryAWS Innovate: Build a Data Lake on AWS- Johnathon Meichtry
AWS Innovate: Build a Data Lake on AWS- Johnathon MeichtryAmazon Web Services Korea
 
ENT306 Migrating large Scale Data Sets to the Cloud
ENT306 Migrating large Scale Data Sets to the CloudENT306 Migrating large Scale Data Sets to the Cloud
ENT306 Migrating large Scale Data Sets to the CloudAmazon Web Services
 
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...Amazon Web Services
 
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...Amazon Web Services
 
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...Amazon Web Services
 

Ähnlich wie Introduction to Amazon S3 (20)

Deep Dive on Amazon S3
Deep Dive on Amazon S3Deep Dive on Amazon S3
Deep Dive on Amazon S3
 
Using amazon web services with cold fusion 11
Using amazon web services with cold fusion 11Using amazon web services with cold fusion 11
Using amazon web services with cold fusion 11
 
Amazon S3
Amazon S3Amazon S3
Amazon S3
 
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierDeep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
 
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierDeep Dive on Object Storage: Amazon S3 and Amazon Glacier
Deep Dive on Object Storage: Amazon S3 and Amazon Glacier
 
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech Talks
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech TalksDeep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech Talks
Deep Dive: Hybrid Cloud Storage with AWS Storage Gateway - AWS Online Tech Talks
 
Aws meetup s3_plus
Aws meetup s3_plusAws meetup s3_plus
Aws meetup s3_plus
 
Ingest and storage options
Ingest and storage optionsIngest and storage options
Ingest and storage options
 
Amazon
AmazonAmazon
Amazon
 
Building Highly Scalable Web Applications
Building Highly Scalable Web ApplicationsBuilding Highly Scalable Web Applications
Building Highly Scalable Web Applications
 
(ARC303) Pure Play Video OTT: A Microservices Architecture
(ARC303) Pure Play Video OTT: A Microservices Architecture(ARC303) Pure Play Video OTT: A Microservices Architecture
(ARC303) Pure Play Video OTT: A Microservices Architecture
 
Amazon S3 - Masterclass - Pop-up Loft Tel Aviv
Amazon S3 - Masterclass - Pop-up Loft Tel AvivAmazon S3 - Masterclass - Pop-up Loft Tel Aviv
Amazon S3 - Masterclass - Pop-up Loft Tel Aviv
 
Get Started & Migrate Your Data to AWS (English Session)
Get Started & Migrate Your Data to AWS (English Session)Get Started & Migrate Your Data to AWS (English Session)
Get Started & Migrate Your Data to AWS (English Session)
 
Amazed by AWS Series #4
Amazed by AWS Series #4Amazed by AWS Series #4
Amazed by AWS Series #4
 
AWS Innovate: Build a Data Lake on AWS- Johnathon Meichtry
AWS Innovate: Build a Data Lake on AWS- Johnathon MeichtryAWS Innovate: Build a Data Lake on AWS- Johnathon Meichtry
AWS Innovate: Build a Data Lake on AWS- Johnathon Meichtry
 
ENT306 Migrating large Scale Data Sets to the Cloud
ENT306 Migrating large Scale Data Sets to the CloudENT306 Migrating large Scale Data Sets to the Cloud
ENT306 Migrating large Scale Data Sets to the Cloud
 
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...
 
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...
 
Amazon S3: Masterclass
Amazon S3: MasterclassAmazon S3: Masterclass
Amazon S3: Masterclass
 
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...
Deep Dive on S3 Storage Management Covering New Feature Announcements - Decem...
 

Kürzlich hochgeladen

The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 
Clustering techniques data mining book ....
Clustering techniques data mining book ....Clustering techniques data mining book ....
Clustering techniques data mining book ....ShaimaaMohamedGalal
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
How To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsHow To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsAndolasoft Inc
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdfWave PLM
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsJhone kinadey
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionSolGuruz
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️Delhi Call girls
 
Active Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfActive Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfCionsystems
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...panagenda
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 

Kürzlich hochgeladen (20)

The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Clustering techniques data mining book ....
Clustering techniques data mining book ....Clustering techniques data mining book ....
Clustering techniques data mining book ....
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
How To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsHow To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.js
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial Goals
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with Precision
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
Exploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the ProcessExploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the Process
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
Active Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdfActive Directory Penetration Testing, cionsystems.com.pdf
Active Directory Penetration Testing, cionsystems.com.pdf
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 

Introduction to Amazon S3

  • 2. Agenda What is Amazon S3? Storage Classes Namespace Security Server Side encryption Access Control S3 APIs
  • 3. But before that... - Cloud computing, also on-demand computing, is a kind of Internet-based computing that provides shared processing resources. - Resources being - Networks - Servers - Storage - Applications and services What? Why? - The term cloud is used as a metaphor for the Internet. - So it means nothing. Just a nice word and now it's hot....
  • 4. Types of storage Client Server Object NFS/SMB/rsynciSCSI/AoE/Fiber ChannelREST APIs
  • 5. What is amazon S3? Amazon S3 is acronym stands for Simple Storage Service . S3 is web store, not a file system, it’s simple write once, read many (WORM) object store having eventual consistency. “Write once” means that an object cannot be changed after it is written, and “read many” means that multiple copies of the object are made across different availability zones. S3 is secure, durable & highly-scalable. It is accessed Via API’s (SOAP and REST) Server side encryption Data is stored with 99.999999999% durability Stores data ranging from 1B to 5TB A bedrock architectural component for many applications Dropbox, Bitcasa, and Tahoe-LAFS-on-S3, among others, use S3 for online backup and synchronization services.Tumblr, Spotify, and Pinterest host media on S3.
  • 7. Durability Load Balancers Web Servers Storage StorageIndexing Indexing Web Servers Load Balancers Region Availability Zone Availability Zone
  • 8. Cloud Storage Classes Standard Reduced Redundancy Storage Glacier ● Designed to provide high durability and high availability ● Designed to sustain concurrent loss of data in two availability zone. ● Objects you want to have high durability. ● E.g. Master copy of a movie media ● Designed to provide lesser redundancy with availability. ● Reduces cost by storing data at lower level of redundancy than in standard storage ● Objects you can afford to lose or can recreate. ● E.g. Different encoding of movie media. ● Suitable for archiving data, where data access is infrequent and retrieval time of several hours is acceptable. ● Uses very low cost amazon glacier service, but managed through s3. ● Objects you want to put in archive ( Rare Use). ● E.g. Digital archive of old movie media.
  • 9. Namespaces The S3 consists of Buckets and Objects. In a single bucket we can have multiple Objects. Globally Unique bucket name + object name (key) => Uniquely identify each object in a S3 cloud. Every object can be addressed through bucket and key combination. Buckets are similar to a directories Object Name has to be unique within the bucket Max 1024 bytes UTF-8 Can have ‘path’ prefix
  • 10. Namespaces Amazon s3 johns-docshare userdocs src drafts/rpt.doc style.css img/icon.ico swf/flash.swf user/foo.c user/bar.c
  • 11. Security S3 provides regional service Data never leaves region unless you move it Server Side Encryption Automatic encryption of data at rest Strong AES-256 Enabled using simple PUT Header Self managed i.e no need to manage key store
  • 12. Server Side Encryption Object Bucket Encrypted Data Encrypted per-object key per-object key Key Management (monthly rotated) Master Key Encrypted Object
  • 13. Access Control S3 provides Policies, ACL’s and IAM (Identity and Access Management) Use these to define rules for sharing objects or buckets IAM Bucket Policies ACL’s ● Fine Grained ● Provide Role based Access ● Apply policies at role, user and group level. Allow Actions: PutObject Resource: arn:aws:s3:::mybucket/* Bob John Allow Bob, John Actions: PutObject Resource: arn:aws:s3:::mybucket/* My bucket ● Fine Grained ● Apply Policies on bucket from AWS console. ● Incorporate user restrictions without using IAM ● Coarse Grained ● Apply access control at object or bucket level. Allow Bob, John Actions: Read My bucket My Object
  • 14. S3 API Accessible through SOAP and REST API’s In S3, The operations can be divided into 3 categories - Operations on Service/s. Get list of all buckets owned by the authenticated sender of the request. - Operations on Bucket/s. - Operations on Object/s. User must have Access Key and Secret Access Key - Provide Temporary access to services - Keys can be generated through IAM. String to sign - Every request has different string to sign.
  • 15. S3 Authentication - Client side GET /foo/bar.jpg HTTP/1.1 Host: johnsmith.s3.amazonaws.com Date: Mon, 26 Mar 2007 19:37:58 +0000 Request ● Create Request ● Create HMAC-SHA1 Signature GETn n n Mon, 26 Mar 2007 19:37:58 +0000n /johnsmith/foo/bar.jpg String to Sign StringToSign = HTTP-Verb + "n" + Content-MD5 + "n" + Content-Type + "n" + Date + "n" + Canonicalized Amz Headers + CanonicalizedResource; String to Sign Format . . . String to Sign Secret Access Key HMAC calculation and Base64 Encoding Your Signature ● Send Request GET /foo/bar.jpg HTTP/1.1 Host: johnsmith.s3.amazonaws.com Date: Mon, 26 Mar 2007 19:37:58 +0000 Authentication: AWS Access Key:Signature Request
  • 16. S3 Authentication - Server side ● Retrieve Access Key ● Create HMAC-SHA1 Signature . . . String to Sign Secret Access Key HMAC calculation and Base64 Encoding Calculated Signature ● Compare Two Signatures GET /foo/bar.jpg HTTP/1.1 Host: johnsmith.s3.amazonaws.com Date: Mon, 26 Mar 2007 19:37:58 +0000 Authentication: AWS Access Key:Signature Request Secret Access Key Get Access key Get Secret Access key Calculated Signature Your Signature Yes: Request is authenticated No: Request authentication fails
  • 17. Operations on Buckets Standard Operations Put Bucket - Creates bucket if does not exist. Get Bucket - List all the objects within the bucket. Delete Bucket - Deletes the bucket. All the objects within the buckets must be deleted. Other operations Bucket lifecycle configuration - Set the lifecycle of objects within the bucket Bucket policies - Set policies on bucket Bucket location - Set the location of bucket Bucket notification - Receive notifications when certain events happen in your bucket Bucket logging - Enable logging for a bucket Bucket request Payment - Returns the request payment configuration of a bucket. Bucket versioning - Enable versioning of objects within the bucket
  • 18. Operations on Objects Standard Operations Put Object - Creates Object. Post Object- POST is an alternate form of PUT that enables browser-based uploads. Get Object - Gets object along with its metadata. Head object - Gets only metadata. Delete Object - Deletes the Object. Multipart Upload Upload a single object as a set of parts. Each part is a contiguous portion of the object's data. Upload for objects from 5 MB to 5 TB in size. Other operations Object ACL’s - set the ACL permissions for an object that already exists in a bucket. Object Copy - Creates a copy of an object
  • 19. Multipart Upload Initiate Multipart Upload Initiates a multipart upload and returns an upload ID. Provide this upload ID in each subsequent upload part requests. POST /example-object?uploads HTTP/1.1 Host: example-bucket.s3.amazonaws.com Date: Mon, 1 Nov 2010 20:34:56 GMT Authorization: authorization string Request HTTP/1.1 200 OK x-amz-id-2: Uuag1LuByRx9e6j5Onimru9pO4ZVKnJ2Qz7/C1NPcfTWAtRPfTaOFg== x-amz-request-id: 656c76696e6727732072657175657374 Date: Mon, 1 Nov 2010 20:34:56 GMT Content-Length: 197 Connection: keep-alive Server: AmazonS3 <?xml version="1.0" encoding="UTF-8"?> <InitiateMultipartUploadResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/"> <Bucket>example-bucket</Bucket> <Key>example-object</Key> <UploadId>VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9hZA</Uplo adId> </InitiateMultipartUploadResult> Responce
  • 20. Multipart Upload Upload Part Uploads a part in a multipart upload. PUT /example-object?partNumber=1& uploadId=VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9h ZA HTTP/1.1 Host: example-bucket.s3.amazonaws.com Date: Mon, 1 Nov 2010 20:34:56 GMT Content-Length: 10485760 Content-MD5: pUNXr/BjKK5G2UKvaRRrOA== Authorization: authorization string ***part data omitted*** Request HTTP/1.1 200 OK x-amz-id-2: Vvag1LuByRx9e6j5Onimru9pO4ZVKnJ2QRPfTaOFg== x-amz-request-id: 656c76696e6727732072657175657374 Date: Mon, 1 Nov 2010 20:34:56 GMT ETag: "b54357faf0632cce46e942fa68356b38" Content-Length: 0 Connection: keep-alive Server: AmazonS3 Responce
  • 21. Multipart Upload Complete Multipart Upload Completes a multipart upload by assembling previously uploaded parts. POST /example-object? uploadId=VXBsb2FkIElEIGZvciA2aWWpbmcncyBteS1tb3ZpZS5t MnRzIHVwbG9hZA HTTP/1.1 Host: example-bucket.s3.amazonaws.com Date: Mon, 1 Nov 2010 20:34:56 GMT Content-Length: 391 Authorization: authorization string <CompleteMultipartUpload> <Part> <PartNumber>1</PartNumber> <ETag>"b54357faf0632cce46e942fa68356b38"</ETag> </Part> <Part> … </Part> </CompleteMultipartUpload> Request HTTP/1.1 200 OK x-amz-id-2: Uuag1LuByRx9e6j5Onimru9pO4ZVKnJ2Qz7/C1NPcfTWAtRPfTaOFg== x-amz-request-id: 656c76696e6727732072657175657374 Date: Mon, 1 Nov 2010 20:34:56 GMT Connection: close Server: AmazonS3 <?xml version="1.0" encoding="UTF-8"?> <CompleteMultipartUploadResult xmlns="http://s3.amazonaws.com/doc/2006-03- 01/"> <Location>http://Example-Bucket.s3.amazonaws.com/Example-Object</Location> <Bucket>Example-Bucket</Bucket> <Key>Example-Object</Key> <ETag>"3858f62230ac3c915f300c664312c11f-9"</ETag> </CompleteMultipartUploadResult> Responce
  • 22. Multipart Upload Abort Multipart Upload Get Object DELETE /example-object?uploadId=VXBsb2FkIElEIGZvciBlbHZpbmcncyBteS1tb3ZpZS5tMnRzIHVwbG9hZ HTTP/1.1 Host: example-bucket.s3.amazonaws.com Date: Mon, 1 Nov 2010 20:34:56 GMT Authorization: authorization string Request GET /example-object HTTP/1.1 Host: example-bucket.s3.amazonaws.com Date: Mon, 1 Nov 2010 20:34:56 GMT Authorization: authorization string Request
  • 25. DEMO
  • 26. Create Bucket #!/bin/bash -x bucket=$1 if [ -z "$1" ] then echo "usage: ./bucket_put <bucket_name>" exit 1 fi resource="/${bucket}/" dateValue=`date -R` stringToSign="PUTnnn${dateValue}n${resource}" s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -binary | base64` curl -v -X PUT -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Authorization: AWS ${s3Key}:${signature}" http://${bucket}.s3.amazonaws.com/ < HTTP/1.1 200 OK < x-amz-id-2: jWt9BVmZkL1eU/i1gRoUrsB19/RHYwHGJZdst5ttGlLx7IvFPzHDSSNFluRyDRrCewG4xoFi oJA= < x-amz-request-id: 9FC84EB054B018F9 < Date: Wed, 12 Aug 2015 11:52:02 GMT < Location: /casoft < Content-Length: 0 * Server AmazonS3 is not blacklisted < Server: AmazonS3 < * Connection #0 to host calsoft.s3.amazonaws.com left intact Request Responce
  • 27. #!/bin/bash -x bucket=$1 if [ -z "$1" ] then echo "usage: ./bucket_list <bucket_name>" exit 1 fi resource="/${bucket}/" dateValue=`date -R` s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -binary | base64` curl -v -X GET -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Authorization: AWS ${s3Key}:${signature}" http://${bucket}.s3.amazonaws.com/ List Bucket < HTTP/1.1 200 OK < x-amz-id-2: nBbw0yclRZ4jPzPEECEKI0oCRMQrdIihEXbCVuUvYdXl75CGYH3/IcsPu/jxkCJb < x-amz-request-id: 90A0E3312B09453A < Date: Wed, 12 Aug 2015 11:55:48 GMT < x-amz-bucket-region: us-east-1 < Content-Type: application/xml < Transfer-Encoding: chunked * Server AmazonS3 is not blacklisted < Server: AmazonS3 < <?xml version="1.0" encoding="UTF-8"?> * Connection #0 to host calsoft.s3.amazonaws.com left intact <ListBucketResult xmlns="http://s3.amazonaws.com/doc/2006-03- 01/"><Name>calsoft</Name><Prefix></Prefix><Marker></Marker><MaxKeys>1000 </MaxKeys><IsTruncated>false</IsTruncated></ListBucketResult> Request Responce
  • 28. #!/bin/bash -x bucket=$1 if [ -z "$1" ] then echo "usage: ./bucket_delete <bucket_name>" exit 1 fi resource="/${bucket}/" dateValue=`date -R` s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -binary | base64` curl -v -X DELETE -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Authorization: AWS ${s3Key}:${signature}" https://${bucket}.s3.amazonaws.com/ Delete Bucket < HTTP/1.1 204 No Content < x-amz-id-2: 4uuTflJqeUnewAYGmgghfiaBf/yfdja3DE7GmC9+e0QmBmE9T+2c/Ylt19jcndrM < x-amz-request-id: 96AEC8E1A534EC3E < Date: Wed, 12 Aug 2015 12:06:44 GMT * Server AmazonS3 is not blacklisted < Server: AmazonS3 < * Connection #0 to host calsoft.s3.amazonaws.com left intact Request Responce
  • 29. #!/bin/bash -x bucket=$1 file=$2 if [[ (-z "$1") || (-z "$2") ]] then echo "usage: ./object_upload <bucket_name> <object_name>" exit 1 fi resource="/${bucket}/${file}" contentType="application/text" dateValue=`date -R` stringToSign="PUTnn${contentType}n${dateValue}n${resource}" s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} -binary | base64` curl -v -X PUT -T "${file}" -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Content-Type: ${contentType}" -H "Authorization: AWS ${s3Key}:${signature}" https://${bucket}.s3.amazonaws.com/${file} Create Object < HTTP/1.1 100 Continue * We are completely uploaded and fine < HTTP/1.1 200 OK < x-amz-id-2: OzR2U7CgsWwtHVbF8qcTiIpezFk5FVt9PxoFq9Px2QP8y7L0kOR2gQysfu9/EFNjUBdqIrzs E2o= < x-amz-request-id: 301904328CA5B6FF < Date: Wed, 12 Aug 2015 12:08:01 GMT < ETag: "78d5333e735ae15f5f19f2e76838b728" < Content-Length: 0 * Server AmazonS3 is not blacklisted < Server: AmazonS3 < * Connection #0 to host calsoft.s3.amazonaws.com left intact Request Responce
  • 30. #!/bin/bash -x bucket=$1 file=$2 if [[ (-z "$1") || (-z "$2") ]] then echo "usage: ./object_download <bucket_name> <object_name>" exit 1 fi resource="/${bucket}/${file}" dateValue=`date -R` stringToSign="GETnnn${dateValue}n${resource}" s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} - binary | base64` curl -v -X GET -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Authorization: AWS ${s3Key}:${signature}" https://${bucket}.s3.amazonaws.com/${file} List Object < HTTP/1.1 200 OK < x-amz-id-2: 7yX1htWSKHVn+ssy32emeQoaF8WgFjRQuEio4PNzSyxjoJiPnmjcsmJvbJFZTTDx < x-amz-request-id: 1786B98B373C737E < Date: Wed, 12 Aug 2015 12:11:42 GMT < Last-Modified: Wed, 12 Aug 2015 12:08:01 GMT < ETag: "78d5333e735ae15f5f19f2e76838b728" < Accept-Ranges: bytes < Content-Type: application/text < Content-Length: 12 * Server AmazonS3 is not blacklisted < Server: AmazonS3 < Hello,World * Connection #0 to host calsoft.s3.amazonaws.com left intact Request Responce
  • 31. #!/bin/bash -x bucket=$1 file=$2 if [[ (-z "$1") || (-z "$2") ]] then echo "usage: ./object_delete <bucket_name> <object_name>" exit 1 fi resource="/${bucket}/${file}" dateValue=`date -R` stringToSign="DELETEnnn${dateValue}n${resource}" s3Key=’Your Access Key’ s3Secret=’Your Secret Access Key’ signature=`echo -en ${stringToSign} | openssl sha1 -hmac ${s3Secret} - binary | base64` curl -v -X DELETE -H "Host: ${bucket}.s3.amazonaws.com" -H "Date: ${dateValue}" -H "Authorization: AWS ${s3Key}:${signature}" https://${bucket}.s3.amazonaws.com/${file} Delete Object < HTTP/1.1 204 No Content < x-amz-id-2: swMz88s6IV8i3dCwP6fSuklrubABX0O7XV1jBt7fUZtCP2x86IPozq+5Usy5wE7x < x-amz-request-id: 247E152DAB64A000 < Date: Wed, 12 Aug 2015 12:14:58 GMT * Server AmazonS3 is not blacklisted < Server: AmazonS3 < * Connection #0 to host calsoft.s3.amazonaws.com left intact Request Responce
  • 32. Q & A