Upload statics to S3 server


![npm]()
A little script to upload statics to a S3 bucket by using the official Amazon SDK.
In order to use this module, you'll need to have AWS Credentials. You can load them, two ways:
* By passing directly to the method as second parameter.
* By having a ENV variable with the path to a file with the credentials.
The ENV variable is AWS_CREDENTIALS_PATH and it should have accessKeyId, secretAccessKey, region and bucket.
``bash`
npm install s3-folder-upload -D
In case you want to use the CLI, you can install it globally:
`bash`
npx s3-folder-upload
`javascript
const s3FolderUpload = require('s3-folder-upload')
// or the ES6 way
// import s3FolderUpload from 's3-folder-upload'
const directoryName = 'statics'
// I strongly recommend to save your credentials on a JSON or ENV variables, or command line args
const credentials = {
"accessKeyId": "
"secretAccessKey": "
"region": "
"bucket": "
}
// optional options to be passed as parameter to the method
const options = {
useFoldersForFileTypes: false,
useIAMRoleCredentials: false
}
// optional cloudfront invalidation rule
const invalidation = {
awsDistributionId: "
awsInvalidationPath: "
}
s3FolderUpload(directoryName, credentials, options, invalidation)
`
- useFoldersForFileTypes (default: true): Upload files to a specific subdirectory according to its file type.useIAMRoleCredentials
- (default: false): It will ignore all the credentials passed via parameters or environment variables in order to use the instance IAM credentials profile.uploadFolder
- (default: undefined): If it's specified, the statics will be uploaded to the folder, so if you upload static.js to https://statics.s3.eu-west-1.amazonaws.com with a uploadFolder with value my-statics the file will be uploaded to: https://statics.s3.eu-west-1.amazonaws.com/my-statics/static.js.ACL
- (default: public-read): It defines which AWS accounts or groups are granted access and the type of access.CacheControl
- (default: public, max-age=31536000): HTTP header holds directives (instructions) for caching in both requests and responses.Expires
- (default: 31536000): Header contains the date/time after which the response is considered stale. If there is a Cache-Control header with the max-age or s-maxage directive in the response, the Expires header is ignored.
If you use programatically the library, you could overwrite the ACL, CacheControl and Expires values to file level.
`javascript
const options = {
useFoldersForFileTypes: false,
useIAMRoleCredentials: false,
}
const filesOptions: {
'index.html': {
CacheControl: 'public, max-age=300',
Expires: new Date("Fri, 01 Jan 1971 00:00:00 GMT")
}
}
s3FolderUpload(directoryName, credentials, options, filesOptions)
`
`bash
s3-folder-upload
Example:
s3-folder-upload statics
`
For the AWS Credentials
* you can define a ENV variable called AWS_CREDENTIALS_PATH with the path of the file with the needed info.`
* you can pass the needed info via command line parameters:
bash`
s3-folder-upload
useIAMRoleCredentials
* you can use option in order to rely on IAM Profile instance instead any passed by variables and environment
For Options
* you can pass the needed info via command line parameters:
`bash`
s3-folder-upload
For CloudFront invalidation
* you can pass the needed info via command line parameters, the invalidation needs both parameters:
`bash
s3-folder-upload
S3_FOLDER_UPLOAD_LOG: You could specify the level of logging for the library.none
* : No logging outputonly_errors
* : Only errors are loggedall
* (default): Errors, progress and useful messages are logged.
Example of use:
``
S3_FOLDER_UPLOAD_LOG=only_errors s3-folder-upload
If you use the library programatically, this ENVIRONEMNT_VARIABLE will be read as well. For example:
```
S3_FOLDER_UPLOAD_LOG=only_errors node upload-script.js
- [x] Upload a entire folder to S3 instead file
- [x] Async upload of files to improve time
- [x] Detect automatically the content type of (limited support)
- [x] Return the list of files uploaded with the final URL
- [x] Better support for parameters with the CLI
- [ ] Improve content type function in order to get more and better types of files
- [ ] Avoid to re-upload files if they didn't change
- [ ] Check if cache is blocking updates of statics on website.
- [ ] Map uploaded paths to create a default invalidation paths rule in CloudFront.