Amplify Data Migration Tool
npm install @purpom-media-lab/amplify-data-migration- 日本語
- Implementation of migration in TypeScript
- Execute export using DynamoDB Point-In-Time Recovery
- Management of executed migrations
``sh`
npm install -D @purpom-media-lab/amplify-data-migration
To use this tool, you need the following AWS IAM permissions:
#### DynamoDB (Migration Management and Data Operations)
- dynamodb:CreateTable - Create migration tabledynamodb:DeleteTable
- - Delete migration tabledynamodb:PutItem
- - Save migration execution recordsdynamodb:Query
- - Retrieve executed migrationsdynamodb:Scan
- - Scan model datadynamodb:BatchWriteItem
- - Batch write datadynamodb:ExportTableToPointInTime
- - Execute Point-in-Time Recovery exportdynamodb:DescribeExport
- - Check export statusdynamodb:DescribeContinuousBackups
- - Check Point-in-Time Recovery status
#### S3 (Export Data Storage)
- s3:CreateBucket - Create export buckets3:DeleteBucket
- - Delete buckets3:GetObject
- - Read export datas3:PutObject
- - Write export data (via DynamoDB export)s3:ListBucket
- - List objects in bucket
#### AWS Amplify (Application Information)
- amplify:GetBranch - Get branch and backend stack information
#### CloudFormation (DynamoDB Table Information)
- cloudformation:ListStackResources - List stack resourcescloudformation:DescribeStacks
- - Get stack information
`json`
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DynamoDBPermissions",
"Effect": "Allow",
"Action": [
"dynamodb:CreateTable",
"dynamodb:DeleteTable",
"dynamodb:PutItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:BatchWriteItem",
"dynamodb:ExportTableToPointInTime",
"dynamodb:DescribeExport",
"dynamodb:DescribeContinuousBackups"
],
"Resource": [
"arn:aws:dynamodb:::table/amplify-data-migration-*",
"arn:aws:dynamodb:::table/--*"
]
},
{
"Sid": "S3Permissions",
"Effect": "Allow",
"Action": [
"s3:CreateBucket",
"s3:DeleteBucket",
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::amplify-export-*",
"arn:aws:s3:::amplify-export-/"
]
},
{
"Sid": "AmplifyPermissions",
"Effect": "Allow",
"Action": [
"amplify:GetBranch"
],
"Resource": "arn:aws:amplify:::apps/*"
},
{
"Sid": "CloudFormationPermissions",
"Effect": "Allow",
"Action": [
"cloudformation:ListStackResources",
"cloudformation:DescribeStacks"
],
"Resource": "arn:aws:cloudformation:::stack//"
}
]
}
Note:
- Adjust the Resource values according to your actual environment
- Follow the principle of least privilege by restricting access to only necessary resources
Creating a migration table
At the beginning (each app & branch), create a migration table and S3 bucket with the following command.
`sh`
data-migration init --appId '
You can create a migration file template by specifying the name of the migration:
`sh`
data-migration create --name
#### Update Models
Suppose the Todo model has already existed as follows.
`ts`
const schema = a.schema({
Todo: a
.model({
content: a.string(),
})
.authorization((allow) => [allow.owner()]),
});
After the release, you will need a completed field, and you need to add the completed field to the Todo model as follows.
`ts`
const schema = a.schema({
Todo: a
.model({
content: a.string(),
completed: a.boolean().required(),
})
.authorization((allow) => [allow.owner()]),
});
The change in the model adds a completed field, but the existing data in DynamoDB table does not have the completed field.AppSync
An error occurs when you try to get an existing record without the field, which is a required field, which is a required field, via .
You can implement migration processing in the implementation class with the Migration interface as follows.completed
The following is an example of a migration that adds a field with the value false.
`ts
import {
Migration,
MigrationContext,
ModelTransformer,
} from "@purpom-media-lab/amplify-data-migration";
export default class AddCompletedField_1725285846599 implements Migration {
readonly name = "add_completed_field";
readonly timestamp = 1725285846599;
async run(context: MigrationContext) {
type OldTodo = { id: string; content: string };
type NewTodo = { id: string; content: string; completed: boolean };
const transformer: ModelTransformer
oldModel
) => {
return { ...oldModel, completed: false };
};
await context.modelClient.updateModel("Todo", transformer);
}
}
`
#### Put Models
Suppose you add a Profile model that did not exist before the release. The Profile model exists for each user, Since the application needs the Profile model, you need to register a Profile for each user in DynamoDB in the migration.
`ts`
const schema = a.schema({
Profile: a
.model({
name: a.string().requred(),
})
.authorization((allow) => [allow.owner()]),
});
You can use the ModelClient.putModel method to register new data in amplify-data-migration as followsProfile
The migration to register a new corresponding to a user in Cognit's UserPool is as follows
`ts
import {
Migration,
MigrationContext,
ModelGenerator,
} from "@purpom-media-lab/amplify-data-migration";
import {
CognitoIdentityProviderClient,
ListUsersCommand,
} from "@aws-sdk/client-cognito-identity-provider";
const client = new CognitoIdentityProviderClient();
type Profile = {
id: string;
name: string;
owner: string;
createdAt?: string;
updatedAt?: string;
};
export default class AddProfileModel_1725285846601 implements Migration {
readonly name = "add_profile_model";
readonly timestamp = 1725285846601;
private userPoolId: string = process.env.USER_POOL_ID!;
async run(context: MigrationContext) {
const userPoolId = this.userPoolId;
const generator: ModelGenerator
let token;
do {
const command: ListUsersCommand = new ListUsersCommand({
UserPoolId: userPoolId,
Limit: 20,
PaginationToken: token,
});
const response = await client.send(command);
token = response.PaginationToken;
for (const user of response.Users ?? []) {
const owner = ${user.Username}::${user.Username};`
const now = new Date().toISOString();
yield {
id: crypto.randomUUID(),
name:
user?.Attributes?.find((attribute) => attribute.Name === "Email")
?.Value ?? "",
owner,
createdAt: now,
updatedAt: now,
};
}
} while (token);
};
await context.modelClient.putModel("Profile", generator);
}
}
When you run the data-migration migrate command as shown below, amplify-data-migration will run pending migrations.
`sh`
data-migration migrate --appId '
Suppose the book model exists as follows.
`ts`
const schema = a.schema({
Book: a.model({
author: a.string(),
title: a.string(),
}),
});
After the release, it is necessary to change the author, title field as a key, and suppose you have changed the Book model as follows.
`ts`
const schema = a.schema({
Book: a
.model({
author: a.id().required(),
title: a.string(),
})
.identifier(["author", "title"])
.authorization((allow) => [allow.owner()]),
});
If you change the model key on AWS Amplify, the DynamoDB table is replace and the existing data is deleted.
Therefore, the existing data cannot be migrated by implementing the Migration.run function alone.Migration.export
In this case, the export of existing data is implemented with the function.
In the Migration.export function, you can export the existing data of the model by calling context.modelClient.exportModel.context.modelClient.runImport
And you can import the exported data into a table after replace by calling in Migration.run function.
`ts
import {
ExportContext,
Migration,
MigrationContext,
ModelTransformer,
} from "@purpom-media-lab/amplify-data-migration";
export default class ChangeBookKey_1725285846600 implements Migration {
readonly name = "change_book_key";
readonly timestamp = 1725285846600;
async export(
context: ExportContext
): Promise
// Export Book table to S3 bucket with Point-in-Time Recovery
const key = await context.modelClient.exportModel("Book");
return { Book: key };
}
async run(context: MigrationContext) {
type OldBook = { id: string; author: string; title: string };
type NewBook = { author: string; title: string };
const newKeys: string[] = [];
const transformer: ModelTransformer
oldModel
) => {
const { id, ...newModel } = oldModel;
if (newKeys.includes(${newModel.author}:${newModel.title})) {${newModel.author}:${newModel.title}
// Skip if the same key already exists.
return null;
}
newKeys.push();`
return newModel;
};
// Import exported data to new Book table.
await context.modelClient.runImport("Book", transformer);
}
}
Run the data-migration export command as follows, and amplify-data-migration executes an export for pending migration.npx ampx pipeline-deploy
Usually, this command is assumed to be called before executing the deployment with .
`sh`
data-migration export --appId '
If you no longer want to use the Amplify Data Migration Tool, run the following command to destroy the migration table and S3 bucket.
`ts``
data-migration destroy --appId '