Direct Connect enables failover testing, CodePipeline can trigger Step Functions – CodePipeline now supports invoking Step Functions, AWS introduce the Cloudendure migration factory solution, Redshift enables a useful storage feature.
Kicking off this week is the announcement that AWS Direct Connect has enabled failover testing. Direct Connect is a service that allows you to privately connect your data center office or co-location environment directly to your AWS account without traversing the internet. Until now AWS hadn’t provided any tools to induce failures to test the resiliency of your connection. Now, AWS have provided a resiliency toolkit failover testing feature, which lets you shut down your BGP or border gateway protocol sessions for a configurable time period.
You can also cancel the failover test at any time and return to your previous configuration when it was all working. This is a very useful feature to ensure that your connections are highly available and resilient. Next up is some news for all of you DevOps engineers out there using AWS CodePipeline. It has a new action type to trigger AWS Step Functions, which will make it easier to invoke complex workflows as part of your release process.
With the new action type, CodePipeline stages can now trigger Step Function state machines, which support error handling, asynchronous tasks, and can easily invoke other AWS services as well through service integrations. This will help everyone using CodePipeline to maintain simpler release pipelines and delegate the behavior of complex workloads through the Step Functions.
Now, two weeks ago, I mentioned that AWS released a game analytics pipeline solution as part of their AWS solutions library, which was really cool to see how AWS solution architects are constructing solutions to the different problems that their customers come up with. Well, this week AWS have released the AWS CloudEndure Migration Factory Solution, which details how to migrate a large number of servers over to AWS using CloudEndure in a simplified and fast way, but still at scale.
It automates many time consuming tasks that enterprises face when migrating to the cloud, like checking source machine prerequisites, installing and uninstalling software on the source and target machine, and so on. Thousands of services have been migrated to AWS using this solution, so I highly suggest you check it out. Finally this week, a quick note from the Redshift team. You can now restrict the amount of disk space used by a schema in Amazon Redshift.
This lets you set quotas on the maximum amount of storage consumed by your schemas, allowing you to control and monitor the amount of storage used by different applications and users across your organizations.
Web enthusiast. Thinker. Evil coffeeaholic. Food specialist. Reader. Twitter fanatic. Music maven. AI and Machine Learning!