Designing an enterprise level cloud solution on AWS public cloud for SAP S/4HANA systems can be a double-edged sword. On the one hand, you can ensure a more agile and scalable business with increased innovation; but on the other, you might be hindered by the need for a defined roadmap, or choice of solution, and even challenges during integration of your existing landscape. However, having recently had the opportunity to design a solution to a few processes on AWS cloud, here are some things I have learnt and would like to share.
The Application Scheduler
This automation module allows for pay per use of compute usage operating from AWS cloud. Importantly, while using this, SAP applications have to be gracefully started and shutdown in order to ensure data integrity is maintained. The cloud integrated scheduling tool has an easy to use UI that will allow controlled shutdowns of SAP systems and the related compute instances. When a SAP system is stopped as a planned activity, the blackout is updated to the respective SOLMAN monitoring via the core script in python (boto3).
It is important to ensure that a stop request should have following steps:
- Sets blackouts for SOLMAN monitoring for the planned scheduled outage.
- Shuts down the SAP system in a controlled manner, ensuring applications, ASCS and database are stopped in the correct sequence.
- Once SAP components on EC2 instances are successfully shutdown, executes snapshots of relevant file systems and stores in AWS S3 buckets
- Check SAP system is safely shutdown (report if not).
- Shuts down the AWS EC2 instances.
- Retain logs on AWS
Along the same lines, a start request must include the following steps:
- Check that the requested EC2 instance is not running already.
- Starts the EC2 instance
- Starts the SAP System in a controlled manner, ensuring applications, ASCS and Database are started in the correct sequence.
- Retains logs on AWS
This is a cloud platform solution that meets all customer security requirements, with the correct SAP components deployed in the shortest time duration possible. This module provides an enterprise level cloud platform (AWS) to help auto provision SAP applications on AWS cloud within 24 hours of request initiation. This provides a rapid response to customers in shortest time possible lowering the overall time and cost of delivery for SAP S/4HANA. A user is able to login to a UI portal from where he should be able to deploy an SAP application tier, an SAP database tier within a virtual private cloud (VPC) on AWS cloud. A user is able to select one installation pattern from a list of options available.
Examples of the installation pattern include:
- All Applications & database in the same AWS EC2 instance / server
- All Application in one server & database in the different EC2 instances
- Only database server for Native Hana on single EC2 instance
- All applications in the different EC2 instance & database in the different EC2 instance
- High Availability on AWS. ABAP SAP Central Services (ASCS) instance & Primary Application Server (PAS) instance in the same EC2 instance, Additional Application Server (AAS) instance in a different EC2 instance & database in the different instance.
A user is able to deploy these set of components in all of the relevant “Installation patterns”, ranging from all components on a single AWS instance, to a fully high available multi availability zone pattern in AWS. This deployment is to be delivered within four hours of initiating the rapid provisioning.
DevOps on SAP
For DevOps to occur with SAP systems, there is a need for self contained “short life” availability systems that are representative of the production system. This an enterprise level cloud platform solution module provides the ability to automate the process of Clone, Copy and Refresh SAP source systems on AWS.
This was the most challenging module, especially in terms of:
- SAP Cloning– Provides ability for a SAP source system to create an exact replica of it in a secure and inaccessible private network on AWS Cloud. The ability to clone an SAP system, is not often required, however when this is needed, it is usually for Forensic analysis, key technical SAP application issue or data integrity concerns. In a traditional data center this would require several days effort to bring this system online, normally from a backup, which due to the severity of the issue, is usually deemed insufficient. Using the Clone automation on AWS public cloud, this cloned system could be made available within a few hours.
- SAP Copy – Provides ability for a SAP source system to create copy of a source system, on AWS cloud, which is then amended to be a unique system with its own SID, integration, batch schedule, user access and so forth. System copies are required when building out new SAP environments, such as building a dual track. Building out new environments take on average a month, with copy automation on AWS, this would be a week long only. For creating individual systems copies, specifically for DevOps type deliverable, this could be achieved in two days. For single system copies, this could be setup as a self service with no intervention.
- SAP Refresh – Provides ability for a SAP source system to be refreshed within an existing SAP landscape. This automation refresh tooling on AWS, could reduce this to between 24 hours to 72 hours depending on integration and number of components and cost. The refresh allows for a copy of another system configuration/data/repository objects while retaining its unique SID
- SAP Refresh – Provides ability for a SAP source system to be refreshed within an existing SAP landscape. This automation refresh tooling on AWS, could reduce this to between 24 hours to 72 hours depending on integration and number of components and cost. The refresh allows for a copy of another system configuration/data/repository objects while retaining its unique SID and integration, batch schedule, user access and so forth. The average refresh of a standard SAP environment is two weeks.
A few important points to note while automating on AWS
Expectation is that below pre-work will be done to support the automation i.e.
a) Number of application servers per /SID need to be pre-determined.
b) Each application servers must be pre-installed with specific instance no, IP Address and hostname.
c) Each application server should be added to the respective logon group and operation mode.
d) Specific parameters need to be set in advance for each application server in scope of scaling.
While this is a high level design for the automation on AWS, there are many more interesting factors that I experienced while working on this. If you’d like to explore deeper or have any queries and doubts, please feel free to reach out to me!
Published by: Asish Joshi
Cloud & Data Solutions Architect | Capgemini
Asish is a passionate Cloud & Data Solutions Architect with around 14 years of experience in large scale IT & Telco enterprises, architecting & designing smart solutions to help clients in their Cloud & Data driven journey. he has hands-on experience in AWS solution design-to-delivery for cloud migration, process automation, data & analytics, SAP on AWS. Retail, Telecom, Healthcare, Media & Entertainment, IoT. Trained in advanced architecting on AWS, AWS’s Big Data & Analytics services, RedHat certified. He has the combination of business acumen and deep understanding of Cloud native technologies.