Home  >  Blog  >   SAP Hana

Scoping Sizing Topics for SAP BW on HANA

Rating: 4
  
 
2883
  1. Share:
SAP Hana Articles

In any implementation, there are several inputs that affect the project timeline and budget cycle. Project planning should lay out the scope, resources, and TCO benefits. Although some of these TCO improvements will be soft in terms of business efficiency, they can be categorized as elements of the overall expected results.

Sizing — Any project, whether a new installation or an upgrade, should include a sizing exercise to determine what resources will be required for a successful productive go-live. These resources are CPU, memory  (SAP  BW and SAP HANA), and disk storage. Customers who are performing new installations need to determine how much volume and what scenarios they will implement. Their SAP account executive can assist them in determining what is the initial sizing that is required. The SAP Service Market Place also has a quicksizer available that will help customers approximate their system requirements. 

Figure 10: SAP BW on SAP HANA Landscape Options

Upgrade project sizing can be a bit more challenging. Customers need to take into consideration the scope of the upgrade to determine the resource requirements. One example is whether the upgrade scope includes a phase to perform a Unicode conversion. Typically, the SAP BW system will require  more CPU and memory when running Unicode. Disk storage requirements for Unicode conversions can vary. When a Unicode conversion is performed, the database is re-organized, and all empty space between data blocks is eliminated. In some cases, this process reduces the database size.

When a company upgrades to SAP HANA, the size of the data on their existing system is a direct input as to what sizing they require for the appliance and the disk storage. SAP HANA uses sophisticated compression algorithms to compress the data. This compression affects how much RAM is required for the data. Because SAP HANA is an in-memory database, additional RAM is required for work processing space. The current rule for working space is twice the amount of data RAM.

The amount of RAM required for a system depends on the amount of source data and the brand of database in the current system. Some RDBMS already has some amount of compression. This will reduce the effective overall compression that SAP HANA will achieve after migration. Through the course of many implementations, SAP has observed varying levels of compression.  In general, the basic rule is to expect a 6:1 compression rate. When applying the working RAM space required, the expected compression is 3:1. Some customers have reported much higher compression rates, but the makeup of the data and the source RDBMS plays a significant role in determining the effective compression rate.

MindMajix Youtube Channel

One additional consideration for SAP HANA appliance size is based on two additional factors:

  1. Data-archiving strategy (Archive and Near Line Storage)
  2. SAP BW and SAP HANA

SAP recommends that customers create a data-archiving  strategy. Archiving and near line storage (NLS) follow a fairly similar process. Archiving is the process of exporting older data into offline storage. Near line storage solutions also export older data, but to a compressed online storage facility. NLS solutions have the benefit of reducing the size of the online database while still allowing queries to access the archived data if needed. The archive solution requires that the data are restored to the system before queries will show the archive data. The sooner a company develops an archiving strategy, the more manageable their system will become.

Checkout SAP  BW On HANA Interview Questions

With SAP BW 7.30 Service Pack 8 and SAP HANA 1.0 Service Pack 5, SAP introduced new functionality. This functionality is referred to as Active/Not- Active data. With the release of the updated SAP BW and SAP HANA, certain data are not loaded into memory by default. In addition, customers can elect to flag certain content as not-active. The not-active data are loaded into memory only when they are needed. They are also the first to be flushed when they are no longer being used. By default, BW now automatically marks all PSA tables and all write-optimized DSOs as not-active. According to initial customer reviews, the not-active data concept reduced the SAP HANA system size by approximately 20%.

The size of some SAP BW systems will require a  “scale-out”  implementation of the SAP HANA appliance. As with all SAP HANA implementations, the configuration has to be certified by the hardware partner. In addition, SAP provides extended monitoring for all scale-out projects. Some elements of the system require specific tuning and configurations to provide optimal functionality. SAP recommends that all customers implementing SAP BW POWERED BY SAP HANA in a scale-out scenario register for extended monitoring with SAP Active Global Support.

Explore SAP BW On HANA Sample Resumes! Download & Edit, Get Noticed by Top Employers!Download Now!

SAP BW Powered by SAP HANA Scale-out — Best Practices

If the memory requirements exceed the available memory of a single server node, a scale out solution consisting of multiple server nodes can be deployed. The architecture of a scale-out solution consists of a master node which stores the row-store data, and several slave nodes, which hold partitions of business data tables (InfoCubes, DSOs, PSA tables). By increasing the number of slave nodes, additional storage capacity must be added to the system as needed.

Customers who create an effective archiving strategy and implement the latest SAP BW release will be better able to estimate the required initial system size. Note that SAP HANA Appliances must be certified by both SAP and the hardware vendor prior to implementation.

 

Join our newsletter
inbox

Stay updated with our newsletter, packed with Tutorials, Interview Questions, How-to's, Tips & Tricks, Latest Trends & Updates, and more ➤ Straight to your inbox!

About Author

Ravindra Savaram is a Technical Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.

read more
Recommended Courses

1 / 15