In a SAP BPC standard environment, the main interface to input data is MS Excel. In order to communicate with your SAP BPC model, the BPC EPM Add-in needs to be installed. In some cases, it would be an advantage to avoid this installation on all BPC users’ laptops. Here are some reasons:
SAP has released a full integration scenario between SAP Analytics Cloud and SAP BPC Standard (as of version 10.x). This scenario gives the opportunity to avoid installing the EPM Add-in for each user. Next to your BPC input forms, you can create SAC input forms available via your internet browser and communicating with your SAP BPC model.
Implementing this scenario is quite straightforward and should respect 3 main steps:
After enabling the communication between your BPC system and SAC in the SAP Cloud connector you can create a new SAP BPC connection.
When creating your connection, you will need to enter your BPC server address and port. The BPC user entered here will be used to transfer and import data.
This scenario is until now available for BPC Standard model only. An integration with BPC embedded is planned in the future SAC releases.
Once your connection is created, you need to import your model from BPC:
In order to be able to input data in SAC and publish them to BPC, you should select the option below :
Before Importing the model, you need to set the dimension mapping between your Source BPC dimensions and your SAC target dimensions. As BPC Standard is an account-based model, you need to identify a dimension type for each dimension (Account, Generic, Organization, Version and Time). This mapping is very important as you cannot change it easily afterwards.
Once your model is imported and saved, an initial synchronization will be executed. Your BPC model is now ready to be used in SAC:
With the help of Sap Analytics Cloud planning functionalities, we have Today a solution to input data in BPC in an easy and efficient way via your internet browser and without the need of an extra installation on your laptop.
Furthermore, SAC offered you planning functions like allocation and distributions that can fulfill most of the planning requirements.
Note: this is the second part of the blog. Missed the first part? Read it here.
For the extraction of data, a where clause needs to be defined. With the current logic this will be a combination of global variables ($G_Load_Mode, $G_Extraction_From_DT, $G_Extraction_To_DT, $G_Delta_Days).Read more >
At client side Robby was challenged to build an ETL framework in Data Services to structure all their batch jobs. They were already using Data Services but in a very limited way and not using the tool at its fullest.
A lot of companies do not know the possibilities the tool offers but you can compare it at best with a puzzle.Read more >
Recently, we ran in some issues when doing training with several trainees on our SAP Datahub environment that is running on a Kubernetes cluster deployed on Azure. If you do a default deployment of Kubernetes with advanced networking, you end up with a pod limit of 30 pods per node. This is something you need to consider before installation since it can only be set during the initial deployment of the cluster and it cannot be changed afterward.Read more >