Process chain (flows) optimization Part 2
Process chain (flows) optimization Part 2
In the previous article I did a small intro and showed you how to check if a process chain should be optimized. In this article I’ll be telling how I optimize the infopackages and DTP’s from the chains.
As for infopackages, I have only encountered 2 things that can slow the process down. When you looked into the detail view of the infopackage at ‘ST13’ it either just loads in a lot of data or as marked above the ‘1st package arrived’ takes a long time.
For the first case there is not much you can do, it’s much information and it will take time. As for the second case: Go to ‘RSA1’ search for the info package. Double click on the info package and double click on the data source. Copy the data source name and open the system from which you are requesting data for example ECC (if you have authorization that is). Go to ‘RSA2’ and paste the data source name. Go to the extraction tab. Here you can find how the data is requested. Unfortunately for this part I’m narrowed to only having one solution for a specific case. Which is when you see in the extraction tab that there is a custom function module that loads in the data. You will then have to debug the function module and see where you can improve it. If it’s a Function Module by SAP it’s best to leave it as it is.
So this is all the info I can provide at the moment as for optimizing infopackages. This is because I’m currently not authorized to look into loads that come from other source systems or databases.
Now let’s have a look at DTP’s. For DTP’s there are 2 things that are most common to slow down the process chain. First will be transformations. I’m not going to go too much into detail here but you should always investigate the transformation thoroughly. Always look into the start, end and rule type routines. They can have a massive impact on the loading time. Avoid having a lot of loops into your code and only use routines if really necessary for changing data in the backend. Best advice for this I can give is ‘DEBUG’. Debug the transformation and see what is taking a long time and what can be changed for the better. Every routine is different and there is not one general rule for routine code to make it faster. Put in some break-points where you think the code can be optimized (DON’T FORGET TO DELETE THE BREAK-POINTS AFTERWARDS!). Small tip in the infoprovider you can already check if it’s the start, end or rule type which is taking a long time.
If you are sure that the DTP doesn’t have routines that slow it down, the second most common reason is regarding the DTP being Delta or Full load. If you use daily chains and just want to add new data every day it is best to use a Delta load, this will save you a lot of time. Be aware you will sometimes notice that even delta loads with no routine are slow. Take a look into the target how many records are transferred and how many are actually added. I for example had a DTP which loaded in 20million records each day and added around 500. This was because in the process chain we didn’t clean up our PSA. A delta load will get all the data that is in the PSA, so if you don’t clean your PSA requests it will almost be the same as a full load except that you add less records. So try to always delete your old PSA requests, this way your load will be a lot faster.
This will conclude my second article on process chain optimization. Next article I’ll continue with what you can do about the performance of DSO’s and Cubes.