2013-01-17

MD_STOCK_REQUIREMENTS_LIST_API performant bulk use

Stock figures; what is in stock and what is available is a challenge in any ERP system. You should not calculate stock figures yourself but use an available function or API. In SAP I have used BAPI_MATERIAL_AVAILABILITY, but now I learnt MD_STOCK_REQUIREMENTS_LIST_API is a better RFC function for this.
It is simple to set up a job in ITL (Integration Tag language) that extracts the stock figures.
This job selects the materials (from a MySQL table the <forevery> job iterator) we like to fetch stock figures for. The job runs the MD_STOCK_REQUIREMENTS_LIST_API for each material and creates a MySQL database table if not exists for the E_MDSTA output and loads the data. If the table exists it is truncated prior to loading.
This job does the work correctly, but it took +70 minutes for 23000 materials . The problem here is that the SAP part of the job is iterated once for each material, it means log on to SAP extract the data and then log off from SAP and all this sequentially. This takes an awful long time (+70 minutes). The best way to solve this is to make an ABAP RFC to do the job. But this time we will to do this without ABAP coding.  
First we like to parallel process the materials in chunks, that is easily fixed:
Adding chunksize and parallel attributes chops the <forevery> array into 1000 row chunks and parallel process them but not more than 10 at a time. This do not however fix the problem with logging on and off to SAP for each material. This can be fixed either by introducing initialization  and termination routines to the <forevery> job iterator or create a piggyback iterator. The piggyback iterator is the simplest solution and requires no PHP coding.  
Now we have a job that chops up the <forevery> into chunks and parallel process each chunk, but this time the job iterator is pushed down to the SAP communicator sap2.php, which now only log on once for the entire chunk. But there is a snag, since we are parallel process the chunks we cannot truncate the target table in this job. We must create a truncate job that we run first.
And that is what this job does. The <prereq> statement intercepts execution if the @TABLE not exists in MySQL. (It would be nice with ‘TRUNCATE IF EXISTS’ command in SQL).
And that’s it, now we extract stock figures for 23000 material in less than 8 minutes, without any ABAP code. This could most likely be optimized below 5 minutes, (see ‘ when fast is not enough ’).
Mixing iterators can create some amazing functionality. If you take a close look at the example above you will find there is quite a lot of functionality packed in there with little code. Creating a job for each chunk with job templates is an alternative, but it is harder to control the parallel processing.
If you like this post I suggest you also read:
My series on Integration Tag language ‘ Job Scheduling with PHP 1-4 ’.

3 comments:

  1. you have done exactly what i want to do. can you please tell how to create this Job in SAP? you can contact me via driller@conta-clip.de.

    ReplyDelete
  2. Hi,
    The job is run in the external server setting up one process for each call to SAP via SAPRFC.
    The XML you see in the post is the job. The XML is interpreted and executed by the ITL interpreter I written in PHP, which you can read more about in my blog, start by the links in the post.

    The setup in SAP is simple it's just calling the BAPI with parameters via SAPRFC, the hard work is done in PHP, that is set up a list of all parts (plant and matnr) we want to examine, chop the list into chunks, call SAP in parallel, define the MYSQL table if-not-exists and finally append the data into the table.

    You can follow the daily progress of jobs at https://twitter.com/tooljn

    ReplyDelete
  3. This comment has been removed by the author.

    ReplyDelete