As we all know, SNP basically supports Make to Stock functionality. So, all the demands and supplies we have in MTS segment can be seen in SNP Planning book without any enhancement.

 

However, when we need to see data from other segements (Make to Order or Planning without final Assembly) in SNP Planning book, we may need to do some enhancements and changes to the SNP planning area.

This document is about making this functionality active.

 

1) You will have to use custom key figure with key figure function 2006 and 2008 for supplies and demands in Make to order or Planning without Final Assly segment.

There is a BAdi, /SAPAPO/SDP_INTERACT> GET_KEYF_SPECIALS and set parameter CV_KEYF_SWITCH to 3.

If you are working only with Make to order segment, you can set this parameter to 2.

If you are working only with Planning without final assly segment, you can set this parameter to 3.

 

2) Create and activate two custom Key Figures for demand and supply each.

 

3) Planning Area De-initialization

 

4) Add Key Figure function to Key Figures in Planning Area

In change Planning Area. Go to Key Figs tab and click Details.

For custom Demand Key figure add Key figure function 2008 and for custom Supply Key Figure add Key Figure function 2006. Also make sure that you are assigning correct category group for these custom key figures.

 

5) Planning Area Initialization

Now you can see these Key figures in Planning area.

 

6) Add these Key figures to the desired Planning Books.

Forecast and supply in “Planning without Final Assembly” segment can be seen in Forecast (ATO) and Supply (ATP) key figures respectively.

Users are facing the problem while trying to run some normal copy macros where we used ROW_INPUT function before the copy happens (because, we have set the 'KF lock' on the planning area).


 

Basically, the issue with this macro can be reasoned with some observations

  • We have activated the KF lock on the planning area with no lock on the read-only KFs: so, if user A opens the data view, modifiable KFs of the view are locked.
  • We are modifying the attributes of one 'output only' KF to edit mode as part of the macro logic.
  • We are having a default macro in the same view which tries to close all the KFs (bring all the KFs to no-edit mode).


Support user A has first opened the view – he locks all the modifiable KFs of the view. If user B now opens the view and executes the macro, he is actually trying to bring one read-only KF to edit mode: this is because the macro first tries to bring the KF to edit mode before updating it. Mind you - when switching to edi tmode, the data must be read again - this because: since this KF was previously not locked for user B, it could therefore have been changed by some other user (system thinks so!)...   

 

Since it is user B who tries to bring the KF in edit mode by a macro, system ideally has to lock this KF for user B. But again, it is user A who is authenticated for locking all the modifiable KFs as he was the first one to have opened the data view and have already locked all the modifiable KFs. The consequence: system cannot lock this KF for user B, and hence it cannot be modified. Hence is the error: “KF not locked, data not saved”...

 

To put is short, it is exactly at the juncture when the macro is run which tries to change the attribute of one KF from display to change mode that there is an internal clash that goes for the system between user A and user B because of the locking logic of the planning area and throws the error in question as a consequence.

 

Well, this is fine, but the error is still alive even if only one user tries to run the macro - how to explain this myself? And here, we have the default macro coming into action. The problem exactly happening here was:


  • I run my macro - the target KF gets updated,
  • I now have the save the results - so, I click the SAVE button, and then the error appears.

 

So, it is while SAVing that we are facing the error - meaning, the screen is getting regenerated; meaning, this is the work of the default macro which doesn't allow the KF to go in edit mode.

 

Well, we have come across the consulting note 1649757 which recommended us to use CELL_INPUT instead of ROW_INPUT. Unfortunately, even this didn't work and is giving below error



After analyising further, it is understood that for CELL_INPUT to work correctly, we needed a minor change in the function module /SAPAPO/ADVF_CELL_INPUT as specified in the note 1328806. But the fact is: we are already in SCM702 and this note is seen implemented !

 

Where is the way now? we allowed SAP to look into the same, and they asked us to use CELL_INPUT instead of ROW_INPUT by explaining the cause of the iissue as because of the default macro which doesn't allow the KF to go in edit mode.

 

I have finally tried to see only with the function written in the macro with no extra arithmetic.


 

And this is giving something as below:


 

I changed for CELL_INPUT( 1 ) in the above macro, and the error changes accrodingly.



Not sure why this KF lock is giving these many problems - finally, we gave this to SAP... and it  has come back saying CELL_INPUT works correctly only with KF type being INPUT/OUTPUT - what we had is OUTPUT type KFs... hence was the problem - thanks to SAP

Transferring data between ERP and APO can be a tricky, error-prone activity, which is why we asked Claudio Gonzalez of SCM Connections to host an online Q&A to address the biggest questions users are facing and tips to optimize CIF performance. Check out the transcript from this Q&A session and see whether your questions were answered.


Claudio will be one of the featured speakers at SAPinsider’s SAP SCM 2014 conference in Las Vegas, April 1-4. For more info on the event visit our website at Logistics & SCM, PLM, Manufacturing, and Procurement 2014 and follow us on Twitter @InsiderSCM


Comment From Pat H.: Is CIF a standard, delivered SAP tool with ERP / APO or a separate add-on?

Claudio Gonzalez: The CIF is a standard delivered interface.  CIF is an integrated part of the ERP system.


Comment From Brad Antalik: If ECC and SCM were both on HANA does this eliminate the need for the CIF? If the CIF is still used how would HANA affect it?

Claudio Gonzalez: Should not affect it as of right now as there is no dependency. CIF is an integration module independent of the underlying database. Now in the future there has been some thinking that With ERP DB in memory, we could see the day that the master data (and transactional) dataset for both, merge, allowing an unified system with Global ATP (SCM-APO-GATP) available out of the box. The same for CRM and ERP integration.


Comment From Ana Parra: Can APO be based on the SAP HANA Platform? Do you have supported evidence of before and after scenarios for SAP HANA performance? Thanks!

Claudio Gonzalez: This one is a bit off the CIF topic, but to answer the question HANA is available on SCM as of version 7.02. Here is a good link with some details on it http://www.saphana.com/docs/DOC-3316


Comment From Guest: Just implemented SAP TM9.0 and SAP EM9.0 with SAP Optimizer 10.0. ECC is EhP6. In past, I was able to use CIF cockpit to monitor CIF data transfer and troubleshoot any issue. What's the equivalent process/method to monitor and troubleshoot CIF related issue and data transfer in the newer versions?

Claudio Gonzalez: As far as I know the CIF Cockpit should be available as long as you have the SCM Suite. I do recommend to use the following CCR report (/SAPAAPO/CCR) , Queue Manager (/SAPAPO/CQ) and CPP report (/SAPAPO/CPP) in conjunction with the CIF Cockpit to troubleshoot CIF related issues


Comment From Axel Völcker: Our ERP system has to provide two SAP system with material master data. How can we set up the IM's for the Material Master data?

Claudio Gonzalez: You would create two integration models. Each integration model would have a different logical system. Each logical system is tied to a specific SAP SCM system.


Comment From Pavan Kumar Bhattu: I have two questions:

1) If we CIF Purchase info records with two purchase organization data, then how will It reflect in APO?

2) If we CIF Sub-contracting SNP PDS, then will it create any transportation lane in APO?

Claudio Gonzalez:
1) The info record would CIF to APO and on the external procurement relationship, the different Purch Org would show under the General data tab. I am assuming the different Purch Org will have also a different destination location, which would then create two lanes, this is the most common scenario

2) Yes, when you CIF a Subcontracting PDS to APO, be that SNP or PP/DS the system will create the lanes for the input components from the Manufacturing location to the Subcontracting location. The lanes from the Subcontracting location to the Manufacturing locations for the FG are created by the PIR or Contract.


Comment From AJ: Can you please further elaborate on PDS_MAINT in ECC?

Claudio Gonzalez: PDS_MAINT is a transaction in the ECC side that's used to update changes to PDS in APO. As of SCM 7.0 EHP1 you could not make changes to the PDS directly in APO. Thus, PDS_MAINT was used to make changes. Changes such as costing, priorities, consumption, bucket offset and so on. In EHP2 there is a functionality to mass maintenance PDS directly in APO but it has its limitations and it seems that if you re-send the PDS from ECC it will overwrite the changes, need to verify this.


Comment From Ayyapann Kaaliidos: Is there no WEEKLY R/3 Consumption mode available in R/3? We are in ECC6 SP10. But we use WEEKLY Consumption mode in APO. We are in SCM 7.01 SP 5.

We had to handle this in user exit only, as we have configured CIF Master data update as Instant and the CIF will overwrite the values in APO if it's not handled in user exit. Will this functionality be part of any future R/3 versions?

Claudio Gonzalez: I have not heard of any plans to add this functionality in ECC yet. But, before you go about modifying the CIF as to not override the ‘W' value in APO. Try the following, ECC has a consumption mode ‘4' (Forward/Backward) which Is not supported by APO. I believe if you set it to this value, it would not override the APO value. It is not pretty but it will save you a custom change.


Comment From Mukesh Lohana: What criterion determines whether inbound queues or outbound queues are faster? We use outbound queues and transfers for the huge planning data from APO to ECC. If we change to work with inbound queues to transfer data from APO to ECC, then in my understanding ECC will be responsible for handling the load the data. Since ECC is an execution system, and a lot of activities are occurring, will the change from outbound to inbound queues slow down the execution system (ECC)? What you suggest?

Claudio Gonzalez: Since this question is more technical than functional, let’s first quickly explain the difference between Outbound and Inbound.

- Communication Method: Outbound Queue - Calling system sends the queues to the receiving system without taking care of the system load of the receiving system. No scheduling of the processes happens in the receiving system. The can lead to overloading of the receiving system, which leads to deterioration of CIF performance with high data volume.

- Communication Method: Inbound Queue - Calling system sends the queues to the 'entrance' of the receiving system which allows the receiving system to control the system queue load on its own. Scheduling of the processes happen in the receiving system. Therefore, in theory this will lead to better CIF performance.

Based on the above and if you go through SAP recommendations, it is stated that if you have performance issues in the target system to use Inbound Queues.

At the end of the day, any actual performance degradation on the ECC side won't be known until change is made and tested.

The following notes deal with how to change the communication from outbound to inbound. 388001, 388528, 388677.

Also, I always recommend to have note 384077 on your favorites, as it deals with how to optimize CIF communication and it is updated regularly.


Comment from Brad: If transactional data goes real time to APO, what is the purpose of the batch stock, sales orders, and purchase orders CIF jobs?

Claudio Gonzalez: Regardless if the data goes real time or not, which by the way I recommend real time. You would need PO, Sales Orders and other transactional data to integrate into APO so that your planning system has all the necessary data for accurate planning.

 

To view the rest of the transcript, click here

When developing and testing complex macros, it is useful to be able to see what is going on behind the scenes.  For example when performing a calculation and putting the results into a hidden key figure or a variable for use in a later macro step, it is often useful to see these interim results.  Equally, when retrieving product master data to use in a macro it can be useful to see this data before it is used to ensure the correct data is being retrieved.

 

A simple technique to achieve this is to add temporary coding (either within a macro as additional steps or as a separate macro) to pop up relevant data.  This has the effect of stopping the macro at each step and putting relevant data on screen in a pop-up box that can be checked before going on to the next step.  Given that a data view often contains over 100 buckets, it is worth temporarily restricting the number of iterations that a calculation being investigated will work on – it saves a lot of keying.

 

Below is a very simple example

MACROBLOG1.png

Here the requirement is to pop up the initial stock on hand so that a note of it can be made.   Thus in step 1, the standard macro function is used to retrieve the data and in step 2, a message is created that pops up with this information.  It’s a simple example that just puts up the contents of 1 key figure once although it is possible to put several pieces of information into the pop-up box.

 

Here is another example to pop up data in key figures that are not always easy to see on the data view being used:

MACROBLOG2.png

Here an additional macro has been developed for the purpose of displaying the capacity consumption and planned production in a single pop up box.

 

After development, this temporary work can be completely removed.  Alternately the relevant macros or macro steps can simply be deactivated in case they are needed in the future for debugging.

Subcontracting by definition is a process which involves the procurement of a product from a supplier (the subcontractor) to whom the procuring entity provides components for the purpose.

Eg. A chemical manufacturer produces a chemical in bulk and sends it to Subcontractor who in turn fills and packs small containers of the chemical. These Small containers are the final saleable product..

 

Why map Subcontracting as In-house production?

Assume a situation where the Subcontracting vendor has multiple resources which are dedicated to your production site & you (as a Supply Chain Planner) want to perform capacity planning of those resources in SNP because you often get into capacity related issues like orders are getting delayed or re-prioritized and rescheduled with last minute notifications. Of course, there is an option available to you to analyze the capacity manually and set the priority to orders and keep changing them as you do not have visibility of capacity load. Would it not be great if you can model these dedicated resources in APO SNP to perform capacity leveling? Let’s explore how we can achieve this!

 

Master data required:

ECC:

  • Material Master: Procurement Type “F” with Special Procurement “30” at Subcontracting Plant
  • Material Master: Production version with details of BOM and Routing at Subcontracting Plant
  • Purchasing Info record (Subcontracting Type)
  • Source list
  • Bill of Material
  • Work center: To be created at Subcontracting plant in ECC. Should have all the capacity details of the dedicated resource available at vendor
  • Routing: Should have at least one operation (scheduling relevant) based on the Work center

APO:

  • Product Master at Subcontracting Location (Via CIF)
  • SNP PDS (created by Core Interface (CIF) Integration model). Select normal SNP PDS not SNP Subcontracting PDS while generating Integration model
  • Vendor Location master
  • Resource Master data at Subcontracting Plant (Transferred via CIF Integration model)

 

Ways of working:

  • Executing SNP Heuristics run for Subcontracting Product-Location combination will create SNP Planned orders. SNP PDS will be automatically adopted as source of supply
  • These Planned orders will show the exact resource utilization and hence SNP Capacity leveling can be performed using Capacity data view in SNP Planning book
  • SNP Planned order is created for header product. This also creates Dependent demand for Component products (via PDS explosion).
  • During execution cycle, convert SNP Planned order à Purchase Requisition à Purchase order
  • Once Planned order is converted to Purchase requisiton, Dependent demand (SNP:DepDmd)for Component will reflects as Subcontractor requirements (SubReq)

Benefits:

  • SNP capacity planning & capacity leveling can be done for vendor resources
  • Alternative bill of material can be easily used. Just by changing a production version in ECC or by changing a SNP PDS in APO, dependent demand will propagate to all relevant components
  • Alternative resources can also be used to plan capacity

 

Challenges:

 

  1. Planned order to Purchase order conversion

-      Problem: In subcontracting scenario, generally end users are used to Purchase requisitions as an output of planning run. These Purchase requisitions are later on converted to Purchase orders as a part of execution. But when we map Subcontracting as in-house production, heuristics run creates SNP Planned orders. These planned orders should first be converted to Purchase requisitions and then only a Purchase order can be created. Thus this can be a point of resistance from end users as it will add one step to their manual work

 

-     Solution: Use transaction MD14 (Single order) or MDUM (Multiple orders) to convert Planned orders to Purchase requisitions. Report “RMCVPLRQ” can be customized to convert Planned orders to Purchase requisitions in bulk using background jobs.

 

2. Vendor stock consumption

-      Problem: It is quite possible that items like packaging material; labels etc. are supplied in bulk to vendor. So the stock of these items is maintained as vendor stock and replenishment of these items are planned via ECC MRP run instead of planning in APO. ECC MRP run will not net the vendor stock if the component demand is in form of “Depdem”. It will net only against “Subreq”

 

-      Solution:

Either Plan component replenishment in APO where vendor stock is netted against Depdem as well as Subreq

Or

Change policy: Issue only limited stock to vendors. Remaining stock should be maintained as plant stock which can be netted against dependent demand

Or

Convert SNP Planned order to Purchase requisition in near term horizon Eg. 3-4 weeks. This horizon should be empirically decided such that it at least takes care of the lead time of subcontracting product

 

 

Key Points to remember:

  • Vendor and Stock should be a part of same integration model for Vendor Stock to be visible in APO at Subcontracting Location. (Special Stocks at Vendor indicator in the integration model)
  • Vendor stock available at Subcontracting plant will be considered to calculate Projected Available Balance in SNP Planning book. Default Stock category group should include vendor stock category CE.
  • PPDS PDS is not required in APO
  • BOM status in ECC should always be “Active”. Else Planned orders can’t be converted to Purchase requisition
  • Maintain set up time in routing equal to planned delivery time. Otherwise dependent demand on component will be on same day as that of planned order date of header product
  • In MDO4, Planned Order details will show “Procurement Type E” even though Material master has Procurement type “F”
  • Alternate production version / PDS can only be changed in planned orders. Not possible to change once Planned order is converted to Purchase Requisition
  • In case of manual intervention,
  • User can create only SNP planned order in SNP planning book
  • User can change planned orders only in SNP planning book or in ECC system (MD04). Can’t use Product view in APO
  • Manual creation or changes will «FIX» the order immediately
  • Direct Purchase requisiton can only be created in ECC. Use respective order type for the plant and item category as «L»

 

Hope this information helps fellow consultants. Your comments are appreciated.

SCAL is the transaction used for calendar maintenance in SAP APO. This transaction will introduce us to following sub-objects:

  • Public Holidays
  • Holiday Calendar
  • Factory Calendar

 

Calendar configuration in APO can either be transferred from ECC to APO or a can be maintained directly in APO. If you opt for the first method then you cannot update only a single calendar, you must update all calendars in SCM which means all calendars in ECC will be transferred to APO. This kind of process is particularly risky when there are multiple ECC systems involved in the landscape. However, if you want to make limited changes then it is better to configure those changes directly in APO. Once you have performed the required holiday / calendar changes in APO (development client) and are set to create a transport request to move the changes to Production client there is an important point to be noted. The moment you click on the transport icon (in SCAL), you will get a warning message as below

 

1.JPG

 

So irrespective of the limited changes done, all public holidays, holiday calendars & factory calendars will be transported. To avoid this we should edit the transport request (TR) such that only specific changes are transported to Production client. Below tables should be edited within the TR no. generated to transfer the changes from development to production client.

 

Step 1: Decide which Tables should be retained and which ones are to be deleted.

 

Let us understand what each table signifies

 

2.JPG

 

Case 1: If changes are done only to Factory calendar (eg. Change validity dates etc.) Then edit tables 2 to 6 only. Delete rest of the table rows from the TR.

 

Case 2: If changes are done only to Holiday calendar (eg. Assign new public holiday etc.) Then edit tables 7 to 11 only. Delete rest of the table rows from the TR.

 

Case 3: If changes are done only to Public holiday (eg. Create new holiday etc.) Then edit tables 12, 13 & 14 only. Delete rest of the table rows from the TR.

 

Follow similar logic if changes are done to both Factory calendar as well as Holiday calendar.

 

Step 2: Decide which Calendars / Holidays are to be transported

 

Assume that we have to transport the changes to Calendar ID A1. So we have to restrict the changes by filtering in Table Key for Calendar A1 only.

 

Table key is:

 

  • For Holiday Changes: Table key is same as Public hol. ID
  • For Calendar Changes: Table key is the combination of primary key for respective table (i.e. Calendar ID) with <*> as either a suffix or pre-fix.

        Eg.: A1, *A1, A1*, *A1*

 

Refer below matrix to know when to use a suffix or prefix of *

 

3.JPG

If primary key is in first column of table and there is only one entry (row) in the table then the table key is same as Primary key i.e. Calendar ID (A1)

 

If primary key is in first column of table and there are multiple entries (rows) in the table then the table key is same as Primary key with suffix * i.e. A1*

 

Below is a table wise summary of various Table Keys.

 

5.JPG

 

In case of changes done to more than 1 public holiday then include the public holiday key of all of them. You may also specify range. Eg. 12* will consider all public holidays with public holiday id. 120 to 129. This is applicable for tables THOL, THOLT, THOLU.

Harsh Singh

PHARMA BEST APO PRACTICES

Posted by Harsh Singh Aug 28, 2013

PHARMA BEST APO PRACTICES

1)       Customers forecast their demand

2)       Forecast figure  is converted to Sales order

3)       API (Bulk materials) is produced in one plant, but packed by second plant.(Demand assigned to one plant can also result from dependent requirements of another plant)

4)       Plants acts as contractor for sourcing co transforming API’s to FG.

5)       DC & Warehouse should be close to plants.

6)       Finished goods/packaged goods to be shipped directly to customers/affiliates

7)       Finished goods from one plant can be sent to another plant for Regulation checks and then sent to customers

8)       A rough cut MPS is generated by SNP Heuristic Planning for each SKU using Global heuristics

9)       Integration of APO DP with local ERP systems provides data to used to generate forecasts & statistical forecast models embedded in DP generate market forecasts per product which are then adjusted consensus forecasting function.

10)  APO SNP takes final DP forecasts per product & generates global MPS’s per product, take inventory level, plant capacities & other restrictions into account. These drafts schedules with plants & refined & optimized.

11)  Planning cycle will start every month with generation of fresh forecasts by each affiliates

12)  Planning cycle will end with release of new global supply chain to local planning systems

13)  APO DP can be implemented to enable sales forecasting generation at affiliates.

14)  SNP VMI process heuristic methods can be used to determine replenishment quantities necessary to fulfill forecast sales. Corresponding sales orders are generated automatically.

15)  SNP Shelf life planning through Heuristics is used.

16)  SNP & CIF User exits to be used.

 

 

Harsh Singh

APO SNP Business Scenarios

Posted by Harsh Singh Aug 27, 2013

 

SNP Optimisation Planning functionality  is used  where planning is based on constraints ,penalty costs & also if there is high Production Cost & Storage Costs incurred in business. Optimiser takes costs into account.

 

We can incorporate SNP Heuristics Engine where we want to get best demand fulfillment mostly in Pharma,Consumer Packaged Goods Industry,Retail etc.

It does not takes into account constraints & costs

 

 

 

SNP CTM  Planning Engine is used mostly where material availabilty is Hard Constraint,mostly used for companies having large volume of data,big companies

 

Regards

Harsh

Extraction of  Historical Demand from R/3

 

This document basically explains, how to extract shipment data from R/3 so that one understands the delivered Sales Orders(SOs) at any point in time. The challenge is to get data for a collection of SOs which is a mix of Assemble to Order(ATO) and non-ATO SOs. Moreover, this will also state how to get the configured components from the mix in the ATO SO. One can further restrict or modify the selection criteria based on one’s requirement or business scenario.

The logic as below:

1: Go to the table VBFA which is the sales document flow table. Select all the SO’s, with their line items, material and quantity i.e VBELN, POSNV,MATNR and RFMNG respectively based on the condition that the preceding document was an order(VBTYP_V=C) and subsequent document has had goods movement(VBTYP_N=R) and also the creation date(ERDAT). Please note a shipped SO number may come from VBFA-VBELV.

2: Then go to the table VBAP(SO Items tables) get the plant and Item category(PSTYV) details.

3: Get other header details and customer details, as desried, from VBAK and KNA1 respectively.

4: Based on the Item Category, check if the SO is a ATO order or a standard order i.e whether the material is a configurable material or normal material.

5: If the SO is a non-ATO order, then you may directly show the all the details as quantity, customer, plant, material, and date i.e VBFA-RFMNG, VBAK-KUNNR, VBAP-WERKS, VBFA-MATNR, VBFA-ERDAT. You can sum all the shipment quantities for a particular material for a particular date.

6: If the SO is a ATO order, then you would also need to get the components for that order. Hence you would need to explode the ATO SO. For doing this you have to get the details of the production order or planned order(based on whatever strategy group is set in MRP view). With the SO reference get the Production Order details AFPO-AUFNR and AFPO-PSMNG from table AFPO, i.e VBAP-VBELN=AFPO-KDAUF, VBAP-POSNR=AFPO-KDPOS. Similarly for planned order from PLAF.

7: Fetch all reservation data from RESB based on the production order number(AFPO-AUFNR) or planned order(PLAF-PLNUM) above. Thus you get all component details. You can use the field RESB-KFPOS to get the option components used in the ATO SO.

8: To get the actual quantity for the Option items of a ATO sales order, shipment for any particular date, you need to get the quantity per for that option and multiply the same by the shipment quantity, i.e (RESB-BDMNG/AFPO-PSMNG)*VBFA-RFMNG. Now you can show show the all the details as quantity, customer, plant, material, option material, option material quantity and date i.e VBFA-RFMNG, VBAK-KUNNR, VBAP-WERKS, VBFA-MATNR, RESB-MATNR, (RESB-BDMNG/AFPO-PSMNG)*VBFA-RFMNG, VBFA-ERDAT. You can sum all the shipment quantities for a particular material for a particular date.

Yeah, it's better to have this here rather than explaining it every time someone asks about how to make a key-figure or a column (non-)editable in DP by a macro.

 

The below macros will make it possible

  1. to make columns non-editable [COL_INPUT] in the period defined in the step;
  2. to make key-figure [CELL_INPUT] non-editable [for the period defined in the step, of course];

 

Parameter 0 is false and 1 is true. So, if it is said COL_INPUT( 0 ), it means that inputting the column is false - meaning, not possible. So, COL_INPUT( 0 ) is turning a column in a non-edit mode while the parameter 1 is edit mode. Same applied to CELL_INPUT function while it is only for a key-figure, i.e. at row level and not the column level.

 

macro.jpg


So, I was just trying to see if there is anything I can get from analyzing myself the waterfall model. Let me make it with excel, and you can see the figures below.

blog.jpg

The point I finally concluded: With this observation in context of waterfall analysis, I came to know that we had the forecast constantly increasing while the demand was not up to the expectation all the time as we finish off with a period. And this obviously results in inventory cost for you as we are constantly increasing our stock!

 

Ok then, hope you observed the pattern [Apr: 120 --> 130, May: 115 --> 140, June: 130 --> 162, July: 125 --> 182, Aug: 134 --> 202]; if we take the snapshots of the forecast in each month from Mar to Aug, there is a constant increase in the demand at each month we enter in, and this we see considering only one parameter: the actual sales. In trying to level our forecast which was previously made based on the actual demand received, we are adjusting the forecast accordingly in the future buckets. Does that mean I have to ignore my forecast whichever is not met in a particular period so that I don't end up increasing my forecast in the future?

 

Hmm, not like that; but this is much to do with the S&Op team to decide by taking these snapshots and looking at how the forecast needs to be adjusted in the coming months. Maybe the forecast in some months of the future should be reduced as well, as you try to adjust the demand for the very immediate bucket! Err, but this will again create a problem at the sourcing locations as they will not have the clear visibility on how the forecast changes in the future, and this will lead to inefficient planning, consequently resulting in a situation where plants will at some point say: 'Boss, you cannot give to me this forecast now!!'

 

Whatever it is: at least for me, it is now concluded that you can never go to rest by doing the forecast once and then sitting silent. You need to have a close look on several parameters that would affect the forecast: we here have only considered one such parameter: actual sales. So guys, THIS adjustment of forecast is, unfortunately a never ending story.

 

And hence, it is concluded that if I take up the profession tomorrow as Demand planner from my current role of APO consultant where I am helping the Demand planners to make their operations go smooth; I think I am going to end up again restless.

 

Thank myself that I realized this now itself!

Today, I am unsuccessful in extracting the logs of some deleted jobs in sm37 as all the below possibilities didn't work for me.


  1. I thought the table TBTCO contains the details of all the job logs even of those which are deleted with a column maintained as "Deleted" but unfortunately, I didn't find that,
  2. I thought I could extract the details of the deleted job logs from scu3 transaction wherein I gave my table and see if I can get any records with Deleted status. But the trace is not activated on these tables, unfortunately,
  3. I thought the spool of the job SAP_REORG_JOBS contains the details of all the deleted jobs and tried to see the same but it just is showing the detail of what all jobs got deleted without any detail on the details[you read it right!];

 

... and I have become unsuccessful in extracting some older job logs which were highly required for me now ;-(

Got the below error while trying to load the data in the planning book? Hmm....

scc.jpg

And this is because: you should have compounded the info objects! The super-ordinate characteristic [info obj 1] is the one which is compounded within the info obj 2. But what does this compounding means? 

 

OK, let me put it simple: There is a lover, and he is like me. This means, he simply loves silently without expressing it to the one he loves dearly. And this is a one-sided love!

 

'Compounding'  means exactly the same thing! The master info object [info obj 2] which compounds the other info object[info obj 1] says, 'Hey dear one, I cannot allow anyone to access me without they speaking of you. If someone wants to access me... they should be knowing of you and they should tell me of you. I shall not allow them to access me with speaking nothing of you. I should always be appearing based on you'. But the other info object[info obj 1] simply doesn't care of this! She ignores the one who loves her. And she allows anyone to access but doesn't get restricted to her mate who loves her so much!

 

In technical terms:

  • The above message appears when "info obj 2" compounds "info obj 1".
  • You cannot load any data from the shuffler of the planning book for "info obj 2" unless you keep some condition on "info obj 1".


Okay, I don't want to speak anything here. My below macro is shouting at me: 'Give me a chance to speak!'

macro.jpg

By the way my dear macro, I would like to just update one case here to my viewers;

 

Dear viewers, one thing which you might miss out here is the case where you need to change the data type of a key-figure in the macros. If you are using a key-figure under a macro function, keep then the data type of the key-figure as 'Attributes' [either Row or Column doesn't matter].

 

As an example, in the above case, you see in the last step where a key-figure is used under the function BUCKET_BDATE (which actually gives the begin date of the week). Here, the key-figure type should be 'Attributes' [I have used 'Row attributes'].

 

... and to add some additional knowledge: just like I have used WEEK, you can as well use YEAR to fetch the year. And there are also functions in similar to BUCKET like FISCAL_B(E)DATE. In any case, you can always go here and there for more functionalities;

 

I have actually written this post because I took one challenge of identifying how many weeks still are remaining in the current fiscal period (assuming the calendar is uses 5-4-4 logic. Instead of achieving this, I kept on going with these things... and macro has finally dominated me by shouting at me to allow her to explain things to you all! Anyways, I shall have to try my challenge in some free time...

Guru Charan

Well, just like that...

Posted by Guru Charan Jun 28, 2013

Ever tried to get to the place where you can type the transaction code via the key-board?

 

Well, this is just that ... nothing more

 

tcode.jpg

Actions

Filter Blog

By author:
By date:
By tag: