Hello all,



SAP note 2083799 provides a composite scenarios involving posting with Accounting BAPIs. But I want to share with you more details related foreign currencies scenario:


The currencies are maintained in transaction OB22 for each company code. You can also see the 2-digit currency type keys here, which are used in accounting interface as well.

Currency type = 00 (document currency) and Currency type = 10 (local currency) are the only real hard-coded keys. You can also see this in OB22, as you cannot change the currency type key for local currency. For 2nd and 3rd local currency, the value depends on the kind of currency you are using. You can find a complete list of the different currency types in SE11 by checking the value range of the domain CURTP.

You see in currency type that basically it always ends with a 0. So the typical values are 10, 20, 30, 40... You also have a valueation field directly underneath with a single digit. This can indicate legal valuation, profit-center valuation and so on. For FI only legal valuation is relevant. If another valuation is maintained, then this value in valuation field will be added to currency type. So you can have a customizing like this:

2nd LC, Currency Type 30, Valuation 0 – legal valuation, currency USD
3rd LC, currency type 30, valuation 2 – profit-center valuation, currency USD


In this case LC2 would be currency type 30 and LC3 would be currency type 32. Only LC2 would be transferred to FI. LC3 would be relevant for profit-center accounting. Eventhough the amounts in LC3 are not relevant for FI, they are determined in FI-Interface (function FI_DOCUMENT_CHECK) if they have not been passed to accounting interface.


You do not have to provide the local currency amounts to accounting interface. Only the document currency is mandatory. But of course you CAN submit local currency to accounting (as MM does for example) then only the missing local currencies are determined in FI-Interface. Only important fact is: IF you submit at least one local currency, you should submit it in all line items and not just at single items. Otherwise a balance issue can occur.


More information on correct setup of OB22 is listed in note 335608.

I hope this helps!





SAP note 2083799 provides a composite scenarios involving posting with Accounting BAPIs. But I want to share with you more details related rounding differences in cross-company scenario using BAPIs.



1) In case of cross-company scenarios the rounding difference is distributed at:

FORM  SUBST_CLEARING (here the company code clearing line items are generated)



2) If the document is not a cross-company posting, the rounding difference is distributed at:

or (depending on the AWTYP)



Important to know is that the rounding difference will always be distributed as described in note 106094 (point 2 is relevant for postings using accounting interface). This means, even on cross-company scenarios the rounding difference is distributed to the first non-tax GL line item – no matter on which company code this item is. The logic in FB01 is equal.


In the form routine get_tolerance the tolerance level is determined, which is (in most cases) basically simply 2 * number of line items of the document. If the rounding difference is higher than this, the rounding difference cannot be distributed by FI-Interface.


If rounding difference is less than tolerance, the first non-tax GL line item is determined at:
loop at accit_fi where koart ca 'MSA'
                         and   taxit <> 'X'


If it cannot be distributed to those items either, it is tried to distribute it to a Customer/Vendor line. If this fails as well, error should be raised. For some countries there is even a legal requirement to create a separate rounding difference line item, which is also done in those form routines. That is done at the very end at insert_rdf_items.


I hope this helps!





Buffering activation for FI documents brings many questions from customers. SAP has various Notes and WIKIs related this topic, but I decided share with you some specific questions that I have already worked.



1 - What kind of GAPS we can have with parallel buffering and how we can document them to the business?


2 - Do we have to run reports RSSNR0S1 and RFVBER00 daily/weekly/yearly?


3 – Can a buffering which has been activated be later deactivated? What is the impact of making this change?


4 - What is the recommendation on the Testing which is required to be performed for this buffering?


5 - Which is the overall impact of activating number range buffering?




1 - What kind of GAPS we can have with parallel buffering and how we can document them to the business?

A: As described in note 1522367 point 3 "gaps" can occur at fiscal year end if number assignment is year dependent and numbers in buffer is more than 1.
Example parallel buffering with numbers in buffer = 10 and 2 instances: both instances draw 10 numbers. instance 1 = number 100010 to 100019 and instance 2 = number 100020 to 100029. Till year end close 7 documents are posted on instance 1 and 3 documents are posted on instance 2.   So for instance 1 no. from 100017 to 10019 and for instance 2 no. from 100023 to 100029 will not be used. But again these are not real gaps and again there is report RSSNR0S1 for documentation.


2 -  Q: Do we have to run reports RSSNR0S1 and RFVBER00 daily/weekly/yearly?

A: As answers above may have illustrated neither parallel buffering not buffering in ascending order will cause more real gaps than buffered number range object. But of course as developer I cannot give you a recommendation how often reports  RSSNR0S1 and RFVBER00 should be executed. This depends on your system administration and reporting requirements. As said buffering does not cause more update terminations. So there is no need to execute RFVBER00 more often only because of activated buffering. But of course RFVBER00 has to be executed before update terminations will be deleted by your
system administration.


3 – Q: Can a buffering which has been activated be later deactivated? What is the impact of making this change?

A - Yes, number range buffering can be deactivated later. But sure, postings created while number range buffering was activated will show the document numbers derived in the logic of the individual number ranges buffering type.

One possible setting in number range buffering is e.g. SNRO -> RF_Beleg to activate the flag 'parallel buffering' and to set the numbers of buffers to 1.

This means each application server picks one document number of the number range and buffers this one locally. In this way between parallel Application Servers parallel postings are possible. With this setting postings executed within each application server will be serial and chronological, only within different application servers the document numbers will not be derived chronological. In some countries it is said by law, that document numbering needs to be serial, but it is not said if this is related to a complete system and in such cases parallel buffering is an option. This means the local law must be studied carefully
before choosing one of the different buffering types.


4 - Q : What is the recommendation on the Testing which is required to be performed for this buffering?

In case a decision is made, testing is not necessary as this is a common setting which is used by nearly all customers.
You just need to choose a way of buffering, which is allowed by the country specific law of your Company Codes.


5 - Q – Which is the overall impact of activating number range buffering?

A - Depends on the way of buffering. E.g. in case you choose in SNRO the number of buffers with 100, this would mean that your application server will pick 100 document numbers and 100 postings of each number range could be done locally on this application server in parallel. But in case until the next application servers shout down only 70 postings are done, 30 document numbers will be unused, which means you would have a number range gap of 30 document numbers as after booting the application server again this one will increase the actual document numbers setting by 100 and store 100 new document numbers locally. -> but sure you need't choose  100, it is also possible to choose 10 or even 1. The advantages and disadvantages are described in note 1398444 and mentioned notes.


I hope this helps!







Frequently I receive questions related to difference between standard parallel buffering and pseudo ascending in FI. So I decided to share with you some findings related this topic:




There are 2 differences.



1.a) Parallel buffering uses number of numbers in buffer as maintained in transaction SNRO.

1.b) Buffering in Pseudo ascending order is using always number in buffer = 1. This is hard coded for buffering in pseudo ascending order.





Usually parallel buffering with numbers in buffer = 1 solves performance / lock waits issue for RF_BELEG.



2.a) Parallel buffering: In case of rollback number(s) will be reused.

2.b) Buffering in Pseudo ascending order: In case of rollback number(s) will not be used again. Otherwise the number assignment would not be chronological anymore.
Numbers which were not used again are transferred to table NRIV_RESTE. So in case of Pseudo ascending order you may have gaps in BKPF. But these gaps are not real gaps as there is report RSSNR0S1 for documentation.




For an overall explanation related to Number Range in FI, I usually check the following WIKIs. They link the most relevant notes for each specific topic:



NUMBER RANGE : Buffering

NUMBER RANGE : Gaps in Number Range

NUMBER RANGE : General Information





The below SAP note is a FAQ note and very useful also:

1398444 - Buffering the document number assignment for RF_BELEG




I hope this can help!





In this post we will look at the convergence of long standing pieces SAP ERP finance master data, the GL Account and the Cost Element. As anyone that has worked with SAP CO in the past knows the cost element is key to the controlling side of SAP. It in an object that allows you to identify the type of activities that can be done within controlling with that account. They are generally divided into Primary and Secondary cost elements.


  • Primary cost elements have an associated GL account and are generally expense or revenue accounts.
  • Secondary cost elements exist only in CO and are used for internal settlements, assessments, and allocations. 


When creating a new revenue or expense account in the GL you have to create a corresponding cost element in CO and typically all you were doing was selecting a Cost Element Category.


With the Simple Finance Add-on 2.0 (now called On Prem 1503) the traditional cost element create, change and display transactions are gone. The functions have been combined in FS00 - Manage G/L Account Centrally. This greatly simplifies the act of creating a new account and eliminated the need to maintain separate masters.


On the Type/Description screen there is a new field called account type. If you select either "Primary Costs or Revenue" or "Secondary Costs" a new field will appear on the Control Data tab for you to enter the Cost element type.


Primary Cost Example:


FS00 screen 1.jpg


FS00 screen 2.jpg

Secondary Cost Example:



FS00 screen 3.jpg

FS00 screen 4.jpg


I think this is a good step forward in simplifying the SAP finance master data and driving the convergence of FI and CO.


Recurring Entries

Posted by SUMIT MISHRA Mar 12, 2015

What are Recurring



Recurring entries allows the business a function for automatic
creation of accounting entries based on the predefined parameters.



Once the Recurring entries are created they get posted into
the SAP system as per the defined schedule by the business.



Use of this functionality is only recommended to be used
only if the account assignments objects, General ledger accounts don’t change
when the document is posted.



Recurring entries can be used in General Ledger, Accounts
Payables and Accounts Receivables postings and thus this functionality of SAP
can be used for various requirements of recurring documents postings



For Example if the Business requires the rent payments
executed each month of $1000 then we can create the recurring entries to post
those particular expenses.



Also if there are any changes we can do so and recurring
entry functionality tracks all these changes and we can see all the changes
which have been done.



How we can set up
Recurring entries in SAP?



The process to create recurring entries in SAP is pretty
straight forward and simple and this process can be used for variety of purposes.



Example shown in this document is just an reference to show how
to proceed with the process .



You can create recurring document for anything like GL posting,
AP posting or AR postings however the process will be similar as displayed in
the below screen shots.



Create the recurring document ( Please note that
recurring document number range is X1)

Transaction code FBD1











To Display Recurring Documents Transaction code FBD3





Create Recurring Documents in Books Transaction code F.14

You need to fill the details of the recurring document earlier created.

If the schedule have been defined then the recurring documents will be created as per the schedule.

Press Execute as shown in below screen shot and the Session will be created.



You can see the session of the recurring document using the transaction code SM35



Once the session is completed the Recurring document is
created in the General Ledger.  You can display usingTransaction code FB03


Recurring 8.png

Changes In Recurring document


For example if the situation arises that we need to change the recurring documents

Execute the T code
FBD2 and here we changed the payment terms from ZK26 to ZK25 and Saved it




Please note-

Some times there are requirements to see the change log  by the Business of the recurring documents we can see using the transaction code FBD4 and please notice it tracks all the frequent changes done in the past for the recurring document.

Below Screen shows the changes done in the other recurring document not shown in the above example for reference.



Recurring 10.png

Conclusion- Recurring entries is a very helpful process to post the repitative Accounting/Invoice postings in a easier and accurate way and it saves times of the end users in doing this activity every month.

Mostly this functionality can be used for booking accruals it they are same each month, Rent or Lease rental payments, Utility Bills etc.

By deploying Simple Finance, IT can bring transformation to finance. In the introduction to this series, we learned the need for simplification and the role HANA plays.


Simple Finance is our industry-leading financial solution re-built to take advantage of SAP HANA. Perhaps the most significant change from this re-build is with reconciliation. If we understand reconciliation and how Simple Finance eliminates it, we can communicate Simple Finance benefits with our colleagues in finance using an example that make sense to them.


There are two general areas of accounting:


  1. Financial (FI). For external entities. Main reports are balance sheet and P&L statements.
  2. Controlling (CO). For internal reports to management, mainly focused on cost.


In software today, FI and CO are separate components or systems, historically thought to be independent areas. Certainly they have different structures and key figures. Executives, though, wanted a holistic view, and this meant a huge reconciliation challenge to understand the differences between systems, and bring them together in a single ledger. Reconciliation is a massive, time-consuming effort that has to occur often:


  • Within components. Ensuring totals match up with underlying line-items.
  • Between components. Comparing figures between functions, like results in the P&L module to the cost-based profitability analysis module (CO-PA).


There have been improvements to make reconciliation easier, notably the New G/L (General Ledger) in ERP 2004. Ultimately, though, none of the improvements solved the underlying issue: detail is stored separately by all components (such as General Ledger, Controlling, Asset Accounting, Material Ledger, Profitability Analysis).


HANA’s most important capability is aggregating within seconds hundreds of millions of items in one table in memory. Thus, it is the ideal architecture to solve reconciliation. In Simple Finance, we combine all the data structures of the different components into one table: the Universal Journal. HANA’s columnar store with superior compression make this possible.


We can’t adequately describe the structure of the Universal Journal in a short article. The important thing for now is that HANA and the Universal Journal solve both reconciliation problems. Recall that with HANA, we no longer need redundant data (aggregates and indices) to do analysis, so the first issue of reconciliation within components is solved. All totals are derived on-the-fly from the line-items directly.


HANA before after.jpg


The more difficult reconciliation between components is solved as well. We merge the components in the Universal Journal to guarantee real-time integration. For example, with FI and CO combined logically in the Universal Journal, users drill down to the same line items from the key figures and reports of either component.


Since HANA provides unprecedented speed for multidimensional analysis, it is no longer necessary to replicate data to OLAP. Even should OLAP be needed, ETL is much simpler from the Universal Journal instead of multiple components.


All of this means dramatic simplification. In one case, a Fortune 500 early adopter cut 120 person-days per month from reconciliation efforts. This is time that can be spent on more strategic planning and analysis tasks.

Are you thinking about the next big accomplishment IT can provide for our colleagues in the finance department? You should be, because now with SAP Simple Finance, we have a great opportunity to transform how finance operates.


CFO.com reported recently that 76% of finance executives think strategic planning will be the biggest area of new demand. You can’t act strategically, however, when so much time is spent on the tactical. Consider:


  • 70% of analytics effort is preparing data (IDC, Feb 2013)
  • 76% of global companies do not have financial performance data at the ready (Harvard Business Review, 2014)
  • 73% of executives think complexity is their biggest IT challenge (Forrester, 2013)


At SAP, we want to attack complexity in finance systems. That is why we took our industry-leading finance solution and re-built it to take full advantage of HANA. The result is Simple Finance. From an IT deployment point-of-view, you can think of Simple Finance as the next version of our finance solution.


I hear all the time from our customers: “What exactly makes Simple Finance simple? What have you simplified? Will it be worth the effort to upgrade?”


This is the first in a series of articles where I will answer these questions. Each article will describe how Simple Finance transforms a key tactical process like reconciliation, analysis, and closing. Simple Finance makes these processes simpler, freeing up finance to focus on the strategic. If you know about these processes, you can explain the benefits of Simple Finance to your finance colleagues in examples that are meaningful to them.


Before we look at specific finance problems, we must first understand the foundation of Simple Finance: HANA. You probably know that HANA at its core is an in-memory database engine. In-memory means that queries run exceedingly fast. But speed alone isn’t good enough. In some cases, it may be worth the upgrade to get queries to run faster. But we want to go further and use the speed to simplify the underlying architecture. How do we do this?


The simplest way to explain it is this: we put everything into one database engine in memory. Queries, transactions—all of it. Because HANA is so fast, and the column store lends itself to excellent compression, we can finally do this. Previously, database architects needed to separate transactions carefully from queries and design redundant data (indices, subtotals) to handle the query workload. Not anymore.


HANA one source.jpg


This has a profound impact on finance systems. First, the database footprint is reduced dramatically since we no longer need redundant data. But more importantly for finance, it means we can combine data for two different tasks—transactions and queries—into one data store. This is the beginning of the simplification of finance systems. I hope you’ll read on in the series to see the specific examples. Part 2.

Hi all,

Before Implementing the 1042s note regarding US Legal Change 2014, please implement the note 2095167 for DDIC changes.

2095204 - US Legal Change 2014 - 1042S Form Reporting


At note 2095204 you have 'adobeform_upload.pdf' with instructions to change the interface 'GS_IDWTCERT_US_1042'.

If you don't have the 2013 interface, Please apply notes:

1949019 and 1949020 -> Smart forms
1949021 and 1949022 -> Adobe forms


I hope it helps to address your concern about this subject.


Best Regards,

Manuela Valente.

Business need


Our finance department engaged with a local bank to accelerate vendor payments with a p-card program. Once set up on the bank's website, they would just need a simple comma-separated text file (CSV) with some basic data like our vendor number, the vendor's invoice number, the payment date, and the amount. This was the minimum requirement; there were several other optional fields, but for this pilot project, just these four were needed. Finance asked us to configure the new payment program and help them get a test file to the bank.



Up to this point, we had been using mainly paper checks. A few vendors were accepting ACH payments, but not many. Both of these payment methods relied on the classic payment programs (RFFOUS_T and RFFOUS_C.) As this would not be printing data to a form, nor would it be outputting a standard file format like ACH, we would have to come up with a custom DME solution.


  1. Configure a Data Medium Exchange format tree
  2. Configure a Payment Medium Format that calls the DME format tree
  3. Configure a variant in the standard program
  4. Associate the new payment method with some vendors
  5. Test


Create DME format tree

This was by far the easiest thing. Using transaction code DMEE, I created a flat file format. The DME Blog and SAP Help were surprisingly helpful in this (see links below.) Some of the key things to remember were the following:

  • Remember your format tree name because your Payment Medium Format has to be named the same thing. This hung me up for the longest time.
  • Set your field type to 1 and your segment delimiter to a comma to get a CSV output.
  • Don't forget to set the carriage return and line feed check boxes on the File Data tab. Whereas the comma was your field delimiter, the CR-LF will be your record delimiter.
  • A segment is like a row.
  • An element is the equivalent to a field. These elements are mapped to specific fields in the various payment structures (FPAYP, FPAYH, etc.)
  • A technical node is similar to an element, but it does not get output into the file. In my case, I had to create a technical node for the currency code so SAP would configure my amounts correctly.
  • Conversion functions are available for elements to format currencies, dates, and other strings of text in any way you might need. Need to get rid of leading zeroes in a cost center field? There's a conversion function for that, and there's even a handy wizard to walk you through getting the correct one in plain English.


Configuring the payment medium format

This was a challenge before I carefully read the DME blog. I cannot overstate how helpful this blog was. Since I was going into this effort without much formal training in configuring the payment program, this no-nonsense, plain-language blog was much more useful to me than SAP's documentation, which I found cryptic at times. Some of the key lessons I learned were the following:

  • Create New Entries rather than copying from an existing format if your format is non-standard. If you are tweaking an existing format, copying makes sense. My first attempt was a disaster because I copied an ACH format and wondered why my output looked like an ACH file.
  • Your format name must be the same as your DME format tree name. This caused me no end of pain, too, when the payment program either produced nothing (without errors) or produced error messages that weren't very helpful. When you name both things the same, everything just falls into place.
  • Step through everything in FBZP so you don't miss anything. Every button; every setting -- check and recheck that you configured everything you needed to. If you miss a setting in FBZP, you won't get the output you want. Trust me.


Configure a variant in the standard program

Once SAP sees that you are trying to pay using the Payment Medium Workbench and DME, it's going to steer clear of the classic RFFOUS* programs and use SAPFPAYM instead. The variant you create will reference the payment medium format you configured in the previous step, tell the program to use DME to create the output, and specify a default directory and file name. This is just like setting up any other variant in SAP, but you have to take another step after this.


You must go to transaction OBPM4 and associate the variant with your payment medium format. You must or you will bang your head on your desk as much as I did, even though it doesn't help get you your output file at all. Only configuring OBPM4 does that.


Associate the payment method with vendors and test

Here is where you engage your business users and start testing. Once you start using the new payment method (for instance putting it on the vendor master record,) then when you run F110, the DME magic happens.


Now what?

I have written here in very general terms about the configuration steps required to configure a custom, non-standard payment method. I have tried to identify the painful lessons I learned trying to do it myself. There are many details and nuances I did not cover, but that the DME blog covers exhaustively. I look forward to your questions and comments below.






From a standard SAP process, the instruction key that is updated in vendor master (LFA1-DTAWS) is captured in to REGUH-DTAWS when F110 or RFF110S is executed. As a next step when SAPFPAYM ( t code FBPM) is executed, this gets populated in the XML file.


For some reasons that does not seem to happen for me. I have identified SAP notes 2106391 ( includes 1574937 &1779966) for this issue.


Did some one come across a similar issue like this. Does this notes fix the issue?


Best regards




How do you decide upon the best approach to clear your check register in SAP for your company? Let me help to demystify this question and others over the next three weeks of postings. Stay tuned to find out that keeping the check register up to date doesn’t need to be a difficult exercise. I will provide three different options on how you can do this within your company.  But how do you decide which option makes the most sense for you?  Every company will have a different perspective, but here are some considerations for your project approach to managing the check register:


1.     Volume and Timing: How many checks are being issued being issued from this account each month? Is receiving an encashment file once per month sufficient? At what point would potential error corrections be too large? Does the volume become too large by having a file only once a month to manage errors? Perhaps it is easier to receive activity each day for reconciliation purposes.


2.     Cost: Do you have to pay additional to have the individual checks included on your bank statement file? What is the additional charge? Could you use the bank statement and eliminate the need to have a separate interface for only cashed checks?


3.     Monitoring: Who will perform the ongoing monitoring to ensure checks are being updated in the register? Will this be done as part of your bank reconciliation process, by your team supporting payments, or someone else? Who will make sure that either solution continues to work properly? Will this person participate in regression testing for the solution once it is live?


4.     Knowledge and Know-how: Does your technical team understand how to set up the different solutions? Do they understand the potential failure points? What will be your processes for ongoing monitoring?


5.     Staffing availability: Does your internal project team have the capacity to set up these interfaces? Often project teams have competing priorities with everything demanding top position in the project scope. If there is a constraint on internal staffing from either the business function or IT, perhaps it is best to update checks manually while other priorities are completed.


6.     Cost of compliance: In cases where the processes are completely manual, what is it costing you to comply with the various state escheatment laws? Would an automated interface make your compliance efforts easier and less expensive?

Now that the stage is set, over the next few weeks I will post how you can actually do this in SAP. 

Hello everyone,



The purpose of this blog is help to identify the root cause of error FS 861 in External Tax System (SABRIX, VERTEX and TAXWARE).



Error FS 861 can be a generic error resulted from RFC_CALCULATE_TAXES_DOC when external tax calculation is used. The cause can be wrong jurisdiction code, jurisdiction code structure, country not supported and so on.



The steps bellow should be followed to identify the root cause of error FS 861.



1. Check if company code is US,CA,PR:



According with SAP Note 1738657, SAP supports external tax systems only for the country codes US, CA and PR (Puerto Rico). This means the country of the company code must be one of these countries so that taxes are calculated using the external tax system and are also updated there.




2. Check jurisdiction code structure defined in transaction OBCO:








The default jurisdiction structure is:

Sabrix    = 2,2,5,5

Taxware = 2,5,2,0

Vertex    = 2,3,4,1




3. For non-taxable scenarios check the default jurisdiction code defined in transaction OBCL:


Enter I0 (for input tax), O0 (for output tax) and a dummy jurisdiction code for the relevant company code(s) using TAXUSX. The system uses this dummy jurisdiction code also for export business transactions. (If you create a billing document in a country with tax jurisdiction code, when the goods recipient is in a different country with or without tax jurisdiction code.) Please have a look at note 419124 where the suggested export jurisdiction codes used by the different External Tax Interface Partners are listed.




spro obcl.PNG





  4. For exporting scenario you have assigned for jurisdiction code in customer master data.:


As informed in SAP Note 419124 the Vertex jurisdiction code for export transactions is '770000000'. For Taxware, the jurisdiction code is IT0000000.


Please check if all new released notes are implemented:





2016990 - Export between United States and Canada

1628962 - Export with invalid tax jurisdiction

1768395 - Export with invalid tax jurisdiction (1)

1809374 - Export with invalid tax jurisdiction (2)

1899214 - Export with invalid tax jurisdiction (3)

2016058 - Export with invalid tax jurisdiction (4)


2095331 - Export with invalid tax jurisdiction (5)




5. Check the Condition used in SD:


Trigger conditions UTXD and UTXE are used in external tax calculation. Both conditions must be used together in the pricing
procedure (for technical reasons). Condition UTXD has value formula 500 and condition UTXE has value formula 501. They are triggered only by FM
PRICING_COMPLETE. The RFC is called only once per document. This is called the MaxTax procedure (developed by FI), and it is supposed to be faster. For more details related SD customizing please check SAP WIKI
Tax jurisdiction - ERP SD - SCN Wiki


6. Check if the Jurisdiction code maintained is valid:


SE37 transation -> RFC_DETERMINE_JURISDICTION -> single Test (F8) -> Test Data Directory

*** "in RFC Target Sys" inform the external tax system (RFC) destination***




7. If the above customizing is correct check the parameters passed to external tax system and received in debug mode:




  1. Transaction se24
  2. cl_xtax_rules_rfc
  3. double click on method RFC_CALCULATE_TAXES_DOC
  4. and scroll down till you see where the RFC_CALCULATE_TAXES_DOC function call is made
  5. set a break point at the break point call function 'RFC_CALCULATE_TAXES_DOC (to check the input parameters) and one after the  call function (to check the output parameters)
  6. and do the transaction again
  7. inspect the input structures: tax_cal_item_inxx: outgoing document values to tax system -->>> If you see some wrong information this is caused by wrong customizing. In this case please review in detail the configuration guide attached on SAP Note 392696 - R/3 Tax Interface Configuration Guide.

   8.  and do F6

   9. now inspect the error structure and output structure:

      tax_cal_item_outxx: incoming values from tax system

tax_cal_jur_level_outxx: jurisdiction level  --->>> if you see an error, this has to be analyzed by company responsible for the External tax system




I hope this helps!




Hi Colleagues,


Through the transaction or program, you can discover the number of customers, vendors, GL accounts, assets or normal FI documents in a client, company, by period, chart of accounts, etc. Basically the program research the data master tables with company codes, chart of accounts and so on.

RFAUDI01 - Number of Customer Master Records - S_ALR_87101051


The screenshot above indicates the number of customers according the KNA1 table and KNB1 table per company code.

RFAUDI02 - Number of Vendor Master Records - S_ALR_87101052


The program above check all the vendos by LFA1 and/or can separate by company code.

RFAUDI03 - Number of G/L Master Records - S_ALR_87101049



The screenshot above indicates how many items there are in SKA1 table per chart of accounts. And below, indicates how many items there are in the SKB1 table per company in each chart of accounts. This is very useful when the client has many company codes assigned to a chart of accounts. There aren't a option to select parameters.


RFAUDI04 - Number of Asset Master Records - S_ALR_87101050


Technically, the program check ANLH table. You do not have access to selection parameters like a research by company code.



RFAUDI07 - Number of Standard FI Documents - S_ALR_87101054 is shown below:



In this case, the program will check the BKPF table by company, ledger, type of document, period and year, but you can insert posting date to research in a specific period. Remeber that normal documents are all documents that aren't special GL items, parked items or noted items.


Also, you can run these transactions in backgorund mode and get the list in the SM37.




Hi Colleagues,


Running NGLM tcode, is displayed many settings for GL to analyze, like screen shots below. You find detailed information on the organizational units ledger, company code, controlling area, and operating concern. Also, about the Document Splitting: In this area, you find options for the detailed analysis of the document splitting settings.


In the Data Analysis area, the monitor offers functions for the quantitative analysis of the transaction data updated in the line item tables, totals record tables, and document splitting tables of General Ledger Accounting. First start the Run Data Analysis function as a background job. The analysis data is stored automatically in a data file. You can then use various criteria to analyze.


This helps, e.g when you wants to check what characteristics of Document Splitting, what chart of accounts is assigned, fiscal variant, scenarios from Document Splitting, what total table is active per ledger, CO details and many other functions.


Follow, there're some steps:







The tcode allows to you run other transactions related, like Maintain Company Code (OBY6) and many other.



To view Document Splitting characteristics




And again you can run transactions to maintain Document Splitting




Also, is possible to check CO details








Filter Blog

By author:
By date:
By tag: