By deploying Simple Finance, IT can bring transformation to finance. In the introduction to this series, we learned the need for simplification and the role HANA plays.


Simple Finance is our industry-leading financial solution re-built to take advantage of SAP HANA. Perhaps the most significant change from this re-build is with reconciliation. If we understand reconciliation and how Simple Finance eliminates it, we can communicate Simple Finance benefits with our colleagues in finance using an example that make sense to them.


There are two general areas of accounting:


  1. Financial (FI). For external entities. Main reports are balance sheet and P&L statements.
  2. Controlling (CO). For internal reports to management, mainly focused on cost.


In software today, FI and CO are separate components or systems, historically thought to be independent areas. Certainly they have different structures and key figures. Executives, though, wanted a holistic view, and this meant a huge reconciliation challenge to understand the differences between systems, and bring them together in a single ledger. Reconciliation is a massive, time-consuming effort that has to occur often:


  • Within components. Ensuring totals match up with underlying line-items.
  • Between components. Comparing figures between functions, like results in the P&L module to the cost-based profitability analysis module (CO-PA).


There have been improvements to make reconciliation easier, notably the New G/L (General Ledger) in ERP 2004. Ultimately, though, none of the improvements solved the underlying issue: detail is stored separately by all components (such as General Ledger, Controlling, Asset Accounting, Material Ledger, Profitability Analysis).


HANA’s most important capability is aggregating within seconds hundreds of millions of items in one table in memory. Thus, it is the ideal architecture to solve reconciliation. In Simple Finance, we combine all the data structures of the different components into one table: the Universal Journal. HANA’s columnar store with superior compression make this possible.


We can’t adequately describe the structure of the Universal Journal in a short article. The important thing for now is that HANA and the Universal Journal solve both reconciliation problems. Recall that with HANA, we no longer need redundant data (aggregates and indices) to do analysis, so the first issue of reconciliation within components is solved. All totals are derived on-the-fly from the line-items directly.


HANA before after.jpg


The more difficult reconciliation between components is solved as well. We merge the components in the Universal Journal to guarantee real-time integration. For example, with FI and CO combined logically in the Universal Journal, users drill down to the same line items from the key figures and reports of either component.


Since HANA provides unprecedented speed for multidimensional analysis, it is no longer necessary to replicate data to OLAP. Even should OLAP be needed, ETL is much simpler from the Universal Journal instead of multiple components.


All of this means dramatic simplification. In one case, a Fortune 500 early adopter cut 120 person-days per month from reconciliation efforts. This is time that can be spent on more strategic planning and analysis tasks.

Are you thinking about the next big accomplishment IT can provide for our colleagues in the finance department? You should be, because now with SAP Simple Finance, we have a great opportunity to transform how finance operates. reported recently that 76% of finance executives think strategic planning will be the biggest area of new demand. You can’t act strategically, however, when so much time is spent on the tactical. Consider:


  • 70% of analytics effort is preparing data (IDC, Feb 2013)
  • 76% of global companies do not have financial performance data at the ready (Harvard Business Review, 2014)
  • 73% of executives think complexity is their biggest IT challenge (Forrester, 2013)


At SAP, we want to attack complexity in finance systems. That is why we took our industry-leading finance solution and re-built it to take full advantage of HANA. The result is Simple Finance. From an IT deployment point-of-view, you can think of Simple Finance as the next version of our finance solution.


I hear all the time from our customers: “What exactly makes Simple Finance simple? What have you simplified? Will it be worth the effort to upgrade?”


This is the first in a series of articles where I will answer these questions. Each article will describe how Simple Finance transforms a key tactical process like reconciliation, analysis, and closing. Simple Finance makes these processes simpler, freeing up finance to focus on the strategic. If you know about these processes, you can explain the benefits of Simple Finance to your finance colleagues in examples that are meaningful to them.


Before we look at specific finance problems, we must first understand the foundation of Simple Finance: HANA. You probably know that HANA at its core is an in-memory database engine. In-memory means that queries run exceedingly fast. But speed alone isn’t good enough. In some cases, it may be worth the upgrade to get queries to run faster. But we want to go further and use the speed to simplify the underlying architecture. How do we do this?


The simplest way to explain it is this: we put everything into one database engine in memory. Queries, transactions—all of it. Because HANA is so fast, and the column store lends itself to excellent compression, we can finally do this. Previously, database architects needed to separate transactions carefully from queries and design redundant data (indices, subtotals) to handle the query workload. Not anymore.


HANA one source.jpg


This has a profound impact on finance systems. First, the database footprint is reduced dramatically since we no longer need redundant data. But more importantly for finance, it means we can combine data for two different tasks—transactions and queries—into one data store. This is the beginning of the simplification of finance systems. I hope you’ll read on in the series to see the specific examples. Part 2.

Hi all,

Before Implementing the 1042s note regarding US Legal Change 2014, please implement the note 2095167 for DDIC changes.

2095204 - US Legal Change 2014 - 1042S Form Reporting


At note 2095204 you have 'adobeform_upload.pdf' with instructions to change the interface 'GS_IDWTCERT_US_1042'.

If you don't have the 2013 interface, Please apply notes:

1949019 and 1949020 -> Smart forms
1949021 and 1949022 -> Adobe forms


I hope it helps to address your concern about this subject.


Best Regards,

Manuela Valente.

Business need


Our finance department engaged with a local bank to accelerate vendor payments with a p-card program. Once set up on the bank's website, they would just need a simple comma-separated text file (CSV) with some basic data like our vendor number, the vendor's invoice number, the payment date, and the amount. This was the minimum requirement; there were several other optional fields, but for this pilot project, just these four were needed. Finance asked us to configure the new payment program and help them get a test file to the bank.



Up to this point, we had been using mainly paper checks. A few vendors were accepting ACH payments, but not many. Both of these payment methods relied on the classic payment programs (RFFOUS_T and RFFOUS_C.) As this would not be printing data to a form, nor would it be outputting a standard file format like ACH, we would have to come up with a custom DME solution.


  1. Configure a Data Medium Exchange format tree
  2. Configure a Payment Medium Format that calls the DME format tree
  3. Configure a variant in the standard program
  4. Associate the new payment method with some vendors
  5. Test


Create DME format tree

This was by far the easiest thing. Using transaction code DMEE, I created a flat file format. The DME Blog and SAP Help were surprisingly helpful in this (see links below.) Some of the key things to remember were the following:

  • Remember your format tree name because your Payment Medium Format has to be named the same thing. This hung me up for the longest time.
  • Set your field type to 1 and your segment delimiter to a comma to get a CSV output.
  • Don't forget to set the carriage return and line feed check boxes on the File Data tab. Whereas the comma was your field delimiter, the CR-LF will be your record delimiter.
  • A segment is like a row.
  • An element is the equivalent to a field. These elements are mapped to specific fields in the various payment structures (FPAYP, FPAYH, etc.)
  • A technical node is similar to an element, but it does not get output into the file. In my case, I had to create a technical node for the currency code so SAP would configure my amounts correctly.
  • Conversion functions are available for elements to format currencies, dates, and other strings of text in any way you might need. Need to get rid of leading zeroes in a cost center field? There's a conversion function for that, and there's even a handy wizard to walk you through getting the correct one in plain English.


Configuring the payment medium format

This was a challenge before I carefully read the DME blog. I cannot overstate how helpful this blog was. Since I was going into this effort without much formal training in configuring the payment program, this no-nonsense, plain-language blog was much more useful to me than SAP's documentation, which I found cryptic at times. Some of the key lessons I learned were the following:

  • Create New Entries rather than copying from an existing format if your format is non-standard. If you are tweaking an existing format, copying makes sense. My first attempt was a disaster because I copied an ACH format and wondered why my output looked like an ACH file.
  • Your format name must be the same as your DME format tree name. This caused me no end of pain, too, when the payment program either produced nothing (without errors) or produced error messages that weren't very helpful. When you name both things the same, everything just falls into place.
  • Step through everything in FBZP so you don't miss anything. Every button; every setting -- check and recheck that you configured everything you needed to. If you miss a setting in FBZP, you won't get the output you want. Trust me.


Configure a variant in the standard program

Once SAP sees that you are trying to pay using the Payment Medium Workbench and DME, it's going to steer clear of the classic RFFOUS* programs and use SAPFPAYM instead. The variant you create will reference the payment medium format you configured in the previous step, tell the program to use DME to create the output, and specify a default directory and file name. This is just like setting up any other variant in SAP, but you have to take another step after this.


You must go to transaction OBPM4 and associate the variant with your payment medium format. You must or you will bang your head on your desk as much as I did, even though it doesn't help get you your output file at all. Only configuring OBPM4 does that.


Associate the payment method with vendors and test

Here is where you engage your business users and start testing. Once you start using the new payment method (for instance putting it on the vendor master record,) then when you run F110, the DME magic happens.


Now what?

I have written here in very general terms about the configuration steps required to configure a custom, non-standard payment method. I have tried to identify the painful lessons I learned trying to do it myself. There are many details and nuances I did not cover, but that the DME blog covers exhaustively. I look forward to your questions and comments below.






From a standard SAP process, the instruction key that is updated in vendor master (LFA1-DTAWS) is captured in to REGUH-DTAWS when F110 or RFF110S is executed. As a next step when SAPFPAYM ( t code FBPM) is executed, this gets populated in the XML file.


For some reasons that does not seem to happen for me. I have identified SAP notes 2106391 ( includes 1574937 &1779966) for this issue.


Did some one come across a similar issue like this. Does this notes fix the issue?


Best regards



How do you decide upon the best approach to clear your check register in SAP for your company? Let me help to demystify this question and others over the next three weeks of postings. Stay tuned to find out that keeping the check register up to date doesn’t need to be a difficult exercise. I will provide three different options on how you can do this within your company.  But how do you decide which option makes the most sense for you?  Every company will have a different perspective, but here are some considerations for your project approach to managing the check register:


1.     Volume and Timing: How many checks are being issued being issued from this account each month? Is receiving an encashment file once per month sufficient? At what point would potential error corrections be too large? Does the volume become too large by having a file only once a month to manage errors? Perhaps it is easier to receive activity each day for reconciliation purposes.


2.     Cost: Do you have to pay additional to have the individual checks included on your bank statement file? What is the additional charge? Could you use the bank statement and eliminate the need to have a separate interface for only cashed checks?


3.     Monitoring: Who will perform the ongoing monitoring to ensure checks are being updated in the register? Will this be done as part of your bank reconciliation process, by your team supporting payments, or someone else? Who will make sure that either solution continues to work properly? Will this person participate in regression testing for the solution once it is live?


4.     Knowledge and Know-how: Does your technical team understand how to set up the different solutions? Do they understand the potential failure points? What will be your processes for ongoing monitoring?


5.     Staffing availability: Does your internal project team have the capacity to set up these interfaces? Often project teams have competing priorities with everything demanding top position in the project scope. If there is a constraint on internal staffing from either the business function or IT, perhaps it is best to update checks manually while other priorities are completed.


6.     Cost of compliance: In cases where the processes are completely manual, what is it costing you to comply with the various state escheatment laws? Would an automated interface make your compliance efforts easier and less expensive?

Now that the stage is set, over the next few weeks I will post how you can actually do this in SAP. 

Hello everyone,



The purpose of this blog is help to identify the root cause of error FS 861 in External Tax System (SABRIX, VERTEX and TAXWARE).



Error FS 861 can be a generic error resulted from RFC_CALCULATE_TAXES_DOC when external tax calculation is used. The cause can be wrong jurisdiction code, jurisdiction code structure, country not supported and so on.



The steps bellow should be followed to identify the root cause of error FS 861.



1. Check if company code is US,CA,PR:



According with SAP Note 1738657, SAP supports external tax systems only for the country codes US, CA and PR (Puerto Rico). This means the country of the company code must be one of these countries so that taxes are calculated using the external tax system and are also updated there.




2. Check jurisdiction code structure defined in transaction OBCO:








The default jurisdiction structure is:

Sabrix    = 2,2,5,5

Taxware = 2,5,2,0

Vertex    = 2,3,4,1




3. For non-taxable scenarios check the default jurisdiction code defined in transaction OBCL:


Enter I0 (for input tax), O0 (for output tax) and a dummy jurisdiction code for the relevant company code(s) using TAXUSX. The system uses this dummy jurisdiction code also for export business transactions. (If you create a billing document in a country with tax jurisdiction code, when the goods recipient is in a different country with or without tax jurisdiction code.) Please have a look at note 419124 where the suggested export jurisdiction codes used by the different External Tax Interface Partners are listed.




spro obcl.PNG





  4. For exporting scenario you have assigned for jurisdiction code in customer master data.:


As informed in SAP Note 419124 the Vertex jurisdiction code for export transactions is '770000000'. For Taxware, the jurisdiction code is IT0000000.


Please check if all new released notes are implemented:





2016990 - Export between United States and Canada

1628962 - Export with invalid tax jurisdiction

1768395 - Export with invalid tax jurisdiction (1)

1809374 - Export with invalid tax jurisdiction (2)

1899214 - Export with invalid tax jurisdiction (3)

2016058 - Export with invalid tax jurisdiction (4)


2095331 - Export with invalid tax jurisdiction (5)




5. Check the Condition used in SD:


Trigger conditions UTXD and UTXE are used in external tax calculation. Both conditions must be used together in the pricing
procedure (for technical reasons). Condition UTXD has value formula 500 and condition UTXE has value formula 501. They are triggered only by FM
PRICING_COMPLETE. The RFC is called only once per document. This is called the MaxTax procedure (developed by FI), and it is supposed to be faster. For more details related SD customizing please check SAP WIKI
Tax jurisdiction - ERP SD - SCN Wiki


6. Check if the Jurisdiction code maintained is valid:


SE37 transation -> RFC_DETERMINE_JURISDICTION -> single Test (F8) -> Test Data Directory

*** "in RFC Target Sys" inform the external tax system (RFC) destination***




7. If the above customizing is correct check the parameters passed to external tax system and received in debug mode:




  1. Transaction se24
  2. cl_xtax_rules_rfc
  3. double click on method RFC_CALCULATE_TAXES_DOC
  4. and scroll down till you see where the RFC_CALCULATE_TAXES_DOC function call is made
  5. set a break point at the break point call function 'RFC_CALCULATE_TAXES_DOC (to check the input parameters) and one after the  call function (to check the output parameters)
  6. and do the transaction again
  7. inspect the input structures: tax_cal_item_inxx: outgoing document values to tax system -->>> If you see some wrong information this is caused by wrong customizing. In this case please review in detail the configuration guide attached on SAP Note 392696 - R/3 Tax Interface Configuration Guide.

   8.  and do F6

   9. now inspect the error structure and output structure:

      tax_cal_item_outxx: incoming values from tax system

tax_cal_jur_level_outxx: jurisdiction level  --->>> if you see an error, this has to be analyzed by company responsible for the External tax system




I hope this helps!




Hi Colleagues,


Through the transaction or program, you can discover the number of customers, vendors, GL accounts, assets or normal FI documents in a client, company, by period, chart of accounts, etc. Basically the program research the data master tables with company codes, chart of accounts and so on.

RFAUDI01 - Number of Customer Master Records - S_ALR_87101051


The screenshot above indicates the number of customers according the KNA1 table and KNB1 table per company code.

RFAUDI02 - Number of Vendor Master Records - S_ALR_87101052


The program above check all the vendos by LFA1 and/or can separate by company code.

RFAUDI03 - Number of G/L Master Records - S_ALR_87101049



The screenshot above indicates how many items there are in SKA1 table per chart of accounts. And below, indicates how many items there are in the SKB1 table per company in each chart of accounts. This is very useful when the client has many company codes assigned to a chart of accounts. There aren't a option to select parameters.


RFAUDI04 - Number of Asset Master Records - S_ALR_87101050


Technically, the program check ANLH table. You do not have access to selection parameters like a research by company code.



RFAUDI07 - Number of Standard FI Documents - S_ALR_87101054 is shown below:



In this case, the program will check the BKPF table by company, ledger, type of document, period and year, but you can insert posting date to research in a specific period. Remeber that normal documents are all documents that aren't special GL items, parked items or noted items.


Also, you can run these transactions in backgorund mode and get the list in the SM37.




Hi Colleagues,


Running NGLM tcode, is displayed many settings for GL to analyze, like screen shots below. You find detailed information on the organizational units ledger, company code, controlling area, and operating concern. Also, about the Document Splitting: In this area, you find options for the detailed analysis of the document splitting settings.


In the Data Analysis area, the monitor offers functions for the quantitative analysis of the transaction data updated in the line item tables, totals record tables, and document splitting tables of General Ledger Accounting. First start the Run Data Analysis function as a background job. The analysis data is stored automatically in a data file. You can then use various criteria to analyze.


This helps, e.g when you wants to check what characteristics of Document Splitting, what chart of accounts is assigned, fiscal variant, scenarios from Document Splitting, what total table is active per ledger, CO details and many other functions.


Follow, there're some steps:







The tcode allows to you run other transactions related, like Maintain Company Code (OBY6) and many other.



To view Document Splitting characteristics




And again you can run transactions to maintain Document Splitting




Also, is possible to check CO details







I attended this SAP webcast this morning and was surprised at the low attendance.  There are many improvements in FI-CO and no, I am not talking "Simple Finance" but the core FI-CO modules.


See below:


Customer Connection for Financials started last December


The usual Legal Disclaimer applies


Figure 1: Source: SAP


This customer connect started last December and closed workspace in February


SAP checked ideas which had the most subscriptions


They evaluated during the select phase


In April they held selection call


Subscribe means you will use improvements productively


Figure 2: Source: SAP


Figure 2 shows 17 improvement requests are delivered in CO


2 are in progress


2 are in handover


11 has FI delivered with 1 improvement request rejected


Figure 3: Source: SAP


Figure 3 shows one of the most popular requests, display controlling area


Figure 4: Source: SAP


Some of the more interesting notes (to me) listed in Figure 4 are below:


Figure 5: Source: SAP


Figure 5 shows an improvement for KOK5


Figure 6: Source: SAP


More CO improvements are listed in Figure 6


Figure 7: Source: SAP


I’m surprised at Figure 7, exporting to Excel from background jobs, but I understand it at the same time too, having done this many times myself.


This note was just released


Figure 8: Source: SAP


I think the most interesting improvement includes all the notes listed with OB52 posting period automation – I only list a few below:




Parallel opening of periods


Figure 9: Source: SAP


Figure 9 shows “display full user name” but looking at the note it says “Branch to user data”


Visit to find delivered improvements

Hello Everyone,



The idea of this blog is summarize the most commun actions to improve the performance in FBRA as well as the SAP notes related with each topic suggested.



-> If change selection parameters does not solve the issue you can use FBRA_LOC transaction as described in SAP note 487347:


487347 - FBRA: Overflow of the lock table


This note suggests to use transaction FBRA_LOAC as a workaround for problematic cases.

For avioding the lock table overflow, there are these possibilities:

1. Extend the profile-parameter "enqueue/table_size", as per note 13907

2. Reduce BKORM entries as much as possible, i.d. for all company codes
   The less entries in table BKORM exist the more faster is FBRA or
   FBRA_LOAC. Please refer to steps 2. and 3. of note 487347.


Additonally, please review the following recommendations

1.      Use Note 1320810 Z_ENQUEUE_PERF during both high and low load.

2.      Test changing the following parameters enque/server/threadcount and enque/server/use_spinning to true.

3.      Run niping during both high and low load.

4.      Use test in okcode in sm12 and set logging on for 15 seconds and run mass calls, starting with a thousand during load and check stad record for enq performance.

5.      During problem time check for rejects in sm12. If refresh shows increases of hundreds, points more to an application issue.

6.      Consider moving the standalone enqueue server to a dedicated system.

7.      Continue application analysis.




-> If you are receiving runtime error TSV_TNEW_PAGE_ALLOC_FAILED in transaction FBRA can be that table BKORM has too many entries.

This table could contains millions of old and very old correspondence requests, which never have been printed and therefore never been deleted from this table.

If you don't print your correspondence requests, you have to reorganize table BKORM from time to time by means of report SAPF140D. This
report can be reached for each user via transaction F.63. The BKORM should contain normally only current correspondence requirements which were not printed yet and possibly already printed, which are not older than 14 days, for a possible reprint. There is no need to keep printed correspondence in BKORM for a long time.
Many of these correspondence requests were created a long time ago and you do not intend to print and to send these correspondences. Such correspondences requests can be safely deleted. Please take the following precautions:
  1. You should run SAPF140D according to note 17831 on a regular basis (every two days or at least weekly) to delete all accounting
correspondence requests out of BKORM that are finished since (e.g. one week).
  2. The same report should be used to delete all accounting correspondence requests that have no expiry date at all('entries
withoutprint date under 'Further selections').  There is also a possibility to run the report in test mode.
  3. In the future, you should use SAPF140 as follows: On the selection screen, under 'Program Control' there is a field called 'Delete if
finished since'. This field has to be filled with a certain number of days (e.g. 7). You could create a variant and then always use this
variant for SAPF140 for which this field is set.
If entries in BKORM are deleted to correct size, FBRA should work fine.



I hope that I could help!



Kind Regards,


Hi guys,


Just a tip about this posting where I was having some doubts why the post was not going through the BADI.



If the post is a Down payment request or a Noted item, you must complement the BAPI_ACC_DOCUMENT_POST using:


    WHEN 'DP'.

      ls_bapiache09-obj_type = 'BKPFF'.

      ls_extension2-structure = space.

      ls_extension2-valuepart1 = 'BUS_ACT'.

      ls_extension2-valuepart2 = 'RFST'.



Implement the BADI_ACC_DOCUMENT:



The BADI_ACC_DOCUMENT has to have the following filter:



And the Method CHANGE should be like:


METHOD if_ex_acc_document~change.

  FIELD-SYMBOLS <accit> TYPE accit.

  DATA ls_extension2    TYPE bapiparex.


  IF c_extension2 IS NOT INITIAL.

    READ TABLE c_extension2 INDEX 1 INTO ls_extension2.

    IF ls_extension2-valuepart1 = 'BUS_ACT' AND ls_extension2-valuepart2 = 'RFST'.

      c_acchd-glvor = 'RFST'.


      LOOP AT c_accit ASSIGNING <accit>.

        <accit>-bstat = 'S'.





Hello everyone,




In 2014 SAP released notes 2092366 and 2083799 which might be important for the parking using BAPI scenario. 


Document parking with accounting BAPIs could only be achieved using a custom enhancement (BTE or BAdI implementation). 


With note 2092366 SAP now officially supports the functionality of parking documents using the accounting BAPI, but only under the restrictions listed in the notes 2092366 and 2021422 With this scenario no manipulation of TCODE allowed and only for object type BKPFF.


The official solution in note 2092366 is only be delivered in support packages. A manual implementation using SNOTE is not intended to guarantee that all previous notes regarding this functionality are available in the customer system.


The more important note for you might be 2083799. This SAP Note describes various posting processes and specifies which data must be transferred to the
BAPI to post the relevant process. The idea is document all possible posting scenarios supported by the BAPIs in this note. The developers still working in this note so if a customer wants to post another scenario which is not yet documented in the note, it does not mean the BAPI does not support this.


Related notes:

2092366               Parking with BAPI_ACC_DOCUMENT_POST

2083799               Composite SAP Note: Postings with Accounting BAPIs

2021422               Vorerfassung über RWIN mit unerlaubtem TCODE oder AWTYP (not translated yet)



Note related park documents using BAPI released in 2015:

2229113 - Using BAPI_ACC_DOCUMENT_POST to park a document with multilevel tax code



I hope it helps! Any sugestions please let we know!





Hi all,


We already have new notes just released for 1099 reporting, where Print Forms Layouts in Smartforms and PDF format are available.


Two mainstream notes are available:



2074358 - US TAX REPORTS - PRINT FORMS LAYOUTS 1099-MISC, 1099-G, 1099-INT, 1099-K, 1042-S




2074359 - US TAX REPORTS - PRINT FORMS LAYOUTS 1099-MISC, 1099-G, 1099-INT, 1099-K, 1042-S

for customers who are in releases 46C, 470 and 500.


According to those notes which are announcements, it will be updated by the time final notes with corrections and updated forms and files would be released.


Those would be the main channel where new corrections and notes related to 1099 that are released will be maintained.


Please follow up on those notes for latest updates about 1099 Print Form layout updates for the Tax Year 2014 to be submitted in 2015.





I hope it helps to address you your concern about this subject.


Danton Prestes

Head Office and Branch Accounts


In some industries,branches of a company sell their goods independently but the accounting for these sales is performed centrally (at the head office). You can represent this type of organizational structure in the R/3 System by using head office and branch accounts.

First you need to create head office and branch accounts. The sales orders are managed in the branchaccount. The sales and transaction figures, however, are not posted to this account but rather automatically to the head office account. Payments are cleared centrally by the head office, meaning that outgoing payments can be made for several branches in one step, using the head office account.




Link between Branch Accounts and Head Office Account

To link branch accounts to a head office account, you must enter the number of the head office account in the Head office field in the branch account master record. This field is contained in the company code area of the master record.

The head office account can be any vendor account except one-time accounts or branch accounts themselves. Branch accounts and head office accounts must belong to the same company code.










Line Item Display

When you are entering the parameters for line item display, you should note the following: for head office accounts, enter the key 004 in the field Sort key. This instructs the system to display the line items for the head office account sorted by branch. This key is defined in the table for allocation rules.

Invoice posting to branch vendor






Document gets posted in the head office with branch in assignment field



FBL1n shows in head office account



FBL1n for branch account gives below message




FBL1n for all branches under the headoffice



Payment run



FBL1N after payments



You can set up your system to cater for written correspondence with vendors a) for the head office, broken down per branch or b) for each branch individually. If you want to create correspondence (such as dunning notices and account statements) for the individual branches instead of the head office, you have to select the Local processing field in the vendor master record of the head office on the Create Customer: Correspondence screen.


You can also define payment methods in the master records of the branches and head offices. For example, if you want to have certain payment methods for particular branches, enter these in the master records of the branches concerned and do not enter any payment method in the head office master record. If you enter payment methods in both head office and branch master records, all payment methods are possible.



Head office customer



Branch office


Invoice posting to branch customer




Posted successfully to the headoffice customers with branch details



Dunning run for headoffice customer


Headoffice customer successfully dunned


Important Notes :


> At Branch accounts system will not allow you to check clearing with customers or vendors. This has to be done at headoffice level


> All the dunning notices related to the branch go to the head office account and the payments (both incoming & outgoing) are made by the headoffice. However if you want the make the dunning and payment programs use the branch account instead, you need to select  "decentralized processing" field in
"correspondence" tab in the head office master data.


Filter Blog

By author:
By date:
By tag: