Hello,

 

 

Frequently I receive questions related to difference between standard parallel buffering and pseudo ascending in FI. So I decided to share with you some findings related this topic:

 

 

 

There are 2 differences.

 

 

1.a) Parallel buffering uses number of numbers in buffer as maintained in transaction SNRO.

1.b) Buffering in Pseudo ascending order is using always number in buffer = 1. This is hard coded for buffering in pseudo ascending order.

 

 

 

 

Usually parallel buffering with numbers in buffer = 1 solves performance / lock waits issue for RF_BELEG.

 

 

2.a) Parallel buffering: In case of rollback number(s) will be reused.

2.b) Buffering in Pseudo ascending order: In case of rollback number(s) will not be used again. Otherwise the number assignment would not be chronological anymore.
Numbers which were not used again are transferred to table NRIV_RESTE. So in case of Pseudo ascending order you may have gaps in BKPF. But these gaps are not real gaps as there is report RSSNR0S1 for documentation.

 

 

 

For an overall explanation related to Number Range in FI, I usually check the following WIKIs. They link the most relevant notes for each specific topic:

 

NUMBER RANGE: BUFFERS (Blog)

NUMBER RANGE : Buffering

NUMBER RANGE : Gaps in Number Range

NUMBER RANGE : General Information

NUMBER RANGE: DEBUG and CREATION

 

 

 

The below SAP note is a FAQ note and very useful also:

1398444 - Buffering the document number assignment for RF_BELEG

 

 

 

I hope this can help!

 

 

Regards,

Raquel

In this post we will look at the convergence of long standing pieces SAP ERP finance master data, the GL Account and the Cost Element. As anyone that has worked with SAP CO in the past knows the cost element is key to the controlling side of SAP. It in an object that allows you to identify the type of activities that can be done within controlling with that account. They are generally divided into Primary and Secondary cost elements.

 

  • Primary cost elements have an associated GL account and are generally expense or revenue accounts.
  • Secondary cost elements exist only in CO and are used for internal settlements, assessments, and allocations. 

 

When creating a new revenue or expense account in the GL you have to create a corresponding cost element in CO and typically all you were doing was selecting a Cost Element Category.

 

With the Simple Finance Add-on 2.0 (now called On Prem 1503) the traditional cost element create, change and display transactions are gone. The functions have been combined in FS00 - Manage G/L Account Centrally. This greatly simplifies the act of creating a new account and eliminated the need to maintain separate masters.

 

On the Type/Description screen there is a new field called account type. If you select either "Primary Costs or Revenue" or "Secondary Costs" a new field will appear on the Control Data tab for you to enter the Cost element type.

 

Primary Cost Example:

 

FS00 screen 1.jpg

 

FS00 screen 2.jpg


Secondary Cost Example:

 

 

FS00 screen 3.jpg

FS00 screen 4.jpg

 

I think this is a good step forward in simplifying the SAP finance master data and driving the convergence of FI and CO.

SUMIT MISHRA

Recurring Entries

Posted by SUMIT MISHRA Mar 12, 2015

What are Recurring
entries?

 

 

Recurring entries allows the business a function for automatic
creation of accounting entries based on the predefined parameters.

 

 

Once the Recurring entries are created they get posted into
the SAP system as per the defined schedule by the business.

 

 

Use of this functionality is only recommended to be used
only if the account assignments objects, General ledger accounts don’t change
when the document is posted.

 

 

Recurring entries can be used in General Ledger, Accounts
Payables and Accounts Receivables postings and thus this functionality of SAP
can be used for various requirements of recurring documents postings

 

 

For Example if the Business requires the rent payments
executed each month of $1000 then we can create the recurring entries to post
those particular expenses.

 

 

Also if there are any changes we can do so and recurring
entry functionality tracks all these changes and we can see all the changes
which have been done.

 

 

How we can set up
Recurring entries in SAP?

 

 

The process to create recurring entries in SAP is pretty
straight forward and simple and this process can be used for variety of purposes.

 

 

Example shown in this document is just an reference to show how
to proceed with the process .

 

 

You can create recurring document for anything like GL posting,
AP posting or AR postings however the process will be similar as displayed in
the below screen shots.

 

 

Create the recurring document ( Please note that
recurring document number range is X1)

Transaction code FBD1

Recurring1.png

 

Recurring2.png

 

 

 

Recurring3.png

Recurring4.png

 

 

To Display Recurring Documents Transaction code FBD3

 

Recurring6.png

 

 

Create Recurring Documents in Books Transaction code F.14

You need to fill the details of the recurring document earlier created.

If the schedule have been defined then the recurring documents will be created as per the schedule.

Press Execute as shown in below screen shot and the Session will be created.

 

Recurring7.png

You can see the session of the recurring document using the transaction code SM35

 

 

Once the session is completed the Recurring document is
created in the General Ledger.  You can display usingTransaction code FB03

 

Recurring 8.png


Changes In Recurring document

 

For example if the situation arises that we need to change the recurring documents

Execute the T code
FBD2 and here we changed the payment terms from ZK26 to ZK25 and Saved it

 

Recurring9.png

 

Please note-

Some times there are requirements to see the change log  by the Business of the recurring documents we can see using the transaction code FBD4 and please notice it tracks all the frequent changes done in the past for the recurring document.

Below Screen shows the changes done in the other recurring document not shown in the above example for reference.

 

 

Recurring 10.png

Conclusion- Recurring entries is a very helpful process to post the repitative Accounting/Invoice postings in a easier and accurate way and it saves times of the end users in doing this activity every month.

Mostly this functionality can be used for booking accruals it they are same each month, Rent or Lease rental payments, Utility Bills etc.


By deploying Simple Finance, IT can bring transformation to finance. In the introduction to this series, we learned the need for simplification and the role HANA plays.

 

Simple Finance is our industry-leading financial solution re-built to take advantage of SAP HANA. Perhaps the most significant change from this re-build is with reconciliation. If we understand reconciliation and how Simple Finance eliminates it, we can communicate Simple Finance benefits with our colleagues in finance using an example that make sense to them.

 

There are two general areas of accounting:

 

  1. Financial (FI). For external entities. Main reports are balance sheet and P&L statements.
  2. Controlling (CO). For internal reports to management, mainly focused on cost.

 

In software today, FI and CO are separate components or systems, historically thought to be independent areas. Certainly they have different structures and key figures. Executives, though, wanted a holistic view, and this meant a huge reconciliation challenge to understand the differences between systems, and bring them together in a single ledger. Reconciliation is a massive, time-consuming effort that has to occur often:

 

  • Within components. Ensuring totals match up with underlying line-items.
  • Between components. Comparing figures between functions, like results in the P&L module to the cost-based profitability analysis module (CO-PA).

 

There have been improvements to make reconciliation easier, notably the New G/L (General Ledger) in ERP 2004. Ultimately, though, none of the improvements solved the underlying issue: detail is stored separately by all components (such as General Ledger, Controlling, Asset Accounting, Material Ledger, Profitability Analysis).

 

HANA’s most important capability is aggregating within seconds hundreds of millions of items in one table in memory. Thus, it is the ideal architecture to solve reconciliation. In Simple Finance, we combine all the data structures of the different components into one table: the Universal Journal. HANA’s columnar store with superior compression make this possible.

 

We can’t adequately describe the structure of the Universal Journal in a short article. The important thing for now is that HANA and the Universal Journal solve both reconciliation problems. Recall that with HANA, we no longer need redundant data (aggregates and indices) to do analysis, so the first issue of reconciliation within components is solved. All totals are derived on-the-fly from the line-items directly.

 

HANA before after.jpg

 

The more difficult reconciliation between components is solved as well. We merge the components in the Universal Journal to guarantee real-time integration. For example, with FI and CO combined logically in the Universal Journal, users drill down to the same line items from the key figures and reports of either component.

 

Since HANA provides unprecedented speed for multidimensional analysis, it is no longer necessary to replicate data to OLAP. Even should OLAP be needed, ETL is much simpler from the Universal Journal instead of multiple components.

 

All of this means dramatic simplification. In one case, a Fortune 500 early adopter cut 120 person-days per month from reconciliation efforts. This is time that can be spent on more strategic planning and analysis tasks.

Are you thinking about the next big accomplishment IT can provide for our colleagues in the finance department? You should be, because now with SAP Simple Finance, we have a great opportunity to transform how finance operates.

 

CFO.com reported recently that 76% of finance executives think strategic planning will be the biggest area of new demand. You can’t act strategically, however, when so much time is spent on the tactical. Consider:

 

  • 70% of analytics effort is preparing data (IDC, Feb 2013)
  • 76% of global companies do not have financial performance data at the ready (Harvard Business Review, 2014)
  • 73% of executives think complexity is their biggest IT challenge (Forrester, 2013)

 

At SAP, we want to attack complexity in finance systems. That is why we took our industry-leading finance solution and re-built it to take full advantage of HANA. The result is Simple Finance. From an IT deployment point-of-view, you can think of Simple Finance as the next version of our finance solution.

 

I hear all the time from our customers: “What exactly makes Simple Finance simple? What have you simplified? Will it be worth the effort to upgrade?”

 

This is the first in a series of articles where I will answer these questions. Each article will describe how Simple Finance transforms a key tactical process like reconciliation, analysis, and closing. Simple Finance makes these processes simpler, freeing up finance to focus on the strategic. If you know about these processes, you can explain the benefits of Simple Finance to your finance colleagues in examples that are meaningful to them.

 

Before we look at specific finance problems, we must first understand the foundation of Simple Finance: HANA. You probably know that HANA at its core is an in-memory database engine. In-memory means that queries run exceedingly fast. But speed alone isn’t good enough. In some cases, it may be worth the upgrade to get queries to run faster. But we want to go further and use the speed to simplify the underlying architecture. How do we do this?

 

The simplest way to explain it is this: we put everything into one database engine in memory. Queries, transactions—all of it. Because HANA is so fast, and the column store lends itself to excellent compression, we can finally do this. Previously, database architects needed to separate transactions carefully from queries and design redundant data (indices, subtotals) to handle the query workload. Not anymore.

 

HANA one source.jpg

 

This has a profound impact on finance systems. First, the database footprint is reduced dramatically since we no longer need redundant data. But more importantly for finance, it means we can combine data for two different tasks—transactions and queries—into one data store. This is the beginning of the simplification of finance systems. I hope you’ll read on in the series to see the specific examples. Part 2.

Hi all,


Before Implementing the 1042s note regarding US Legal Change 2014, please implement the note 2095167 for DDIC changes.


2095204 - US Legal Change 2014 - 1042S Form Reporting

 

At note 2095204 you have 'adobeform_upload.pdf' with instructions to change the interface 'GS_IDWTCERT_US_1042'.

If you don't have the 2013 interface, Please apply notes:


1949019 and 1949020 -> Smart forms
1949021 and 1949022 -> Adobe forms

 

I hope it helps to address your concern about this subject.

 

Best Regards,

Manuela Valente.

Business need

 

Our finance department engaged with a local bank to accelerate vendor payments with a p-card program. Once set up on the bank's website, they would just need a simple comma-separated text file (CSV) with some basic data like our vendor number, the vendor's invoice number, the payment date, and the amount. This was the minimum requirement; there were several other optional fields, but for this pilot project, just these four were needed. Finance asked us to configure the new payment program and help them get a test file to the bank.

 

Concept

Up to this point, we had been using mainly paper checks. A few vendors were accepting ACH payments, but not many. Both of these payment methods relied on the classic payment programs (RFFOUS_T and RFFOUS_C.) As this would not be printing data to a form, nor would it be outputting a standard file format like ACH, we would have to come up with a custom DME solution.

Steps

  1. Configure a Data Medium Exchange format tree
  2. Configure a Payment Medium Format that calls the DME format tree
  3. Configure a variant in the standard program
  4. Associate the new payment method with some vendors
  5. Test

 

Create DME format tree

This was by far the easiest thing. Using transaction code DMEE, I created a flat file format. The DME Blog and SAP Help were surprisingly helpful in this (see links below.) Some of the key things to remember were the following:

  • Remember your format tree name because your Payment Medium Format has to be named the same thing. This hung me up for the longest time.
  • Set your field type to 1 and your segment delimiter to a comma to get a CSV output.
  • Don't forget to set the carriage return and line feed check boxes on the File Data tab. Whereas the comma was your field delimiter, the CR-LF will be your record delimiter.
  • A segment is like a row.
  • An element is the equivalent to a field. These elements are mapped to specific fields in the various payment structures (FPAYP, FPAYH, etc.)
  • A technical node is similar to an element, but it does not get output into the file. In my case, I had to create a technical node for the currency code so SAP would configure my amounts correctly.
  • Conversion functions are available for elements to format currencies, dates, and other strings of text in any way you might need. Need to get rid of leading zeroes in a cost center field? There's a conversion function for that, and there's even a handy wizard to walk you through getting the correct one in plain English.

 

Configuring the payment medium format

This was a challenge before I carefully read the DME blog. I cannot overstate how helpful this blog was. Since I was going into this effort without much formal training in configuring the payment program, this no-nonsense, plain-language blog was much more useful to me than SAP's documentation, which I found cryptic at times. Some of the key lessons I learned were the following:

  • Create New Entries rather than copying from an existing format if your format is non-standard. If you are tweaking an existing format, copying makes sense. My first attempt was a disaster because I copied an ACH format and wondered why my output looked like an ACH file.
  • Your format name must be the same as your DME format tree name. This caused me no end of pain, too, when the payment program either produced nothing (without errors) or produced error messages that weren't very helpful. When you name both things the same, everything just falls into place.
  • Step through everything in FBZP so you don't miss anything. Every button; every setting -- check and recheck that you configured everything you needed to. If you miss a setting in FBZP, you won't get the output you want. Trust me.

 

Configure a variant in the standard program

Once SAP sees that you are trying to pay using the Payment Medium Workbench and DME, it's going to steer clear of the classic RFFOUS* programs and use SAPFPAYM instead. The variant you create will reference the payment medium format you configured in the previous step, tell the program to use DME to create the output, and specify a default directory and file name. This is just like setting up any other variant in SAP, but you have to take another step after this.

 

You must go to transaction OBPM4 and associate the variant with your payment medium format. You must or you will bang your head on your desk as much as I did, even though it doesn't help get you your output file at all. Only configuring OBPM4 does that.

 

Associate the payment method with vendors and test

Here is where you engage your business users and start testing. Once you start using the new payment method (for instance putting it on the vendor master record,) then when you run F110, the DME magic happens.

 

Now what?

I have written here in very general terms about the configuration steps required to configure a custom, non-standard payment method. I have tried to identify the painful lessons I learned trying to do it myself. There are many details and nuances I did not cover, but that the DME blog covers exhaustively. I look forward to your questions and comments below.

 

 

References


Hi

 

From a standard SAP process, the instruction key that is updated in vendor master (LFA1-DTAWS) is captured in to REGUH-DTAWS when F110 or RFF110S is executed. As a next step when SAPFPAYM ( t code FBPM) is executed, this gets populated in the XML file.

 

For some reasons that does not seem to happen for me. I have identified SAP notes 2106391 ( includes 1574937 &1779966) for this issue.

 

Did some one come across a similar issue like this. Does this notes fix the issue?

 

Best regards

 

Dasaradh


toonpool.com

How do you decide upon the best approach to clear your check register in SAP for your company? Let me help to demystify this question and others over the next three weeks of postings. Stay tuned to find out that keeping the check register up to date doesn’t need to be a difficult exercise. I will provide three different options on how you can do this within your company.  But how do you decide which option makes the most sense for you?  Every company will have a different perspective, but here are some considerations for your project approach to managing the check register:

 

1.     Volume and Timing: How many checks are being issued being issued from this account each month? Is receiving an encashment file once per month sufficient? At what point would potential error corrections be too large? Does the volume become too large by having a file only once a month to manage errors? Perhaps it is easier to receive activity each day for reconciliation purposes.

 

2.     Cost: Do you have to pay additional to have the individual checks included on your bank statement file? What is the additional charge? Could you use the bank statement and eliminate the need to have a separate interface for only cashed checks?

 

3.     Monitoring: Who will perform the ongoing monitoring to ensure checks are being updated in the register? Will this be done as part of your bank reconciliation process, by your team supporting payments, or someone else? Who will make sure that either solution continues to work properly? Will this person participate in regression testing for the solution once it is live?

 

4.     Knowledge and Know-how: Does your technical team understand how to set up the different solutions? Do they understand the potential failure points? What will be your processes for ongoing monitoring?

 

5.     Staffing availability: Does your internal project team have the capacity to set up these interfaces? Often project teams have competing priorities with everything demanding top position in the project scope. If there is a constraint on internal staffing from either the business function or IT, perhaps it is best to update checks manually while other priorities are completed.

 

6.     Cost of compliance: In cases where the processes are completely manual, what is it costing you to comply with the various state escheatment laws? Would an automated interface make your compliance efforts easier and less expensive?


Now that the stage is set, over the next few weeks I will post how you can actually do this in SAP. 

Hello everyone,

 

 

The purpose of this blog is help to identify the root cause of error FS 861 in External Tax System (SABRIX, VERTEX and TAXWARE).

 

 

Error FS 861 can be a generic error resulted from RFC_CALCULATE_TAXES_DOC when external tax calculation is used. The cause can be wrong jurisdiction code, jurisdiction code structure, country not supported and so on.

 

 

The steps bellow should be followed to identify the root cause of error FS 861.

 

 

1. Check if company code is US,CA,PR:

 

 

According with SAP Note 1738657, SAP supports external tax systems only for the country codes US, CA and PR (Puerto Rico). This means the country of the company code must be one of these countries so that taxes are calculated using the external tax system and are also updated there.

 

 

 

2. Check jurisdiction code structure defined in transaction OBCO:

 

 

spro.PNG

 

 

 

OBCO.PNG

The default jurisdiction structure is:

Sabrix    = 2,2,5,5

Taxware = 2,5,2,0

Vertex    = 2,3,4,1

 

 

 

3. For non-taxable scenarios check the default jurisdiction code defined in transaction OBCL:

 

Enter I0 (for input tax), O0 (for output tax) and a dummy jurisdiction code for the relevant company code(s) using TAXUSX. The system uses this dummy jurisdiction code also for export business transactions. (If you create a billing document in a country with tax jurisdiction code, when the goods recipient is in a different country with or without tax jurisdiction code.) Please have a look at note 419124 where the suggested export jurisdiction codes used by the different External Tax Interface Partners are listed.

 

 

 

spro obcl.PNG

 

OBCL.PNG

 

 

  4. For exporting scenario you have assigned for jurisdiction code in customer master data.:

 

As informed in SAP Note 419124 the Vertex jurisdiction code for export transactions is '770000000'. For Taxware, the jurisdiction code is IT0000000.

 

Please check if all new released notes are implemented:

 

 

 

 

2016990 - Export between United States and Canada

1628962 - Export with invalid tax jurisdiction

1768395 - Export with invalid tax jurisdiction (1)

1809374 - Export with invalid tax jurisdiction (2)

1899214 - Export with invalid tax jurisdiction (3)

2016058 - Export with invalid tax jurisdiction (4)

 

2095331 - Export with invalid tax jurisdiction (5)

 

 

 

5. Check the Condition used in SD:

 

Trigger conditions UTXD and UTXE are used in external tax calculation. Both conditions must be used together in the pricing
procedure (for technical reasons). Condition UTXD has value formula 500 and condition UTXE has value formula 501. They are triggered only by FM
PRICING_COMPLETE. The RFC is called only once per document. This is called the MaxTax procedure (developed by FI), and it is supposed to be faster. For more details related SD customizing please check SAP WIKI
Tax jurisdiction - ERP SD - SCN Wiki

 

6. Check if the Jurisdiction code maintained is valid:

 

SE37 transation -> RFC_DETERMINE_JURISDICTION -> single Test (F8) -> Test Data Directory

*** "in RFC Target Sys" inform the external tax system (RFC) destination***

 

 

 

7. If the above customizing is correct check the parameters passed to external tax system and received in debug mode:

 

SU24new.png

 

  1. Transaction se24
  2. cl_xtax_rules_rfc
  3. double click on method RFC_CALCULATE_TAXES_DOC
  4. and scroll down till you see where the RFC_CALCULATE_TAXES_DOC function call is made
  5. set a break point at the break point call function 'RFC_CALCULATE_TAXES_DOC (to check the input parameters) and one after the  call function (to check the output parameters)
  6. and do the transaction again
  7. inspect the input structures: tax_cal_item_inxx: outgoing document values to tax system -->>> If you see some wrong information this is caused by wrong customizing. In this case please review in detail the configuration guide attached on SAP Note 392696 - R/3 Tax Interface Configuration Guide.

   8.  and do F6

   9. now inspect the error structure and output structure:

      tax_cal_item_outxx: incoming values from tax system

tax_cal_jur_level_outxx: jurisdiction level  --->>> if you see an error, this has to be analyzed by company responsible for the External tax system

 

 

 

I hope this helps!

 

Regards,

Raquel

Hi Colleagues,

 

Through the transaction or program, you can discover the number of customers, vendors, GL accounts, assets or normal FI documents in a client, company, by period, chart of accounts, etc. Basically the program research the data master tables with company codes, chart of accounts and so on.



RFAUDI01 - Number of Customer Master Records - S_ALR_87101051

ScreenHunter_059.jpg


The screenshot above indicates the number of customers according the KNA1 table and KNB1 table per company code.




RFAUDI02 - Number of Vendor Master Records - S_ALR_87101052

ScreenHunter_058.jpg

The program above check all the vendos by LFA1 and/or can separate by company code.



RFAUDI03 - Number of G/L Master Records - S_ALR_87101049

ScreenHunter_056.jpg

 

The screenshot above indicates how many items there are in SKA1 table per chart of accounts. And below, indicates how many items there are in the SKB1 table per company in each chart of accounts. This is very useful when the client has many company codes assigned to a chart of accounts. There aren't a option to select parameters.

ScreenHunter_057.jpg



RFAUDI04 - Number of Asset Master Records - S_ALR_87101050

ScreenHunter_055.jpg

Technically, the program check ANLH table. You do not have access to selection parameters like a research by company code.

 

 


RFAUDI07 - Number of Standard FI Documents - S_ALR_87101054 is shown below:

ScreenHunter_054.jpg

 

In this case, the program will check the BKPF table by company, ledger, type of document, period and year, but you can insert posting date to research in a specific period. Remeber that normal documents are all documents that aren't special GL items, parked items or noted items.

 

Also, you can run these transactions in backgorund mode and get the list in the SM37.

 

 

JPA

Hi Colleagues,

 

Running NGLM tcode, is displayed many settings for GL to analyze, like screen shots below. You find detailed information on the organizational units ledger, company code, controlling area, and operating concern. Also, about the Document Splitting: In this area, you find options for the detailed analysis of the document splitting settings.

 

In the Data Analysis area, the monitor offers functions for the quantitative analysis of the transaction data updated in the line item tables, totals record tables, and document splitting tables of General Ledger Accounting. First start the Run Data Analysis function as a background job. The analysis data is stored automatically in a data file. You can then use various criteria to analyze.

 

This helps, e.g when you wants to check what characteristics of Document Splitting, what chart of accounts is assigned, fiscal variant, scenarios from Document Splitting, what total table is active per ledger, CO details and many other functions.

 

Follow, there're some steps:

 

SPRO

ScreenHunter_019.jpg


Execute


ScreenHunter_020.jpg


ScreenHunter_021.jpg


The tcode allows to you run other transactions related, like Maintain Company Code (OBY6) and many other.

ScreenHunter_022.jpg


ScreenHunter_023.jpg


To view Document Splitting characteristics


Execute

ScreenHunter_029.jpg

ScreenHunter_025.jpg


And again you can run transactions to maintain Document Splitting


ScreenHunter_026.jpg

ScreenHunter_027.jpg

 

Also, is possible to check CO details

 

ScreenHunter_031.jpg

 

 

 

JPA

I attended this SAP webcast this morning and was surprised at the low attendance.  There are many improvements in FI-CO and no, I am not talking "Simple Finance" but the core FI-CO modules.

 

See below:

 

Customer Connection for Financials started last December

 

The usual Legal Disclaimer applies

1fig.png

Figure 1: Source: SAP

 

This customer connect started last December and closed workspace in February

 

SAP checked ideas which had the most subscriptions

 

They evaluated during the select phase

 

In April they held selection call

 

Subscribe means you will use improvements productively

2fig.png

Figure 2: Source: SAP

 

Figure 2 shows 17 improvement requests are delivered in CO

 

2 are in progress

 

2 are in handover

 

11 has FI delivered with 1 improvement request rejected

3fig.png

Figure 3: Source: SAP

 

Figure 3 shows one of the most popular requests, display controlling area

4fig.png

Figure 4: Source: SAP

 

Some of the more interesting notes (to me) listed in Figure 4 are below:

5fig.png

Figure 5: Source: SAP

 

Figure 5 shows an improvement for KOK5

6fig.png

Figure 6: Source: SAP

 

More CO improvements are listed in Figure 6

7fig.png

Figure 7: Source: SAP

 

I’m surprised at Figure 7, exporting to Excel from background jobs, but I understand it at the same time too, having done this many times myself.

 

http://service.sap.com/sap/support/notes/1991518

 

This note was just released

8fig.png

Figure 8: Source: SAP

 

I think the most interesting improvement includes all the notes listed with OB52 posting period automation – I only list a few below:

 

OB52 http://service.sap.com/sap/support/notes/1993365

 

Parallel opening of periods http://service.sap.com/sap/support/notes/1993365

9fig.png

Figure 9: Source: SAP

 

Figure 9 shows “display full user name” but looking at the note it says “Branch to user data” http://service.sap.com/sap/support/notes/2073896

 

Visit www.sapimprovementfinder.com to find delivered improvements

Hello Everyone,

 

 

The idea of this blog is summarize the most commun actions to improve the performance in FBRA as well as the SAP notes related with each topic suggested.

 

 

-> If change selection parameters does not solve the issue you can use FBRA_LOC transaction as described in SAP note 487347:

 

487347 - FBRA: Overflow of the lock table

 

This note suggests to use transaction FBRA_LOAC as a workaround for problematic cases.

For avioding the lock table overflow, there are these possibilities:

1. Extend the profile-parameter "enqueue/table_size", as per note 13907

2. Reduce BKORM entries as much as possible, i.d. for all company codes
   The less entries in table BKORM exist the more faster is FBRA or
   FBRA_LOAC. Please refer to steps 2. and 3. of note 487347.

 

Additonally, please review the following recommendations

1.      Use Note 1320810 Z_ENQUEUE_PERF during both high and low load.

2.      Test changing the following parameters enque/server/threadcount and enque/server/use_spinning to true.

3.      Run niping during both high and low load.

4.      Use test in okcode in sm12 and set logging on for 15 seconds and run mass calls, starting with a thousand during load and check stad record for enq performance.

5.      During problem time check for rejects in sm12. If refresh shows increases of hundreds, points more to an application issue.

6.      Consider moving the standalone enqueue server to a dedicated system.

7.      Continue application analysis.

 

 

 


-> If you are receiving runtime error TSV_TNEW_PAGE_ALLOC_FAILED in transaction FBRA can be that table BKORM has too many entries.

This table could contains millions of old and very old correspondence requests, which never have been printed and therefore never been deleted from this table.

If you don't print your correspondence requests, you have to reorganize table BKORM from time to time by means of report SAPF140D. This
report can be reached for each user via transaction F.63. The BKORM should contain normally only current correspondence requirements which were not printed yet and possibly already printed, which are not older than 14 days, for a possible reprint. There is no need to keep printed correspondence in BKORM for a long time.
Many of these correspondence requests were created a long time ago and you do not intend to print and to send these correspondences. Such correspondences requests can be safely deleted. Please take the following precautions:
  1. You should run SAPF140D according to note 17831 on a regular basis (every two days or at least weekly) to delete all accounting
correspondence requests out of BKORM that are finished since (e.g. one week).
  2. The same report should be used to delete all accounting correspondence requests that have no expiry date at all('entries
withoutprint date under 'Further selections').  There is also a possibility to run the report in test mode.
  3. In the future, you should use SAPF140 as follows: On the selection screen, under 'Program Control' there is a field called 'Delete if
finished since'. This field has to be filled with a certain number of days (e.g. 7). You could create a variant and then always use this
variant for SAPF140 for which this field is set.
If entries in BKORM are deleted to correct size, FBRA should work fine.

 

 

I hope that I could help!

 

 

Kind Regards,

Raquel

Hi guys,

 

Just a tip about this posting where I was having some doubts why the post was not going through the BADI.

 

 

If the post is a Down payment request or a Noted item, you must complement the BAPI_ACC_DOCUMENT_POST using:

 

    WHEN 'DP'.

      ls_bapiache09-obj_type = 'BKPFF'.

      ls_extension2-structure = space.

      ls_extension2-valuepart1 = 'BUS_ACT'.

      ls_extension2-valuepart2 = 'RFST'.

 

 

Implement the BADI_ACC_DOCUMENT:

 

 

The BADI_ACC_DOCUMENT has to have the following filter:

 

 

And the Method CHANGE should be like:

 

METHOD if_ex_acc_document~change.

  FIELD-SYMBOLS <accit> TYPE accit.

  DATA ls_extension2    TYPE bapiparex.

 

  IF c_extension2 IS NOT INITIAL.

    READ TABLE c_extension2 INDEX 1 INTO ls_extension2.

    IF ls_extension2-valuepart1 = 'BUS_ACT' AND ls_extension2-valuepart2 = 'RFST'.

      c_acchd-glvor = 'RFST'.

 

      LOOP AT c_accit ASSIGNING <accit>.

        <accit>-bstat = 'S'.

      ENDLOOP.

    ENDIF.

  ENDIF.

ENDMETHOD.

Actions

Filter Blog

By author:
By date:
By tag: