cancel
Showing results for 
Search instead for 
Did you mean: 

TIME_OUT runtime error

Former Member
0 Kudos

Hi,

Could some one help me in resolving the below run time error.

we are getting time out error with the below details:

Short text
    Time limit exceeded.

What happened?
    The program "ZXXXXXX" has exceeded the maximum permitted
     runtime without
    interruption and has therefore been terminated.

Error analysis
    After a specific time, the program is terminated to make the work area
    available to other users who may be waiting.
    This is to prevent a work area being blocked unnecessarily long by, for
    example:
    - Endless loops (DO, WHILE, ...),
    - Database accesses with a large result set
    - Database accesses without a suitable index (full table scan)

    The maximum runtime of a program is limited by the system profile
    parameter "rdisp/max_wprun_time". The current setting is 1800 seconds. If this
     time limit is
    exceeded, the system attempts to cancel any running SQL statement or
    signals the ABAP processor to stop the running program. Then the system
    waits another 60 seconds maximum. If the program is then still active,
    the work process is restarted.

the above mentioned parameter is set to 1800 and still getting runtime whenever Zprogram called, and that has been used several DB tables to fulfill business process.

is there other way to minimise the runtime errors.

kindly help me.

much appreciated the help.

Thanks in advance!

Best Regards,

Venky

Accepted Solutions (1)

Accepted Solutions (1)

hendrik_brandes
Contributor
0 Kudos

Hello Venky,

it is very difficult to answer directly to your question without having more informations:

When a TIME_OUT error occurs, it may have the described issues:

1) Loops over a large amount of lines or without a exit-condition

2) Inperformant database access without having proper indexes

3) Large amount of data which have to  be transferred from DB to AS

What can you do?

On the database:

* You can minimize the amount of data by bringing as much as possible logic from ABAP code into the DB-select (Grouping, Joins, Subselects ... )

* Do not transfer every field if you only need a set

* Inspect you SQL and find out if the indices are correct


On the business logic:

* Loop/Reads on Sorted or hashed tables

* Try to avoid to complex structures which will lead into "spaghetti"-code with the risk of unreachable conditions

There is a good blog about this topic ( although it is about ABAP on HANA, it can be used very good for classical environments 😞

Nevertheless: Make a estimation about the amount of data and time which the logic should take. Use tracetools for clarifying where most of the time gets lost.

If nothing helps, try to run your report as background-job.

Kind regards,

Hendrik

Former Member
0 Kudos

Thanks Hendrik,

yes the time out is because of Loops over a large amount of lines in the program.

they looping over almost 1.7L records and giving time out while looping those many records.

i Hvase suggested then to include some condition te get minimal records before looping.

other than this if you have any experience do please share.

Thanks in Advance!

Best Regards,

Venky

Answers (5)

Answers (5)

former_member196490
Active Participant
0 Kudos

Check for the performance of the program and try restricting the selection criteria.

If the program is written well and still it fails for time_out try the option of running the report in background and check the o/p in spool.

Also 'COMMIT WORK' can be used to reset the timer after huge selects and loop statements to prevent fast timeout

ronaldo_aparecido
Contributor
0 Kudos

If the error is in your select command means you are pulling a lot of data at once, then the solution is to select the 1000 and 1000 packages append Example:

SELECT *

          FROM zmb_tclientes

          INTO TABLE lt_zmb_tcliente2

          PACKAGE SIZE 1000

          WHERE audat IN s_audat.

     LOOP AT lt_zmb_tcliente2 INTO ls_zmb_tcliente2.

       APPEND ls_zmb_tcliente2  TO lt_zmb_tcliente.

     ENDLOOP.

     REFRESH lt_zmb_tcliente2[].

   ENDSELECT.

The error time out limit can be circumvented by running home after work committ data selection program, because it resets the timer.

meenakshi-btp
Explorer
0 Kudos

Try restricting the selection criteria.

arindam_m
Active Contributor
0 Kudos

Hi,

It looks like a time out error. Before you move in to understand what code is delaying. It may not be doing so. Its just that you are executing a program in Foreground and it hits your time out limit set for you profile. You can check the following and change it if necessary

T-Code RZ10 -> Select Instance Profile ->Check the Extended Maintenance Radio Button and click on Change Button.

Check Parameter rdisp/j2ee_timeout and you can increase the time here for greater processing time window. Also, check profile rdisp/gui_auto_logout to know how the log out time.

Cheers,

Arindam

venkateswaran_k
Active Contributor
0 Kudos

Dear Venky,

Are you using any select statement inside your program that fetches large number of rows?

for example, do yu use cluster table like BSEG table -

What is your z program  and share the Select statemetn code..

Regards,

Venkat

Former Member
0 Kudos

Thanks Venkat,

yes we are using Select statments and when i observed the below Select Statment fetching 1.7L records and after that they looping the results for processing, here its giving Time out error.

   SELECT partner mc_name1 mc_name2 FROM bbpv_buyer_addr INTO TABLE li_bbpv_buyer_addr  
WHERE country = iv_country AND ( mc_name1 LIKE '%A%' OR mc_name2 LIKE '%A%').

Loop on li_bbpv_buyer_addr   ...

EndLoop.

in this loop time out error happening,

i did debug on this ZFM and found this, and suggested to use some more conditions to minimize results.

Thanks,

Venky

former_member219762
Contributor
0 Kudos

Hi Venky,

               First analyze the program by Runtime analysis,SQL trace. From that we can find where is problem and we can tune our program according to that.If we not able to improve the performance then we can use Parallel processing to solve this problem.

Regards,

Sreenivas.