cancel
Showing results for 
Search instead for 
Did you mean: 

File Input Adapter - Dynamic Mode

Former Member
0 Kudos

Hi,

I want to use dynamicFile and dynamicPath together for file input adapter but I am getting error when I write both of them in DynamicMode parameter. Is it possible somehow?

Let me explain my scneario. A server is generating logs and I need to read them from ESP. The server writes logs in a file and when size of the file reaches 100MB, logs are started to be written a new log file. So, I need to read each newly coming data from a single log file and also get entries from a new file if previous log file reaches its maximum size.

How can I handle this kind of request?

Thanks and regards,

Bulut

Accepted Solutions (0)

Answers (1)

Answers (1)

former_member217348
Participant
0 Kudos

Hi Bulut,

Please share your project and possible some sample input csv file that I can use to reproduce the problem.

What is the error that is reported? Please share the log files where the error is reported.

You can send the files to me privately if needed.

Thank you,

Alice

Former Member
0 Kudos

Hi Alice,

To be able to use wildcard in the File Parameter, I should choose dynamicPath in dynamicMode paramater. But when I choose dynamicPath, adaptor didn’t read newly written entries (incremental data) in log files.

Vice versa, to able to get incremental data from a log file I should choose dynamicFile in dynamicMode. This time, I could not use wildcard (*) and I should specify a certain file in File Parameter.

you can see my adapter codes below:

     1- DynamicFile

CREATE INPUT WINDOW InputWindow1 SCHEMA (

       Column1 string ,

       Column2 string ) PRIMARY KEY ( Column1 ) ;

ATTACH INPUT ADAPTER File_Hadoop_CSV_Input1 TYPE toolkit_file_csv_input TO InputWindow1 PROPERTIES csvExpectStreamNameOpcode = FALSE ,

       dir = 'E:/gg/t' ,

       file = 'new2.txt' ,

       dynamicMode = 'dynamicFile' ,

       removeAfterProcess = FALSE ,

       csvDelimiter = ',' ,

       csvDateFormat = '' ,

       csvTimestampFormat = '' ,

       csvHasHeader = FALSE ,

       pollingPeriod = 5 ,

scanDepth = 0 ;

     2-  DynamicPath

CREATE INPUT WINDOW InputWindow1 SCHEMA (

       Column1 string ,

       Column2 string ) PRIMARY KEY ( Column1 ) ;

ATTACH INPUT ADAPTER File_Hadoop_CSV_Input1 TYPE toolkit_file_csv_input TO InputWindow1 PROPERTIES csvExpectStreamNameOpcode = FALSE ,

       dir = 'E:/gg/t' ,

       file = '*' ,

       dynamicMode = 'dynamicPath' ,

       removeAfterProcess = FALSE ,

       csvDelimiter = ',' ,

       csvDateFormat = '' ,

       csvTimestampFormat = '' ,

       csvHasHeader = FALSE ,

       pollingPeriod = 5 ,

scanDepth = 0 ;

My new2.txt file includes:

A,A

B,B

C,C

D,D,

I could not find a way to handle this situation. Do you have any idea? How can I use both dynamicFile and dynamicPath options together?

Regards,

Bulut

former_member217348
Participant
0 Kudos


Hi Bulut,

I think that what you want to use is the Log File Input Adapter rather than the File/Hadoop CSV Input Adapter. Have you taken a look at that one yet?

Thanks,

Alice

Former Member
0 Kudos

Hi Alice,

I have chance to look Log File Input Adapter today. Yes, you are right it is more appropriate for me. I have one question related to the Log File Input Adapter.

My log file is advancing file which is defined in the docs as below:

For advancing files, when one file is full, the log file writer creates a new file and starts writing to that new file. The naming convention is typically a base name plus a suffix, where the suffix might be based on date/time or a sequential number (for example, "access-log.2007-01-01", "access-log.2007-01-02", and so on, or "access.log.1", "access.log.2", and so on). Regardless of the naming convention, the adapter opens these log files in chronological order by the most recent modification date.


For .properties file the document is saying that set Input.WaitForGrowth parameter to true. I also set Input.FileName. I did it and adapter read the appended data. But when I create a new log file and write new logs into it adapter didn't read new file. Do I need to set some other parameters? Do you have any opinion about how I can read advancing files?


You can use this adapter to read either live or historical log files.

  • To read a historical file, specify the file name in the Input.Filename property, and set the Input.WaitForGrowth property to false.
  • To read a live file, whether rotated or advancing, set the Input.WaitForGrowth property to true. The Log File Input adapter goes to the end of the file and then reads as new data is appended to the file. When the file size shrinks to zero (after the old log file is renamed and a new, empty one is created), the Log File Input adapter continues reading from the beginning of the new file.

Thanks and regards,

Bulut

Former Member
0 Kudos

Hi Bulut,

There are at least 6 bugs in ESP 5.1 SP04 that limit the Log File Adapter's functionality. Some of these have been fixed in SP08, others have not.  One of the six that have not been fixed is:

  761559 - Log File Input Adapter does not work on advancing files

Rather than discussing them here, it would be easier if you could open a technical support case so that we can discuss your project requirements (what type of log files you are reading, what language/locale they are in), your timeline, how far are you in your current project, if you would feel comfortable upgrading to SP08 and so on.

Thanks,

  Neal

Former Member
0 Kudos

Hi Neal,

I have opened a technical case from sybase case-express. CaseId is 11717464. If you have chance to look and and make some comment, they will be valuable and helpful for me.

Thank and regards,

Bulut

Former Member
0 Kudos

Hello,

Since your technical support contract was migrated from Sybase to SAP, we are not allowed to work under the old Sybase systems.  As long as you log a case under the SAP component BC-SYB-ESP, I will get a notification and we can start working with you directly.

Thanks,

  Neal

Former Member
0 Kudos

Hello Neal,

The technical support contract migration have been completed and I could open an incident. The incident name and id: Log File Input Adapter - Advancing Files ( 791723 / 2014 ).

Thanks and regards,

Bulut