cancel
Showing results for 
Search instead for 
Did you mean: 

Uploading long text from Excel using Data Services

Former Member
0 Kudos

Hi,

We are facing an issue when uploading long texts (longer than 255 symbols) from Excel file using Data Services.

The problem is that if the first 16 rows of source Excel file don't contain texts longer than 255 symbols, Data Services ODBC driver truncates all further texts in this column to 255 symbols, even if the field length is defined as varchar(2500) in Excel file format in Data Services and if the column contains longer texts in next rows.  If I put a long text in one of the first rows of source file, all further long texts are also uploaded correctly

with full length.

I've tested this ussue using simpliest data flow with one source file in Excel format and one target file in CSV format, the problem is at the very fist step of reading data by Excel Workbooks file format.

We've tried to set parameter TypeGuessRows = 0, using Microsoft recomendations (http://support2.microsoft.com/kb/189897/en-us), but we are still facing this issue. With TypeGuessRows =0, if the first 16384 rows of Excel file don't contain long texts, all texts in this column are truncated to 255 symbols.

We can't add dummy first row with long texts manually, because  these .xlsx files are received from external system daily and should be uploaded automatically.

Do you have any ideas about workaround for this issue? Is there a possibility to add dummy row by some kind of script automatically? Or convert xlsx to csv somehow?

Thanks,

Jeny

Accepted Solutions (0)

Answers (2)

Answers (2)

beat_honegger
Participant
0 Kudos

Hello Viacheslav

I have the same problem, but it doesn't work even I added in the first Row a field with more than 256 Char.

Windowsserver 2012 R2

Dataservice 14.2.3.549

EXCEL 2010 (tried with xls and xlxs)

Did you generated the format out of the excel or manually?

Can you send me the atl which works and the EXCEL?

Any idea?

Thanks alot 🙂

Greetings

Beat

vnovozhilov
Employee
Employee
0 Kudos

As per KBA 1675110 - Excel columns greater than 255 characters are truncated on output - Data Services - this seems tobe a known issue / limitation.

What is the version of DS?

What is the platform (Win or UNIX)?

Could you provide a simplified ATL and Excel for a quick test?

Thank you,

Viacheslav.

Former Member
0 Kudos

Hi Viacheslav,

We are using DS 4.2 SP1 on Windows.

Thanks, for the KBA, in our case Excel can read 16K+ rows without truncating, though SAP says about 100 rows only in the KBA 1675110, you mentioned before. And I still hope there's another workaround except adding fake records to the source file.

BTW, id there's any way to add a fake record to source file using a batch script running from DS - I'll appreciate the ideas how to do this.

The testing ATL is attached, it is using 2 source files - one with long texts in the first string, another one - with long texts in the bottom line of the file (line 17K+). You just need to change path at the Format tab of both Excel file formats to load data. There will be both truncated and not-truncated texts in the result .csv file.

Attached file is a renamed zip archive without password. Password to ATL is DS_long_texts.

Thanks,

Jeny

vnovozhilov
Employee
Employee
0 Kudos

I have imported the job and can see the scenario described. I do not see an immediate fix, let me have a chat around.

Thank you,

Viacheslav.

vnovozhilov
Employee
Employee
0 Kudos

As promised I had a chat with my colleagues. I personally agree that it shouldn't be designed that way - It sounds logical to me that sampling of 100 rows needed to update schema with max length. It is also logical that if one wants to change max length of the column in Excel FF properties one should be able to do so. And DS should be able to comply.

The only option I can offer is to post the same on SAP Ideas Place for further enhancement or wait for a member of community to offer a better workaround.

Thank you,

Viacheslav.