Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
Recently, I was tasked with pulling data from various Web Services and integrating it with data from other data sources (BW, SFDC, Oracle, etc.).  Upon completion of my work, I realized that the information I found online was sparse and lacking a clear concise step by step guide to navigate thru the steps needed.  Below is what I put together both for my own reference and to aid others doing similar tasks.


  1. Define a Datastore to link to the WSDL that calls the Web Service

  2. Double click the name of the datastore and import the function(s) you wish to call.

  3. The default size of the value returned from the function is a varchar(1024), in most cases this is too small.  If you right click the function name in the datastore and select open, it will open the function for modification.  Double click the object(s) you wish to change the size of and change them.  I made it a varchar(5000000) – Yes, you can create a varchar of 5M!  Just don’t output this to Oracle.

  4. The first step is to create a data flow to read the information from the Web Service and export it to an XML file.

  5. The source table contains all the information needed to call the WSDL.

  6. Create the initial query as follows – you will pass the second schema (Weather_Call) to the WSDL to pull back the XML response – as you cannot pass the Query itself.

  7. In the next query, right click and select New Function call – select your WSDL function. When it opens, drag the sub-schema with your user parameters (the schema itself) to the function call.  In this case we drag the schema Weather_Call to GetWeather.

  8. Create another query to un-nest the getCoreDataXMLResponse.

  9. Create a new Flat File output.  This will contain the XML output from your WSDL call.

  10. The type needs to be delimited, the adaptable schema needs to be Yes (for error handling). For our environment the location is Job Server and the /export/share directory.

  11. Drag that into your dataflow and make it the target.

  12. When this data flow completes, the output is an XML file called Global_Weather.xml.

  13. Open the XML output in Textpad or similar application.

  14. Now you need to generate an XML schema to tell Data Services how to parse this information out.  There are many free web utilities that can do this.

  15. Create an .xsd file based on the XML.

  16. Next create an XML schema in Data Services
    Give it a Format Name without spaces, select your XSD file in the FileName section and select the root element.

  17. Now, create a second data flow that will read the XML output using the XML schema you just created.

  18. Drag the XML schema you created as the source.  Double click it and set where the XML file itself exists.  In this case, we use the /export/share directory.

  19. In the query, select all the data from the XML.

  20. Incorporate any additional tables/information as you would normally do.

  21. Select a table or template table as the output.

  22. Run the job and view the output


3 Comments
Labels in this area