Financial Management Blogs by Members
Dive into a treasure trove of SAP financial management wisdom shared by a vibrant community of bloggers. Submit a blog post of your own to share knowledge.
cancel
Showing results for 
Search instead for 
Did you mean: 
deepesh_kumar
Participant

          Any ETL (Extract, Transform and Load) tool in the current market space requires the users to be proficient if not experts in technology. SAP BO FIM is an exception to that story. The product philosophy is very much straight forward - keep it simple. It is designed for the business users to move the data across SAP EPM products. If the EPM strategy is clearly defined in your company with forward looking approach then FIM is going to be an integral part of your SAP landscape. 

          Why FIM

Going forward SAP seems to be interested in detaching the EPM tools from the IT gang, probably to make sure that adoption rate is high by making business own the EPM tools. In such scenario FIM is the no frills ETL tool with no overhead of IT specialized skills.

In the current version of the tool FIM 10.0 SP4 you can connect between,

  1. Database (Both SQL and Oracle)
  2. Financial Consolidation (FC)
  3. Planning and Consolidation (BPC)
  4. Profitability and Cost Management (PCM)
  5. Intercompany Reconciliation
  6. ECC 5.0 +
  7. SAP BW 7.01
  8. Strategy Management (SM is present in the SAP documentation however in the FIM data store creation this was missing)

The following EPM tools are not currently integrated in FIM

  1. Disclosure Management
  2. Notes Management
  3. Spend performance management
  4. Supply chain performance management
  5. Sales and operations planning

I hope these will be integrated in the future releases.

What lies under the hood

Below the FIM layer the ETL activities are handled using Business Object Data Services (BODS). BODS is a powerful ETL tool which can be used to move data between any heterogeneous systems. However BODS requires skilled IT resources to manage the application, which is not the case with FIM. FIM installation involves installation of the Business Object Enterprise (BOE) Platform server too.  The user management for the FIM is done using the Central Management Console (CMC), which is a part of BOE.

ETL Activities

The ETL activities involve two processes. a) Creation of Data Store b) Creation of Jobs

Data Store Creation

Data store can be either source or target. For moving data to and fro to any EPM tool you need to create a data store. Only exception to this is while you are loading data from a flat file, flat file can be directly used without data store. Data store creation is a two-step wizard driven process. Depending up on the object on which the data store is build the parameters needed varies.

Step 1 - Data Store type selection. In the first step, the type of data store is selected.

Step 2 - Details for the data store is provided at this step. Depending up on the data store type selected in the previous step the information needed varies, the below chart provides the same.

Once the data store is created, you can create the jobs that move data.

FIM Job creation

Running job is the actual process that moves the data between the data Stores. When you run a job it moves the data according to the parameters selected between data stores. Creation of a job is a one-time activity and jobs can be run any number of times. Job creation is a five step process.

Step 1. Define General Properties

As you can see above in the screenshot, Job name and Job Description needs to be provided. The job name field has a character limit of 8. Job description can be used for detailed description of the job. If there are large number of jobs, serial numbers can be prefixed to the job name so that jobs can be easily identified e.g. – 01_Abcd, 02_Pqrs etc.

If a custom job with complex data transformation is needed, it can be created in the BODS layer and can be used in the FIM interface. In such case the check box on the top of the page (Connect to an SAP Business Object Data Services Job) is selected, also provide the BODS job name. A Transaction table with the details of job will be created in the backend and the data will be saved in the same. 

Step 2. User assignment

Each Job needs be assigned to at least one user. These users are created and managed at the BO CMC level. The job level rights are provided at this step, if a user tries to execute a job that is not assigned to him/her the job run ends with error message.  There are 4 types of users can be defined for the FIM application. Below table provide the details of the same.

Step 3. Define Source Properties

The source details can be selected in this step. A source can be a data store created before or a flat file. While using a flat file there is a provision to preview the data (100 rows). When a data store is selected, you need to select the application also.

Step 4. Define Target Properties

The target details are selected in this step. Once the data store is selected, application to which the data needs to be fed also selected at this stage.

Step 5. Field Mapping

The filed mapping between the source and target is done at this stage.  The data transformation rules to be defined are done in this step.

Job Run Details

The job run history is available for each job and for the last job run it is feasible to see the data being uploaded. The below screenshot provide a sample of job run history.

Mapping table history – Audit compliance

The changes done in the mapping table is captured and it is available for the future reference. This makes the system compliant with audit needs. In the above screen shot, the account 1001 was mapped to ‘ABCD’ by User 1 on 3/22/12 and it is being updated on 3/27/12 by user 2 by remapping to account ‘PQRS’

Import and Export of Jobs

The jobs created in one system can be easily moved to another system easily using the Import and Export functionality present in the FIM application. So moving jobs between Development, Quality and Production is an easy task to complete. Exporting a job is two step wizard driven process.

Select the Job and Export please refer the screen shot below,

Make sure that you move the dependencies when you are moving a job (You need to Import the data stores before or along with job). If the data store is already moved there is no need to move the same again.  You can move multiple jobs in a single export process. The file will be saved in XML format.

Importing a job is a wizard driven three step process. Please refer the above screenshot; Import is also done from the same place. If the selected object is already present in the target, you have provision not to import the same during the importing (last step).

Also the jobs can be copied within the application using Save as option present in the job.  Jobs can be run from both the main window also from the individual job level.

Wish list

1.     1. I would really like to have an option to clear the FIM tables from the front end where I load the data. Let me give an example; consider you are loading to BPC from flat file using FIM. If you are loading the same data again, when you drill through from BPC you will see the entire data loaded. This defeats the purpose of drill through from the BPC application side.  Ideally there should be an option to delete the tables from the FIM interface.

2.     2. Option to trigger processes at the application side, once a FIM job completes. For e.g if you want to run default logic at the BPC side after a FIM job completes, currently you need to do this from the BPC side. It will be great if we have a choice to define succeeding actions after job execution.

To conclude, FIM is a very user friendly ETL tool which requires minimal IT support. It is a very useful tool if data from multiple systems needs to be brought to a common place. It enables integration between EPM tools for closed loop performance management.

3 Comments
Top kudoed authors