Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

New SAP HANA Cloud Integration (HCI) content category

$
0
0

Dear Community Members,

 

We are pleased to announce that a new content category for SAP HANA Cloud Integration (HCI) has been created into the Process Integration & SOA Middleware community on SCN.

 

The aim is to give everyone interested a quick and easy access, simplify and solidify everything about SAP integration – on premise, in the cloud, and hybrid - into one single place. Everyone will be able to filter in the content area and easily display the HCI content only if they want to.

 

We will be transferring existing content as soon as we have a go from the SCN strategy team and you will soon see the list grow.

 

You are invited to share new content - blogs, documents and questions - related to HANA Cloud Integration into the PI community and associate it with the HCI category. Since there has been inappropriate use of categories in other spaces, we kindly ask you to only apply the category if your content is related to HCI. Also please remember to tag your content to help others find it.

 

New to HANA Cloud Integration? Follow the Getting started with HANA Cloud Integration and let us know if you have any questions.

 

Also do not forget to join the HANA Cloud Integration info days world tour starting in Germany on July 30, 2014! Follow Udo Platzer to find out when new locations are confirmed.

 

No opportunity to travel? No worries - join the HANA Cloud Integration webcasts and enjoy a new event with Q&A every month!


SFTP using custom adapter module Part1 - Receiver

$
0
0

More often we have faced challenges for using SFTP protocol in SAP PI, especially the days when we did not have standard solution for it. Back then we have used workarounds using shell scripts or bought third party adapters. Now that with SP 8.0 and higher service pack for PI 7.11 sap itself has provided standard adapters for SFTP solution (Here), the need of using SFTP protocol in SAP PI has become simpler.

 

However, Older service packs still have to rely on custom solutions or 3rd party adapters, This blog would stand helpful for older service packs. In this article I will explain how SFTP can be done using adapter module.

 

For simplicity I created two separate modules, One for sender (to read files from SFTP server) and one for Receiver (to create files on SFTP server). This blog would cover the Receiver Module, I will be creating another blog for the Sender.

To build these modules and to connect SFTP server I have used JSch jcraft jars, mainly because of its various methods and easiness in using them in custom classes. Moreover I found JSCH is capable of handling various authentication methods that are Password based, Public key and Host based. There are lots of example on http://www.jcraft.com/jsch to use these classes. For instance below is snippet of one of the method(Authenticating with Password and connecting to SFTP server) that I have used in the receiver module.

 

import com.jcraft.jsch.*;import java.io.*;
 try   { JSch jsch = new JSch(); Session session = jsch.getSession(Username, HostName, Portno);
 session.setPassword(Password);
 session.setConfig("StrictHostKeyChecking", "no");      
session.connect();
Channel channel = session.openChannel("sftp");
channel.connect();
ChannelSftp sftpChannel = (ChannelSftp) channel;
sftpChannel.cd(Directory);
Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,"Connection to SFTP location Successful");
// Creating file with applied filename
String outfname = filename;
if (Outfilename != null) {    outfname = Outfilename;            }
if (Timestamp != null) {    Date date = new Date();
// get the current date    SimpleDateFormat dateFormatter = new SimpleDateFormat("yyyyMMddHHmmssSSS");
// set the format for date    String dfmt = dateFormatter.format(date);    dfmt = dfmt + ".";    outfname = outfname.replace(".", dfmt);            }        String toutfname = outfname;    if (tempfname != null) {    toutfname = toutfname + ".tmp";    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,"Writing file with temporary name");            }    sftpChannel.put(isXML, toutfname);    if (tempfname != null) {    try {    sftpChannel.rename(toutfname, outfname);    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,    "File Renaming is succesful");    } catch (Exception e) {    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,"Module Exception caught:");    ModuleException me = new ModuleException(e);    throw me;            }            }    sftpChannel.exit();    session.disconnect();    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS, outfname + "File written sucessfully in " + Directory);            }    // Set content as byte array into payload    xpld.setContent(inpbyt);    // Sets the principle data that represents usually the message to be    // processed    inputModuleData.setPrincipalData(msg);        } catch (Exception e) {            Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                    "Module Exception caught:");            ModuleException me = new ModuleException(e);            throw me;        }

Likewise we can do various operations, for example creating the file with temporary name while it is being written and renaming it to final file name once file writing is completed and so on. Once the module code is completed with required operations and is deployed in SAP PI, Next step is to use it. I used it in module section of standard File adapter, with this my interface now will call the custom adapter module to post file on sftp server but additionally it will also call standard processing sequence of File adapter to create a file at file location mentioned in Parameters tab of standard file adapter. On a loner run these additional file, created by standard file adapter if needed can be used as an archive repository of data sent to sftp server or if not we can simply pass empty payload to this additional file for so any data security reason (The module code will have to be modified accordingly to pass empty payload to standard file adapter processing  sequence CallSapAdapter of the module section).


For content conversion we can use standard bean StrictXml2PlainBean and can include in module section of file adapter before our custom sftp module.


2014-07-10_182635.jpg


And for monitoring, We can add audit logs at every step inside module code and same will be visible in communication channel monitoring logs.

CC.jpg

 

References :

 

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0b39e65-981e-2b10-1c9c-fc3f8e6747fa?overridelayout=t…

JSch - Java Secure Channel - Examples

SAP PI SALESFORCE BULK API

$
0
0

In this blog I’m going to give an idea how we can achieve bulk API using SOAP adapter and CCBPM in PI. Salesforce supports three API’s for data integration. 1. SOAP API (Synchronous) – It support only 200 records for any operations like insert/update and delete. 2. REST API (Synchronous) – It also support only 200 records for any operations 3. BULK API (Asynchronous) – It support 10,000 records per message. SOAP API already well explained in this link by Prasanna, I tried BULK API by using SOAP adapter with CCBPM in PI.


BULK API

It is useful when you are trying to bulk insert, update and upsert in salesforce. If your interface performs below three operations for salesforce then it perform the BULK API.

1. Create Job

2. Add Job

3. Close Job


Logical Flow:
logical diagram.PNG


Please check the link for more information https://www.salesforce.com/us/developer/docs/api_asynch/index_Left.htm#CSHID=asynch_api_quickstart_requests_intro.htm|StartTopic=Content%2Fasynch_api_quickstart_requests_intro.htm|SkinName=webhelp for bulk API. Sample example provided in link to load data using CURL, here I tried to compare the CURL example with my development in PI.


To Perform above steps we need BPM to achieve. Here i used CCBPM to follow the steps in single interface.


CCBPM for Bulk API:

CCBPM for Salesforce.png


Step1 - Salesforce Login:

For this we can use SOAP Lookup in message mapping to login into salesforce to get the SessionId and ServerURL. Please check link http://scn.sap.com/community/pi-and-soa-middleware/blog/2006/11/20/webservice-calls-from-a-user-defined-function for SOAP Lookup.


Step2 - Create Job:

Before you send the data to perform some operation in saleforce, you need to prepare the job by providing which operation you are going to perform for object. For that you need to prepare Job xml to request to salesforce. Once you request salesforce, it will prepare the job and provide you the response message with jobid, by using this job id you can send bulk data to perform operation in slaesforce.


Create request mapping for preparing Job.xml with soap lookup UDF to get session id and assign it dynamically in http headers for the soap adapter.

createjob request.png

UDF Snippets: (Assign session id and content type to the headers)

createjob udf.png

Communication Channel:

Channel to send Job.xml to salesforce:

Http url - https://instance.salesforce.com/services/async/30.0/job

Http headers - X-SFDC-Session and Content-Type

cc createjob.png

Use Variable transport binding for dynamic session and content type in http headers which we assigned in UDF (XHeaderName1 & XHeaderName2).

cc createjob headers.png

The response of this message will contain Jobid, assign this id to BPM by using export parameterized mapping for the response message.

Response message mapping:

createjob response.png

Use UDF to assign jobid to export parameter for CCBPM.

createjob response udf.png

Create Export parameter in mapping.

createjob export.png


Step 3 - Add Job:

Once you completed the step 2 (create job) in PI, you will have the jobid in CCBPM to send data to perform bulk operation in salesforce. You need to prepare the add job url by appending url with jobid/batch.  Here we can send XML data to perform operation in salesforce object.

Create Request mapping:

Add job request.png

     CCBPM:                                                                          Import Parameter:

Add job bpm.png

UDF Snippets: (Assign session id, content type and http url to the headers)

Add job udf.png

Communication Channel:

Channel to send Job.xml to salesforce:

Http url - TServerLocation

Http headers - X-SFDC-Session and Content-Type

cc addjob.png

Use Variable transport binding for dynamic URL, session and content type in http headers which we assigned in UDF (XHeaderName1, XHeaderName2 & XHeaderName3).

cc addjob headers.png

Step 4 - Close Job:

Once above steps completed we can close the job using the same jobid. Here we need to pass the xml data with status closed to salesforce using the url with job id to close the job.

closejob.png

     CCBPM:                                                                                     Import Parameter

closejob bpm.png

closejob map.png

UDF Snippets: (Assign session id, content type and http url to the headers)

closejob udf.png

Communication Channel:

Channel to send Job.xml to salesforce:

Http url - TServerLocation

Http headers - X-SFDC-Session and Content-Type

cc closejob.png

Use Variable transport binding for dynamic URL, session and content type in http headers which we assigned in UDF for (XHeaderName1, XHeaderName2 & XHeaderName3).

cc closejob headers.png

This blog is to just giving idea that Salesforce bulk API is possible in PI by using SOAP adapter, Parameterized mapping, BPM and variable transport binding. In Next blog I will try to cover step by step procedure to use bulk API in PI.

This is my first blog, Please share your thoughts/feedback and your solution to achieve Salesforce BULK API.

Enabling Component Based Message Alerting in an upgraded PI system from 7.* to 7.31

$
0
0

This blog is applicable to only systems upgraded from a dual stack PI to latest 7.31(both single or dual stacks) systems. This is a small piece of configuration in the Netweaver Administrator and the ABAP Integration Engine of the system which is buried deep, both in the documentation and in the system as well.

 

When a dual stack system is upgraded to 7.31, the system gives a choice to enable the new feature from the classic ABAP based Alert Framework that is enabled by default. If this is step is not carried out during the upgrade process (the basis person may not know the applicability of this new feature and may simply choose to go with the default Alert Framework option). In order to enable the latest component based message alerting mechanism which is purely based out of the Java stack (but capable of alerting for errors in both the stacks), please read ahead.

 

Procedure for Java Central Adapter Engine:

Log on to NWA and navigate to Configuration Next navigation stepInfrastructureNext navigation stepJAVA System PropertiesNext navigation stepServices: XPI Service: AF Core

 

Change the parameter alerting.target value to 1 from the default zero value. Save and restart you Java Engine (required).

 

Procedure for the ABAP Integration Engine:

Log on to the ABAP stack and go to transaction SXMB_ADM, Integration Engine Configuration and then Configuration.

Add the below parameters with value 1 under the category Monitor

  • ALERTING_IS_ACTIVE
  • ALERTING_TARGET

 

Save and Exit.

 

Further reading -

 

You can refer to the below blog on how to use the Component Based Message Alerting -

Michal's PI tips: Component-Based Message Alerting

 

Hope this helps.

 

Vijayashankar Konam.


Unlock the Value of Your SAP Investments

$
0
0

Date: Wednesday, August 13, 2014
Time: 11:00 am ET/15:00 pm UK/16:00 pm Central Europe
Duration: 60 minutes

User experience encompasses a variety of aspects of which UI is just one. Newer SAP Fiori apps are merely the beginning. Learn how you can build your own Fiori-like apps and employ a more intuitive approach to developing business process applications. Learn of advances in how you can achieve a greater degree of interoperability between your SAP and Microsoft applications providing your end-users with an enhanced user experience.

Attend this webinar to learn:

• How to leverage SAP offerings for a comprehensive UX strategy
• How to build your own Fiori-like apps
• How to leverage third-party tools and partner offerings to complement SAP tools


REGISTER HERE!

Generate Dynamic Queue in PI Java Only.

$
0
0

We have requirement to generate gueue,  that depends on some key from inbound xml message.


What for? We can use Encoded Headers in Sender channel, add &QueueId=  to endpoint url of ICO. But client can’t agree to change url, add parameters. Client wants one unchangeable endpoint and send QUEUE name in inbound xml message.


What can we do in this situation?


We can use two Iflow.  How to use Iflow-Iflow read in this blog.


First Iflow will have unchangeable URL and EOIO Sender Channel with some QUEUE name.


Receiver channel will be a little tricky.

Channel: SOAP HTTP.

Target URL: http://

ASMA and Transport Binding checked.

Variable Header (XheaderName1): TserverLocation

Variable Header (XheaderName2): TAuthKey


We will set this variables in Operation Mapping in this first flow.

Server Location we use for set URL with parameter &QueueId= in Target URL of Receiver Channel, TauthKey we use for set user credentials (user that will run second IFlow).

+ add View Authorization Key witn username and password.

Key: username

Password: user password.


Receiver Channel Configuration in first Iflow:


receiverChannelConfig.png

UDF for set this values (var1 used for QUEUE_NAME):


@LibraryMethod(title="GenerateQueue", description="", category="FL_local", type=ExecutionType.SINGLE_VALUE)      public String calculate1 (          @Argument(title="")  String var1,           Container container)  throws StreamTransformationException{          
//Get the dynamic configuration from the container

DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);

 

DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/SOAP", "TServerLocation");

 String varURL = "http://[host:port]/XISOAPAdapter/MessageServlet?senderService=TEST_BC&interface=SIOA_AAA&interfaceNamespace=http://some.ns&QueueId=Q" +var1; 

//Put the url value from the input in the configuration under the specified key

conf.put(key,varURL);

 key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/SOAP", "TAuthKey");
 conf.put(key,"username");

 

//return the data for mapping to the output

return var1;
     }

Use this code in OM in First Iflow.

But this have some limitation on TServerLocation length. 200 symbols max. So, don’t give your queues and interfaces long names, if you want to implement this.


Then in second Iflow you need to configure Sender Channel:

Channel: SOAP HTTP

Keep Headers, Use Encoded Headers, Use Query String checked

QoS: EOIO with DummyQueueName (this will be changed from Receiver Channel of First Iflow)

Remaining configuration is straightforward.

 

Have fun

Bridges Construction Site (3/3): SAP PI bridges - "exotic" and recommendations.

$
0
0

Previous posts:

Bridges Construction Site (1/3): Asynchronous - synchronous bridge.

Bridges Construction Site (2/3): Synchronous - asynchronous bridge.


----------------------------------------------------------------------------------------------------------------------------------------------------------------

nbridge4.jpg


Hello again, colleagues!

 

Let's finish to review solutions for the problem of integration in heterogeneous interfaces landscape using SAP PI bridges.

 

Contents:

 

4. Using System Acknowledgement in Async-Sync bridges (from PI version 7.3).

 

Starting from version SAP PI 7.3  a couple of new modules were added to the bridges construction pool - for sending a technical delivery confirmation to the initial system. The fact is that in "normal"(not bridges) interfaces, communication channel returns technical confirmation message automatically if initial system requested a technical confirmation. This confirmation called System Acknowledgement.

 

In the case of asynchronous-synchronous bridge,an automatic sending of such confirmation does not work.

 

To send this confirmation in SAP PI 7.3 you need to use following modules:

 

  • SendAckBean - used to send system acknowledgement to the initial system. This module must be the last in the chain of modules. You must also use CloneMessageBean.
  • CloneMessageBean - used to replace the original message with a copy. All further chain modules will work with a copy, leaving the original message intact. This module must be used only with SendAckBean in context of asynchronous-synchronous bridge.

 

So first module will save the initial message intact (technical header is the most important), second module will use this saved message to create a System Acknowledgement.

 

For using these modules in the Communication Channel you should go to the tab «Module» and enter the following values:


 

Module nameTypeModule key
AF_Modules/CloneMessageBeanLocal Enterprise Beankey1
.....Local Enterprise Beanall other modules
AF_Modules/SendAckBeanLocal Enterprise Beankey2

 

«Module Key» can be anything. These modules do not have any settings.


5. Async-Sync bridge via mapping lookup.

 

I would also like to mention the possibility to create an asynchronous-synchronous bridge via mapping look-up functionality.

It is not "real" bridge, but it's really looks like it.

 

Diagram of look-up bridge:

bridge22.jpg

 

Pros:

 

  • Convenient for a small amount of data.
  • Starting with SAP PI 7.1 RFC and JDBC lookups included into standard mapping functions of Message Mapping editor and do not require knowledge of Java.

 

Cons:

 

  • Requires knowledge of Java (for RFC and JDBC in PI versions up to 7.0, and for any PI version if you need other adapters)
  • Performance strongly depends on the volume data.
  • Mixing message mapping and transport / routing in one Message Mapping object.
  • Since you using Java, it is a custom development.

 

I must mention again that the main purpose of lookups is to obtain additional data from the external system in the process of mapping messages. Lookup was not developed for bridges initially.

 

You need to create following objects to build a bridge:bridge23.jpg

Configured RFC Lookup in the mapping looks like this:

sapbridge_example_57_c.jpg

 

You need to configure a standard asynchronous interface with all communication channels and routing rules in the Integration Directory. Additionally, you need to create one more communication channel (which will be used for RFC-lookup call) and use this communication channel in the Interface Determination settings.

 

If everything is configured correctly, then RFC-call will be executed during the mapping and results will be seen in the target message.

 

More details about the look-up can be found here:

Usage of SOAP Lookup and RFC Lookup in a Single Mapping Program

Adding Lookups to Mapping Programs

 

 

6. Recommendations for choosing a bridge variant.

 

So, what kind a bridge to choose for your special interface? This question is very creative, but I'll try to give some recommendations:

 

  • At first, let's look at the interface and try to understand - is it possible to go without the bridge? Yes, it's the main question. It might be easier to change the logic of one of connected systems before start of using modules and increasing the complexity of the interface.

       This is one of the "golden" rules for working with PI: the simpler interface working faster and more reliable.

 

  • Look at the source system and understand the type of bridge: asynchronous-synchronous or  synchronous-asynchronous.

 

  • Choose the "main" communication channel to build a bridge. In general - it is preferable to use receiver communication channel. In this variant, the message is processed in the PI in normal mode and changes its properties in one place only  - just before the adapter for the target system. But it can be not possible in some situations - for example, the communication channel configured already and it is used in conjunction with multiple interfaces. Or there is no way to add the necessary modules to the adapter (safety rules, adapter loaded with tons of modules already, etc). In such circumstances, use Sender Communication channel for all bridge modules.

 

  • Check the amount and complexity of information passing through the asynchronous-synchronous bridge. If you need to get values of one or two fields from ERP, database or web-service - use mapping lookup. If you need more complex information - you need a full bridge.

 

  • Do we need a mapping? If yes - one side or both sides? Then check the possibility to use this kind of mapping in the selected bridge variant.

 

And the main recommendation - no fear and go experimenting, you will get what you need!

 

7. Additional information.

 

Official information about bridges with modules on help.sap.com (SAP Netweaver 7.4):
Configuring Async/Sync and Sync/Async Bridge in the JMS Adapter

Step-by-step instruction and example from SAP Official on SDN:
«Async/Sync and Sync/Async Bridge Scenarios Configuring Async/Sync Bridge and Sync/Async Bridge
without BPM»

-----------------------------------------------------------------------------------------------------------------------------

So, this is all that I have about bridges now.  I hope these posts will help you to build best integration in SAP world.

 

Links to previous posts:

Bridges Construction Site (1/3): Asynchronous - synchronous bridge.

Bridges Construction Site (2/3): Synchronous - asynchronous bridge.

 

 

With best regards,

Alexey Petrov

Freelance Integration Expert

Comparison on Application Interface Framework (AIF) and Process Orchestration (PO)

$
0
0

    SAP Application Interface Framework ( AIF )

 

  • A powerful framework for interface implementation, monitoring  and error handling
  • A proven solution with more than six years of development and customer history
  • A cross-industry solution with customers from 22 industries

 

      Interface Implementation (Design Time)

  • Interface implementation mainly through Customizing menus
  • Easy access to relevant data or functions of underlying SAP application
  • Re-use of interface components (e.g. checks, mappings, actions)
  • Supports variants of interfaces (exceptions or additional steps)
  • Independent implementation and testing of interface components and interface variants

      Monitoring and Error-Handling (Runtime & Operations)

  • Business user monitoring (power user)
  • Customizable authorizations for interface access
  • Overview of interface status

                  AIF.png

 

 

      Application Interface Framework Benefits :

  • Fast implementation of Interfaces
  • Enforcement of implementation guidelines
  • Re-use of components in multiple interfaces
  • Allows versioning of interfaces
  • Unification of required skills
  • Test tool for automated testing
  • Tool-supported documentation of interfaces
  • Transparency of interfaces within area of responsibility
  • User-friendly Interface Information
  • Ability to correct errors within their system on User Friendly Screens
  • Capabilities for mass error handling
  • Highlighted Errors in Interface Documents which offer forward navigation to issue
  • User Alerts in case of Errors
  • Interfaces with significantly reduced implementation costs
  • Efficient Interface Monitoring leads to reduced monitoring costs
  • Secure compliance by providing a multi-layer authorization concept
  • Restrict changes of interface data down to field level
  • Central compliance report to track changes
  • Transparency and governance throughout complete interface development life cycle

 

SAP Process Integration / Process Orchestration

 

SAP Process Integration/  Process Orchestration provides an easy and flexible middleware platform to design, model, implement,execute and monitor business processes by ensuring seamless connectivity to multiple different business/ technical systems and applications (SAP and non-SAP).


  

     PO.png

 

  • Centralized governance, maintenance,mapping, routing and monitoring
  • De-coupled runtimes and distributed load
  • Separating pure messaging from process orchestration
  • Localization: Reduced message travel time and reduced network load in global networks
  • Localization: Local AAEs serve local business needs
  • Business continuity: Minimize planned downtime via switch over
  • Security: Secure deployment of AAE instances in DMZ enforcing network isolation for B2B scenarios
  • Standard based interoperability with SAP and Non-SAP apps – on premises and in the Cloud

 

******************************************************************************************************************************************

          “I already have PI, does it still make sense for me to use AIF?” - Yes

          “I want AIF, does it still make sense for me to use PI?” – Yes

******************************************************************************************************************************************


Joint usage of PI and AIF :  PI / PO is an integration platform,  AIF is an integration add-on for your business applications


         AIF and PO.png

 

As general recommendation IT landscape with both business applications and integration platforms/middlewares

  • No business logic into the integration platform!
  • No integration logic into the business applications!
  • Business logic into AIF, not PI ( Typical business logic patterns,Calculations,Validations,Checks )     
  • Integration logic into PI, not AIF ( Integration Logic – Typical patterns,Mapping of data structures from different sender formats to AIF
    format,Routing of messages,Connectivity to various protocols (FTP, RFC, ….)

 

As general recommendation IT landscape with both business applications and integration platforms/middleware,The typical AIF user is a business user and a functional user where as PO / PI user is a Technical User

 

Combined,PO / PI and AIF with both tools in one landscape, you can have two (field/structure) mappings in one integration scenario


      Example.png

Which Tool for which mapping


SAP PI / PO - (Structure) Mapping

  1. Technical Mapping
  2. Rather format conversions (Example: file structure to Web Service)
  3. No business logic in mapping
  4. Error Handling for Mapping Errors by Technical User

 

SAP AIF – Field Mapping

  1. Business Level Mapping
  2. Contains Business logic
    • Checks (Example: „is company code valid?“)
    • Validations and Derivations (Example: derive account number from document data)
  3. No technical format conversions
  4. Error Handling for Mapping Errors by Business User


I have PO / PI and AIF: where should I perform mappings?

The answer: typically in both(Reason: aclean, future-proof integration architecture )

 

The data structures are the decision-making factor for whether to go for PO / PI or AIF. the questions like..
In which data structure does the data arrive in PI?  (All format conversions are typically performed in PI)

In which data structure is the data submitted into the backend system? (All business logic, validations as well as the final handover into BAPIs etc is done in AIF)

 

 

I have many old legacy applications. Can I connect them directly to AIF? NO, connectivity to different protocols (files, JMS, …) is a typical task of PO / PI

 

SAP PI – Value  Mapping  (If you want a landscape-wide value mapping mechanism: use PI)

 

  • Technical Level value Mapping
  • Focused on the complete landscape that is brokered by PI
  • Data Maintenance either directly in PI or via Consulting Solution “Value Mapping Tool”
  • Error Handling for Mapping Errors by Technical User (if Value Mapping Tool use: Business User)

   

SAP AIF – Value Mapping (If you want a business application-centric value mapping mechanism in one or more specific business application(s), use AIF )

 

  • Business Level Value Mapping
  • Focused on the business application it is deployed in
  • Data Maintenance performed by business user
  • Error Handling (missing values, …) by Business User

 

My sincere thanks to Markus Gille for sharing his knowledge for the above blog.


Applexus Brucke and Hana Cloud Integration

$
0
0

Applexus Brucke and HANA Could Integration

Hi All,


Before you start making complications with the title HANA Cloud Integration, let me tell you in the beginning itself it’s not complicated rather a very easy tool to integrate SAP R3 system to on- cloud applications. In short our scenario was to send data from SAP Back end to different e-commerce applications like Amazon, IBM Web sphere, Magneto, Oracle etc. Some people might think only if you have a PI background you will be able to work with HCI as it’s an extension on SAP PI (Process Integration). Just smile it’s not. HCI is a completely new product and not a successor of SAP PI. For those people who are already working with PI can also smile as you can reuse your SAP PI objects in HCI.


Overview of Applexus Brucke: Applexus Brucke is a product by Applexus Technology Pvt Ltd which connects different e-commerce like IBM Web sphere, Magneto, Hybris, Amazon to SAP R3 System via Hana Cloud Integration

This integration will be a key benefit for e-commerce as it helps the business to pick up with time by doing the following activities.

  • Avoids out of stock.
  • Decrease time to replenish inventory.
  • Real time inventory checks and quick order processing.
  • CRM activities for customer order management and complaint management.



bruckeflow.PNG

Rather than asking why Applexus choose HCI it’s better to ask why not HCI, and given below are some reason to support it.

  • Sends message from one sender to multiple receivers.
  • Splits complex message into simple and sends to receiver.
  • Various connectivity options like SFTP, SOAP and IDOC.
  • Importing of already existing integration.
  • Security and Monitoring of messages and channels.
  • Future integration with mobile.


A sample scenario from Applexus Brucke to integrate SAP R3 via HCI using SOAP and SFTP adapter using eclipse.


Requirements: Eclipse Juno, SOAP UI (Smart bear), Tenant management node. (Will get on request to SAP)

  1. Install eclipse and import all the Hana cloud integration tools by going to help->add new software.
  2. Give the operation server details (tenant details provided from SAP) under Preferences->SAP Hana Cloud Integration->Operation Server.
  3. Create a working set and deploy a sample project to working set. I have given a sample integration flow template below which we created for our scenario which is SOAP to SFTP connection.


pic1.png

In the above figure you can see a sender and receiver which are linked by the integration process. We can configure the sender channel and the receiver channel by the right clicking on it and selecting configure channel.

 

An important point to be noted is to connect HCI with our back end a SSL certificate has to be imported which will look the one given below.


pic2.png


You have to right click on sender and choose import certificate.

To configure the sender and receiver channel you can right click on the SOAP and SFTP link and click on configure channel and specify the configurations. A sample is given below.

 

Sender channel configurations

pic3.png

Receiver channel configurations

pic4.png


pic5.png


The receiver directory should be a SFTP site. Please refer documents how to create a SFTP site.

Once you are done with this basic configuration you can test the scenario with SOAP UI (Smart bear) with a sample wsdl file.


pic6.png


You have to mention the end point in your soap UI. The end point can be taken from our integration flow model by choosing services.


pic7.png

Once this is done just click on the start button and you will be able to see a message found message in your eclipse HCI screen and finally to SFTP destination.

Reading a delimeter separated huge file(s) whose columns may jumble

$
0
0

Dear SCN Members,

 

Recently we have a requirement of reading a .csv file from third party file server whose columns may jumble and first row/record tells you the column names.I have developed this Interface with the below approach .It worked effectively with small and medium sized files .

 

Reading a delimeter separated file whose columns may jumble

 

During Testing phase we come to know from Business that we can expect 15 to 30 mb file and only 1 file will get processed in a month .We tried to test the interface with 30 mb file ,because of this java stack went to hung state .So I have been asked to fine tune the code that would be able to process huge files.I want to share my knowledge on this .This concept will help us to understand how to handle our code for huge files processing .

 

Follow the below step to achieve these kind of requirements.

 

Read the CSV file by using normal FCC.

Map.JPG

 

Pass all Source field values and No of cols as an Input to the below udf and get the respective jumbled values as an output from udf and map those values to respective target fields.

 

NOTE:Core Logic for Jumbling will remain the same as explained in my previous blog ,but here Instead of taking entire payload in a array variable for jumbling I'm taking 10 records at a time and applying jumbling logic and processing further .

 

UDF:I have used the below udf for passing the values to apt fields in the target structure.

 

public void Colum_Reorg(String[] Fld1,String[] Fld1, String[] Fld2, String[] Fld3, String[] Fld4, String[] Fld5, String[] Fld6, String[] Fld7, String[] Fld8, String[] Fld9, String[] Fld10, String[] Fld11, String[] Fld12, String[] Fld13, String[] Fld14, String[] Fld15, String[] Fld16, String[] Fld17, String[] Fld18, String[] Fld19, String[] Fld20, String[] Fld21, String[] Fld22, String[] Fld23, String[] Fld24, String[] Fld25, String[] Fld26, String[] Fld27, String[] Fld28, String[] Fld29, String[] Fld30, String[] Fld31, String[] Fld32, String[] Fld33, String[] Fld34, String[] Fld35, String[] Fld36, String[] Fld37, String[] Fld38, String[] Fld39, String[] Fld40, String[] Fld41, String[] Fld42, String[] Fld43, String[] Fld44, String[] Fld45, String[] Fld46, String[] Fld47, String[] Fld48, String[] Fld49, String[] Fld50, String[] Fld51, String[] Fld52, String[] Fld53, String[] Fld54, String[] Fld55, String[] Fld56, String[] Fld57, String[] Fld58, String[] Fld59, String[] Fld60, String[] Fld61, String[] Fld62, String[] Fld63, String[] Fld64, String[] Fld65, String[] Fld66, String[] Fld67, String[] Fld68, String[] Fld69, String[] Fld70, String[] Fld71, String[] Fld72, String[] Fld73, String[] Fld74, ResultList AcctEntryId, ResultList ValueDate, ResultList Entity, ResultList Folder, ResultList DenomCcy, ResultList FunctCcy, ResultList TradeId, ResultList TradeVersion, ResultList TradeType, ResultList LinkId, ResultList LinkVersion, ResultList Counterparty, ResultList AcctProcess, ResultList CashAccount, ResultList AcctEntryDescr, ResultList ReversesAcctEntryId, ResultList AcctEntrySubSysId, ResultList DenomCcyAmount, ResultList FunctCcyAmount, ResultList AcctRule, ResultList AcctChartAcctNumber, ResultList AcctChartDescr, ResultList AcctChartAlternateId, ResultList AcctCategory, ResultList IsParentAcct, ResultList ParentAcctChartAcctNumber, ResultList ParentAcctChartDescr, ResultList AcctChartType, ResultList EntityAltId1, ResultList EntityAltId2, ResultList EntityAltId3, ResultList EntityAltId4, ResultList EntityAltId5, ResultList CptyAltId1, ResultList CptyAltId2, ResultList CptyAltId3, ResultList CptyAltId4, ResultList CptyAltId5, ResultList FolderAltId1, ResultList FolderAltId2, ResultList FolderAltId3, ResultList FolderAltId4, ResultList FolderAltId5, ResultList TradeGroup, ResultList TradeAcctCode, ResultList TradeAcctCode2, ResultList CounterpartyTradeId, ResultList AlternateTradeId, ResultList TradeGroupAltId, ResultList TradeAcctCodeAltId, ResultList LinkGroup, ResultList LinkAcctCode, ResultList LinkAcctCode2, ResultList LinkGroupAltId, ResultList LinkAcctCodeAltId, ResultList AcctChartAltId1, ResultList AcctChartAltId2, ResultList AcctChartAltId3, ResultList AcctChartAltId4, ResultList AcctChartAltId5, ResultList ParentAcctChartAltId1, ResultList ParentAcctChartAltId2, ResultList ParentAcctChartAltId3, ResultList ParentAcctChartAltId4, ResultList ParentAcctChartAltId5, ResultList CashAccountAltID1, ResultList CashAccountAltID2, ResultList CashAccountAltID3, ResultList CashAccountAltID4, ResultList CashAccountAltID5, ResultList OriginalValueDate, ResultList ProductType, ResultList ProductSubType, ResultList AlternateLinkId, Container container) throws StreamTransformationException{
AbstractTrace trace = container.getTrace();    int counter =0;    int NoofRec = Fld1.length;    int l1=0;int v1 =10 ;int c1=0;    int count = (int)Math.ceil(NoofRec/10.0);       for(int l2=0;l2<count;l2++)    { 
if(count>1)
{
if(l2 == (count-1)){
v1 =(NoofRec-(10*l2)) +l2  ;
}}
if(count ==1)
{v1=NoofRec ;}    
String[][] temp = new String[v1][ Noofcols*v1];    for(int i=1;i<v1;i++)    {        temp[0][0] =Fld1[0] ;  //AcctEntry0d        temp[0][1] =Fld2[0] ;        temp[0][2] =Fld3[0] ;        temp[0][3] =Fld4[0] ;        temp[0][4] =Fld5[0] ;        temp[0][5] =Fld6[0] ;        temp[0][6] =Fld7[0] ;        temp[0][7] =Fld8[0] ;        temp[0][8] =Fld9[0] ;        temp[0][9] =Fld10[0] ;             temp[0][10] =Fld11[0] ;        temp[0][11] =Fld12[0] ;        temp[0][12] =Fld13[0] ;        temp[0][13] =Fld14[0] ;        temp[0][14] =Fld15[0] ;        temp[0][15] =Fld16[0] ;        temp[0][16] =Fld17[0] ;        temp[0][17] =Fld18[0] ;        temp[0][18] =Fld19[0] ;        temp[0][19] =Fld20[0] ;             temp[0][20] =Fld21[0] ;        temp[0][21] =Fld22[0] ;        temp[0][22] =Fld23[0] ;        temp[0][23] =Fld24[0] ;        temp[0][24] =Fld25[0] ;        temp[0][25] =Fld26[0] ;        temp[0][26] =Fld27[0] ;        temp[0][27] =Fld28[0] ;        temp[0][28] =Fld29[0] ;        temp[0][29] =Fld30[0] ;             temp[0][30] =Fld31[0] ;        temp[0][31] =Fld32[0] ;        temp[0][32] =Fld33[0] ;        temp[0][33] =Fld34[0] ;        temp[0][34] =Fld35[0] ;        temp[0][35] =Fld36[0] ;        temp[0][36] =Fld37[0] ;        temp[0][37] =Fld38[0] ;        temp[0][38] =Fld39[0] ;        temp[0][39] =Fld40[0] ;             temp[0][40] =Fld41[0] ;        temp[0][41] =Fld42[0] ;        temp[0][42] =Fld43[0] ;        temp[0][43] =Fld44[0] ;        temp[0][44] =Fld45[0] ;        temp[0][45] =Fld46[0] ;        temp[0][46] =Fld47[0] ;        temp[0][47] =Fld48[0] ;        temp[0][48] =Fld49[0] ;        temp[0][49] =Fld50[0] ;             temp[0][50] =Fld51[0] ;        temp[0][51] =Fld52[0] ;        temp[0][52] =Fld53[0] ;        temp[0][53] =Fld54[0] ;        temp[0][54] =Fld55[0] ;        temp[0][55] =Fld56[0] ;        temp[0][56] =Fld57[0] ;        temp[0][57] =Fld58[0] ;        temp[0][58] =Fld59[0] ;        temp[0][59] =Fld60[0] ;             temp[0][60] =Fld61[0] ;        temp[0][61] =Fld62[0] ;        temp[0][62] =Fld63[0] ;        temp[0][63] =Fld64[0] ;        temp[0][64] =Fld65[0] ;        temp[0][65] =Fld66[0] ;        temp[0][66] =Fld67[0] ;        temp[0][67] =Fld68[0] ;        temp[0][68] =Fld69[0] ;        temp[0][69] =Fld70[0] ;             temp[0][70] =Fld71[0] ;        temp[0][71] =Fld72[0] ;        temp[0][72] =Fld73[0] ;        temp[0][73] =Fld74[0] ;      temp[i][counter++] =Fld1[i+l1] ;  //AcctEntryId        temp[i][counter++] =Fld2[i+l1] ;        temp[i][counter++] =Fld3[i+l1] ;        temp[i][counter++] =Fld4[i+l1] ;        temp[i][counter++] =Fld5[i+l1] ;        temp[i][counter++] =Fld6[i+l1] ;        temp[i][counter++] =Fld7[i+l1] ;        temp[i][counter++] =Fld8[i+l1] ;        temp[i][counter++] =Fld9[i+l1] ;        temp[i][counter++] =Fld10[i+l1] ;             temp[i][counter++] =Fld11[i+l1] ;        temp[i][counter++] =Fld12[i+l1] ;        temp[i][counter++] =Fld13[i+l1] ;        temp[i][counter++] =Fld14[i+l1] ;        temp[i][counter++] =Fld15[i+l1] ;        temp[i][counter++] =Fld16[i+l1] ;        temp[i][counter++] =Fld17[i+l1] ;        temp[i][counter++] =Fld18[i+l1] ;        temp[i][counter++] =Fld19[i+l1] ;        temp[i][counter++] =Fld20[i+l1] ;             temp[i][counter++] =Fld21[i+l1] ;        temp[i][counter++] =Fld22[i+l1] ;        temp[i][counter++] =Fld23[i+l1] ;        temp[i][counter++] =Fld24[i+l1] ;        temp[i][counter++] =Fld25[i+l1] ;        temp[i][counter++] =Fld26[i+l1] ;        temp[i][counter++] =Fld27[i+l1] ;        temp[i][counter++] =Fld28[i+l1] ;        temp[i][counter++] =Fld29[i+l1] ;        temp[i][counter++] =Fld30[i+l1] ;             temp[i][counter++] =Fld31[i+l1] ;        temp[i][counter++] =Fld32[i+l1] ;        temp[i][counter++] =Fld33[i+l1] ;        temp[i][counter++] =Fld34[i+l1] ;        temp[i][counter++] =Fld35[i+l1] ;        temp[i][counter++] =Fld36[i+l1] ;        temp[i][counter++] =Fld37[i+l1] ;        temp[i][counter++] =Fld38[i+l1] ;        temp[i][counter++] =Fld39[i+l1] ;        temp[i][counter++] =Fld40[i+l1] ;        temp[i][counter++] =Fld41[i+l1] ;        temp[i][counter++] =Fld42[i+l1] ;        temp[i][counter++] =Fld43[i+l1] ;        temp[i][counter++] =Fld44[i+l1] ;        temp[i][counter++] =Fld45[i+l1] ;        temp[i][counter++] =Fld46[i+l1] ;        temp[i][counter++] =Fld47[i+l1] ;        temp[i][counter++] =Fld48[i+l1] ;        temp[i][counter++] =Fld49[i+l1] ;        temp[i][counter++] =Fld50[i+l1] ;        temp[i][counter++] =Fld51[i+l1] ;        temp[i][counter++] =Fld52[i+l1] ;        temp[i][counter++] =Fld53[i+l1] ;        temp[i][counter++] =Fld54[i+l1] ;        temp[i][counter++] =Fld55[i+l1] ;        temp[i][counter++] =Fld56[i+l1] ;        temp[i][counter++] =Fld57[i+l1] ;        temp[i][counter++] =Fld58[i+l1] ;        temp[i][counter++] =Fld59[i+l1] ;        temp[i][counter++] =Fld60[i+l1] ;             temp[i][counter++] =Fld61[i+l1] ;        temp[i][counter++] =Fld62[i+l1] ;        temp[i][counter++] =Fld63[i+l1] ;        temp[i][counter++] =Fld64[i+l1] ;        temp[i][counter++] =Fld65[i+l1] ;        temp[i][counter++] =Fld66[i+l1] ;        temp[i][counter++] =Fld67[i+l1] ;        temp[i][counter++] =Fld68[i+l1] ;        temp[i][counter++] =Fld69[i+l1] ;        temp[i][counter++] =Fld70[i+l1] ;        temp[i][counter++] =Fld71[i+l1] ;        temp[i][counter++] =Fld72[i+l1] ;        temp[i][counter++] =Fld73[i+l1] ;        temp[i][counter++] =Fld74[i+l1] ;        counter =0;                c1 =i ;        }        l1 = l1+c1 ;        for (int j=0,k = 0; k < Noofcols; k++)        {            if (temp[j][k].equalsIgnoreCase("AcctEntryId") ||                    temp[j][k].equalsIgnoreCase("ValueDate")     ||                             temp[j][k].equalsIgnoreCase("Entity") ||                    temp[j][k].equalsIgnoreCase("Folder") ||                    temp[j][k].equalsIgnoreCase("DenomCcy") ||                    temp[j][k].equalsIgnoreCase("FunctCcy") ||                    temp[j][k].equalsIgnoreCase("TradeId") ||                    temp[j][k].equalsIgnoreCase("TradeVersion") ||                    temp[j][k].equalsIgnoreCase("TradeType") ||                    temp[j][k].equalsIgnoreCase("LinkId") ||                    temp[j][k].equalsIgnoreCase("LinkVersion") ||                    temp[j][k].equalsIgnoreCase("Counterparty") ||                    temp[j][k].equalsIgnoreCase("AcctProcess") ||                    temp[j][k].equalsIgnoreCase("CashAccount") ||                    temp[j][k].equalsIgnoreCase("AcctEntryDescr") ||                    temp[j][k].equalsIgnoreCase("ReversesAcctEntryId") ||                    temp[j][k].equalsIgnoreCase("AcctEntrySubSysId") ||                    temp[j][k].equalsIgnoreCase("DenomCcyAmount") ||                    temp[j][k].equalsIgnoreCase("FunctCcyAmount")  ||                    temp[j][k].equalsIgnoreCase("AcctRule") ||                    temp[j][k].equalsIgnoreCase("AcctChartAcctNumber") ||                    temp[j][k].equalsIgnoreCase("AcctChartDescr") ||                    temp[j][k].equalsIgnoreCase("AcctChartAlternateId") ||                    temp[j][k].equalsIgnoreCase("AcctCategory") ||                    temp[j][k].equalsIgnoreCase("IsParentAcct") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAcctNumber") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartDescr") ||                    temp[j][k].equalsIgnoreCase("AcctChartType") ||                    temp[j][k].equalsIgnoreCase("EntityAltId1") ||                    temp[j][k].equalsIgnoreCase("EntityAltId2") ||                    temp[j][k].equalsIgnoreCase("EntityAltId3") ||                    temp[j][k].equalsIgnoreCase("EntityAltId4") ||                    temp[j][k].equalsIgnoreCase("EntityAltId5") ||                    temp[j][k].equalsIgnoreCase("CptyAltId1") ||                    temp[j][k].equalsIgnoreCase("CptyAltId2") ||                    temp[j][k].equalsIgnoreCase("CptyAltId3") ||                    temp[j][k].equalsIgnoreCase("CptyAltId4") ||                    temp[j][k].equalsIgnoreCase("CptyAltId5") ||                    temp[j][k].equalsIgnoreCase("FolderAltId1") ||                    temp[j][k].equalsIgnoreCase("FolderAltId2") ||                    temp[j][k].equalsIgnoreCase("FolderAltId3") ||                    temp[j][k].equalsIgnoreCase("FolderAltId4") ||                    temp[j][k].equalsIgnoreCase("FolderAltId5") ||                    temp[j][k].equalsIgnoreCase("TradeGroup") ||                    temp[j][k].equalsIgnoreCase("TradeAcctCode") ||                    temp[j][k].equalsIgnoreCase("TradeAcctCode2") ||                    temp[j][k].equalsIgnoreCase("CounterpartyTradeId") ||                    temp[j][k].equalsIgnoreCase("AlternateTradeId") ||                    temp[j][k].equalsIgnoreCase("TradeGroupAltId") ||                    temp[j][k].equalsIgnoreCase("TradeAcctCodeAltId") ||                    temp[j][k].equalsIgnoreCase("LinkGroup") ||                    temp[j][k].equalsIgnoreCase("LinkAcctCode") ||                    temp[j][k].equalsIgnoreCase("LinkAcctCode2") ||                    temp[j][k].equalsIgnoreCase("LinkGroupAltId") ||                    temp[j][k].equalsIgnoreCase("LinkAcctCodeAltId") ||                    temp[j][k].equalsIgnoreCase("AcctChartAltId1") ||                    temp[j][k].equalsIgnoreCase("AcctChartAltId2") ||                    temp[j][k].equalsIgnoreCase("AcctChartAltId3") ||                    temp[j][k].equalsIgnoreCase("AcctChartAltId4") ||                    temp[j][k].equalsIgnoreCase("AcctChartAltId5") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAltId1") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAltId2") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAltId3") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAltId4") ||                    temp[j][k].equalsIgnoreCase("ParentAcctChartAltId5") ||                    temp[j][k].equalsIgnoreCase("CashAccountAltID1") ||                    temp[j][k].equalsIgnoreCase("CashAccountAltID2") ||                    temp[j][k].equalsIgnoreCase("CashAccountAltID3") ||                    temp[j][k].equalsIgnoreCase("CashAccountAltID4") ||                    temp[j][k].equalsIgnoreCase("CashAccountAltID5") ||                    temp[j][k].equalsIgnoreCase("OriginalValueDate") ||                    temp[j][k].equalsIgnoreCase("ProductType") ||                    temp[j][k].equalsIgnoreCase("ProductSubType") ||                    temp[j][k].equalsIgnoreCase("AlternateLinkId"))                         {                 if(temp[j][k].equalsIgnoreCase("AcctEntryId")) {for(int m1 = 1; m1 < v1; m1++){AcctEntryId.addValue(temp[m1][k]);}}             else if(temp[j][k].equalsIgnoreCase("ValueDate"))   {for(int m2 = 1; m2 < v1; m2++){ValueDate.addValue(temp[m2][k]);}}             else if(temp[j][k].equalsIgnoreCase("Entity"))      {for(int m3 = 1; m3 < v1; m3++){Entity.addValue(temp[m3][k]);}}             else if(temp[j][k].equalsIgnoreCase("Folder"))      {for(int m4 = 1; m4 < v1; m4++){Folder.addValue(temp[m4][k]);}}             else if(temp[j][k].equalsIgnoreCase("DenomCcy"))    {for(int m5 = 1; m5 < v1; m5++){DenomCcy.addValue(temp[m5][k]);}}                       else if(temp[j][k].equalsIgnoreCase("FunctCcy"))    {for(int m6 = 1; m6 < v1; m6++){FunctCcy.addValue(temp[m6][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeId"))     {for(int m7 = 1; m7 < v1; m7++){TradeId.addValue(temp[m7][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeVersion")){for(int m8 = 1; m8 < v1; m8++){TradeVersion.addValue(temp[m8][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeType"))   {for(int m9 = 1; m9 < v1; m9++){TradeType.addValue(temp[m9][k]);}}             else if(temp[j][k].equalsIgnoreCase("LinkId"))      {for(int m10 = 1; m10 < v1; m10++){LinkId.addValue(temp[m10][k]);}}                            else if(temp[j][k].equalsIgnoreCase("LinkVersion"))          {for(int m11 = 1; m11 < v1; m11++){LinkVersion.addValue(temp[m11][k]);}}             else if(temp[j][k].equalsIgnoreCase("Counterparty"))         {for(int m12 = 1; m12 < v1; m12++){Counterparty.addValue(temp[m12][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctProcess"))         {for(int m13 = 1; m13 < v1; m13++){AcctProcess.addValue(temp[m13][k]);}}             else if(temp[j][k].equalsIgnoreCase("CashAccount"))        {for(int m14 = 1; m14 < v1; m14++){CashAccount.addValue(temp[m14][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctEntryDescr"))        {for(int m15 = 1; m15 < v1; m15++){AcctEntryDescr.addValue(temp[m15][k]);}}                       else if(temp[j][k].equalsIgnoreCase("ReversesAcctEntryId")){for(int m16 = 1; m16 < v1; m16++){ReversesAcctEntryId.addValue(temp[m16][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctEntrySubSysId"))    {for(int m17 = 1; m17 < v1; m17++){AcctEntrySubSysId.addValue(temp[m17][k]);}}             else if(temp[j][k].equalsIgnoreCase("DenomCcyAmount"))        {for(int m18 = 1; m18 < v1; m18++){DenomCcyAmount.addValue(temp[m18][k]);}}             else if(temp[j][k].equalsIgnoreCase("FunctCcyAmount"))        {for(int m19 = 1; m19 < v1; m19++){FunctCcyAmount.addValue(temp[m19][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctRule"))            {for(int m20 = 1; m20 < v1; m20++){AcctRule.addValue(temp[m20][k]);}}                       else if(temp[j][k].equalsIgnoreCase("AcctChartAcctNumber"))    {for(int m21 = 1; m21 < v1; m21++){AcctChartAcctNumber.addValue(temp[m21][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartDescr"))            {for(int m22 = 1; m22 < v1; m22++){AcctChartDescr.addValue(temp[m22][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartAlternateId"))    {for(int m23 = 1; m23 < v1; m23++){AcctChartAlternateId.addValue(temp[m23][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctCategory"))            {for(int m24 = 1; m24 < v1; m24++){AcctCategory.addValue(temp[m24][k]);}}             else if(temp[j][k].equalsIgnoreCase("IsParentAcct"))            {for(int m25 = 1; m25 < v1; m25++){IsParentAcct.addValue(temp[m25][k]);}}                       else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAcctNumber")){for(int m26 = 1; m26 < v1; m26++){ParentAcctChartAcctNumber.addValue(temp[m26][k]);}}             else if(temp[j][k].equalsIgnoreCase("ParentAcctChartDescr"))    {for(int m27 = 1; m27 < v1; m27++){ParentAcctChartDescr.addValue(temp[m27][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartType"))            {for(int m28 = 1; m28 < v1; m28++){AcctChartType.addValue(temp[m28][k]);}}             else if(temp[j][k].equalsIgnoreCase("EntityAltId1"))            {for(int m29 = 1; m29 < v1; m29++){EntityAltId1.addValue(temp[m29][k]);}}             else if(temp[j][k].equalsIgnoreCase("EntityAltId2"))            {for(int m30 = 1; m30 < v1; m30++){EntityAltId2.addValue(temp[m30][k]);}}                            else if(temp[j][k].equalsIgnoreCase("EntityAltId3"))            {for(int m31 = 1; m31 < v1; m31++){EntityAltId3.addValue(temp[m31][k]);}}             else if(temp[j][k].equalsIgnoreCase("EntityAltId4"))            {for(int m32 = 1; m32 < v1; m32++){EntityAltId4.addValue(temp[m32][k]);}}             else if(temp[j][k].equalsIgnoreCase("EntityAltId5"))            {for(int m33 = 1; m33 < v1; m33++){EntityAltId5.addValue(temp[m33][k]);}}             else if(temp[j][k].equalsIgnoreCase("CptyAltId1"))                {for(int m34 = 1; m34 < v1; m34++){CptyAltId1.addValue(temp[m34][k]);}}             else if(temp[j][k].equalsIgnoreCase("CptyAltId2"))                {for(int m35 = 1; m35 < v1; m35++){CptyAltId2.addValue(temp[m35][k]);}}                       else if(temp[j][k].equalsIgnoreCase("CptyAltId3"))                {for(int m36 = 1; m36 < v1; m36++){CptyAltId3.addValue(temp[m36][k]);}}             else if(temp[j][k].equalsIgnoreCase("CptyAltId4"))                {for(int m37 = 1; m37 < v1; m37++){CptyAltId4.addValue(temp[m37][k]);}}             else if(temp[j][k].equalsIgnoreCase("CptyAltId5"))                {for(int m38 = 1; m38 < v1; m38++){CptyAltId5.addValue(temp[m38][k]);}}             else if(temp[j][k].equalsIgnoreCase("FolderAltId1"))            {for(int m39 = 1; m39 < v1; m39++){FolderAltId1.addValue(temp[m39][k]);}}             else if(temp[j][k].equalsIgnoreCase("FolderAltId2"))            {for(int m40 = 1; m40 < v1; m40++){FolderAltId2.addValue(temp[m40][k]);}}                                        else if(temp[j][k].equalsIgnoreCase("FolderAltId3"))            {for(int m41 = 1; m41 < v1; m41++){FolderAltId3.addValue(temp[m41][k]);}}             else if(temp[j][k].equalsIgnoreCase("FolderAltId4"))            {for(int m42 = 1; m42 < v1; m42++){FolderAltId4.addValue(temp[m42][k]);}}             else if(temp[j][k].equalsIgnoreCase("FolderAltId5"))            {for(int m43 = 1; m43 < v1; m43++){FolderAltId5.addValue(temp[m43][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeGroup"))                {for(int m44 = 1; m44 < v1; m44++){TradeGroup.addValue(temp[m44][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeAcctCode"))            {for(int m45 = 1; m45 < v1; m45++){TradeAcctCode.addValue(temp[m45][k]);}}                       else if(temp[j][k].equalsIgnoreCase("TradeAcctCode2"))            {for(int m46 = 1; m46 < v1; m46++){TradeAcctCode2.addValue(temp[m46][k]);}}             else if(temp[j][k].equalsIgnoreCase("CounterpartyTradeId"))    {for(int m47 = 1; m47 < v1; m47++){CounterpartyTradeId.addValue(temp[m47][k]);}}             else if(temp[j][k].equalsIgnoreCase("AlternateTradeId"))        {for(int m48 = 1; m48 < v1; m48++){AlternateTradeId.addValue(temp[m48][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeGroupAltId"))        {for(int m49 = 1; m49 < v1; m49++){TradeGroupAltId.addValue(temp[m49][k]);}}             else if(temp[j][k].equalsIgnoreCase("TradeAcctCodeAltId"))        {for(int m50 = 1; m50 < v1; m50++){TradeAcctCodeAltId.addValue(temp[m50][k]);}}                            else if(temp[j][k].equalsIgnoreCase("LinkGroup"))            {for(int m51 = 1; m51 < v1; m51++){LinkGroup.addValue(temp[m51][k]);}}             else if(temp[j][k].equalsIgnoreCase("LinkAcctCode"))        {for(int m52 = 1; m52 < v1; m52++){LinkAcctCode.addValue(temp[m52][k]);}}             else if(temp[j][k].equalsIgnoreCase("LinkAcctCode2"))        {for(int m53 = 1; m53 < v1; m53++){LinkAcctCode2.addValue(temp[m53][k]);}}             else if(temp[j][k].equalsIgnoreCase("LinkGroupAltId"))        {for(int m54 = 1; m54 < v1; m54++){LinkGroupAltId.addValue(temp[m54][k]);}}             else if(temp[j][k].equalsIgnoreCase("LinkAcctCodeAltId"))    {for(int m55 = 1; m55 < v1; m55++){LinkAcctCodeAltId.addValue(temp[m55][k]);}}                       else if(temp[j][k].equalsIgnoreCase("AcctChartAltId1"))    {for(int m56 = 1; m56 < v1; m56++){AcctChartAltId1.addValue(temp[m56][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartAltId2"))    {for(int m57 = 1; m57 < v1; m57++){AcctChartAltId2.addValue(temp[m57][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartAltId3"))    {for(int m58 = 1; m58 < v1; m58++){AcctChartAltId3.addValue(temp[m58][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartAltId4"))    {for(int m59 = 1; m59 < v1; m59++){AcctChartAltId4.addValue(temp[m59][k]);}}             else if(temp[j][k].equalsIgnoreCase("AcctChartAltId5"))    {for(int m60 = 1; m60 < v1; m60++){AcctChartAltId5.addValue(temp[m60][k]);}}                                           else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAltId1"))    {for(int m61 = 1; m61 < v1; m61++){ParentAcctChartAltId1.addValue(temp[m61][k]);}}             else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAltId2"))    {for(int m62 = 1; m62 < v1; m62++){ParentAcctChartAltId2.addValue(temp[m62][k]);}}             else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAltId3"))    {for(int m63 = 1; m63 < v1; m63++){ParentAcctChartAltId3.addValue(temp[m63][k]);}}             else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAltId4"))    {for(int m64 = 1; m64 < v1; m64++){ParentAcctChartAltId4.addValue(temp[m64][k]);}}             else if(temp[j][k].equalsIgnoreCase("ParentAcctChartAltId5"))    {for(int m65 = 1; m65 < v1; m65++){ParentAcctChartAltId5.addValue(temp[m65][k]);}}                       else if(temp[j][k].equalsIgnoreCase("CashAccountAltID1"))        {for(int m66 = 1; m66 < v1; m66++){CashAccountAltID1.addValue(temp[m66][k]);}}             else if(temp[j][k].equalsIgnoreCase("CashAccountAltID2"))        {for(int m67 = 1; m67 < v1; m67++){CashAccountAltID2.addValue(temp[m67][k]);}}             else if(temp[j][k].equalsIgnoreCase("CashAccountAltID3"))        {for(int m68 = 1; m68 < v1; m68++){CashAccountAltID3.addValue(temp[m68][k]);}}             else if(temp[j][k].equalsIgnoreCase("CashAccountAltID4"))        {for(int m69 = 1; m69 < v1; m69++){CashAccountAltID4.addValue(temp[m69][k]);}}             else if(temp[j][k].equalsIgnoreCase("CashAccountAltID5"))        {for(int m70 = 1; m70 < v1; m70++){CashAccountAltID5.addValue(temp[m70][k]);}}                       else if(temp[j][k].equalsIgnoreCase("OriginalValueDate"))    {for(int m71 = 1; m71 < v1; m71++){OriginalValueDate.addValue(temp[m71][k]);}}             else if(temp[j][k].equalsIgnoreCase("ProductType"))        {for(int m72 = 1; m72 < v1; m72++){ProductType.addValue(temp[m72][k]);}}             else if(temp[j][k].equalsIgnoreCase("ProductSubType"))        {for(int m73 = 1; m73 < v1; m73++){ProductSubType.addValue(temp[m73][k]);}}                                                           else              {                for(int m74 = 1; m74 < v1; m74++)                {                    AlternateLinkId.addValue(temp[m74][k]);                }              }                   }            else            {            throw new StreamTransformationException( " Column name " + temp[j][k] + " in the file is not matching with our column     names");            }        }// for loop
}//close brace for top forloop

 

We have tested the interface with different file sizes ranging from 15mb to 30 mb.We got the expected output as result for all cases.

 

Regards

Venkat

Re-Cap SAP HANA Cloud Integration Info Days

$
0
0

Another great event, the SAP HANA Cloud Integration Info Days at SAP A.G. This event was all about sharing, learning and meeting fellow professionals. It was an open minded day were we had time for discussions and to get hands-on experience on SAP HANA Cloud Integration.

 

Benefits

SAP made the statement that it wants to hear more from their partners. That’s why several partners (including myself and Igor Mitrovic) were asked to tell what we think of SAP HANA Cloud Integration. What we think, are key benefits and the missing capabilities of SAP HANA Cloud Integration. Mainly everybody agrees on the benefits:

  1. Reduced Costs-of-Ownership
  2. Centralized technical monitoring
  3. Pre-packaged integration content


Missing capabilities

The main missing capabilities:

  1. Connectivity options (additional adapters)
  2. Monitoring needs improvement(e.g. pick up payload, adapter & performance monitoring)

 

One of the missing capabilities we did not all fully agree upon at first was the B2B/EDI Add-On. SAP HCI definitely needs a B2B/EDI Add-On, to be a full blown Middleware Tool. The B2B/EDI Add-On needs to cover at least: ANSI X.12, EDIFACT, TRADACOM and Plain text.  Without this functionality SAP HANA Cloud Integration will not take the next step forward. It will be a middleware tool to connect existing SAP On demand/On premise applications and that’s it.

 

License/subscription model

Another great discussion on this topic, customer can choose out of 2 different subscription models:

  1. Standard Edition
    1. 10 Connections
    2. 100 GB Bandwidth
  2. Professional Edition
    1. 25 connections
    2. 1 TB Bandwidth
    3. Additional connections possible

 

From my perspective this is a hybrid solution of a cloud based subscription model. When I want to use a cloud application, I really want to pay per usage. In this case I would like to pay per connection and perhaps in combination with bandwidth. For example when I already have 10 connections and want to add 1 new connection then I need to go to the professional edition. This is overkill for just one connection.

 

SAP is also busy with exploring a new variant “HCI for Partners”. This will be a low-price subscription especially for partners. This HCI instance cannot be used in a productive environment and is intended to be used for demo’s, developing integration content or developing new adapters.

 

What do you think?

Is a B2B/EDI Add-on a must have or a nice to have in SAP HCI? Is the substription model fine? Please share your thoughts.

PI 7.4 - Adapter Module Creation using EJB 3.0

$
0
0

Hello,


The below mentioned scenario is very common and an adapter module needs to be implemented for the same.

This example below will also help to  build an adapter module in SAP PI 7.4 using EJB 3.0.

 

The Scenario:

The business requirement is to transfer files from one folder to multiple folders on FTP. The files are present in a consolidated manner at source side, it needs to put in different folders at destination side according to the file names.

 

For this Scenario: Configurations are done in Integration Directory only and no ESR objects are built.


ESR objects are not required as files need to be transferred as it is from one directory to other directory. And there is no message mapping or data processing required. Hence configurations are done directly in the ID which involves Communication channels, Receiver Determination, Interface Determination, Sender Agreement and Receiver Agreement.

 

The two major components involved in the scenario are:

 

Sender Communication Channel:

 

This channel needs to pick multiple files from the source directory. For this either “Advanced Selection for Source File” option in sender communication channel can be used or we can put *.* in File Name to pick all the files from Source Directory.

 

Advanced Selection for Source File option is shown below:

1.png


Receiver Communication Channel:

 

This channel needs to put the files in different folders on FTP. For a single communication channel to
put the files dynamically in different folders, an adapter Module needs to be developed which will select the Target Directory Name and File Name Scheme at
runtime.

 

The Adapter Module FileAdapModule is highlighted below.

2.png

 

Adapter Module Creation Process for SAP PI 7.4:

 

The configurations and steps required to build adapter module is mentioned in detail below:

 

  1. Install the latest version of NWDS 7.3 EHP 1. The latest version of NWDS available is NWDS 7.3 EHP1 SP 12.
  2. Use the JDK 1.6.0_07 or higher in NWDS for building the project.
  3. Following JAR files need to be imported from SAP PI system for creating the Adapter Module.The JAR files and their location on PI system is mentioned below:
  4. EJB Project and Session Bean creation in NWDS: For building an Adapter Module EJB Project needs to be created and stateless session bean needs to be created in the EJB project which holds the business logic.


              In NWDS: Go to File – New – EJB Project

            3.png
         

              Enter the details for EJB as mentioned below and click on Next: Select the EJB Module Version as 3.0 and add it to EAR project.

          4.png         

 

          Uncheck the client interfaces and classes checkbox. Select the “Generate ejb-jar.xml” deployment descriptor checkbox and click on Finish.

          5.png

 

          An EJB and EAR project will get created as shown below. EJB Project holds the business logic where as EAR Poject is required for deployment.

          6.png

    5.  JAR files addition to EJB project: As the Adapter Module needs standard SAP JAR files for compilation, these JAR files need to be added to the EJB project.


    Put all the JAR files mentioned in Step 3 in a Folder in your local machine.

    Right click on EJB project and select Build Path – Configure Build Path.

 

      7.png

 

        Select Library tab and click on Add Variable

    8.png

 

    Select the JAR Files Folder from machine and click on Extend:

      9.png

 

    Select all the JAR Files and click on OK.

    10.png

    6.  Adding the Stateless Session bean into the EJB Project:

 

        Right click on the EJB Project and select New – Session Bean(EJB 3.x)

        11.png

         

          Enter the session bean details as mentioned below:

  •           Enter the Java package and Class names.
  •           Select the State type as “Stateless”.
  •           Select the checkboxes for Remote and Local Interfaces.

          12.png

        Click on Next and Finish. Session bean will be added in the mentioned package in the EJB Project as shown below:

          13.png

 

    Open the FileAdapModule.java file and logic to determine the Directory and File names will be added in this java file. Add one Method ModuleData process                (ModuleContext, ModuleData) in the bean as shown below . Reference code is attached in attachment section.

14.png

   

    Right click on EJB and Build  the Project.

 

7. In the EJB Project maintain the META-INF Files as mentioned below for correct deployment. The ejb-jar.xml file is attached for reference:

    15.png

Maintain ejb-j2ee-engine.xml file as below:

Capture.PNG


 

8. Exporting the EJB Jar file: The source code and class files of the EJB are stored in JAR file which gets deployed on the J2EE server.

 

    Right click on the EJB Project and select Export – SAP EJB JAR File.

    16.png

    Select the EJB Project and Destination as mentioned below:

    17.png

 

    9. EAR Project Settings:

      This Project holds the EJB project JAR file and has SAP Standard EAR Content in the form of application-j2ee-engine.xml file.

18.png         

 

    Open the application-j2ee-engine.xml file and put the source code as attached below. Build the EAR Project like EJB Project.

 

10.  Exporting the EAR file for deployment:



  • Right click on the EJB Project and select Export – SAP EJB JAR File.
  • Select the EAR Project and Destination as mnetioned below.

 

      19.png

 

20.png


11. EAR Deployment and Settings: Setting the PI system as the deployment system in NWDS:



  • Go to Window – Preferences – SAP AS Java and click on Add.
  • Provide the Details and system will be added as shown below.

21.png

 

 

 

Deploy the EAR Project on PI system as follows:

 

  • Right click on EAR Project and select Run As – Run on Server.
  • Select the PI system on which deployment needs to be done.
  • Select the EAR project which needs to be deployed and click Finish.


The steps  screenshots are mentioned below:

22.png

 

23.png

24.png


In the Deployment View Console, the deploymnet status can be checked. The depoyment should take place without any errors.

This is shown in the screenshot below:

 

12.  Communication channel settings:

 

   

Sender Communication channel– As source file names need to be picked by the receiver adapter module for selecting the corresponding directory.

In the Sender channel the ASMA attributes - File Name checkbox needs to be checked.

Sender File.png

 

Receiver Communication Channel


The adapter module needs to set the Directory Name and File name dynamically, hence in the ASMA attributes checkbox for File Name and Directory needs to be selected.

Receiver File.png

 

  • The custom adapter module (identified by JNDI Name mentioned in EJB) needs to be added in the Module tab above the standard SAP Module.
  • The Directory path is passed as the parameter to the custom adapter module logic, hence that need to be added in the Module Configuration section as shown below.
  • Then Run the scenario and the audit log can be checked in Communication Channel monitoring  which will also have traces of custom Adapter Module call.

26.png

Steer Your Business Operations With Real-time Visibility

$
0
0

The initial promise of Business Process Management (BPM) was that it would enable our businesses to be managed holistically through business process models that were implemented in software. The vision was enticing, but the reality – as always – is more complex. The business reality of restricted budgets, organizational change complexity, M&A activity, and more means that processes will always be implemented using patchworks of systems.


Operational Process Intelligence platforms offer a practical, pragmatic way forward. Working side-by-side with existing systems, they provide real-time insights into end-to-end business process health and performance, and they help operations managers and leaders to make quick, evidence-based decisions to manage risks and tackle problems.

 

The webinar will also include a demo of how SAP® provides real-time intelligence across end-to-end processes with its solution SAP Operational Process Intelligence powered by SAP HANA.

 

Attend this webinar to learn:

  • Better understand the business value of Operational Process Intelligence
  • Explore some likely use cases for the technology
  • Learn how to gain real-time insights into your business operations across complex, heterogeneous system landscapes
  • Learn how to better navigate through daily operations by knowing the most critical situations in real-time and prioritizing workloads accordingly

Date: Wednesday, August 27, 2014
Time: 11:00 am ET/16:00 pm UK/17:00 pm Central Europe
Duration: 60 minutes

 

 

Register Here:

http://event.on24.com/r.htm?e=779199&s=1&k=AA720D84525B5E08AD65C63C96B61044&cb=on24&partnerref=NA-SCN5

How to Ride The Wave of The Digital Economy With Simplification

$
0
0

Think back a few decades and API was purely a geek term related to developers and software architects – how times have changed. … Today we see APIs as the key enabler for anything a company wants to sell such as products & services. Think about the business models of NetFlix, Amazon Web Services, Google, Fitbit, etc. As the world is moving beyond simply providing consumer information on a website to engaging consumers in new modalities including mobile applications, social media, wearables, and other specialized user interfaces, the challenges of providing simple access in a secure and scalable manner has created a perfect storm that breaks the traditional IT governance models.

Join SAP to understand how provisioning scalable and secure APIs can unlock your digital assets to new developer ecosystems and support new business innovations, while at the same time leveraging out-of-the-box integrations with your SAP and non-sap systems.

 

Attend this webinar to learn:

  • What the API Economy means to SAP & its ecosystem
  • How you can scale and secure the access to your back-end systems to support new digital channels and consumer-oriented applications
  • How you can reuse non-SAP and SAP back-end systems together to support new consumer facing use cases with a standards-based approach
  • What SAP’s strategy is and what key capabilities will be introduced in this exciting new area of API management

 

Date: Wednesday, September 10, 2014

Time: 11:00 am ET/16:00 pm UK/17:00 pm Central Europe

Duration: 60 minutes

 

Register Here: http://event.on24.com/r.htm?e=779199&s=1&k=AA720D84525B5E08AD65C63C96B61044&cb=on24&partnerref=NA-SCN6

New book: SOA Integration - Enterprise Service Monitoring (LIE, FEH/ECH, AIF)

$
0
0

As more and more new functions from SAP are being delivered in the form of Enterprise Services and many customer are developing new interfaces on the basis of abap proxies instead of IDOCs it's becoming more and more important to be able to understand how to properly monitor those interfaces. As SAP provides multiple tools for performing those tasks on the SAP backend systems developers and business users must know when to use each of them. My new book "SOA Integration - Enterprise Service Monitoring" I'm trying to compare three most popular Enterprise Service backend system monitoring tools available from SAP:

 

- Local Integration Engine

 

- Forward Error Handling/Error and Conflict Handler

 

- Application and Interface Framework

 

As in my previous books you will not only find the pure feature comparision of the tools but best of all, tons of customizing examples for all three technologies. The book does not contentrate on the coding as such (with a few exceptions) but tries to show what you can achieve with customing so apart from the developers it can be as well used by the key users who are in charge of monitoring interfaces but would like to find some more options from the tools which they use everyday.

4book_1.png

 

 

Where to buy:

 

- paper edition - Amazon: https://www.createspace.com/4877273

 

- ebook edition for all platforms - coming soon!

 

 

Reference:

 

My previous books:

 

- New book: Mastering IDoc Business Scenarios with SAP XI

 

- Second Edition: Mastering IDoc Business Scenarios with SAP PI

 

- New Book: The Essentials on SAP NetWeaver Process Integration - A SAP Mentor 2010 Series


Using exception behavior of Fix Values for retrying RFCLookup

$
0
0

My first blog

 

In my recent project, I encountered a scenario where I had to retry when the RFC Lookup returns a message i.e. BAPI_RETURN with some message number(Error Message). The RFC Lookup didn't throw any exceptions but returned a proper message with BAPI_RETURN and message type E. So I couldn't use the default behavior of RFC Lookup to throw exception.

 

So I mapped the BAPI_RETURN message type with standard "FixValues" Conversion function and didn't maintain any conversion values and selected "If behavior fails" Throw exception. Thus, whenever the RFC Lookup returns an error message it raises an exception and retries every 5 minutes interval for 3 times(default). This default behavior can be configured to have "n" number of re-tries before moving it to error queue.

 

This helped me a lot. So thought of sharing it with the community. May be you might have better way of doing it. If so please comment/share your ideas.

SRM integration with PI using standard content -Part1

$
0
0

Introduction


For integrating SRM to PI SAP provides standard content that can be downloaded from the service market place. SAP XI content is one of the best features provided by SAP for integrating two different systems and exchange data between them using ESOA methodology.ESOA methodology exchanges message using proxies from ABAP back end. By using standard content we can integrate SAP ERP with other like SAP SRM, CRM, and SCM etc.There is a lot of predefined content already available that can be reused and which helps customers to save time and effort in their integration development projects. A typical use case is that customers use data types, service interfaces, and mappings provided by SAP, so no need to develop ESR content. For standard implementations like SAP ERP to SAP SRM predefined content works perfect for development projects. As it has all the relevant ESR content required.


Business Scenario:


Business Scenario is to integrate SAP SRM system with web-service to exchange Purchase order data through PI using ESOA methodology. Scenario will be XI-SOAP Asynchronous. Integration using SAP PI uses standard ABAP Run time of PI for message processing. See below diagram for the sample Interface flow of standard Interface Implementation. Here application servers could change depending on the Implementation Project.

Capture.JPG

Pre-requisites


SAP SRM 7.0

  • Proxy Configurations are in place.
  • SLD Bridge is active and running in RZ70.
  • Import relevant IR content from Service Market Place.
  • Functional configuration relevant for business functionality.


SAP PI 7.11

  • Latest CR is available in SLD SAP_CR 6.6 version.
  • Technical System for SAP ERP and SAP SRM got created using SLD Bridge.
  • Created Business Systems for the corresponding client for both Application systems.
  • Import the IR content relevant for both Application Systems in to ESR.


Systems Involved


Systems involved in this integration are SAP SRM 7.0 and SAP PI 7.1 EHP 1.SAP Integration Content has to be imported from Service Market Place, let us see, available Integration Content for respective systems involved in our integration in Service Market Place.


SAP SRM 7.0


  1. Go to SAP Support portal--> Software Download-->Support Packages and Patches--> Entry by Application Group

Capture1.JPG

B.Go to SAP Application ComponentsàSAP SRM

Capture2.JPG

C. Go to SAP SRM 7.0

Capture3.JPG


D. Go to Entry by Component -->XI Content-->SRM Server 7.0-->Database Independent.

 

Capture4.JPG

Check the latest available zip and download.


Similarly download XI Content SRM Server IC 7.0

Capture5.JPG


We need to import the integration content to ESR as follows


  • After downloading the Integration content, we should extract and place the file in the respective path for importing "<systemDir>\xi\repository_server\import".
  • After you start the Integration Builder, follow the menu path "Tools -> Import design objects..." to import the XI Content.
  • After a successful import, the Integration Builder moves the imported tpz files into the directory "<systemDir>/xi/repository_server/importedFiles". The Support Package level of imported software component versions is available in the Integration Builder on the "Details" tab for the relevant software component version. This information is important for support if an error occurs.
  • After successful import, XI content will be visible under your “Software Component” in the SLD. 


Capture6.JPG

Configurations at Sender system [SRM]


Pre-requisites


1. The business systems should be based on SAP Web AS 6.20 and above

2. The business systems and your central Integration Server are maintained in the System Landscape Directory (SLD).

 

Use T Code SM59 to create RFC Destination in SRM to connect to PI (PID) system.

Capture7.JPG

Connection between Business System and System Landscape Directory (SRM System) (Transaction SM59)

 

RFC Destination: LCRSAPRFC

Capture8.JPG

Connection test

Capture9.JPG

RFC Destination: SAPSLDAPI

Capture11.JPG

Connection test

 

Capture10.JPG

RFC Destination:DX1CLNT001


Capture12.JPG

Configuration Business System as Local Integration Engine (SRM System) (Transaction SXMB_ADM)

 

  1. Run Transaction code SXMB_ADM
  2. Go to Integration Engine Configuration -> Choose Edit
  3. Select Role of Business System: Application system
  4. Corresponding integration server: http://<host>:<port>/sap/xi/engine?type=entry

 

Now click on button specific configuration and assign category as RUNTIME, Parameters as IS_URL and Current Value: http://<host>:<port>/sap/xi/engine?type=entry

 

Maintaining the SAP J2EE Connection parameters for LCRSAPRFC and SAPSLDAPI in SAP J2EE Engine (Transaction SM59)

 

  1. Go to J2EE Engine and choose Cluster --> Server --> Services. JCo RFC provider
  2. Under RFC destination specify the following:

 

Program ID: LCRSAPRFC_DX1, Gateway Host (This is your Integration Server host)

Gateway Service and Number of process: 3

  1. Under Repository specify your Integration Server host as your application server and Choose Set.
  2. Finally, Maintain SLD access details in Transaction SLDAPICUST

 

Configurations in PI

RFC Destination in PI

  1. SM59 --> ABAP connections --> type 3 connections --> Provide SRM system details


Capture13.JPG


Connection test

Capture14.JPG

Login to PI Server -> T-code: SXMB_IFR


Capture15.JPG

System Landscape Directory [SLD]

 

For the standard implementation we need to import latest Content Repository SAP_CR in to SAP PI’s SLD. Logic here is, latest CR contains relevant the Products and Software components for the Technical Systems (Application System SRM).


Capture16.JPG

After post installation steps we could see in SLD, Data tab is containing Content Repository version as SAP_CR 3.0 or 4.0.This CR need to be updated to the latest available level. Refer: SAP Note: 669669 Updating the SAP Component Repository in the SLD.pdf

In SLD we should define the business system for SRM and the web-service. Click on System landscape directory option which is showing in the above screen, it will take you to the below screen.

Capture18.JPG

Click on the business system and click new business system, provide the SRM system details and select the business system type as “ABAP”. XXXCLNT110 is the business system created for SRM system. Since we don’t have enough details to create the business system for Grainger we are directly creating business component in integration directory. Assume all the necessary configurations are completed.

Configuration in Enterprise service repository

In Enterprise Services Repository we need to import the IR content that we have imported in section 5.1. Generally we will be importing Software component and versions for custom development. But for standard implementation we will import the IR content to ESR directly. Let us see how to import IR content.

Capture19.JPG

Fig: import design objects

Capture20.JPG

Fig: import from client or server

If we got IR content imported in to our desktop then select Client, if we got on to the server location then select Server. Unzip the IR .ZIP file you can see .TPZ file. Import all .TPZ files in to ESR.


Capture21.JPG

Fig: ESR Content Import

After importing IR content your ESR should look as follows:

Capture22.JPG

As a developer we don’t need to create any mappings for Interfaces, as SAP has provided extensive pre-delivered IR content, which has similar global data types at both application system ends. But in our case we are not using the SAP system on target side [Web-service]. So we have to define the data type / external definition/ service interface / message mapping / operation mapping. Go to the service interface namespace where purchase order request message type is defined.

Capture24.JPG

Target system is a web-service we should import the WSDL[Message structure] file in PI. since client didn't provide the WSDL the other options available are we should generate the wsdl from pi or convert the target structure into XSD and import as external definition in SAP PI.Select the external definition and right click to import the XSD.

Capture25.JPG

Purchase order request is the standard message type defined by SAP

Capture23.JPG

Service Interface

 

1)    Outbound Asynchronous referring to the source message type.

2)    Inbound Asynchronous referring to the target message type.

 

Sender service interface [SRM]


Capture26.JPG

Receiver Service Interface [Web-service]


Capture27.JPG

Message Mapping

Create message mapping to map the source and target fields and apply the mapping conversions.

Right click on name space and click new--> Mapping objects--> Message mapping

Give the name for message mapping and provide the name space and S/W component version.


Drag and drop the source and target message types in the respective fields.

Do the required mapping between source and target fields


Capture28.JPG

Operation mapping

Give the sender and receiver service interface details along with message mapping.

Capture29.JPG

Continue: Part-2


SRM integration with PI using standard content - Part 2

SRM integration with PI using standard content - Part 2

$
0
0

Configuration in Integration directory

Click on integration directory in PI home page and create the configuration scenario with name “CS_SRM_TO_XXXXX_PurchaseOrder”

Capture30.JPG

Assign sender business system and business component


Sender Communication channel

Create the sender communication channel to establish communication to the SRM system and receive the data. Since we have standard service interface and proxy programming for SRM standard content, we are using xi adapter which uses the PI standard protocol [XI 3.0].

Capture31.JPG

Receiver communication channel

Our requirement is to send the data to the web-service, to achieve this we are using SOAP receiver adapter.


Capture32.JPG

Go to the advanced tab and make the communication channels active.

 

Receiver determination

The validation of the file is done while determining the receivers. The conditions are specified and the receiving systems are selected according to the conditions.

Capture33.JPG

Interface Determination


Capture34.JPG

Receiver agreement

Capture35.JPG

With this we are done with the design and configuration.

 

PO's are successfully receiving in PI

Capture36.JPG

Part 1 SRM integration with PI using standard content -Part1


Referred and useful Blogs:


http://wiki.scn.sap.com/wiki/display/SRM/Overview+of+the+SUS+scenarios+and+interfaces+used

http://scn.sap.com/community/pi-and-soa-middleware/blog/2006/11/14/xipi-data-type-enhancements-standard-business-partner

http://pdpsappireference.blogspot.in/2012/06/srm-standard-datatypes-enhancements-in.html

http://service.sap.com/~sapidb/011000358700002897342004E/CCMConfig10_01_05.pdf

http://service.sap.com/~sapidb/011000358700003992672005E/Config_Guide_CCM200_640doc.pdf

http://service.sap.com/~sapidb/011000358700002897402004E/ServiceProcurementNEW.pdf

http://service.sap.com/~sapidb/011000358700002897382004E/Plan-DrivenwithSInew.pdf



Content Conversion for fixed length files with key field value shorter than key field length

$
0
0

It is common for hierarchical flat files to have a key field indicating the type of structure for the line. For the case of fixed length files, normally the value of the key field is same as the maximum length defined for the key field.

 

However, there are occasions where the value of the key field is shorter than the maximum defined length. Below is an example whereby the file has 3 different structures - Header, Line and Trailer. The maximum length of the key field is defined as 10, however the actual contents are only 6.

filestruct.png

 

This can cause potential issue during configuration and execution of content conversion for such file format. If the fieldFixedLengths parameter is set to 10 for the key field, and the keyFieldValue has length of 6 (i.e. MSGHDR); during runtime processing, it will not be able to match that line. Even if we try to pad the keyFieldValue with additional white space, these spaces are automatically removed during channel activation.

 

A simple technique can be employed in order to resolve this. It involves splitting the key field into two parts – an actual key field and a dummy field.

 

Using the example above, the key field is split into the following two fields:-

CORCID – length 6

TEMP – length 4

 

The corresponding data type in ESR should be modified to match this split of fields.

 

In the parameter configuration of the content conversion, the field split is reflected as below:-

xml.keyFieldName = CORCID

xml.Header.fieldFixedLengths = 6,4,<Other field lengths>

xml.Header.fieldNames = CORCID,TEMP,<Other fields>

xml.Header.keyFieldValue = MSGHDR

xml.Line.fieldFixedLengths = 6,4,<Other field lengths>

xml.Line.fieldNames = CORCID,TEMP,<Other fields>

xml.Line.keyFieldValue = MSGLIN

xml.Trailer.fieldFixedLengths = 6,4,<Other field lengths>

xml.Trailer.fieldNames = CORCID,TEMP,<Other fields>

xml.Trailer.keyFieldValue = MSGTRL

 

Below are screenshots of an example configuration based on the example. Please note that the example uses the MessageTransformBean module, but this splitting technique is also applicable for FCC configuration on the File/FTP adapter.

 

Module processing sequence

module.png

 

Module parameters

Set conversion class

contenttype.png

Set conversion type and document details

mtbtop.png

Set record structure

mtbstruct.png

Set field configuration for Header structure

mtbhead.png

Set field configuration for Line structure

mtbline.png

Set field configuration for Trailer structure

mtbtrail.png

"It does not work? Refresh the Cache!". Architecture and caching process in SAP PI 7.4

$
0
0

Hi colleagues!

 

Today is a good day to talk about such thing as a cash.. Oh no, sorry, I mean "cache" - not a money, unfortunately.

What is Cache?

 

Wikipedia tells:

 

Cache (from the French. cacher - «hide", pronounced [kæʃ] - «cache") - an intermediate buffer with fast access containing information that may be requested with the highest probability. Accessing data in the cache is faster than the original sampling data from the slower memory, or a remote source, but its volume is considerably limited as compared with the initial data storage. 

 

There is a standard "highly professional" comment on any issue in the SAP PI - "Update the cache."  Indeed, it is often helpful. Why?

Let's deal.

 

I assume that in times then the architecture of SAP PI was developed, the developers team had two problems:

 

  • Split the development environment and already working interface objects, give developer an opportunity to change the interface without affecting productive work.
  • Given the number of different components of SAP XI / PI - establish interaction between them, taking the performance into account.

 

Most likely, the idea was to use the "cache" as an intermediate storage for all interface objects.

 

 

Overall caching scheme in the PI looks like this:

cache1.jpg

Pic. 1: Cache architecture in SAP PI

 

 

How it works?

 

Update of the cache is starting automatically after object activation took place in the ESR or Integration Directory; the process of update can also be initiated manually.

 

When object activation occurs in Enterprise Service Repository - PI takes the following sequence of actions:

 

  • collects all related and linked objects, which also needs to be updated;
  • Notification Service (Cache Refresh Notification Service) provides information on the required update to the Integration Directory;
  • Searching of all linked and relevant objects in the Integration Directory;
  • Notification Service (Cache Refresh Notification Service) sends all collected objects to "consumers" - various cache mechanisms in PI.

 

"Consumers" are the following cache mechanisms:

  • Mapping cache - storage for ready (compiled) mapping programs;
  • CPA cache - all development objects and settings for interface execution in Advanced Adapter Engine;
  • Integration Engine cache - all design and customization objects required required for interface execution in Integration Engine; here are also stored all the objects needed for ccBPM runtime;
  • Business System cache - cache of SAP business system (ABAP) connected to PI; it contains all necessary interface objects for proxy and web-service interfaces.

 

In addition to the chain, there is SLD-caches in the ESR and Integration Directory:

  • SLD-cache in the ESR is updated whenever the developer imports the new software component. 
  • SLD-cache in the ID is updated when the developer imports the business system.

 

SLD-caches can also be forced to update.

 

Monitoring and Administration.

 

There are a number of tools to monitor the status and contents of the cache mechanisms in PI, as well as for manual cache updates.

 

 

The first tool is a part of the administration tools located on the home page of PI:

PI-cache-2.jpg

Select the tab "Repository", a group of instruments "Lock and Cache Administration", the tool "Java Virtual Machine Cache".

PI-cache-3.jpg

As you can see, virtual machine has several different cache mechanisms. We are particularly interested in the SLD cache. On this page we can force an update the cache manually by clicking "Refresh Selected Caches".

 

You can also use "Data Cache" in the same group of tools:

PI-cache-7.jpg

Pic.4: CPACache Update.

 

PI-cache-8.jpg

Pic.5: Mapping Cache update

 

You can force an update of CPACache and Mapping Cache here. When you click on the link "Refresh" next to the appropriate cache mechanism - PI  will open the page of update tool.

 

CPACache update has two options: "Delta" and "Full".

PI-cache-20.jpg

Delta-update tries to minimize the syncronization time and update only items that have been changed since the last update. Option "Full" conducts a full synchronization of ESR and Integration Directory repositories with CPACache.

 

TIP: You can copy URLs for CPACache and Mapping Cache updates and use them directly.

 

 

"Directory" tab contains the same tools:

PI-cache-9.jpg

Pic.7: "Directory" tab - SLD Cache update

PI-cache-10.jpg

Pic.8: "Directory" tab - CPACache refresh

 

These tools are effective when you want quickly eliminate possible cache error during interface execution.

But it's not enough for a complete analysis or troubleshooting.

 

You can find more advanced tools in the "Configuration and Monitoring Home" toolset, located on the initial PI page:

Go to http://<host>:<port>/dir and select "Configuration and Monitoring Home" in the lower right corner.

 

You will see the following portal:

PI-cache-15.jpg

Pic.9: Monitoring and customizing tools for SAP PI


You can choose from the menu on top of page - what part of PI you want to see: Integration Engine, Adapter Enginge, Business Process Engine or Mapping Runtime.

PI-cache-151.jpg

Pic.10: Cache Monitor for Integration Engine (ABAP)

PI-cache-16.jpg

Pic. 11: Cache Monitor for Adapter Engine (J2EE)

PI-cache-17.jpg

Pic. 12: Cache Monitor for mappings (J2EE)


Choose "Cache Monitor" and enjoy the power over all caching mechanisms of SAP PI.

PI-cache-18.jpg

Pic.13:Cache Monitor for Adapter Engine — content view

 

Here you can see what is in the cache, synchronize individual objects or entire cache.

 

Caching and Developer.

 

There are some useful cache-related tools for developers you can use to monitor and control the caching process.

 

Open the Enterprise Service Builder, choose "Environment":

PI-cache-5.jpg

Pic. 14: Cache tools in Enterprise Service Builder


The item "Clear SLD Data Cache" allows you to reset all cached objects from SLD, next time all objects will be read directly from SLD (not from cache).

This tool is useful when you just created a new software component in SLD, but it's not visible in the list during the import to the ESR.

 

The item "Cache Status Overview" calling the monitor for Cache Refresh Notification Service.

 

For example, the monitor now shows that some development object was activated and this change was successfully provided to Integration Directory cache:

PI-cache-6.jpg

Pic. 15: successful object activation message in the ESR cache monitor


The life of this object can be traced in the <strong> Integration Directory </ strong> now.  Run the Integration Builder, call the menu "Environment" ->  "Cache Status Overview":

PI-cache-13.jpg

Pic. 16: successful object activation message in the Integration Directory cache monitor


The monitor shows that the activated object has been successfully transferred to "consumers": Integration Cache (ABAP) and Adapter Engine Cache (J2EE).

 

 

Here you can also refresh SLD Cache, which stores a list of Business Systems from SLD. After a cleaning, the first import of business systems would start direct reading of business systems list from the SLD.

 

Caching in ABAP.

 

If you want to see cache of ABAP part (Integration Server itself or connected SAP system) - use transaction SXI_CACHE:

PI-cache-22.jpg

Pic. 17: SXI_CACHE transaction


You can also find tools for delta- and full cache updates in the transaction menu:

PI-cache-23.jpg

Pic.18: SXI_CACHE menu


That's all about cache universe in PI for now.


If you want more information, you can use these sources:

 

SDN article "How to Handle XPI Caches in SAP NetWeaver 2004s (NW7.0)"

Analyzing the Runtime Cache (help.sap.com, PI 7.4)

Runtime Caches (help.sap.com, NW 7.4)


Good luck in your integrations!


Alexey Petrov

Freelance Integration Expert

Viewing all 741 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>