Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

SAP PI ABAP Proxy to JDBC Synchronous Interface Part III

$
0
0

In this part I will show that how to configure integration runtion for abap proxy and PI ID.

 

4. Configure Integration Runtime

4.1 SLD Connection from ABAP to AEX SLD

Tcode: SLDAPICUST

48.png

 

  49.png

 

Tcode: SLDCHECK

  50.png

 

4.2 RFC Destination to PI SOAP Adapter

Tcode: SM59

Create the destination to PI SOAP XI Adapter

  51.png

Because I want to use “XI Message Protocol” in SOAP Adapter, so the “Path Prefix” is :

/XISOAPAdapter/MessageServlet?ximessage=true

If you use XI Adapter, the “Path Prefix” should be:

/sap/xi/engine?type=entry

  52.png

Maintain the PI user info.

 

4.3 Integration Engine Configuration

Tcode: SXMB_ADM

  53.png

  54.png

  55.png

The value of “IS_URL” the “dest://<RFC_NAME>”, and <RFC_NAME> is maintain above.

 

4.4 Register the jobs to auto execute queues

  56.png

  57.png

 

4.5 Activate the XI engine service

Comment: This step is no need for proxy consumer, but required to use proxy provider.

Tcode: SICF

  58.png

  59.png

  60.png

  61.png

 

 

5. Configure ID

5.1 Configure Scenario via wizard

  62.png

  63.png

  64.png

Choose the Service Interface in ERP side and SOAP Adapter.

  65.png

Choose the Service Interface in 3rd side and JDBC Adapter

  66.png

Enter the CC name

  67.png

  68.png

Specify the Operation Mapping

  69.png

Enter the Receiver CC name

  70.png

Enter the scenario name

 

5.2 Configure the SOAP adapter

  71.png

Apater type is “SOAP”, and Message Protocol is “XI 3.0”.

 

5.3 Configure the JDBC adapter

  72.png

The value of “JDBC Driver” and “Connection” can be found in the JDBC Driver help:

  73.png

 

At last activate all the objects configured above!

 

The links of other parts:

SAP PI ABAP Proxy to JDBC Synchronous Interface Part IV

$
0
0

At the last part I will show that how to test this interface.

 

6. Test

We can test the abap proxy consumer in “SPROXY”

74.png

   75.png

 

  76.png

  77.png

 

Or write some abap code to test the ABAP proxy consumer:

 

  data: lo_proxy type ref to zco_si_a2j.
 data: ls_outut type zmt_a2j_req,
 ls_input type zmt_a2j_res.
 data: wa_results type zdt_a2j_res_results.
 constants l_times type i value 10.
 create object lo_proxy.
 parameters p_empc type string DEFAULT '0010009151'.
 try.      ls_outut-mt_a2j_req-emp_code = p_empc.      call method lo_proxy->si_a2j      exporting           output = ls_outut      importing           input  = ls_input.      commit work.      write: / 'Success!'.      loop at ls_input-mt_a2j_res-results into wa_results.           write: / 'structurecode:', wa_results-scode.           write: / 'empcode:', wa_results-empcode.           write: / 'c_name:', wa_results-name.           write: / 'parentid:', wa_results-pid.           write: / 'superempcode:', wa_results-superempc.           write: / 'defaultposition:', wa_results-defposit.           write: / 'approve:', wa_results-approve.      endloop.      catch cx_ai_system_fault .      data fault type ref to cx_ai_system_fault.      create object fault.      write :/ fault->errortext.
 endtry.

7. Reference

JDBC Receiver scenarios best practices - Stored procedure design-Part4

http://scn.sap.com/community/pi-and-soa-middleware/blog/2012/09/07/jdbc-receiver-scenarios-best-practices--stored-procedure-design-part4

 

JDBC Stored Procedures

http://scn.sap.com/people/siva.maranani/blog/2005/05/21/jdbc-stored-procedures

 

Defining XML Documents for Message Protocol XML SQL Format

http://help.sap.com/saphelp_nw74/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm

 

Defining an EXECUTE Statement

http://help.sap.com/saphelp_nw74/helpdata/en/44/7b72b2fde93673e10000000a114a6b/content.htm

 

JDBC Receiver Adapter -- Synchronous Select – Step by Step

http://scn.sap.com/people/bhavesh.kantilal/blog/2006/07/03/jdbc-receiver-adapter--synchronous-select-step-by-step

 

JDBC receiver adapter - stored procedure response

https://scn.sap.com/thread/2015436

 

Please contact me if there are some mistake in the documents or doubt.

Sorry for my poor Englist.

The links of other parts:

Create email with body and attachments for binary payload with Java mapping

$
0
0

Some years ago I wrote a blog about emails with body and attachment with help of the MailPackage structure:

XI Mail Adapter: An approach for sending emails with attachment with help of Java mapping

 

In this blog I will present another solution which does not use MailPackage and I show also how to add binary payloads to an email.

 

Meanwhile there are many blogs available that show the use of the mail adapter and module configuration. However, using a Java mapping to create the whole MIME stream is the most flexible way to create a mail exactly in the way that it should look like.

 

The following code should show the basics of the MIME creation, feel free to use and enhance it to your needs:

 

Java Mapping to create MIME parts of an email

package sample;

 

import java.io.IOException;

import java.io.InputStream;

import java.io.OutputStream;

 

import javax.xml.bind.DatatypeConverter;

 

import com.sap.aii.mapping.api.AbstractTransformation;

import com.sap.aii.mapping.api.StreamTransformationException;

import com.sap.aii.mapping.api.TransformationInput;

import com.sap.aii.mapping.api.TransformationOutput;

 

 

public class MyBinaryMessageAsAttachment extends AbstractTransformation {

 

  String attachmentName = "file.pdf";

  String boundary = "--AaZz";

  String mailContent = "This is a sample file";

  String CRLF = "\r\n";

 

  public void transform(TransformationInput arg0, TransformationOutput arg1)  throws StreamTransformationException {

 

    InputStream in = arg0.getInputPayload().getInputStream();

    OutputStream out = arg1.getOutputPayload().getOutputStream();

 

 

    try {

      // create the declaration of the MIME parts

      //First part

      String output = "--" + boundary + CRLF

        + "Content-Type: text/plain; charset=UTF-8" + CRLF

        + "Content-Disposition: inline" + CRLF + CRLF

        + mailContent // this should be some more useful text

        + CRLF + CRLF

 

      //Second part

        + "--" + boundary + CRLF

        + "Content-Transfer-Encoding: base64" + CRLF

        + "Content-Type: application/pdf; name=" + attachmentName + CRLF

        + "Content-Disposition: attachment; filename=" + attachmentName + CRLF + CRLF;

      out.write(output.getBytes());

 

      // convert InputStream to Byte array

      byte[] input = new byte[in.available()];

      in.read(input);

 

      // convert payload to base64

      output = DatatypeConverter.printBase64Binary(input);

 

      // split lines after 76 rows

      output = addLinefeeds(output);

      out.write(output.getBytes());

 

      // last boundary

      output = CRLF + CRLF +"--" + boundary + "--" + CRLF;

      out.write(output.getBytes());

    } catch (IOException e) {

      throw new StreamTransformationException(e.getMessage());

    }

  }

 

  public String addLinefeeds(String str) {

 

    StringBuffer result = new StringBuffer(str);

    int n = 76; // split by 76 characters (maximum of numbers in lines)

    int l = str.length();

    int i = n;

    while (l > n) {

      result.insert(i, CRLF);

      i = i + n + 2;

      l = l - n;

    }

    return result.toString();

  }

}

 

The Java Mapping will create two MIME parts. The first part is plain text, the second part is a binary, therefore we encode it to base64 and divide it into lines with 76 rows (which is the allowed maximum according to MIME protocol). The result will look like this:

 

Sample output of Java mapping

----AaZz

Content-Type: text/plain; charset=UTF-8

Content-Disposition: inline

 

 

This is a sample file

 

 

----AaZz

Content-Transfer-Encoding: base64

Content-Type: application/pdf; name=file.pdf

Content-Disposition: attachment; filename=file.pdf

 

 

PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPGNmZGk6Q29tcHJvYmFudGUg

cnRlPSIxNjA4Ni4xMyI+PC9jZmRpOlRyYXNsYWRvPjwvY2ZkaTpUcmFzbGFkb3M+PC9jZmRpOklt

cHVlc3Rvcz48Y2ZkaTpDb21wbGVtZW50bz48L2NmZGk6Q29tcGxlbWVudG8+PC9jZmRpOkNvbXBy

b2JhbnRlPg==

 

 

----AaZz--

 

This scenario requires special settings of the Mail adapter channel.

First of all, It is very important that the mail attribute Use Mail Package is not checked,Content Encoding is set to None and Keep Attachments is not checked.


Furthermore, we need to set a special Content-Type multipart/mixed; boundary="--AaZz"

The boundary declaration in the Content-Type must be identical with the boundary used in the Java mapping, otherwise the mail will just be the whole MIME stream as plain text.


We set the Content-Type with the MessageTransformBean in the Processing Sequence under tab Module like this:


 

ModuleTypeModule Key
AF_Modules/MessageTransformBeanLocal Enterprise BeanContentType
sap.com/com.sap.aii.adapter.mail.app/XIMailAdapterBeanLocal Enterprise Beanmail


 

Module KeyParameter NameParameter Value
ContentTypeTransform.ContentTypemultipart/mixed; boundary="--AaZz"

 

If you want to know, how the MIME parameters Content-Type and Content-Transfer-Encoding work, and what other parameters can be used, then you can look in RFC 1341 about MIME (Multipurpose Internet Mail Extensions): http://www.w3.org/Protocols/rfc1341/0_TableOfContents.html

Signing the request message using Certificates and Encoding the message using Base64 - Bouncy Castle API Format

$
0
0

All of you, once in your PI career would have worked on the integration with Bank scenarios. I have worked on 3 scenarios in a career of 3 years and trust me Bank guys are very stubborn and expecting an slightest of change also from them is like asking someone to lend 1 million dollars

 

Intention of writing this blog post is to help those people who are looking for a signing method which can be done in mapping and who specifically look for CMS standards and here i am signing the message using Bouncy Castle API

 

There are many other standard methods available for signing the request message like the below ones -

 

1. PGP Encryption method, where you can turn off the Encryption part and just sign the message using your private key and the same message can be verified at the receiver end using the Public key. You can find below the blog from Shabarish Vijaya kumar and William Li for the detailed steps to perform PGP signing.

 

PGPEncryption Module: A Simple How to Guide

 

Using PGP in Process Integration

 

2. Signing using WSSE - This is the standard Web service security provided in SOAP receiver channel to sign the message. Even in this method we can either go for only signing or only encryption or both signing and encryption. The detailed steps have been listed out very nicely by Rajendra in the below blog. You can refer it if you are looking for this method of signing.

 

http://people/rajendra.badi/blog/2011/08/24/configuring-wsse-digital-signing-and-encryption-using-sap-pi-711-aae-soap-ad…

 

In my case, i tried all the above mentioned methods and they were working perfectly and without any issue. But the issue came up when Bank came up into the picture, they started rejecting the above methods for the reason they don't use web-service at their end and they do not expect SOAP envelope in the message. Signing using WSSE works only with SOAP envelope. Hence 2nd method was rejected. For the 1st method, i used ASCII armored keys (.asc) and Bank told we want you to sign only using a private key and we can verify it only using your public i.e, your X509 certificates.

 

Hence, i came up with the Idea of writing a Java code for the Signing and additional part was to encode the signing using Base64 encoding.

 

Here you can find the complete code of signing your request message using Bouncy Castle API. Please note i have used Private key (.pfx or .p12) of the SAP PI system to sign the message.

 

And i am not so great at Java, so please excuse me for redundancy or irregular method declaration or unnecessary import of libraries Trust me this is a very simple code because an amateur like me in Java could write this code

 

 

Java code for Signing and encoding of PI message payload

package com.javamapping.signing;

 

 

 

import java.io.File;

import java.io.FileInputStream;

import java.io.InputStream;

import java.security.KeyStore;

import java.security.PrivateKey;

import java.security.Security;

import java.security.cert.CertStore;

import java.security.cert.CollectionCertStoreParameters;

import java.security.cert.X509Certificate;

import java.util.ArrayList;

import java.util.Enumeration;

 

 

import org.apache.commons.codec.binary.Base64;

import org.bouncycastle.cms.CMSProcessableByteArray;

import org.bouncycastle.cms.CMSSignedData;

import org.bouncycastle.cms.CMSSignedDataGenerator;

import org.bouncycastle.jce.provider.BouncyCastleProvider;

 

 

import com.sap.aii.mapping.api.AbstractTransformation;

import com.sap.aii.mapping.api.StreamTransformationException;

import com.sap.aii.mapping.api.TransformationInput;

import com.sap.aii.mapping.api.TransformationOutput;

import com.sap.aii.utilxi.core.io.IOUtil;

 

 

 

 

public class SigningBouncyCastle extends AbstractTransformation {

 

  public void transform(TransformationInput input, TransformationOutput output)

  throws StreamTransformationException {

  String finalString = "";

  String pass = "";

 

  try {

 

 

  InputStream ins = input.getInputPayload().getInputStream();

 

 

 

 

  String input_data = IOUtil.copyToString(ins, "UTF-8");

  finalString = signRequest(input_data, pass);

 

 

  output.getOutputPayload().getOutputStream().write(

  finalString.getBytes());

 

  } catch (Exception ie) {

  // do nothing

  }

  }

 

  private String signRequest(String strPaymentRequest, String strPassword) {

 

 

  X509Certificate cert = null;

  PrivateKey priv = null;

 

  try {

  // Below we are using BouncyCastle classes to sign the Request Message.

  Security.addProvider(new BouncyCastleProvider());

 

 

 

  String pass = "Your Private Key Password";

  File file = new File("Path of the SAP Application where Private key is stored");

       InputStream stream = new FileInputStream(file);

       KeyStore store = KeyStore.getInstance("PKCS12");

       store.load(stream, pass.toCharArray());

       PrivateKey key = (PrivateKey)store.getKey("Your Private Key File name", pass.toCharArray());

 

 

 

  Enumeration e = store.aliases();

  String name = "";

 

 

  if (e != null) {

  while (e.hasMoreElements()) {

  String n = (String) e.nextElement();

  if (store.isKeyEntry(n)) {

  name = n;

  }

  }

  }

 

 

  // Get the private key and the certificate

  priv = key;

  cert = (X509Certificate) store.getCertificate(name);

 

 

  java.security.cert.Certificate[] certChain = store.getCertificateChain(name);

 

 

  ArrayList certList = new ArrayList();

  CertStore certs = null;

  for (int i = 0; i < certChain.length; i++)

  certList.add(certChain[i]);

  certs = CertStore.getInstance("Collection",new CollectionCertStoreParameters(certList), "BC");

 

 

  // Encrypt data

  CMSSignedDataGenerator sgen = new CMSSignedDataGenerator();

 

 

  // What digest algorithm i must use? SHA1? MD5? RSA?...

  // In our case we are using SHA1 algorithm

  // CMSSignedDataGenerator.DIGEST_SHA1 = "1.3.14.3.2.26"

  sgen.addSigner(priv, (X509Certificate) cert,CMSSignedDataGenerator.DIGEST_SHA1);

 

 

  sgen.addCertificatesAndCRLs(certs);

 

 

  // Convert the message to UTF8 encoding

 

 

  byte[] utf8 = strPaymentRequest.getBytes("UTF-8");

 

 

  // Initialize signer object using UTF8 encoded string, detached =

  // true, and //Bouncy Castle provider (BC).

  // The 2nd parameter need to be true (detached form) we need to

  // attach //original message to signed message

  CMSSignedData csd = sgen.generate(new CMSProcessableByteArray(utf8), true, "BC");

  // Get signed message

  byte[] signedData = csd.getEncoded();

  // Get base 64 representation of signed message

  byte[] signedDataB64 = Base64.encodeBase64(signedData);

  String str = new String(signedDataB64);

  //output.getOutputPayload().getOutputStream().write(str.getBytes());

  // Write Base64 encoded message to file - If needed

  // FileWriter fw = new FileWriter("Base64Encoded.txt", false);

  // fw.write(signedDataB64);

  // fw.flush();

  // fw.close();

  // FileOutputStream out = new FileOutputStream("Signed.txt");

  // out.write(signedData);

  // out.close();

 

 

  return str;

  } catch (Exception ex) {

  System.out.println("Error signing payment request. Please verify the certificate.");

  ex.printStackTrace();

  return "";

  }

 

  }

}

 

You can also test the above code by writing simple few lines in your Main class -

 

Main Class

public static void main(String[] args) throws FileNotFoundException {

  

     try {

       InputStream in = new FileInputStream(new File("Your Input File Path")); //Make Sure this is the Workspace path where your Java code is running.

  

  

       OutputStream out = new FileOutputStream(new File("Your Output File Path"));

 

       SigningBouncyCastle bouncyCastle = new SigningBouncyCastle();

  

  

       bouncyCastle.transform(in, out);

     } catch (Exception e) {

       e.printStackTrace();

     }

   }

 

Once you are done with the testing. Successfully build your Java code and export the java code as JAR file and import it as Imported Archive and use it in your Operation Mapping.

 

 

One more point where i faced difficulty was in searching for the right jar files. Getting the matching JDK versions JAR file was an uphill task. I have attached all the Jar files used in this Java Mapping to make it easy for you. Please note the JAR files attached are for JDK version 1.6.

 

 

This is my first blog on Java Mapping, i would like to know the feedback from all the SCN members out there..

Import integration content from SAP HCI Web-UI to Eclipse

$
0
0

In this blog I’m going to discuss how to download the Artifacts from SAP HCI web-UI and import into Eclipse [Developer workbench].

 

Introduction


We may get requirements to customize the existing integration content to meet the business needs. In those cases we cannot directly customize the objects in SAP HCI Web-UI,we use Eclipse UI [Developer workbench] to customize the integration content.

 

SAP HANA Cloud Integration web tool provides you a web based interface for accessing and managing integrationsc onfigured in HCI. You can access and use the prepackaged integration content from the catalogue or import them to the workspace from the system you are accessing HCI. You can use the artifacts available in integration packages to achieve the integration scenario. Optionally, you can also import integration package from your local folder and use them.

 

Integration flow, one of the artifacts in integration packages, are used to achieve the integration scenarios along with other artifacts like value mappings, data flows, files and URLs. You can use the integration flows available in packages by configuring and deploying them. You can also edit the integration flows by adding or removing elements, before you configure and deploy them.

 

To customize the integration content objects we should download the integration content locally and import it into Eclipse workspace.

 

Configuration

 

  1. Access the Web-UI URL which is assigned to your organization.test.png

2)  Click on Catalog

test2.png

3) Select the required package from the available on your tenant by using Discover option.

test3.png

4)  In this example I have choose the package “eDocument: Electronic invoicing for Chile”

test4.png

5) Click on the package it will gives the brief description of the package as below

test5.png

 

It will show the list of available artifacts under this package.

test6.png

6)  Click on the button under Actions to download the artifact to your local system.

test7.png

7) Zip file will be created in our local system.

test8.png

 

8) Extract the Zip file.

 

HCI Configuration to Import the artifacts into Eclipse

 

Open integration designer perspective

 

1) Go to Menu --> File --> import

test9.png

 

2) Select the option general --> Existing projects into workspace --> Next

 

test10.png

Browse for the directory where you extracted the zip files.

test11.png

Now the artifacts imported into eclipse workspace succesfully.

test12.png

Reference:

 

https://proddps.hana.ondemand.com/dps/d/preview/93810d568bee49c6b3d7b5065a30b0ff/2015.06_CORR/en-US/frameset.html?2fb0aa4dc5194b589adcd1c5534901e3.html

 

https://help.sap.com/cloudintegration

Dynamic File Name in Multimapping - Standard Solution

$
0
0

Fellow SCNers,

 

All of us at one point of time in our career as SAP PO consultants would have have faced a requirement needing multimapping and to go with it a need for dynamic file name after multimapping. And at the same time, we would have realized that we cannot assign ASMA using a UDF in case of multimapping.

 

This blog does not intend to showcase any hidden functionality which makes it possible as it is still not possible to do a 1:N multimapping without using custom solutions. One such solution has been brilliantly showcased by my friend Praveen Gujjeti in A new approach: Multi-mapping Dynamic Configuration using a generic custom module

 

However, it is possible to do multimapping and Dynamic Configuration in cases of 1:N multimapping if the value of N is known and fixed. I have achieved this to Dynamically assign ASMA values to child messages in case of a 1:5 split.

 

I would like to mention our colleague and guru Stefan Grube whose blog Unknown use case of DynamicConfigurationBean: Store file name to JMS header without mapping showed me the light for the approach.

 

Solution:

 

In your multimapping program, insert a UDF having the below code snippet. Please note that the you can increase the number of DynamicConfigurationKey objects based on the number of messages you have in your target.

 

public String putFileName(String inputFileName1, String inputFileName2, String inputFileName3, String inputFileName4, String inputFileName5, Container container) throws StreamTransformationException{

try {

  DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);

  DynamicConfigurationKey key1 = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/Test","Message1");

  DynamicConfigurationKey key2 = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/Test","Message2");

  DynamicConfigurationKey key3 = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/Test","Message3");

  DynamicConfigurationKey key4 = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/Test","Message4");

  DynamicConfigurationKey key5 = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/Test","Message5");

  conf.put(key1,inputFileName1);

  conf.put(key2,inputFileName2);

  conf.put(key3,inputFileName3);

  conf.put(key4,inputFileName4);

  conf.put(key5,inputFileName5);

  return "";

}

catch(Exception e){

return "Could not write filename";

}

 

And then, in your receiver communication channels(Assuming you will have different communication channels to process each of the split messages) you need to maintain the below entries in the module tab.

 

ModuleList.JPG

 

With the below settings in Module Configuration.

 

ModuleConfig.JPG

And that's it!!! The above example pick the value assigned to Dynamic Configuration Key Message3 from mapping to DCJMSMessageProperty0 attribute of the JMS adapter. Similar thing can be done for the attributes of File adapter to dynamically name the files at receiver end.

Using SAP HCI OData Adapter with SAP HANA Cloud Connector

$
0
0

Introduction


A few months ago I was asked by my team lead Chris Paine to investigate using the HANA Cloud Integration solution to move data to and from SAP and SuccessFactors. He wanted to know how easy/hard this was and we thought actually trying it out was the best way to measure this.


It took a bit of effort to understand how everything worked, and we thought that it would be worthwhile sharing our experience here with others in the hope that it's easier for you!


Starting with the Help:

 

As suggested by online help we could use SAP Cloud Connector directly to access OData service with SAP HCI adapters.

 

https://proddps.hana.ondemand.com/dps/d/preview/93810d568bee49c6b3d7b5065a30b0ff/2015.05_CORR2/en-US/frameset.html?65a60e750eca49328fef93c0723ad4b8.html


When I first signed up with the SAP HCI Trial programme, I tried to use this approach but failed miserably.  Whilst the online help isn’t particularly descriptive, I couldn’t find any blog or descriptive instruction on how to get this working end to end. Unfortunately it was much later in my trial of the solution that I learned that this approach will not work with trial participants. This is a shame, but hopefully the HCI trial team can look at working something else out, as they were very helpful in solving various other issues we had during the trial.

 

 

As I found it very hard to build an integration without any detailed step by step blogs or documentation on SCN, I thought that it would be best to share an example, we will be using SAP HCI to fetch data from SuccessFactors and follow by updating a custom table in SAP On-Premise.

UseCase.png

 


SuccessFactors

In SuccessFactors, we will use the Currency Conversion data and it’s API. You can either update these currency conversion data manually in SuccessFactors or you could use SAP HCI to leverage on free online exchange rate data (eg: https://currencylayer.com )

 

Picture16.png

 

On-Premise SAP System


Create a custom Z table. (Or you can use standard SAP tables if you wish)

 

Picture17.png

Create a gateway service using the custom table above.

 

Here is a reference on how to create a simple gateway service:

http://scn.sap.com/community/gateway/blog/2014/11/02/simple-step-by-step-sap-gateway-service-guide

 

An example from my Gateway service builder

Picture18.png

Go to the Gateway Client and create EDMX file.

Picture1.png

Save your EDMX  file. We will need this file when configuring the OData adapter in Eclipse. (NB you have to save the file as you can't directly pull the metadata from the on Premise connection using the Model Operation area in HCI. I don't know why, but this is the solution!)

 

SAP HCI Tenant

Get tenant ID (Account Name) of your SAP HCI tenant from Eclipse.  You will need the tenant ID as the account name when configuring SAP Cloud Connector

Picture4.png

SAP Cloud Connector Configuration

Install SAP Cloud Connector on your on-premise system & start the Cloud Connector

 

Here is a reference on how to install the SAP Cloud Connector on different operating systems:

https://help.hana.ondemand.com/help/frameset.htm?57ae3d62f63440f7952e57bfcef948d3.html

 

Login to the SAP Cloud Connector once the Cloud Connector is running (Default URL - https://localhost:8443 )

Picture19.png

Establish connection to your SAP HCI tenant by adding new account.

Picture3.png

Picture5.png

Configure on-premise SAP system and map it to a virtual host and port. If you would like to use Certificate based Authentication, select HTTPS and the corresponding port. (NB this is the point where the trial landscape doesn't work! )

Picture6.png

Once the system has been mapped, define the permitted resources that you wish to grant access to.

Picture20.png

SAP HCI Configuration

Now, in the Eclipse, we would build an iFlow to fetch the currency conversion rates from SuccessFactors, map the data to a different format and finally, updates a custom table in the SAP system by using OData adapter.

 

Here is the iFlow will look like:

Picture21.png

 

Configure the SuccessFactors Adapter as a Receiver Channel

Picture22.png

Open the Operation Modeler wizard, specify the SuccessFactors address, Company ID, Username and Password which you are connecting to. Select “CurrencyConversion” from the entity list.

Picture23.png

Select the Operation and Fields of the Entity

Picture24.png

Once the operation modelling is finished, an XSD is generated in src.main.resources.wsdl automatically.

Picture9.png

 

Before we configure the receiver channel for SAP, we need to import the EDMX file which we generated earlier

Picture25.png

Now, moving on to the receiver channel for SAP, configure the OData communication channel as below

Picture26.png

Using the Operation Modeler, select “Local EDMX File”.  Click on the “Browse” button to select the edmx file to proceed.

Picture27.png

Select “CurrencySet” from the entity list.

Picture11.png

Select the “PUT” Operation and Fields of the Entity we are updating.

Picture28.png

Once the operation modelling is finished, an XSD is generated in src.main.resources.wsdl automatically.

Ensure “On-Premise” is selected as the Proxy Type in order to use SAP Cloud Connector.

Authentication type as defined in the SAP Cloud Connector. In this example, it will be Basic Authentication.

Picture10.png


Deploy User Credentials Artifacts

In order for the adapter to authenticate, we need to deploy “User Credentials” arctifacts. The name of the artifacts will have to be the same Credential Name referenced in both communication channels. In this example, we named the arctifacts “sap_odata” and “sf_odata”

Picture14.png

Message mapping 

In this step, we need to map the incoming message to the message type required by SAP. The Mapping step can be added to the integration flow canvas from the Palette. In the Message Mapping Overview tab, add the .XSD files generated earlier and define the mapping as below:

Picture30.png

Split Incoming Payload

Here we need to split the incoming payload to individual messages for each currency code.

Picture15.png

Add Splitter to the integration flow canvas by selecting from the Palette. Configure a “General” splitter with the following properties:

Picture31.png

Here is more information on splitting messages:

http://scn.sap.com/community/pi-and-soa-middleware/blog/2015/01/16/blog-6-splitting-messages-in-integration-flows

 

 

Deploy

Set Timer Start Event to “Run Once”. This should allow the iFlow the run immediately once deployed.

 

 

Picture32.png

Now, deploy the Integration Project. In the message monitoring view, you should be able to see the status of the iFlow.



Here is the results after a successful run:

 

In the SAP system, the custom Z Table is updated with Currency code, timestamp and exchange rates. (Image: Below left)

Picture34.png

If you have activated the ICF recording in SAP system,you will first find a "GET" request with “x-csrf-token: fetch” in the header to obtain the X-CSRF-Token, then follow by "PUT" request to update the data in SAP system. (Image: Above right)



Here is the audit log from SAP Cloud Connector showing my user accessing the ICF service

Picture35.png

 

Potential Bug


The iFlow seems pretty simple and straight forward. But during this exercise, there was an error that kept giving me grief whenever it tries to update SAP system via the OData adapter. Fortunately the HCI trial team pointed out to me that the header in the message after fetching data from SuccessFactors is the caused of it. The "SystemQueryOption" with the value of "$select=lastModifiedDateTime"  in the message header was flows all the way through to the OData channel for SAP system. To resolve the error, she suggested to have a "Content Modifier" with the same header parameter and the value is set to blank string.

Picture37.png

 

 

 

I hope you found my blog (it's my first) useful. I'd love to hear your comments. If you're in Australia (Melbourne) and want a demo of it working, please let me know, I'd love to help out.

Yes Rest LookUp is possible in PI

$
0
0

Recently we had a requirement where we need to perform a look up for a rest service to obtain the token and then include the same in the future requests.

I did not find any document for REST look up in PI , so thought to create a new one so that it can help others.

 

For this document, I have used the below rest api

 

http://maps.googleapis.com/maps/api/geocode/json?address=10001


This service takes zip code as json input and returns status and result in json format.


So like other look up first thing we need to do is to create a new communication channel with type REST in ID


 



 

pic1.JPG

 

REST URL Tab:

 

pic2.jpg

Notice we have used a variable parameter {req_zipcode} in the URL.The value of this parameter will be fetched from the input xml message which we will be passing during the look up.

 

In the xpath expression we have used the value as //zipcode. So in the input xml there has to be a field with the name zipcode. Below is our input xml

 

pic3.jpg

Rest Operation tab:

 

pic4.jpg

Data Format tab:

 

pic66.jpg

 

Our input is xml but the service expects json so we have to choose the option 'Convert XML Payload To JSON'.

 

Similarly the service will return the output as JSON. So we have to choose the option 'Convert to XML'. Also we need to select the 'Add Wrapper Element'.

The final output in xml format will look like below

 

pic777.jpg

 

Communication channel is ready now. Next we need to create source and target structure in ESR

 

source:

 

pic888.jpg

target:

 

pic999.jpg

 

Java Mapping Code:

 

package com.test;
import java.io.ByteArrayInputStream;
import java.io.InputStream;
import java.io.OutputStream;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import com.sap.aii.mapping.api.AbstractTransformation;
import com.sap.aii.mapping.api.StreamTransformationException;
import com.sap.aii.mapping.api.TransformationInput;
import com.sap.aii.mapping.api.TransformationOutput;
import com.sap.aii.mapping.lookup.Channel;
import com.sap.aii.mapping.lookup.LookupService;
import com.sap.aii.mapping.lookup.Payload;
import com.sap.aii.mapping.lookup.SystemAccessor;
public class RestLookInPI extends AbstractTransformation {  public void transform(TransformationInput arg0, TransformationOutput arg1)  throws StreamTransformationException {  this.execute(arg0.getInputPayload().getInputStream(), arg1  .getOutputPayload().getOutputStream());  }// end of transform  public void execute(InputStream in, OutputStream out)  throws StreamTransformationException {  try {  String status = "";  // generate the input xml for rest look up  String loginxml =  "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + "<zipcode>10001</zipcode>";  //perform the rest look up  Channel channel = LookupService.getChannel("BC_468470_Receiver","CC_Rest_Rcv");  SystemAccessor accessor = null;  accessor = LookupService.getSystemAccessor(channel);  InputStream inputStream = new ByteArrayInputStream(loginxml.getBytes());  Payload payload = LookupService.getXmlPayload(inputStream);  Payload SOAPOutPayload = null;  SOAPOutPayload = accessor.call(payload);  InputStream inp = SOAPOutPayload.getContent();  DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();  DocumentBuilder builder = factory.newDocumentBuilder();  Document document = builder.parse(inp);  NodeList stats = document.getElementsByTagName("status");  Node node = stats.item(0);  if (node != null)  {  node = node.getFirstChild();  if (node != null)  {  status = node.getNodeValue();  }  }  Document targetDoc = builder.newDocument();  Element targetRoot = (Element) targetDoc.createElement("ns0:MT_Output");  targetRoot.setAttribute("xmlns:ns0","http://xxxxxxx/pi/468470/RestLookUp");   Element stat = (Element) targetDoc.createElement("status");  stat.setTextContent(status);  targetRoot.appendChild(stat);  targetDoc.appendChild(targetRoot);   DOMSource domSource = new DOMSource(targetDoc);           StreamResult result = new StreamResult(out);           TransformerFactory tf = TransformerFactory.newInstance();           Transformer transformer = tf.newTransformer();           transformer.transform(domSource, result);  } catch (Exception e) {  e.printStackTrace();  }  } // end of execute
}

 

 

Test Result:

 

pic4444.jpg


Maintaining Endpoints in SOAMANAGER

$
0
0

When it comes to exposing certain functionality from SAP using soap protocol, we can either create an interface in PI or generally expose sap backend RFC/BAPI and let other applications consume it.

However, IMO, i would suggest avoiding PI in user front end scenarios where performance is key. Infact, adding any interface in such time critical interfaces could prolong the execution time and point of failure. Having said so, here i am not going to discuss when to use what, but as the blog's title suggest i am going to show how we can maintain endpoints in SOMANAGER when we are directly exposing applications from SAP backend system.


So, consider a scenario where you have multiple WSDLs in your landscape and you have to generate endpoints for each of them. Generally, these endpoints are not transportable because of obvious reasons, but having said so instead of creating endpoints manually in each environment we could actually bundle service definitions under one integration scenario and do a mass generation of endpoints - I believe, it's much more simpler than maintaining it one by one

 

In this blog, I will show how to accomplish that in 4 simple steps –

 

1) Creation of profiles under SOAMANAGER:

In profiles configuration, one can define what all security/transport settings are required to be assigned to the endpoints.

So, depending upon the provider's security settings you can chose different authentication methods. And, once you assign this profile to a service definition (in step 2) by default all the authentication mechanisms defined in a profile will be assigned to the endpoints.

Goto "SOAMANAGER" -> Technical Administration -> Profiles -> Create Profile -> Generic_Profile.

 

Here below, i have selected basic user id/password based authentication.Capture1.PNG

 

2) Create configuration scenarios:

In this step, you have to select your service definitions for which endpoints need to be generated and assign profile to each one of them under an integration scenario.

Go to "SOAMANAGER" -> Service Administration -> Local Integration Scenario Configuration -> Either create a new scenario or edit the existing one.


Here below, i am editing the existing scenario.

Capture2.PNG

As soon as you do that, a wizard will open where you have to search and select your service definitions.

Capture3.PNG

Capture5.PNG

Once you select that, you have to assign a profile (created in step 1) to each of your service definition and complete the wizard.

Capture.6PNG.PNG

 

3) Pending Task:

In this step, we have to complete the registration and generate endpoints. Goto "SOAMANAGER" -> Service Administration -> Pending Tasks.

Capture8.PNG

Capture10.PNG

 

Once done, look for your service definition under "web service configuration" and ensure that endpoints are generated properly.

Capture11.PNG


4) Export/Import of Profiles and Configuration Scenarios:

The profiles and configuration scenarios created in one client can be leveraged and transported to another client or higher environments. So, just export these configurations in a xml format and then eventually import the same - Once you import configuration scenario, you have to ensure that profiles are assigned to each of the definitions.


Reference-

Mass Configuration - ABAP Workbench Tools - SAP Library

Successfactor (LMS) Integration with HCM system Using SAP PI

$
0
0

In one of my Project, we had a requirement for Successfactor Learning Management System Integrations with SAP HCM for Curricula data exchanging.

  • The term qualification in SuccessFactors refers to Curricula.  Curricula is a template that contains multiple items (courses), requirements and sub-Curriculas. When Curricula is used as a compliance tool, completion of a series of items and requirements (electives) as a whole can be tracked in SuccessFactors.  Here the requirement is to track safety certificates for job execution as some jobs require employees to complete mandatory certification.

 

  • There are two parts to the interface – qualification catalog updates from SF LMS to SAP ECC and qualification link along with Status to be assigned to an employee in SAP ECC (PA – IT24).

 

So in simple words, we need to set up following 2 scenarios using SAP Standard content:

  1. Employee Curriculum Catalog :

b. EmployeeCurriculumStatus :

 

 

Steps to integrate successfactor EC and LMS with SAP HCM through SAP PI in simple following steps:


Step-1:Pre-requisite of standard content for the Integration has to be download from SAP Market Place

 

## In SAP PI following objects needs to be imported for SFSF LMS Integration.

  • SFIHCM0360004 (Support Package XI CONTENT SFIHCM03 600 #Database independent)

## The ECC version supported by LMS are version 6.0 and above.

Along with that, it is required to download the following Add-Ons and import in SAP ECC.

- SFIHCM03        600        0004      SAPK-60003INSFIHCM03              SFIHCM03: HCM integration with SuccessFactor

- SUCCESSFACTORS_HCM_INTEGR          3.0          sap.com              SUCCESSFACTORS HCM INTEGR 3.0

 

NOTE: SFIHCM03 depends on SFIHCM01, SFIHCM02 – Need to deploy complete package in ECC and always go for the latest available version for all the packages. (If client requirement for version is not specific)

 

    

Step2 -Creating channels


For Catalog Interface we are going to create following channels:


Sender SFSF Adapter with REST Protocol - To connect to Success Factor LMS services for Catalog

 

 

 

Receiver ODATA Adapter with ODATA Protocol- To connect to On-premise SAP HCM system.

 

For Curriculum Status Interface we are going to create following channels:


Sender SFSF Adapter with SOAP Protocol: To connect to Success Factor Employee Central services for Curriculum Status


Receiver SFSF Adapter with REST Protocol : For performing lookup in Success Factor LMS services for Status

 

 

Receiver ODATA Adapter with ODATA Protocol: To connect to On-premise SAP HCM system for Status

 

 

Step 3- Apply Integration model from ESR.

Use the following Process Integration scenario in ID and complete the configuration part for SAP PI.

 

Integration Model for Curriculumcatalog:

 

 

Integration Model for CurriculumStatus:

 

Step 4- ODATA service Configuration in SAP HCM:

In order to maintain and activate the following ODATA service: HRSFI_QUALIFICATION_SRV in ECC requires to follow the following steps:

 

Steps:

 

  • Execute the transaction – SPRO in HCM-ECC and follow the below path mentioned in the screen for Activate and Maintain services.

 

  • Once execute the Activate and Maintain Services, it will open following window to add the service

 

Once it is added , it will show under below service catalog window as follow:

 

 

Step 5:  BADI configuration

 

 

Note : The test case scenarios has been attached , please download it and change the extension to ".zip".

 

Referenced links -

 

OData Adapter and SFSF Adapter (extensions) for SAP Process Integration

Hana Cloud Integration in comparison to Dell’s Boomi.

$
0
0

Hana Cloud Integration in comparison to Dell’s Boomi.

 

In my previous four (1,2,3,4) blogs I talked about iFlows in Hana Cloud Integration. As we all know HCI is not the only middleware solution that is out there and if you are working in the integration scene, names as JitterBit, MuleSoft, BizTalk and Boomi are probably not unfamiliar to you. That last one, Boomi, is specifically for all the SuccessFactors Employee Central users and integration partners. Until a month ago, when purchasing an SAP SFSF enterprise package, the middleware software that came with that package was Boomi. Now it is also possible to go for a SAP SFSF enterprise package with…..HANA CLOUD INTEGRATION.

 

1) HCI and Boomi.png

 

Because this option is relatively new, there are still a lot of SFSF customers who are working with Dell Boomi. Therefore I want to compare some options in Dell Boomi to Hana Cloud Integration. I’m going to make a simple integration flow in HCI and Boomi and show you my results. The iFlow will contain an inbound connector (SuccessFactors), a simple mapping, an XML to CSV converter and an outbound connection (email).

Because the HCI web UI is not fully developed yet (e.g. there is no possibility to start a new iFlow from scratch) I’ll be using Eclipse for the HCI part. For Boomi I will be using the web environment (https://platform.boomi.com)

 

Inbound connector

 

2) inbound HCI.png

HCI connector and connection


 

3) inbound HCI.png

Details for the inbound connector



4) inbound HCI.png

Details for the connection

 

In HCI, you create an inbound connector and a connection to the start shape. On the connector itself you can adjust the authentication details. On the connection you see two sections; ‘Connection details’ and ‘Processing details’. These are quite self-explanatory: one section is to determine where the inbound connector has to go, and the other section tells it what to do over there.
You also have the opportunity to schedule you’re flow here.

 

2.1) inbound Boomi.png

Boomi connector and operation


3.1) inbound Boomi.png

Boomi connector



4.1) inbound Boomi.png

Boomi operation

 

 

As you can see in the pictures, in Boomi there’s one place where you enter the authentication, the connection details, and the operation details (called process details in HCI).
A big difference between HCI and Boomi is the way these connectors are saved. In HCI, you create a project, in which your iFlow is stored. In the iFlow you’ll find the different connectors that have been used. In Boomi, there’s one big project where all your iFlows and connectors are saved. That means, for example, that you create one SuccessFactors connection which you can use in all your iFlows, even those that have not yet been created. That also applies to the operation details. So it’s possible to use the same SuccessFactors connector in two iFlows, but create different operations.

 

 

 

Mapping

 

The mapping in Boomi and HCI are quite similar. It’s possible to drag a value from your import directly to your desired output, or you can change the data by using different kind of functions. For example, if the date format from your input does not match with the desired output it’s possible to use a TransformDate function.

 

5) mapping HCI.png

The mapping icon in HCI



6) mapping HCI.png

Overview of a simple mapping in HCI



7) mapping HCI.png

Some of the functions that can be used between your input and output

 


8) mapping HCI.png

Parameters for a TransformDate function in HCI



 

In Boomi the mapping looks like this:

 

5.1) mapping Boomi.png

The mapping icon in Boomi



6.1) mapping Boomi.png

Overview of a simple mapping in Boomi

 



7.1) mapping Boomi.png

A function to modify the date format



8.1) mapping Boomi.png

Parameters for DateTrans in Boomi



Combine and/or convert

 

Now we come to the part where Boomi and HCI show some difference. When looking at HCI, we see that the requested data rows from SuccessFactors are treated as one document. Within HCI all records (data rows) will automatically be combined into one document. This is not the case with Boomi. Every single row (record) is treated as one document, which can be sent out individually. Normally, you’d want to have one file (.csv) containing all the rows (records) so it’s necessary to combine the documents (each containing one data row) to one document.
On the other hand, in HCI the data is processed as xml. We need to convert this data to .csv if that is the output we want. In Boomi, we just need to indicate that the output should be a .csv file and don’t need to adjust or transform any data. Let’s see how that looks like in HCI:

 

9) converter HCI.png

The converter icon in HCI


10) converter HCI.png

Parameters for the XML to CSV converter

 

And in Boomi, we do not convert but we must combine:

 

 

9.1) combine Boomi.png

The data process icon, from where we can combine

 

 

10.1) combine Boomi.png

Parameters to combine documents

 

Another difference is naming the resulting document. In HCI the document is named in the connector which is being used (e.g. the mail adapter, which we will discuss in the next chapter). In Boomi, the name of the document must be set as a property before setting the outbound connector.

 

11) properties Boomi.png

The icon to set the properties



12) properties Boomi.png

The parameters to name the file.

 

 

Outbound connector

 

The outbound connector is set up in the same way as the inbound connector. In HCI you need to create a connector and a connection, and in Boomi you need to create a connector and an operation.

 

 

13) outbound HCI.png

The mail connector icon and connection flow in HCI


14) outbound HCI.png

Parameters for the Connection Details and the Mail Attributes in HCI

 

 

13.1) outbound Boomi.png

The Boomi outbound connector icon

 

 

14.1) outbound Boomi.png

The connections details for email in Boomi

 

 

14.2) outbound Boomi.png

The operation details in Boomi

 

Wrap-up

 

When you deploy and run these simple iFlows in both systems you get the following result:

 

15) deploy HCI.png
Result in HCI

 

15.1) deploy Boomi.png

Result in Boomi

 

 

As shown, Boomi takes up to more than a minute to run this flow. It has read 668 documents (read: rows of data) out of SuccessFactors, and has sent one .csv file out.
HCI doesn’t display the amount of documents in or out, and only takes 16 seconds to complete. Off course this is just one simple test, but I think the speed difference is remarkable.

There are two things that I would love to see in HCI in the near future. In Boomi you have the possibility to “test” the iFlow before actually deploying it to your atom. If your iFlow request to much data it will only take the first 10 documents and run the iFlow as if it is was deployed. In HCI you have to deploy the whole iFlow to your tenant and run it to see if the changes you have made were successful.
The second thing is the way Boomi handles multiple accounts. I now have one email address to login and from there I can service all my customers. This saves me a lot of time and I’m very curious in wat way HCI is going to develop in this matter.

The thing I like a lot in HCI is the trace functionality. Your iFlow becomes decorated with envelopes where you can see the data that is going through. In Boomi the trace function is much more standard and you cannot see directly what data is going through the different steps.

 

16) trace HCI.png

Trace functionality in HCI


16.1) trace Boomi.png

Trace functionality in Boomi

 

As explained there are some differences between Boomi and HCI, but definitely also a lot of  similarities between these programs. For now, the community behind Boomi is a lot bigger, but I’m sure HCI will continue to develop more and more standard packaged integrations and become more standard within the SuccessFactors EC community.

 

I’m very curious if you are working with HCI or Boomi and what your experience are with both systems. Please let me know in the comments or drop me a mail on bob@nextmoves.nl

 

 

Blog 1:Starting with Hana Cloud Integration? Keep this in mind!
Blog 2:Starting with Hana Cloud Integration? Create a simple integration flow (iFlow)!

Blog 3:Deep dive in Hana Cloud Integration.

Blog 4:Hana Cloud Integration and SuccessFactors Integration Center

Blog 5: Hana Cloud Integration in comparison to Dell’s Boomi.

Copy files without mapping PI 7.4 java Only (Copy files without changes)

$
0
0

On many occasions we copy files from one application to another without changes, so we do not need mapping, and these files can be any type.

 

For such cases, we create false interfaces as shown in the example.

 

I hope will be useful.


1. Create communication channel for sender

28-09-2015 12-08-24.png

2. Create communication channel for receiver

28-09-2015 12-11-33.png

3. Create Integrated Configuration with dummy interfaces

 

Inbound and sender channel

28-09-2015 12-24-50.png

Outbound and receiver channel

28-09-2015 12-26-17.png

28-09-2015 12-35-22.png

4. Create Configuration Scenario (You choose with or without model). Add objects

28-09-2015 12-31-48.png


Ready

How easy to get started on HANA Cloud Integration

$
0
0

After working more than 4 years in SAP's mobile solutions at this point of my career I feel a bit saturated. I find it difficult to plan my career ahead as I am confused on what I should update myself on? Usually I do not get hooked on to the titles and positions; I focus on role that I will enjoy most and which will bring more opportunities to learn. I sometimes thought about changing the technology to check whether I could still acquire new skills quickly.

 

Recently I got an opportunity to put my hands in a different technology I have ever tried before -  HANA Cloud Integration. I looked at it from two angles.

  • It would be a big addition to the current skills I possess.
  • And it is a big opportunity to work in a cloud based solution, which is the future.

How I get started?

Right, it is uncomfortable to imagine being a beginner in something again. To learn HCI, I have started by reading the documentation and following the video tutorials. But it made be incurious. Because it is not the way I used to learn. I believe that the time taken to learn a new skill is comparatively less if we actually sit down and practice instead of reading a bunch of documents.

 

Hence after understand the basics of HCI I instantly took the responsibility of building a new package for HCI(which will be a part of the product soon). And it made me comfortable with the tool in couple of weeks.

 

Some of the features in the tool excited me. Especially the tracing feature is something tells us that HCI was developed keeping in mind that it should be developer friendly.

2015-10-05_16-58-15.png

I haven't tried all the features of HCI but after spending few weeks on the tool I could guarantee that it is really easy to learn.

 

Here is the thing!

While learning a new skill you don't have to worry about mastering the skill. Understand what is HCI and why it is needed ,then start practicing(it's the key). For couple of weeks you just need to practice enough to get the results you want.

 

Here are the steps you could follow to start with.

  • Get a trial instance of HCI.
  • Set up the development environment(Eclipse).
  • Create a simple project - Try this iflow.

 

For more hands on follow these blogs.

 

Happy learning.

Using EGit for Java source code management in NWDS

$
0
0

Introduction

Java developments are becoming more common as PI/PO moves to single Java-only stack. Pure Java mappings and adapter module developments are commonplace these days to tackle the myriad challenges faced in integration projects. Together with continuous enhancements of the PI-centric functionality in Eclipse-based NWDS, NWDS is poised to become the single IDE for all PI/PO based development.

 

While ESR development and ID configuration changes are automatically tracked by PI's internal version control, there is no built-in version control system for Java mappings and adapter module developments. It is not uncommon that Java source codes for these are developed and maintained on a developer's local computer, with the compiled JAR/EAR files deployed manually into the server. This approach is a potential risk for an organization where the developer might leave the organization or the computer could get lost or damaged.

 

SAP's own offering to address this gap is NetWeaver Development Infrastructure (NWDI). It is a full blown server based solution providing a complete Java development environment, which covers not just version control, but includes landscape management, transport & deployment mechanism and many more.

 

However, not many organizations have NWDI or are willing to invest in it. It may be that the cost is prohibitive with low ROI as most organizations that use SAP are still quite ABAP-centric with only minimal Java developments.

 

In this blog, I will introduce Git as an alternative to NWDI for version control management. It is an open-source lightweight and popular Source Code Management tool. In particular, there is an Eclipse-based EGit plugin that can be used within NWDS.

 

 

Installation

To install the plugin in NWDS, click Help > Install New Software

 

Unfortunately, as NWDS 7.31 is based on Galileo, an old version of Eclipse, therefore it is only compatible with an old version of the EGit plugin, version 2.1. Therefore, add the following update site for EGit version 2.1 - http://archive.eclipse.org/egit/updates-2.1

site2.png

Select Eclipse EGit and proceed with the rest of the installation.

install_egit.png

 

 

Initial Configuration

After installation has been completed, NWDS has to be restarted. Upon restart, it will usually prompt the following two warnings.

 

Warning 1 - HOME environment variable

home.png

To fix this, edit the environment variables in your computer and add HOME as a new user variable with an appropriate directory of your choice.

envi.png

var.png

 

Warning 2 - Git installation

This can be ignored, so check the "Do not warn" checkbox and proceed.

warn.png

 

Additionally, configure the following basic user settings under Preferences > Team > Git > Configuration

  • core.autocrlf = false
  • user.email = <your email>
  • user.name = <your name>

config.png

 

 

Basic Usage of EGit

I'll cover some basic usage of EGit just to begin with. For further information, please refer to the EGit User Guide in the reference section.

 

Create Git repository

Change to Git Repository Exploring perspective. Click the "Create a new Git Repository" button
create.png

Provide details to the new repository. Note that the user guide recommends that the Git repository is not created within the Eclipse workspace.

repo.png

 

Import existing project into Git repository

Switch back to Java EE perspective or any perspective with Project Explorer. Right click on project to be added and select Team > Share Project. Then select Git as the repository type.

share1.png

Select the Git repository that was previously created. Once the configuration is completed, the source codes of the project will be moved to the directory of the Git repository.

share.png

 

Add files to be tracked and commit changes

Right click on project node and select Team > Add to Index

index.png

 

Subsequently, right click on project node and select Team > Commit to commit the files to be tracked by EGit. Note that binary .class files do not need to be committed, they can optionally be ignored from further commits by configuring a .gitignore file.

commit.png

 

 

Conclusion

With EGit on NWDS, we can implement a simple solution to provide a Source Code Management system to manage Java developments for PI/PO. To reduce the risk of the valuable Java projects and source codes being on just a local machine, the Git repository can be hosted on a network directory, or even on a managed remote server.

 

 

References

EGit User Guide

NetWeaver Development Infrastructure (NWDI) SCN Wiki

Adapter overview PI / PO 7.31 (7.4)

$
0
0

Ever wished yourself a central list with most details about the standard adapters available for PI / PO?

 

Here it comes ... Comments and improvement proposals are welcome...

 

Rename the file to .pdf to read it!


A1.PNG

A2.PNG

A3.PNG

A4.PNG


INT111 Best practice for SAP Integration

$
0
0

I am looking forward to hosting the best practices session I have previously announced.

I have been working with SAP PI/PRO for over 10 years, and have learned quite a bit about what to do when it comes to integration. For over a decade, I have been working with consultants and users from all over the world.

 

There is quite a number of dos and don’ts – you must know these so that you don’t fall into the same traps as other people who have already made certain mistakes.

The largest issue is that there is no 100% effective best practice, which works in all considerations. You must know what the limits are, and when to implement one or another best practice.

There is a series on my blog that will cover some of the topics presented here.

 

I will be covering the following topics:

  • best practices
  • developer guidelines
  • documentation
  • skills
  • landscape
  • support
  • custom development
  • B2B

 

I will be covering some of the mapping patterns that can help with certain mapping issues you might encounter. If you don’t follow them, you may end up having issues with contexts in your mappings. I have seen situations in which people have not been following the guidelines, and this has caused a deficient understanding of what was happening. Sometimes it can take a long time to identify that there is an issue with the contexts.

See you on Thursday, the 22nd of October 2015, between 08:00 a.m. and 09:00 a.m. at the San Polo 3503 meeting room in Las Vegas.
You can find the session at the agenda builder.

- Cross posted at : http://picourse.com/int111-best-practices-in-sap-integration

My session picks for SAP Teched Las Vegas

$
0
0

I’ve gone through the list of this year’s SAP TechEd Las Vegas sessions, and I’ve compiled a list of the most interesting ones (from an integration perspective). I hope to attend as many as possible, though I’m not sure whether I can make it to all of them, as I may have other commitments.

 

If you work in integration and wish to stay up-to-date with the news and innovations happening right now in the world of SAP, I strongly recommend attending at least a few of these sessions. If you have other recommendations, please list them in the comments section, and spread the word to all SAP enthusiasts and professionals.

I am looking forward to attending the following sessions (if my session allow it):

INT360 – B2B Add-On Solution for SAP Process Orchestration

Figuring out how to use the B2B add-on is always interesting. This year, the presentation will cover trading partner management. The basics are not that difficult.

EXP27523 – IFG for Integration: Survey Results on SAP Technology Integration

This is my session, where I will cover the results of the IFG (International focus Group for Integration) survey. We will be having a conversation on what is interesting about SAP Integration, and what were the replies to the survey.

INT264 – Connect to Third-Party Cloud Systems with SAP HANA Cloud Integration

I’m helping out with this hands-on session. Here we will make our own adapter from the Camel repository, and use it for some integration.

EXP27089 – Interface Serialization with SAP Application Interface Framework

Michael’s session on the AIF (Application Interface Framework). It seems like it will be a topic that is relevant for a lot of integration.

INT101 – What is new in SAP Process Orchestration

I think this is a must, just to keep up with the new innovations that are happening in the SAP world. There are quite a few things happening, as I can see it from the presentation. I hope they will also cover some details regarding the 7.5 release in this presentation.

INT104 – OData Service Deployment Options for On-Premise, Mobile and Cloud Scenarios

I would add this to learn more about the OData direction that SAP is heading toward. I see it as a topic that you must be able to thoroughly understand.

INT100 – Integration and Orchestration: Overview and Outlook

This is about all the different integration products SAP has. I think it does contain a good list of the different sessions to attend if you want to learn more about each topic.

EXP27229 – Best Practices for Building UIs in SAP BPM

I would love to hear this expert session talk. I have been developing some UIs (user interfaces) for BPM, but I’d like to hear their recommendations.

EXP27362 – Innovate with SAP API Management and SAP Gateway

After a week with APIGEE, I believe it is a really interesting topic that we must be able to master as developers. I do hope that it will contain some coding.

TT100 – API First: The New Enterprise Architecture for Systems of Engagement

This is the APIGEE design approach about making APIs (application program interfaces) both internally and externally.

INT261 – Build UIs Based on SAP Fiori to Enable Mobile Access to SAP BPM

Another session that I’m helping out with. We will be covering how to make UI5/Fiori screens on the BPMN (Business Process Model and Notation).

INT300 – Tune a Process Orchestration Scenario for High Performance

It is always interesting to see how it is possible to improve the performance of the PRO system.

INT201 – How to Deal with Integration and Orchestration Aspects of IoT Scenarios

This is also an interesting topic moving forward, about how the IoT (Internet of Things) should be handled.

INT111 – SAP Best Practice Integration with SAP Process Orchestration and Others

This is also my session, in which I’ll be covering what I see as best practices in the integration world. I hope you will come join in.

INT205 – Simplify Your B2B Integration with the Integration Advisor Tool

Integration Advisor can be a tool that will save us a lot of time on B2B integrations. I would like to see how it works, and how to implement it on customers’ systems.

INT210 – Moving to SAP Process Orchestration B2B Add-On

Should be interesting to see if there is any good way of handling migration from Seeburger or other tools to the B2B add-on.

INT103 – Apply REST with SAP API Management

A presentation that goes into some of the REST design principles that are out there.

You can find the sessions on the agenda builder:

https://sessioncatalog.sapevents.com/go/agendabuilder.myagenda/?l=111&locale=en_US

-  Original posted on : http://picourse.com/sap-teched-sessions-on-sap-integration/

XML Extended Handling - Function module to build CONTROLLER table

$
0
0

INTRODUCTION


     Sometimes, when dealing with synchronous interfaces, you need your proxy implementation to return optional fields even if they are initial. It can be done activating XML Extended Handler feature and populating CONTROLLER tables of your output structure accordingly. I have created a pair of function modules to make this job easier:

 


FUNCTION MODULE - ZPI_UTIL_XML_EXTENDED_HANDLING

 

  • Description: It activates/deactivates the XML Extended Handling feature.
  • Parameters:
    • ACTIVATE: 'X' to activate and  ' ' to deactivate XML Extended Handling feature.

 


FUNCTION MODULE - ZPI_UTIL_BUILD_CONTROLLER


  • Description:It walks through the input structure looking for initial fields. Every time an initial field is found, it is inserted in the corresponding control table with the desired control flag.
  • Parameters:
    • CNTRL_FLAG: Field Control in XML Data Stream (=> Type Group SAI).
    • OUTPUT: Output structure.

 

 

 

THE CODE

 

 

ZPI_UTIL_XML_EXTENDED_HANDLING



function zpi_util_xml_extended_handling.
*"----------------------------------------------------------------------
*"*"Local Interface:
*"  IMPORTING
*"     REFERENCE(ACTIVATE) TYPE  ACT DEFAULT 'X'
*"----------------------------------------------------------------------  data: lo_server_context   type ref to if_ws_server_context,        lo_payload_protocol type ref to if_wsprotocol_payload.  lo_server_context = cl_proxy_access=>get_server_context( ).  lo_payload_protocol ?= lo_server_context->get_protocol( if_wsprotocol=>payload ).  lo_payload_protocol->set_extended_xml_handling( extended_xml_handling = activate ).
endfunction.


ZPI_UTIL_BUILD_CONTROLLER



function zpi_util_build_controller.
*"----------------------------------------------------------------------
*"*"Local Interface:
*"  IMPORTING
*"     REFERENCE(CNTRL_FLAG) TYPE  PRX_CONTR
*"  CHANGING
*"     REFERENCE(OUTPUT)
*"----------------------------------------------------------------------  perform build_controller_table using output cntrl_flag.
endfunction.

 

*&---------------------------------------------------------------------*
*&      Form  BUILD_CONTROLLER_TABLE
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
*      -->P_OUTPUT  text
*      -->P_CNTRL_FLAG  text
*----------------------------------------------------------------------*
form build_controller_table  using    p_output     type any                                      p_cntrl_flag type prx_contr.  data: descr_output  type ref to cl_abap_typedescr,        struc_output  type ref to cl_abap_structdescr,        ls_components type abap_compdescr,        ls_prxctrl    type prxctrl,        lv_initial    type act.  field-symbols: <fs_controller_tab> type prxctrltab,                 <fs_field>          type any,                 <fs_table>          type any table,                 <fs_output_line>    type any,                 <fs_output_tab>     type any table,                 <fs_output>         type any.  descr_output = cl_abap_typedescr=>describe_by_data( p_output ).  case descr_output->type_kind.
*   -------------------------------------------------
*   - P_OUTPUT is either a Flat or a Deep structure -
*   -------------------------------------------------    when cl_abap_typedescr=>typekind_struct1 or "Internal type u (flat structure)         cl_abap_typedescr=>typekind_struct2.   "Internal type v (deep structure)      struc_output ?= descr_output.      assign p_output to <fs_output>.      if <fs_output> is assigned.        loop at struc_output->components into ls_components.          assign component sy-tabix of structure <fs_output> to <fs_field>.
*         -------------------------------------------------------------------
*         -- Inside P_OUTPUT - Flat/Deep Strcuture: CONTROLLER table found --
*         -------------------------------------------------------------------          if ls_components-name eq 'CONTROLLER'.            if <fs_field> is assigned.              assign <fs_field> to <fs_controller_tab>.            endif.
*         ---------------------------------------------------------------
*         --  Inside P_OUTPUT - Flat/Deep Strcuture: Other field found --
*         ---------------------------------------------------------------          else.            case ls_components-type_kind.
*             --------------------------------------------------------------------------------
*             --  Inside P_OUTPUT - Flat/Deep Strcuture: Flat and Deep Structures treatment --
*             --------------------------------------------------------------------------------              when cl_abap_typedescr=>typekind_struct1 or "Internal type u (flat structure)                   cl_abap_typedescr=>typekind_struct2.   "Internal type v (deep structure)                if <fs_field> is assigned.                  if ls_components-type_kind eq cl_abap_typedescr=>typekind_struct2.                    perform build_controller_table using <fs_field> p_cntrl_flag.                  else.                    if <fs_field> is not initial.                      perform build_controller_table using <fs_field> p_cntrl_flag.                    endif.                  endif.                endif.
*            ----------------------------------------------------------------------
*            -- Inside P_OUTPUT - Flat/Deep Strcuture: Internal Tables treatment --
*            ----------------------------------------------------------------------              when cl_abap_typedescr=>typekind_table.     "Internal Type h (Internal Table)                if <fs_field> is assigned.                  assign <fs_field> to <fs_table>.                  if <fs_table> is assigned.                    if <fs_table>[] is not initial.                      perform build_controller_table using <fs_table> p_cntrl_flag.                    endif.                  endif.                endif.
*            -------------------------------------------------------------------
*            --  Inside P_OUTPUT - Flat/Deep Strcuture: Other types treatment --
*            -------------------------------------------------------------------              when others.                if <fs_controller_tab> is assigned.                  assign component sy-tabix of structure <fs_output> to <fs_field>.                  if <fs_field> is assigned.                    if <fs_field> is initial.                      ls_prxctrl-field = ls_components-name.                      ls_prxctrl-value = p_cntrl_flag.                      append ls_prxctrl to <fs_controller_tab>.                    endif.                  endif.                endif.            endcase.          endif.        endloop.      endif.
*   ---------------------------------
*   - P_OUTPUT is an Internal Table -
*   ---------------------------------    when cl_abap_typedescr=>typekind_table. "Internal Type h (Internal Table)      assign p_output to <fs_output_tab>.      if <fs_output_tab> is assigned.
*       -----------------------------------------------------------------
*       --  Inside P_OUTPUT - Internal Table: Treatment of each record --
*       -----------------------------------------------------------------        loop at <fs_output_tab> assigning <fs_output_line>.          perform build_controller_table using <fs_output_line> p_cntrl_flag.        endloop.      endif.    when others.  endcase.
endform.                    " BUILD_CONTROLLER_TABLE


INVOCATION CODE


These two functions can be used in proxy implementations once the output structure is populated.

 

 

call function 'ZPI_UTIL_XML_EXTENDED_HANDLING'   exporting      activate = 'X'.
call function 'ZPI_UTIL_BUILD_CONTROLLER'   exporting      cntrl_flag = sai_ctrl_initial   changing      output     = output.


Anonymous logon for SOAP Adapter

$
0
0

I would like to present a new way of accessing the SOAP Adapter with anonymous logon without changing the adapter configuration itself or without using any kind of header rewriting.

 

With the use of a simple servlet deployed in a webapp it is possible to do cross context dispatching - with this little "trick" anonymous logon for the SOAP Adapter is possible - but be aware of security - you need to extend this simple example to control access

 

Servlet source code 

 

import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class AnonymousSoapServlet extends HttpServlet {        private static final long serialVersionUID = 1L;    @Override    protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {        getServletContext().getContext("/XISOAPAdapter").getRequestDispatcher("/MessageServlet").forward(req, resp);    }    @Override    protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {        getServletContext().getContext("/XISOAPAdapter").getRequestDispatcher("/MessageServlet").forward(req, resp);    }
}

DynamicConfigurationBean in SFTP receiver adapter

$
0
0

Using DynamicConfigurationBean with variable substitution to rename file in receiver SFTP adapter.

Overview:

Dynamic configuration parameters can be used to modify the filename, path and other parameters on the receiver side. This is generally achieved via UDF / Java mapping. However SAP has provided this functionality through a standard bean. This enables us to do modify these parameters without writing any code.

Requirement:

PI Version : 7.3 dual stack

Our requirement was to transfer file using SFTP adapters from source fileserver to target file system.  No mapping required. Additionally, the file also has to be written to an “AUDIT” directory in zip format.

The name of the file in output directory and the name of zip file in AUDIT directory have to be same as source file.

So if the source file name is AB_YYYYMMDD_HHMMSS.txt, then the output file also has to be –
AB_YYYYMMDD_HHMMSS.txt and zip file name in audit directory has to be AB_YYYYMMDD_HHMMSS.zip

For achieving the zip functionality, we used PayloadZipBean.

However the zip file was named as AB_YYYYMMDD_HHMMSS.txt (instead of .zip extension) because ASMA was enabled in receiver adapter.

If we unchecked the ASMA on receiver channel for zip file, then we had to provide a static name in “File name” parameter like “AB.zip” and then use the “Append Timestamp” option. But this did not satisfy our requirement of having zip file name same as source file name.

Also since it was a pass through scenario, we wanted to avoid any kind of graphical / java mapping.

Using OS command option was the last resort, however we wanted to try if we could fulfill this requirement using standard bean.

Solution:

We were able to achieve our business requirement by using combination of variable substitution and Dynamic configuration bean as below.

  1. Enable ASMA option in sender file adapter.
  2. Enable variable substitution in receiver adapter which creates zip file and provide a variable name. In the reference field assign it value of message:interface_name.
  3. Ensure that ASMA option is disabled in the receiver adapter because, ASMA has higher priority and hence the file will be named as .txt irrespective of all other configurations.

IMG-1.jpg

 

  4.  Mention the variable used in step 2 in Filename parameter and appends “.zip” as extension.

IMG-2.jpg

5. In the module tab, add Dynamic configuration and payload zip bean as shown below.

IMG-3.jpg

The write option of dynamic configuration bean, writes the name of source file in the header field – message.interface. This is then substituted in Filename parameter using variable substitution and also since we have “.zip” extension already hardcoded. So we now get a zip file with .zip extension.

Log from communication channel –

IMG-4.jpg.png

IMG-5.png

 

Related content

http://scn.sap.com/thread/3562745

http://scn.sap.com/people/stefan.grube/blog/2009/06/19/unknown-use-case-of-dynamicconfigurationbean-store-file-name-to-jms-header-without-mapping

Viewing all 741 articles
Browse latest View live