Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

How to monitor IFlows PI 7.3.1

$
0
0

When working with IFlow, you have some patters:

  • Point-to-Point Channel : You select this pattern for direct communication between the sender and receiver.
  • Recipient List : You select this pattern when you need to route the message to two or more receivers based on conditions.
  • Message Translator : You select this pattern when the sender and receiver systems have different message formats.
  • Recipient List (Dynamic Conditions) : You select this pattern when you need to dynamically route the messages to the receivers.

 

also take a look to this link:Creating an Integration Flow - Process Integration Tools (Eclipse-Based) - SAP Library

 

Once you did the configuration provided by Michal in his blog PI/XI: Eclipse based integration flows - how to configure them with PI 7.31 I will provide you the steps to monitor the IFlow.

 

First all, you need a Test Client to execute the IFlow, you can use the SOAPUI, XMLSpy.

Test_Client.JPG

Once you execute the IFlow, go to the NWDS, Right Click on the IFlow you want to monitor and select the option Monitor Integration IFlow.

MonitorIntegrationWFL.jpg

scroll down to the "Processing Details" section and select the corresponding message ID.

CommunicationChannel Entry.JPG

Once you clink on the Message ID, the Message Monitoring page will appear.

MessageMonitoring Page.JPG

Select the entry and press the button "Open Message"

Once you are in the "Message Editor" page, you will see in the column "CAPTION" the monitoring selected in the ICO's configuration

LoggingStaging.jpg

If you want to monitor the structure of the request message, just select the entry table and the Payload Tab.

RequestStructure.JPG

 

If you want to monitor the structure of the target message, just select the entry table and the Payload Tab.

TargetStructure.JPG

 

 

Thanks.


Simple Way To Find SAP PI/PO Java Class JAR files location

$
0
0

In SAP PI typical development projects, sometimes finding a SAP PI/PO java class jar file is a tedious job due to various reasons e.g., the jar file name might be changed in new PI version, jar file location change in pi server etc.. There are couple of good blogs on this topic on SCN (check references)

 

Well, this blog is just an extension to the wonderful blog concept: Finding com.sap.guid.IGUID (or any other class) on the PI Server by Markus Windhager

 

All that you have to do is, use below UDF code in a draft mapping program and then execute from test tab with required package class name as input. With this UDF procedure we can find any PI jar file location without much effort

 

FindJarFileUDF code

** add import statement: java.net.URL

String str = "";   try   {  //Class c = Class.forName("com.sap.guid.IGUID");  Class c = Class.forName(var1);  URL loc = c.getProtectionDomain().getCodeSource().getLocation();  str = loc.toString();  //audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, c + " found at " + loc);    }catch (Exception e){     str = e.toString();    }
return str;

 

GM_FindJarFileUDF.jpg

Check results yourself and input your feedback

 


References

 

Finding com.sap.guid.IGUID (or any other class) on the PI Server by Markus Windhager

 

How to Find Java Library During PI Development  byWilliam Li

 

How to Find Java Library Resource File Possessing Information for Class Name by Vadim Klimov

Export Integration Flows from NWDS To Desktop !!!

$
0
0

The reason for writing this blog is a PI consultant posted a thread in SCN (http://scn.sap.com/thread/3544808),

for a  issue while exporting the Integration flow to file system or desktop. So I would like to show the step to be

followed to export the Integration flow to file system.

  1. Login to Integration directory in NWDS.
  2. For exporting a Integration flow goto option as shown in the below pic.

1.jpg

3. Select Integration Flow as the drop down option,as shown in the next screen.  2.jpg

4. Choose the Flow that needs to be exported. like --> 3.jpg

5. Choose next and again the objects involved in the flows. -->4.jpg

6. In the next screen enter the file name, where Integration flows would be saved under this name

And using browse button select the target location. -->5.jpg

 

7.  After pressing ok, flows will be exported to desktop. -->6.jpg

8. The file is available in the desktop. --> 7.jpg

 

 

That's it, very simple to export..


Thanks & Regards,

avinash M

How to Synchronize GUID between Enterprise Service Repository and System Landscape Directory Software Component Versions

$
0
0

Introduction:To begin with development work in the Enterprise Service Repository of PI/PO, a software component version will normally be created in the SLD and imported into the ESR. However, there are situations in which a software component has been created locally in ESR and the same needs to be maintained in the SLD. It is also possible that the SWCV is was existing originally in the SLD and then imported into ESR but somehow it went missing in the SLD (may be it was deleted). This will mean also that a new SWCV with the same name will need to be created in the SLD, but the challenge is, this new SWCV to be created will have a new GUID which is different from the one in ESR. This article gives an overview of how to maintain the same GUID for SWCV in the SLD and ESR. Some of the steps below can be skipped depending on which stage of problem you are.

Step by Step Activities

  1. Create the SWCV locally in ESR

SWCVCreate Object.gif

2. The Software Component version has the following details when created;

SyncESR_SLDSWCV - Word2.gif

3. To maintain this same software component version in the SLD, a number of step will be performed. This will be to keep the GUID of this SWCV component created locally as that which will be created in SLD.

4. Create a new product for this SWCV

5.Create the same SWCV with same name in the SLD on the product

SyncESR_SLDSWCV - Word5.gif

6. After creating the SWCV, follow the steps below to change the GUID to the same as that created locally in ESR

7. Go to the SLD homepage, click on the “Administration” tab

SyncESR_SLDSWCV - Word7.gif

8.Under the content tab, select the option “Maintenance”

SyncESR_SLDSWCV - Word8.gif

9. Under the Maintain content on this SLD, choose the option “CIM Instances”

SyncESR_SLDSWCV - Word9.gif

10. Under “View and Maintain Data on CIM Level”, make the following settings

SyncESR_SLDSWCV - Word10.gif

11. Search for the new SWCV created earlier in the SLD change the GUID to that in the ESR using the “properties” tab

Note: When you copy the ID from of the SWCV from ESR, it is of this format “40dab440e5bd11e385a60000008396ea”, but when you change it in the SLD, you have to use this format “40dab440-e5bd-11e3-85a6-0000008396ea”. The difference is, you have to put 4 hyphens, to determine the positions to put the hyphens do the following

a) Count the first 8 characters put hyphen after the 8 character

b) Count the next 4 characters and put hyphen

c) Count the next 4 characters and put hyphen

d) Count the next 4 characters and put hyphen

After this, you should have the second format above hyphens. You could also compare with an existing SWCV in SLD and ESR to confirm

SyncESR_SLDSWCV - Word11.gif

12. Save the changes and you would have created a SWCV with the same name and GUID as that in the ESR


Conclusion: With the steps above performed, you will be able to sync the GUID between your SLD and ESR. To test this, if you try to import the SWCV again in your ESR, you will not find it as one of the lists because it is already existing in the ESR. Also, if you try to create a dependency on the SWCV from another SWCV and you update the SLD info of this dependent SWCV, it should be updated successfully.

Merge multiple files without BPM

$
0
0

BpmPatternCollectMultiIf is classical BPM pattern used for n:1 scenario. A classic scenario is to pick multiple files(different format): merge and send single output to receiver. Obviously given its issues one will go for BPM only when no better option seems feasible. I am sure many would have come across same requirement and BPM seems to be the only Option.

 

In this blog I am discussing an approach with which BPM can be avoided.

btw I myself am guilty of creating BPM for similar requirement... picking 2 files(same folder), merging and posting IDOC to receiver. I now ask myself Why didn't I consider this approach back then!!!

 

Requirement:

  • Read 2 files(XML/Text) from folder and merge and post data receiver.

Note: No message based correlation here. Just pick existing files and process.

 

Approach:

1) Read 2 Files using File sender adapter. One as MainPayload another as ApplicationAttachment

- Check Additional Files(s) and provide file name pattern

- Read 2nd file as mandatory(unless both files are available files will not be picked)

AttachmentFile1.optionalNO

 

FileAdapter_AdditionalFiles.png


Note: FCC cannot be used for both files. In such case one can use MessageTransformBean and PayloadSwapBean in Sender channel.


2) Create mapping to merge Main payload and Attachment(s)

- Do Check Read Attachments in Operation mapping

OperationMapping_ReadAttachments.png

 

This is how we can merge the two:- MainDocument and ApplicationAttachment.

 

File1(Main Document)File2(Attachment)

Result Structure

(After Java mapping)

<?xml version="1.0" encoding="UTF-8"?>

<ns0:MT_Sender xmlns:ns0="http://AttachmentTest">

   <MainDoc>

      <Row>

         <StudentID>190</StudentID>

         <StudentName>Lorum</StudentName>

         <StudentStream>MS</StudentStream>

      </Row>

      <Row>

         <StudentID>191</StudentID>

         <StudentName>Ipsum</StudentName>

         <StudentStream>B.Tech</StudentStream>

      </Row>

   </MainDoc>

</ns0:MT_Sender>

<?xml version="1.0" encoding="UTF-8"?>

<ns0:MT_Attachment xmlns:ns0="http://AttachmentTest">

   <Attachment1>

      <Row>

         <StudentID>190</StudentID>

         <CourseData>

...

...

</CourseData>

      </Row>

      <Row>

         <StudentID>192</StudentID>

         <CourseData>

...

...

</CourseData>

      </Row>

   </Attachment1>

</ns0:MT_Attachment>

<?xml version="1.0" encoding="UTF-8"?>

<ns0:MT_Intermediate xmlns:ns0="http://AttachmentTest/">

   <MainDoc>

      <Row>

         <StudentID>190</StudentID>

         <StudentName>Lorum</StudentName>

         <StudentStream>MS</StudentStream>

      </Row>

      <Row>

         <StudentID>191</StudentID>

         <StudentName>Ipsum</StudentName>

         <StudentStream>B.Tech</StudentStream>

      </Row>

   </MainDoc>

   <Attachment1>

      <Row>

         <StudentID>190</StudentID>

         <CourseData>

...

...

</CourseData>

      </Row>

      <Row>

         <StudentID>192</StudentID>

         <CourseData>

...

...

</CourseData>

      </Row>

   </Attachment1>

</ns0:MT_Sender>

 

 

Sample Java Mapping Code

package com.sap.JavaMapping;
import java.io.*;
import java.util.*;
import javax.xml.parsers.*;
import javax.xml.transform.*;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.*;
import org.xml.sax.SAXException;
import com.sap.aii.mapping.api.*;
public class MergeAttachment extends AbstractTransformation{  public void transform(TransformationInput in, TransformationOutput out)  throws StreamTransformationException  {  InputAttachments inputAttachments = in.getInputAttachments();  InputStream inputstream = in.getInputPayload().getInputStream();  OutputStream outputstream = out.getOutputPayload().getOutputStream();  try {  DocumentBuilderFactory dbfactory = DocumentBuilderFactory.newInstance();  DocumentBuilder dbuilder = dbfactory.newDocumentBuilder();  Document mainDoc = dbuilder.parse(inputstream);  if(inputAttachments!=null)  {  if(inputAttachments.areAttachmentsAvailable())  {  Collection<String> collectionIDs  = inputAttachments.getAllContentIds(true);  Object[] arrayObj  =collectionIDs.toArray();  for(int i=0;i<arrayObj.length;i++)  {  String attachmentID = (String) arrayObj[i];  Attachment attachment =inputAttachments.getAttachment(attachmentID);  InputStream attachmentInputStream = new ByteArrayInputStream(attachment.getContent());  Document attachmentDoc = dbuilder.parse(attachmentInputStream);  Element element = (Element) attachmentDoc.getElementsByTagName("Attachment1").item(0);  Node copiedNode = mainDoc.importNode(element, true);  mainDoc.getDocumentElement().appendChild(copiedNode);  }  }  }  TransformerFactory tf = TransformerFactory.newInstance();  Transformer transformer  = tf.newTransformer();  transformer.transform(new DOMSource(mainDoc), new StreamResult(outputstream));  } catch (ParserConfigurationException e) {  this.getTrace().addInfo("ParserConfigurationException caught");  e.printStackTrace();  } catch (SAXException e) {  this.getTrace().addInfo("SAXException caught");  e.printStackTrace();  } catch (IOException e) {  this.getTrace().addInfo("IOException caught");  e.printStackTrace();  } catch (TransformerException e) {  this.getTrace().addInfo("TransformerException caught");  e.printStackTrace();  }  }
}

 

This code reads 2 xml document and merge attachment recordset below.

 

Please note following things:

- Read additional files option is only available for NFS not for FTP.

- Also in case Business system/location are different for sender file cannot be read as attachment.

Though using separate interface files can be brought to a common location and then this approach can be followed.

 

Let me know if this blog helps you.

This approach is also discussed in Praveen Gujjeti's blog N:1 & N:M mappings not possible without BPM

RFC-lookup with caching values

$
0
0

Hi all, in this blog i will show, how to use UDF to caching values in hashmap in GlobalContainer.

 

Logic of UDF is simple.

 

1. We check existence of hashmap in GlobalContainer. Hashmap should have unique name for rfc-lookup. It can be input node name of rfc-lookup function.

 

2. If we have no hashmap in GlobalContainer, then we create it (hashmap), put to hashmap value from rfc-lookup and push hashmap to GlobalContainer.

 

3. If we have hashmap then we check value.

 

3.1 If we have no value, then we put to hashmap value from rfc-lookup and update hashmap in GlobalContainer.

3.2 If we have value, then we return it and exit udf.

 

And that's all.

 

See code below:

 

@LibraryMethod(title="RFC_CALL_HASH", description="", category="FL_local", type=ExecutionType.SINGLE_VALUE)    public String RFC_CALL_HASH (        @Argument(title="")  String input_node_value,        @Argument(title="Service")  String Service,        @Argument(title="Channel")  String Channel,        @Argument(title="Function Module")  String FM,        @Argument(title="Input Node")  String Input_Node_name,        @Argument(title="Output Node")  String Output_Node_name,         Container container)  throws StreamTransformationException{            AbstractTrace trace = container.getTrace();        // Set default values            boolean put_to_cache = false;        final String SERVICE = Service, // Name of service defined in XI configuration        CHANNEL_NAME = Channel, // Name of communication channel defined for service        SAP_RFC_NAMESPACE = "urn:sap-com:document:sap:rfc:functions", // Namespace for SAP RFC definitions        FUNCTION_MODULE = FM, // Name of the function module called        VALUE_NOT_FOUND = "NOT FOUND"; // Default return value in case something goes wrong              if (input_node_value.equals("")) return ""; // if input value is empty just return empty string and exit from udf                    GlobalContainer globalContainer;        HashMap<String, String> cache;              globalContainer = container.getGlobalContainer();        // check hash existence.        if (globalContainer.getParameter(Input_Node_name)!=null )        {            // get object (hashmap) from GlobalContainer                     cache = (HashMap<String, String>)globalContainer.getParameter(Input_Node_name);            // get Value from hashmap            String retval = cache.get(input_node_value);            // if value not null - return it,            // if value is null, then put_to_cache set true            if (retval == null)            {                put_to_cache=true;            }            else            {                return retval; // exit from udf with value from hash            }                  }        else        {            // create new hashmap if we have no it yet and set put_to_cache in true            cache = new HashMap<String, String>();            put_to_cache=true;                                      }        // this part will run if we have no hashmap or have no our value in our hashmap      // start to call RFC. This part is copied from open source code project.          // Create document busilder to create DOM XML document        DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();        DocumentBuilder builder = null;        factory.setNamespaceAware(false);        factory.setValidating(false);              try {            // Create XML document using document builder            builder = factory.newDocumentBuilder();    } catch (Exception e) {            trace.addWarning("Error creating DocumentBuilder - " +                    e.getMessage());            return null;    }       // Define XML for RFC Request    String rfcXML = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><ns0:" +            FUNCTION_MODULE + " xmlns:ns0=\"" + SAP_RFC_NAMESPACE +            "\"><"+Input_Node_name+">"+input_node_value+"</"+Input_Node_name+"></ns0:" +        FUNCTION_MODULE + ">";      // Prepare and perform RFC Lookup ...    RfcAccessor accessor = null;    Payload result = null;          try {            //  Determine a communication channel (Business system + Communication channel)            Channel channel = LookupService.getChannel(SERVICE, CHANNEL_NAME);               //  Get a RFC accessor for the channel.            accessor = LookupService.getRfcAccessor(channel);               //  Create an XML input stream that represents the RFC request message.            InputStream is = new ByteArrayInputStream(rfcXML.getBytes());               //  Create the XML Payload            XmlPayload payload = LookupService.getXmlPayload(is);               //  Execute the lookup.            result = accessor.call(payload);               if (result == null) {                    trace.addWarning("result of RFC call is null");            }       } catch (LookupException e) {            trace.addWarning("Error during lookup - " + e);    }       // Parsing RFC Response Document    Document docResponse = null;    InputStream in = result.getContent();    String returnValue = VALUE_NOT_FOUND;    NodeList poItems = null;        try {        docResponse = builder.parse(in);            if (docResponse == null) {                trace.addWarning("docResponse is null");                      }        // Get the list of PO items from RFC structure (item nodes, child nodes of PO_ITEMS)        // Important: getElementsByTagName returns a *node list*.        //            So the first - even if there is only one - item needs to be picked - item(0)!        //            Only when having the PO_ITEMS *node* the getChildNodes() function can be used to get the PO items.
// this part depends on your xml structure!        ------------- //            poItems = docResponse.getElementsByTagName(                Output_Node_name);       //returnValue+=poItems.getLength();        if (poItems.getLength()>0)                 if (poItems.item(0).getChildNodes().getLength()>0)            returnValue = poItems.item(0).getChildNodes().item(0).getNodeValue();
// ---------------------------------------------- //       
}
catch (Exception e) {        trace.addWarning("Error when parsing RFC Response - " + e.getMessage());
}
try {       // Free resources, close the accessor..    if (accessor != null) {            try {                    accessor.close();            } catch (LookupException e) {                    trace.addWarning( "Error while closing accessor " + e.getMessage());            }    }
} catch (Exception e) {    trace.addWarning("Result value not found in DOM - " + e);
}
// end of calling RFC
// return the result obtained above
// now if we have put_to_cache == true, then we add it to our hashmap
if (put_to_cache==true)
{    cache.put(input_node_value, returnValue);     globalContainer.setParameter(Input_Node_name, cache); // update or set our hashmap in GlobalContainer    }    // and return value from rfc-lookup
return returnValue;    }
}
return returnValue;    }
}

Here i coded RFC-lookup in Java, but you can use standart RFC-lookup in message mapping, but logic will be little different.

 

Have fun

GS1 Integration - SAP PI 7.1 – PART - I

$
0
0

One day or other you might get a chance to integrate GS1 systems.

 

That day shouldn’t be challenging for you .

 

This Blog describes as how to exchange/create GS1 understandable messages which will let PI [7.1] to post the GS1 understandable documents and the sample use case for developing the same.

 

It is much painful at the initial stage to find the correct way of integrating GS1 systems via SAP PI and there is no proper SDN blog or discussions available for the same.

 

Hence I have shared this information based on my project learning.

 

It doesn’t contain any step by step for deployment or development  and It describes the how to frame/create/Build the Main GS1 Document XML schema using the available GS1 standard sub schema.

 

It is for Intermediate PI Consultants.

 

Hints /Prerequisites:



  • Here I have explained based on GS1 XML Business Message Standards (BMS) V 2.8.

 

 

 

  • The base business document used for this KB isGDSN Price Synchronisation. You can download all the standard sub-schemas using thegiven link.

 

  • Get the sample document (here Price) output xml from the GS1 team

 

  • Any of the The XML editing tool – Here I have used Liquid XML Studio  .

 

    • You can use any third party XML editing tool like Altova XML Spy or Styus studio  etc.,

 

  • Developed and compiled java mapping code using J2SE1.5 compiler since PI 7.1 using the older Java compiler.

 

 

Overview of GS1

 

  • GS1 is dedicated to the design and implementation of global standards and solutions to improve the efficiency and visibility of supply and demand chains globally and across sectors. The GS1 system of standards is the most widely used supply chain standards system in the world.

 

  • GS1 Member Organizations handle all enquiries related to GS1 products, solutions and services.
    • GS1 has close to 40 years' experience in global standards - see our timeline for more information.
    • GS1 offers a range of standards, services and solutions to fundamentally improve efficiency and visibility of supply and demand chains.
    • GS1 standards are used in multiple sectors and industries.

 

GS1 XML Business Message Standards (BMS)


GS1 Main XML message consisting of multiple segments/Layers like

 

  • Transport
  • Service
  • Message
  • Business document

 

GS1 XML Business Message Architecture

 

GS1_BMA.jpg

 

Basically we need to generate the xml in the above consolidated segments which will let us to post GS1 documents successfully.

 

 

Sample Use case

 

Our requirement is to post the Pricing (Add/Update) document to GS1 from ECC via PI.

 

This is designed as simple Asynchronous Proxy to File scenario.

 

Initial Steps

 

In order to build/frame the  GS1 - XML Business Message Architecture based XML output we have to follow the  following steps

 

 

 

Step 1:  

Here I have used V2.8 of GS1- BMS

 

 

 

Once you have downloaded the schemas , unzip it and keep in your local desktop.

 

The folder structure would be as follows


            For e.g.,

           

            PriceSynchronisationDocument   schema is under the folder path as  ean.ucc\gdsn


SchemaFolderStruct.jpg

 

 

Step 2:  

 

We may not directly use / import in XI to use as external definition which we have downloaded in earlier step.


We need to adjust\Edit the external definition as follows

 

Once you have extracted relevant schemas (.xsd) , please open the xsd and check check any relative paths specified as../../.   in the xsd:import  -  schemaLocation.

 

Remove the relative paths and save.

 

Note :


You need to check all the schemas before you upload to XI System. Make sure that it has the proper folder path


Sample :

<xsd:import namespace="urn:ean.ucc:2" schemaLocation="../../ean.ucc/common/Document.xsd"/>

After correction it would be looking like as follows

 

SampleSchemaLocation.jpg

 

 

Steps to be Performed in ESR


Importing external Definitions


Once you have removed all the relative paths  in standard GS1 Schemas, upload all the XSD using import external definition option from ESR.

 

In PI 7.1 we have the option to do the mass upload of external definition

 

  1. Go to Tools --> Import external definition from ESR


        ImportExternalDefinition.jpg



Steps to be performed in XML editing tool

 

Step 1 :


Open the  sample output xml  whihc you have received from the GS1 team for the relevant Business Document

 

Sample would be as follows


SampleGS1XML_Message.jpg

 

 

   It has all the GS1 BMS components as mentioned above.

 

Step 2 : Open the sample GS1 xml using XML tool

 

Once you import this xml, it will try to load all the corresponding linked schema for validation

 

ImportingXML.jpg

 

Once it got successfully loaded , please click on the infer XSD Schema

 

InferXSD_Schema_Main.jpg

 

It will pop up and ask you  as where to store the generated schema

 

Path_To_Store.jpg

 

Select the required folder path and click on finish.

 

 

Basically it will generate 4 schemas using the sample xml file as follows

 

 

SampleOutput.jpg

 

If you open the First Schema in XML tool, in this example it is  GS_Example_ADD0


You can see the schema layout as follows



Schema_Layout.jpg


Now we need to adjust the schema location to actual file path .

 

Open GS_Example_ADD1 and change the import schema location as follows

 

<xs:import schemaLocation="ean.ucc/common/DocumentCommand.xsd" />

<xs:import schemaLocation="ean.ucc/common/Message.xsd" />

 

 

 

Open GS_Example_ADD2 and change the import schema location as follows

 

<xs:import schemaLocation="schemapath/GS_Example_ADD1" namespace="urn:ean.ucc:2" />

<xs:import schemaLocation="ean.ucc/gdsn/PriceSynchronisationDocument.xsd" namespace="urn:ean.ucc:gdsn:2" />

 

Open GS_Example_ADD3 and change the import schema location as follows

<xs:import schemaLocation=" schemapath /GS_Example_ADD2.xsd" />

 

Create a folder and zip all the above files. We will call this as “Architecture Schema”.

 

We have to import this to ESR once you have imported all the standard schemas which you have downloaded in earlier step.

 

Note :

 

schemapath  - this is the folder name in which you have zipped the schemas and upload to XI.

 

E.g,

 

In below example Source filed value ean.ucc/gdsn is the schemapath 

 

ESR_Source.jpg

 

 

This Covers the following

 

  • Use cases/Building Blocks of GS1 XML Messages
  • Overview of  GS1 XML Business Message Standards (BMS)
  • Sample use case
  • How to build the GS1 Main document schema.

 

 

In next blog we are going to see how to use this in ESR to generate the expected GS1 Business Document XML.

GS1 Integration - SAP PI 7.1 – PART - II

$
0
0

In PART - I ,  we have seen as How to frame/create the GS1 XML Main schema .

 

In this blog we are going to see how to use in ESR to get the desired result/GS1 Output.

 

Steps to be performed in ESR

 

Step 1:  


Import the design schema which we have zipped as Architecture Schema” in earlier step.

 

Step 2:  


Create all the standard objects , like Source DT , MT , SI

 

Here the Source is Proxy so source DT is required and target is External definition so no Target DT.

 

Step 3:  


Create Message Mapping between Source Proxy structure and GS1 xml Structure (Architecture Schema).

 

Important Note :


  1. Architecture Schema: This can be linked to any business document. Here we are linking with Price Synchronization Document.
  2. Once you have completed your message mapping , the xml output which we are going to get won’t be directly usable in GS1.
  3. Hence we need to create Cascade java mapping to achieve this.
  4. Java code as follows

 

 

 

package com.sap.pi.gs1;

import java.io.BufferedReader;

import java.io.BufferedWriter;

import java.io.File;

import java.io.FileReader;

import java.io.FileWriter;

import java.io.IOException;

import java.io.InputStream;

import java.io.InputStreamReader;

import java.io.OutputStream;

import java.io.OutputStreamWriter;

import java.util.HashMap;

import java.util.Map;

import java.io.*;

import java.util.Date;

import java.text.DateFormat;

import java.text.SimpleDateFormat;

import java.util.Calendar;

import com.sap.aii.mapping.api.AbstractTrace;

import com.sap.aii.mapping.api.StreamTransformation;

import com.sap.aii.mapping.api.StreamTransformationConstants;

import com.sap.aii.mapping.api.StreamTransformationException;

 

 

public class MultiTextReplaceGS1 implements StreamTransformation  {

 

                /**

                 * @param args

                 * This program to Convert/Clean the Mapping output to GS1 understandable format

                 * Here we have used HashMap

                 */

           

                private Map map = null;

                private AbstractTrace trace = null;

                public void setParameter(Map arg0) {

                map = arg0; // Store reference to the mapping parameters

                if (map == null) {

                this.map = new HashMap();

                }

                }

                public void execute(InputStream arg0, OutputStream arg1)

                throws StreamTransformationException {

                String line = null;

           

                BufferedReader reader = new BufferedReader(new InputStreamReader(arg0));

                BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(arg1));

           

                StringBuffer buffer = new StringBuffer();

           

                // To set the Current date time for PriceSync Header URL

           

                DateFormat dateFormatDate = new SimpleDateFormat("yyyy-MM-dd");

           

                DateFormat dateFormatTime = new SimpleDateFormat("HH:mm:ss");

           

                Date date = new Date();

           

                // To set the Price Sync header create date and time stamp

           

                String PriceSyncHeaderDate = dateFormatDate.format(date)+"T"+ dateFormatTime.format(date);

            

                // Get reference to mapping trace

           

                trace = (AbstractTrace)map.get(StreamTransformationConstants.MAPPING_TRACE);

           

                trace.addInfo("Processing message");

           

                try {

                                trace.addWarning("Read Started");

                                while((line = reader.readLine()) != null) {

                                    buffer.append(line);

                                    buffer.append("\r\n");

                                }

                           

                                trace.addWarning("Read Completed");

                            

                                reader.close();

                                                             

                                trace.addWarning("Replacement Going to start");

                           

Map<String, String> ReplacementMap = new HashMap<String, String>();

                           

 

 

                                ReplacementMap.put("<ns1:StandardBusinessDocument xmlns:ns1=\"http://www.unece.org/cefact/namespaces/StandardBusinessDocumentHeader\">",  "<sh:StandardBusinessDocument xmlns:sh=\"http://www.unece.org/cefact/namespaces/StandardBusinessDocumentHeader\" xmlns:eanucc=\"urn:ean.ucc:2\" xmlns:gdsn=\"urn:ean.ucc:gdsn:2\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://www.unece.org/cefact/namespaces/StandardBusinessDocumentHeaderhttp://www.gdsregistry.org/2.8/schemas/sbdh/StandardBusinessDocumentHeader.xsd urn:ean.ucc:2 http://www.gdsregistry.org/2.8/schemas/PriceSynchronisationDocumentProxy.xsd\">");

 

                                trace.addWarning("StandardBusinessDocument Header URL Set completed");

                           

                                ReplacementMap.put("ns2:message xmlns:ns2=\"urn:ean.ucc:2\"", "eanucc:message" );

   

ReplacementMap.put("ns2:message", "eanucc:message" );

 

        ReplacementMap.put("ns1", "sh"); 

              

ReplacementMap.put("ns2:entityIdentification","entityIdentification");

   

ReplacementMap.put("ns2:documentCommand>","eanucc:documentCommand>");

                           

                                ReplacementMap.put("ns2:transaction","eanucc:transaction");

                                           

                            

                                // Price tag change  along with header time stamp

                           

                                 ReplacementMap.put("<ns3:priceSynchronisationDocument xmlns:ns3=\"urn:ean.ucc:gdsn:2\" lastUpdateDate=\"\" creationDateTime=\"\" documentStatus=\"\">", "<gdsn:priceSynchronisationDocument xsi:schemaLocation=\"urn:ean.ucc:2 http://www.gdsregistry.org/2.8/schemas/PriceSynchronisationDocumentProxy.xsd\" documentStatus=\"ORIGINAL\" creationDateTime=\"" + PriceSyncHeaderDate +"\">");

                           

                                 trace.addWarning("priceSynchronisationDocument Header URL Set completed");

                            

                                 ReplacementMap.put("ns3:priceSynchronisationDocument>","gdsn:priceSynchronisationDocument>");

                                                                                                           

                                String toWrite = buffer.toString();

                           

                                           

                                for (Map.Entry<String, String> entry : ReplacementMap.entrySet())

                                {

                                    toWrite = toWrite.replaceAll(entry.getKey(), entry.getValue());

                                }

                                                           

                                //Once we get the complete coverted output , we have to replace all  ns2: to empty

                           

                                String updateWrite = toWrite.replaceAll("ns2:","");

                                                           

                                bw.write(updateWrite);

                           

                                trace.addWarning("Replacement Completed Successfully");

                           

                                  // Flush and close

                           

                                bw.flush();

                           

                                bw.close();

                           

                } catch (IOException ioe) {

                           

                trace.addWarning("Could not process source message" + ioe.toString());

           

                throw new RuntimeException(ioe.getMessage());

           

                }              

                trace.addInfo("Processing completed successfully");

                }               

           

}

 

 

 

5. Crate Jar file

6. Create imported archive using the jar.


Step 5:    Create Operation Mapping and Use Java class after the Message Mapping

 

            Java_Mapping.jpg

 

All done!  

 

Happy Learning.


File Lookup with effective error handling

$
0
0

Dear SCN members ,

 

Recently I have worked on a requirement of writing XML payload to a file using file look up and drafting a mail to application support/Business folks for any issue while connecting or writing to file server  without disturbing main flow of the interface .

 

Generally we don't prefer file look up because it needs robust code or effective error handling in order to convince the  business .

 

There are couple of wiki's or blogs already available in scn on file look up ,but I'm not happy on error handling part which is very vital in real time critical projects .So I want to share a blog which will focus more on Error handling and robust java code for these kind of requirements.

 

 

Main Flow:RFC<-->PI<-->SOAP


Mappingss.JPG

 

Prerequisites:


For connecting to different source file servers ,I have used the Apache open source API .It is free to use .

 

You can download the jar file from the below link.

Apache Commons Net -  Download Commons Net

 

For drafting mail from UDF ,I have used java-mail-1.4.4.jar, javax.activation.jar .These are free to download Oracle open source API's.You can easily get from from Google/Oracle site.

 

UDF code:

 

Import parameters

 

com.sap.aii.mapping.api.*

com.sap.aii.mapping.lookup.*

com.sap.aii.mappingtool.tf7.rt.*

java.io.*

java.lang.reflect.*

java.util.*

org.apache.commons.net.ftp.FTPClientConfig

org.apache.commons.net.ftp.FTPClient

java.io.BufferedWriter

java.io.File

java.io.FileInputStream

java.io.FileWriter

java.io.IOException

javax.mail.Message

javax.mail.MessagingException

javax.mail.Session

javax.mail.Transport

javax.mail.internet.InternetAddress

javax.mail.internet.MimeMessage

 

public String FilesUpload(String Input, String ReturnFieldName, String Dateformat, String Host, String Username, String Pwd, String Folderpath, int TimeOut, String FileName, String FieldName, String Mail_Host, String Mail_From, String Mail_To, String Mail_Subject, Container container) throws StreamTransformationException{
int TIMEOUT =TimeOut * 1000; //Variable to store Connection TimeOut
AbstractTrace trace = container.getTrace();
FileInputStream fis = null;
FTPClient client = new FTPClient();     //Creating a FTPClient instance
try{    FTPClientConfig conf = new FTPClientConfig(FTPClientConfig.SYST_UNIX);    conf.setDefaultDateFormatStr(Dateformat);    client.configure(conf);    try{                client.setConnectTimeout(TIMEOUT); //Setting connection timeout                client.connect(Host);// connecting to ftp server                if(!client.login(Username, Pwd)){    throw new StreamTransformationException("Authorization failed.");} //Giving credentials to login                 trace.addInfo("Successfully connected to server"+client.getReplyString());             if(!client.changeWorkingDirectory(Folderpath)){    throw new StreamTransformationException("Exception occurred while changing folder path to Interface specific path.");} //Changing the current directory to required folder path                }            catch(Exception c){                        throw new StreamTransformationException("Exception occured while connecting to server  :" + c);                            }//close brace for catch
// Create an InputStream of the file to be uploaded
String srcFilename = FileName+"_Temp"+new Date().getTime();
String targetfilename = FileName+"_"+new Date().getTime();
File Sourcefile =new File(srcFilename);
//If file doesn't exists, then create it
if(!Sourcefile.exists()){ Sourcefile.createNewFile();}
FileWriter fileWritter = new FileWriter(Sourcefile.getName(),true);
BufferedWriter bufferWritter = new BufferedWriter(fileWritter);
bufferWritter.write(Input); //Writing the Input payload to file
bufferWritter.close();
fis = new FileInputStream(Sourcefile);
boolean done = client.storeFile(targetfilename, fis); //Store file to server
fis.close();
if (done) {                    trace.addInfo("!!!----File is uploaded successfully----!!!");                } else {                    trace.addWarning("Upload Failed");                    throw new StreamTransformationException("Failed to upload file  :Please cross check..:");                } //close brace for else
client.logout();
}catch(Exception e){
try{                      Properties properties = System.getProperties(); // Get system properties             properties.setProperty("mail.smtp.host",Mail_Host);  // Setup mail server             //Session session = Session.getDefaultInstance(properties);  // Get the default Session object.                       Session session = Session.getInstance(properties);  // if you get Unknown Host exception in JAVA                            //Sending mail to app support folks                         MimeMessage message = new MimeMessage(session);      // Create a default MimeMessage object.                  message.setFrom(new InternetAddress(Mail_From));   // Set From: header field of the header.                      String recipients[] =Mail_To.split(";");                            InternetAddress[] addressTo = new InternetAddress[recipients.length];                                                 for (int i = 0; i < recipients.length; i++) {                                        addressTo[i] = new InternetAddress(recipients[i]);                            }                            message.setRecipients(Message.RecipientType.TO, addressTo);             //message.addRecipient(Message.RecipientType.TO,new InternetAddress(Mail_To));     // Set To: header field of the header.                  message.setSubject(Mail_Subject);    // Set Subject: header field                  message.setText(e.getMessage()+"."+" Failed message is having field value of "+FieldName+" is "+ReturnFieldName);   // Now set the actual message                  Transport.send(message);   // Send message        trace.addInfo("Sent alert mail to app support folks successfully");
}catch(Exception mex){                                                                                                                                                                                                                                                                 trace.addWarning("Failed to send alert mail : "+mex);        }//close brace for catch        } //close brace for catch
finally {                try {                    if (fis != null) {                        fis.close();                    }                    client.disconnect();                } catch (IOException k) {trace.addWarning("Exception while closing filestream and diconnecting"+k);    }            }  //close brace for finally
return "File placed successfully";
}

In the above code ,for the below line we need to provide input based on file server that we are going to connect .

 

FTPClientConfig conf = new FTPClientConfig(FTPClientConfig.SYST_UNIX);

 

     UNIX -->For connecting to Unix based ftp server. (Use this for Windows-NT servers which has been configured to use a unix-style listing format )

     NT -->For connecting to WindowsNT based ftp server

     AS400 -->For connecting to AS/400 based ftp server

     VMS-->For connecting to VMS based ftp server

     OS2-->For connecting to OS/2 based ftp server

     L8-->Some servers return an "UNKNOWN Type: L8" message in response to the SYST command. We set this to be a Unix-type system

     NETWARE-->For connecting to Netware based ftp server

     MACOS_PETER -->For connecting to Mac pre OS-X  based ftp server

     MVS -->For connecting to MVS based ftp server

     OS400 -->For connecting to OS/400 based ftp server

 

We are providing input values to the below parameters in runtime (From ID)

 

  1. DateFormat --> File server OS level Date format
  2. FieldName --> Field value of unique field to identify  the message when udf failed to place file in PI local server .                                                             Ex: Exception occured while connecting to server  :com.sap.aii.mapping.api.StreamTransformationException: Authorization failed.. Failed message is having field value of CALENDAR is 20140521101010
  3. FileName --> Filename Ex: FileName+"_"+current time stamp in milli secs
  4. Folderpath --> Folderpath
  5. Host --> Hostname of PI file server
  6. Mail_From :Mail From
  7. Mail_Host : Mail server Host name
  8. Mail_Subject :Subject in Mail
  9. Mail_To :Mail To
  10. Pwd : Password to login to PI file server
  11. TimeOut (secs):Connection timeout
  12. Username : Username to login to PI file server


ID_OM.JPG


In the below cases alert mail will be sent to concern parties .Main mapping will not get disturbed because of the exceptions in UDF.

 

1)Unable to connect to given file server host

Ex: Exception occurred while connecting to server  :java.net.UnknownHostException: XYZ .Failed message is having field value of CALENDAR is 20140521101010

2)When credentials are not working .

Ex: Exception occurred while connecting to server  :com.sap.aii.mapping.api.StreamTransformationException: Authorization failed.. Failed message is having field value of CALENDAR is 20140521101010

3)When timeout:

Ex: Exception occurred while connecting to server  : java.net.SocketTimeoutException: connect timed out. Failed message is having field value of CALENDAR is 20140521101010

 

4)Exception occurred while changing the folder path to interface specific path

Ex: Exception occurred while connecting to server  :java.lang.Exception: Exception occurred while changing folder path to Interface specific path . Failed message is having field value of CALENDAR is 20140521101010

 

5)Exception occurred while writing file to file server.

Ex: Failed to upload file  :Please cross check..:. Failed message is having field value of CALENDAR is 20140521101010

 

 

Performance and Re-usability:


1)We can reuse the UDF across in all mappings which have similar requirement.

2)As per our User acceptance testing in almost all cases it took less than a sec.


Regards

Venkat


Processing the complex source file structure using two message mapping

$
0
0

Purpose:

 

Step by step to processing the complex source file structure using two message mapping.

 

Source File Structure:

21.JPG

ESR Configuration:

 

Step:1

 

Create the source data type

1.JPG


Step:2

Create the target data type

2.JPG

Step: 3

Create the source and target message type

3.JPG

  Step: 4

 

Create the Service interface for outbound and inbound

5.JPG

ss.JPGStep: 5

Create message mapping_1 to get the value from file

7.JPG

8.JPG

9.JPG

Step: 6

Create message mapping to push the values to RFC.

ccc.JPGFrom these below mapping, First of all I will store the values of NAME, VALUE  in array using the udf  storeValue

12.JPG

13.JPG

 

Perform Store:

 

To storing the values temporarily, need to call the class as performStore as written in attributes and methods:

 

PerfromGet :

 

Then get the stored input values from array by calling the method performGet(int inp)

arr1.JPG

arr2.JPG

 

Getvalue:

 

  Using get value udf to get the each values from array by applying the constant like 1,2,3,4,5,6

17.JPG

18.JPG

14.JPG

Step: 6

Create operation mapping with two message mapping

dd.JPGID Configuration:

 

Step:1

Create sender communication  channel  with file content conversion

19.JPG

20.JPG

Step:2

Create receiver RFC communication channel

 

Step:3

Create receiver determination

 

Step:4

Create Interface determination

 

Step:5

Create sender agreement

 

Step:6

Create receiver agreement

 

Mapping Test and Monitoring:

test11.JPG

test12.JPGaa.JPG

Read SOAP header information

$
0
0

Reading the SOAP header when a web service is consumed seems to be an easy task. Nothing could be further from the truth in my case. It really took some time to realize. That is the reason for this blog, to share my thoughts, problems and insides with the community.

 

System

We use a SAP PI 7.11 SP11 system to realize this scenario.

 

Context and requirement

It all starts with the requirement to know who consumed a web service and to store that information on the SAP back end system. It means that a user ID must be provided within the SOAP call, but the requirement is not to put this in the message body, but dynamically in the SOAP header.

 

The input message could look like this

BLOG -- 1.jpg

What preceded it were a blog and the follow-up.

 

*** Important remark     Please note that this blog is based on Service Interfaces containing only 1 operation.

                                          Currently, it does not work with multiple operations within 1 service interface.

                                See Open items and doubts section below.

 

Configuration

The input structure looks like this

 

DT_input

     Name

 

The output structure looks like this

 

DT_output

     UserID

     Name

 

The objective is to fill the UserID field with the value of the AuditUser field within the SOAP header. To accomplish this, an XSLT mapping will be used. No other mappings are being used within the Operation Mapping step.

 

This is the XSLT mapping I used:

 

<?xml version="1.0" encoding="UTF-8"?>

<xsl:stylesheet version="1.0" xmlns:xsl=http://www.w3.org/1999/XSL/Transform xmlns:ns1="namespace">

  <xsl:template match="/">

            <ns1:DT_output>

                  <UserID><xsl:value-of select="*/*/AuditUser"/></UserID>

                  <Name><xsl:value-of  select="*/*/DT_input/Name"/></Name>     

</ns1:DT_output>

  </xsl:template>

</xsl:stylesheet>

 

Concerning the Integration Directory, a sender SOAP adapter is used and the option Do Not Use SOAP Envelope is enabled.

Also, I added nosoap=true in the URL in soapUI, to allow sending messages in no soap mode.

 

After successful tests, I still have some doubts and things I want to clarify…

 

Open points and doubts

  • What about multi operation Service Interfaces? I already started with a thread on this topic.
  • For testing purposes, I use soapUI, but for real scenarios, Microsoft Visual Studio will be used. What about enabling the no soap mode there? Check out this thread.
  • What about  the SAP Enterprise Service explorer for Microsoft.NET, compatible with Microsoft Visual Studio 2012 and 2013? Is there any? For MS Visual Studio 2008, I know there is one.

 

Once these open points and doubts are cleared out, I will update this blog to have everything in 1 place.

 

 

 

Special thanks go to Antonio Sanz for helping me out with the XSLT mapping and the scenario.

HANA Cloud Integration Roadmap Webinar - 3 July

$
0
0

Join us on July 3rd for a live SAP HANA Cloud Integration (HCI) roadmap webinar covering current capabilities and future direction of SAP HANA Cloud Integration. The webinar will include a Q&A session as well.

 

 

Date:

Thursday, July 3.2014, 5-6 p.m. CET | 8-9 a.m. PST

 

Presenters:

Udo Paltzer and Sindhu Gangadharan, SAP HANA Cloud Integration Product Management

 

Register here.

 

Access the up-to-date SAP HANA Cloud Integration roadmap document on Service Market Place: service.sap.com/roadmaps > Product and solution roadmaps > Database and Technology > Product Roadmaps > Foundation section

 

For more webinars (upcoming and recordings) please check and follow this document: North America (EN) SAP Product Road Map Webinars.

 

Interested to learn more about HCI? Do not miss the global webcast series on HANA Cloud Integration. Next session will be on June 26th, 2014 and will cover the Eclipse IDE: overview of all the process steps and the various patterns, setting up the environment, roadmap and future with Kepler. More information and registration link in this blog: HANA Cloud Integration: Webinar Series.

Creating Jar files from NWDS using Ant

$
0
0

When dealing with PI development of java mappings or XLS(T) mapping you have to create a jar or zip file with the mapping source. This can take some time to create every time you make mapping changes. I have found that it is easy to create an Ant build/script to create the Jar files.

Hundred years ago (well almost), I wrote about how you could use the Ant build file to create mapping archives. Eclipse has changed the way it can create build files easily, so you also have to change the new format.

I have created a video that show what are the different steps required to make this work.

 

The code for the userbuild.xml file is there.

 

<?xml version="1.0" encoding="UTF-8"?><?eclipse.ant.import?><project name="test1" ><!--  Sample XML created by Daniel Graversen figaf.com -->      <import file="build.xml"/>      <target depends="build" name="pack1">  <!-- define the directory to place the packed files in -->  <property name="dist" value="packedfiles"/>  <property name="src" value="src"/>  <!-- delete and create a new directory -->  <delete quiet="true">  <fileset dir="${dist}" includes="*.jar"/>  </delete>  <mkdir dir="${dist}"/>  <echo message="${ant.project.name}: ${ant.file} packing"/>  <!-- create the jarfile -->  <jar destfile="${dist}/CRM.jar" >  <!-- class and jar files to be included -->  <fileset dir="${src}">  <include name="com/figaf/crm/**" />  </fileset>  <!-- xsl file to be included -->  </jar>  </target>  </project>

 

As you can see it is just normal Ant build that is required, so you can easy plug all your and skills in to it to enhance the user build file with tons of useful material.

I often use a build file to also create the jar files for XSL(T) even they are not used for other things in the project.

Do you have anything useful in your Ant build file, then please share?

 

And do you have an ANT task that can upload the JAR to the repository?

Manual verification of certificate chain of trust

$
0
0

Introduction

In B2B integration, we often encounter the requirement to handle certificates for use in SSL, encryption or authentication. The certificates normally come in the form of a chain of trust, and need to be imported in PI's NWA to be used in the configuration of the interfaces.

 

This blog illustrates a quick way to manually verify a certificate chain of trust, which can be easily done using the certificate keystore functionality of the Windows OS.

 

 

Example

The example will use a chain that consists of 3 certificates (1 end-server certificate, 1 intermediate CA certificate and 1 root CA certificate) in the following tree structure:-

Internal Root CA

--> Internal Gold CA1

--> b2bgateway.****.net

 

 

Before importing the CAs into the trusted keystore

When the certificate files are opened, it will display the message that the certificate cannot be verified.

 

1) End-server certificate

The certificate below is the end-server certificate which has been issued by Internal Gold CA1.

end_cert.png

 

2) Intermediate CA certificate

However, Internal Gold CA1 (the intermediate CA certificate) is not a trusted certificate. Opening this file will also display the message that the certificate cannot be verified.

 

This intermediate CA certificate is issued by Internal Root CA.

intca_cert.png

 

3) Root CA certificate

Internal Root CA certificate is also not trusted. When the file is opened, it will indicate that the root is not trusted.

 

The root certificate is a self-signed certificate (issued by itself - Internal Root CA.)

rootca_cert.png

 

Note: If there are more than 1 intermediate CA, repeat the second step to find the next higher intermediate CA until the root CA is reached.

 

 

Import CA certificates into trusted keystore

Now we import the CA certificates into the keystore.

 

1) Root CA certificate

The root CA certificate (Internal Root CA) is imported into the Trusted Root Certification Authorities store.

import_root.png

Select Yes to trust the Root CA certificate.

install.png

 

2) Intermediate CA certificate

After the root CA has been imported and trusted, if we open the intermediate CA file, it can now be verified successfully.

intca_ok.png

Subsequently, we also import the intermediate CA (Internal Gold CA1) into the Intermediate Certification Authorities keystore.

import_intca.png

 

3) End-server certificate

After both the root and intermediate CA certificates have been imported and trusted, now when we open the end-server certificate, we can see that verification is successfull. The Certification Path tab also provides the tree structure of the certificate chain.

end_verified.pngcert_ok.png

With this, we have successfully verified that this set of certificates in the example are valid, and the chain of trust can be established from the end-server to the trusted root.

 

 

Summary

Using this handy approach, we can verify that the received certificates are valid and the chain of trust can be established. This can be done even before importing into PI and conducting end-to-end testing. This will help to eliminate incorrect certificates or incomplete chains even before any development or configuration work is done.

 

 

Additional Reference

After the manual verification is done, the certificates can be imported into NWA. The following document details the steps to achieve that:-

Adding Certificates to PI

SAP PI & PMBOK: From business vision to the methodology aligned with techincal and best practice experiences

$
0
0

Hi Experts,

 

This is my first blog and im glad to share a topic that i have not seen applied directly in all projects and that i identify it as one of a lot of issues that we have as SAP PI Consultants in our projects directly with our project managers, leaders, managers, directors, etc...

 

So for this reason i share the abstract of one investigation that i have been working for more than 2 years, and maybe you will agree with me or maybe you can help me to complement this interesting topic with your constructive criticism...


@Note: you can find the video in spanish about this: Defensa de Tesis para obtener el grado de Maestría en dirección de proyectos en UNITEC Azael Navarro - YouTube


1.- General Idea: we should try to align us as consultants with the vision of the company, with best practices described in our PMBOK of PMI or others,

with SAP best practices about technical / functional vision even you can add your ASAP Methodology, and with these main components, you can begin designing the best Methdology just "FOR YOUR COMPANY" because as you know all projects and companies are differentes, etc...

 

 

 

2.- Problem - SAP PI Consultants vs Project Managers: Normally, instead of talking about PMBOK, ASAP, SCRUM or others, the point here independent of methodologies, standars, etc... We can agree that there are projects that our project manager doesnt have any idea about SAP PI and some times he can not follow this kind of projects just thinking about meetings, times and milestones or the same happens with the SAP PI Consultant that in this case if we dont have few experiences about projects, we can be limited in our understanding when our project manager request us something or he can't understand us sometimes when we can express some point, etc., where as we can see below, the consultants and the project manager have different visions and are not aligned with the organizational objectives.

 

 

 

3.- Solution: In brief, the SAP PI Consultants should get Knowledge about Project Management and the Project Managers should get Knowledge about SAP PI... In order to align the vision with the general understanding and with this, benefit the communication to deliver and to grow up in each line without separate our professional career from our own professional objectives...


The StrengthsandWeaknesses are various, but the important point here is how we implement this idea or others in our projects or companies...

Thanks for your attention and for your recomendations!!...

 

Best Regards,

 

References:

PMBOK 5 image at 25/06/2014: http://www.bitcompany.biz/wp-content/uploads/2012/08/pmbok-5.jpeg

Methodology image at 25/06/2014: http://www.architectmagazine.com/Images/Arch50_methodology_tcm20-2011650.png?width=600&amp;404=404.png

SAP images at 25/06/2014: SAP Software &amp; Solutions | Business Applications &amp; IT | SAP

SAP image others at 25/06/2014: http://www.meregio.de/images/logos/sap_big.gif


SAP Info Days on SAP HANA Cloud Integration

$
0
0

Interested in learning more about the latest news on SAP HANA Cloud Integration?

 

If yes, then we are very happy to announce here the start of a series of info days about "SAP HANA Cloud Integration (SAP HCI) - Everything that you want to learn about our Cloud based integration platform and how you can leverage SAP HCI today in integration scenarios!"

 

During these info days on SAP HCI, which will provide an update directly from the development organization, we would like to take the opportunity to share a lot of exciting news around new capabilities, supported scenarios and new editions with you.

 

As you may know already, SAP HCI is SAP’s strategic integration platform to integrate SAP Cloud applications with other systems, SAP and non-SAP, on premise or on the cloud. SAP HCI runs on top of the SAP HANA Cloud Platform and is leveraged for integration of SAP Cloud solutions, such as Cloud for Customer, SuccessFactors, Ariba etc.

 

Our new SAP HCI editions, such as the Standard and Professional Edition, allow you to leverage SAP HCI in arbitrary (i. e. any system to any system) integration scenarios.

 

The next SAP HCI info day will take place on Wednesday, July 30th 2014, in Walldorf, Germany, at the headquarters of SAP AG:

SAP AG

Dietmar-Hopp-Allee 16 (formerly: Neurottstraße)

69190 Walldorf
Germany

 

Start and end time of info day: 10:00 am – 3:00 pm CET(Please consultwww.timeanddate.com if you are in another time zone). The SAP HCI info day is for free, but registration via email to udo.paltzer@sap.com is mandatory.

 

The agenda of the info day is as follows:

  • Overview of SAP HANA Cloud Integration
  • Real customer and partner scenarios leveraging SAP HCI today (presented by customers or partners)
  • Deep dive demo on SAP HANA Cloud Integration
  • Questions & Answers
  • Hands-on Session with SAP HANA Cloud Integration (optional)

 

The info day will be held in the English language.

 

If you have any questions regarding the info day about SAP HANA Cloud Integration or regarding the registration to the info day, please contact me at udo.paltzer@sap.com.

 

 

Request of SAP HCI info days in other locations

 

As mentioned above, this info day is the start of a series of info days on SAP HCI. In case you would like to request SAP HCI info days in other countries around the globe, kindly feel free to reach out to me at udo.paltzer@sap.com.

 

We would be also very happy to offer SAP HCI info days directly at a partner site via the well established form of SAP CodeJam events; further information about SAP CodeJam sessions can be found at http://scn.sap.com/community/events/codejam and http://scn.sap.com/docs/DOC-37775.

 

Last but not least ...

 

We promote the SAP HCI info days via the Internation Focus Group for PI (IFG for PI). Further information about the IFG for PI can be found at http://scn.sap.com/people/holger.himmelmann2/blog/2010/10/07/introducing-global-special-interest-group-for-process-integration. Kindly feel also free to participate in our annual survey on PI at https://www.surveymonkey.com/s/IFG_for_PI_2014_Global_PI_Survey. The survey, which offers you a great opportunity to give feedback to SAP, closes on August 19th, 2014.

OData Adapter and SFSF Adapter (extensions) for SAP Process Integration

$
0
0

I am pleased to announce that today we released SP1 of the “SAP Process Integration, connectivity add-on 1.0”. This release of the add-on consists of

  • a new "OData Adapter"
  • and extensions to the existing "SFSF Adapter"


Overview


The OData adapter enables integration with OData service providers. It currently focuses on the consumption of OData services and hence available only as a receiver channel. All OData operations are supported and you can choose the desired operation within the channel configuration itself.


The SFSF Adapter has been extended to support SuccessFactors Adhoc and OData API's along with several other features. Within the channel configuration, by switching the message protocol from SOAP to ODATA, you can communicate with the OData API's exposed by the SuccessFactors system.

 

For your reference, here are some example scenarios of OData API integration between SAP ERP HCM and SuccessFactors Employee Central using the SFSF Adapter:


1. Employee Central Employee Data Replication to ERP HCM Employee ReplicationODataQuery.jpg                                                                                                                                                                                                                                                                                         2. ERP HCM Employee Replication to Employee Central Employee Data ReplicationODataUpdate.jpg     

        3. ERP Employee Data Replication Confirmation to EC Employee Data Replication ConfirmationODataCreate.jpg

Capabilities of OData Adapter


  1. OData Operations - The OData adapter provides Query, Read, Create, Update, Merge & Delete operations.
  2. Authentication Methods - The adapter supports Basic Authentication (Username & Password) and Client Certificate based authentication.
  3. HTTP Proxy Settings - The OData adapter supports the usage of HTTP proxies.
  4. HTTPS Communication - OData adapter supports secure communication by using SSL certificates.
  5. Using HTTP Destinations - The adapter supports use of HTTP destinations created in NetWeaver Administrator, within the channel configuration. This reduces the effort of manually entering HTTP connection details during channel configuration.
  6. Pagination - The OData adapter handles server side pagination. It merges the response from the server in the form of multiple pages into one single payload. It also supports client side pagination using query parameters.
  7. Integration with Channel Ping - The adapter is integrated with channel ping, which enables validation of channel configuration.


Capabilities of SFSF Adapter


  1. OData API - The SFSF adapter can now be used to communicate with the SuccessFactors OData API's. The OData operations supported by the SFSF adapter are Query, Read, Create, Update, Merge & Delete.
  2. Adhoc Entities - The adapter enables querying Adhoc entities from SuccessFactors system. You can also specify the maximum time the adapter should wait for the query results (timeout).
  3. HTTP Proxy Settings - The adapter supports the usage of HTTP proxies.
  4. HTTPS Communication - SFSF adapter supports secure communication using SSL certificates.
  5. Using HTTP Destinations - The adapter supports use of HTTP destinations created in NetWeaver Administrator, within the channel configuration. This reduces the effort of manually entering HTTP connection details during channel configuration.
  6. Optimized Message Processing - The adapter enables optimized message processing by packaging the SuccessFactors query results into less number of XI messages.
  7. Integration with Channel Ping - The adapter is integrated with channel ping, which enables validation of channel configuration.
  8. Integration with Alerting - The adapter is integrated with alerting and can raise alerts for these error situations; Inactive channel, Error during channel initialization, Authentication failure and Message processing failure.
    1. Operate Sender Channel using Web Services - The adapter supports usage of web services for controlling the sender channel.

    2. Compatibility Matrix


      SAP NetWeaver Process Integration, connectivity add-on 1.0, SP1 is compatible with:

      • SAP NW PI 711 >=SP12
      • SAP NW PI 730 >=SP10
      • SAP NW PI 731 >=SP09
      • SAP NW PI 740 >=SP04

       

      The connectivity add-on runs fully in the Java stack and supports all valid deployment options of SAP NetWeaver Process Integration (ABAP + Java, Java only). The add-on  is included in your SAP Process Integration license and you do not need any further license to use the adapters.


      Additional Resources


      1. Help Documentation:  http://help.sap.com/nw-connectivity-addon101/
      2. Software Download Link: http://service.sap.com/swdc  > Software Downloads > Support Packages and Patches  > Browse our Download Catalog > SAP NetWeaver and complementary products > SAP NW PI CONNECTIVITY ADDON > PI CONNECTIVITY ADDON 1.0
      3. FAQ's Note - 1964868
      4. For information on SP0 of the add-on, refer the blog: SuccessFactors (SFSF) Adapter for SAP NetWeaver Process Integration

      Cancelling Holding Status Messages in Bulk

      $
      0
      0

      While dealing with EOIO queues we have often come across
      situations wherein thousands of messages get stuck in holding status in Adapter engine due to wrong data.

      Even if we knew that these messages are of no use we end up in deleting them one by one due to EOIO nature.

       

      However there is a method wherein we can terminate the entire sequence which will lead to cancellation of these holding status messages in bulk.

       

      Following is the process to do so:

       

      Step 1:Select the below mentioned columns from the “configure table Columns” option in RWB’s Message Monitoring.


      1.jpg

      Step 2: By using the sequence ID in the advanced selection criteria, list down all messages stuck in the particular sequence, this step will give you a count of messages in that sequence. In normal behavior one of the message should be in system error status and rest of them should be in holding status.

       

      2.jpg

       

      In below example Sequence ID “SEQID001” has 8 messages in it.

       

      3.jpg

       

      Step 3:Open Sequence Status Monitor using below URL

       

      http://<host>:<port>/MessagingSystem/monitor/sequenceStatus.jsp

       

      <host> is hostname of PI System and <port> is http port of PI System.

       

      Step 4: Enter the sequence ID and appropriate date range to include all messages in that sequence and click apply filter.

       

      4.jpg

       

      Step 5:Click on Terminate Sequence (hard), in order to cancel the sequence. The Sequence will show status as TERMINATED once done.

       

      5.jpg

       

      6.jpg

       

      Step 6: Go back to message monitor and select all messages of that sequence and click resend. This step will set the status of the messages to “Cancelled with Errors”.

       

      7.jpg

       

      Click “Refresh” to show the updated status.

       

      8.jpg

       

      Please post your comments to improve the content 

      Export Integration Flows from NWDS To Desktop !!!

      $
      0
      0

      The reason for writing this blog is a PI consultant posted a thread in SCN (http://scn.sap.com/thread/3544808),

      for a  issue while exporting the Integration flow to file system or desktop. So I would like to show the step to be

      followed to export the Integration flow to file system.

      1. Login to Integration directory in NWDS.
      2. For exporting a Integration flow goto option as shown in the below pic.

      1.jpg

      3. Select Integration Flow as the drop down option,as shown in the next screen.  2.jpg

      4. Choose the Flow that needs to be exported. like --> 3.jpg

      5. Choose next and again the objects involved in the flows. -->4.jpg

      6. In the next screen enter the file name, where Integration flows would be saved under this name

      And using browse button select the target location. -->5.jpg

       

      7.  After pressing ok, flows will be exported to desktop. -->6.jpg

      8. The file is available in the desktop. --> 7.jpg

       

       

      That's it, very simple to export..


      Thanks & Regards,

      avinash M

      How to zip and encode a field as base64 with a UDF in a Message Mapping

      $
      0
      0

      It's possible that in some scenarios you should compress  as a ZIP and send a document through  a WebService in base64. For achieve this we can use UDF and some libraries.

       

      The UDF will be this:

       

      public String invoice2zip(String invoice, String filename, Container container) throws StreamTransformationException{
      ByteArrayOutputStream baos = new ByteArrayOutputStream(); 
      ZipOutputStream zos = new ZipOutputStream(baos);  
       BASE64Encoder encoder = new BASE64Encoder();
      String encoded = "";
      try{     zos.putNextEntry(new ZipEntry(filename));      zos.write(invoice.getBytes());     zos.close();      encoded = encoder.encode( baos.toByteArray());
      }catch(Exception e){
      return "";
      }     return encoded;
      }

      We need to import a library for Base64 encoding:  sun.misc.BASE64Encoder  

       

      Mapping:

       

      mapping.PNG

       

      UDF:

       

      test.png

       

      Regards

      Viewing all 741 articles
      Browse latest View live


      <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>