Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

Step by step to add current date and timestamp/any data in middle of receiver file name using abap mapping

$
0
0

Hi All,

 

Requirement :  I want to add current date and timestamp in the middle of the receiver file name using abap dynamic configuration functionality.

 

Step 1: Configure Receiver file communication channels like below.

c1.png

Step 2 : click on Advanced tab and check adapter specific message attributes.

c2.png

 

Step 3: Create mapping class in SE24.

Below is the sample code in mapping class.

c3.png

c4.png

 

Step 4: Assign mapping class at operation mapping level.

c5.png

 

Step 5 : Provide our operation mapping details at Interface determination level.

c6.png

 

Then output filename will be Test+Current date + timestamp + .csv.pgp

 

Hope this  will be useful.

 

Thank you,

Narasaiah T.


Salesforce adapter in SAP HANA Cloud Integration

$
0
0

Join this session on June, 9th, 2016, 17:00 (GMT 2) - Amsterdam, Berlin, Rome, Stockholm, Vienna -  to learn how you can use the Advantco Salesforce adapter to easily connect to Salesforce.

 

The Salesforce adapter is built on the powerful Adapter Development Kit (ADK) of SAP HANA Cloud Integration. The ADK offers a rich set of capabilities for helping you develop adapters to various systems and applications. Advantco is a leading SAP partner providing integration solutions, consulting services and custom development. With their out-of-the-box Salesforce adapter, you can get your Salesforce systems integrated in no time.

 

In this Webinar, we shall talk about:

Overview of the Connectivity options in SAP HANA Cloud Integration

SAP and Salesforce integration architecture

Advancto Salesforce adapter

Use case demo

 

Register here

 

Presenters:

Peter Ha - Product Manager - Advantco International LLC

Hemachandran, Sujit - Product Manager, SAP HANA Cloud Integration

Delete Operation Using SFSF Odata Adapter

$
0
0

Recently we have got requirement to delete some records from successfactors using SFSF adapter. We have already performed INSERT,UPDATE & UPSERT operations using SFSF Adapter, but we never worked with DELETE operation. We don't have DELETE operation in Query modeler in Juno or Kepler. But, finally we have found a way to use DELETE operation successfully. So here i'm sharing my experience.


Objective:

     To perform delete operation using SFSF - Odata Adapter in order to delete a record from Odata Entity in Successfactors.


Scenario:

     For example, we have employee position available in Successfactor, we wants to delete one record from it using SFSF Adapter.



Steps:


Open a query modeler in juno/kepler, select your interested entity(in this case it's position) and select Update(PUT) operation as we don't have DELETE operation there.

It will select Key fields from that entity. No need to add more fields here. Click on finish so that you will have XSD file generated

Capture.PNG

 

Import this XSD in ESR. It looks like below.

Capture2.PNG

 

Use this external definition at receiver side which is successfactor.

Create all necessary ESR components and Iflow.

Copy resource path which is generated by query modeler. In this case it is "Position('code',datetime{effectiveStartDate})".

 

Then you need to configure SFSF Odata Receiver adapter as below.

Capture1.PNG

 

As per our XSD, XML looks as below.

Capture3.PNG

In successfactors Odata autdit log, we can see HTTP request like this.

Capture4.PNG

 

It has successfully deleted entry of Position.

Capture5.PNG

 

You can use below references to know more about SFSF Adapter and Query Modeler.



References:

 

SFSF Adapter - SuccessFactors (SFSF) Adapter for SAP NetWeaver Process Integration

Modeling - PI.SFSF Integration.DOC - How to Model Successfactors SOAP and ODATA Entities using Eclipse Juno Tool.

Outbound support for TLS 1.1/1.2

$
0
0

Why use it?

The Payment Card Industry (PCI) announced SSL and TLS 1.0 as no longer secure. (Date Change for Migrating from SSL and Early TLS)

Vendors like Salesforce.com adopt PCI standards and disable SSL and TLS 1.0, so in near future we are forced to use TLS 1.1 or 1.2.

 

What's the problem

Activating TLSv1.1 or TLSv1.2 on the client side unfortunately results in handshake failures with a certain non-marginal number of older servers.

They implement the negotiation of the SSL/TLS protocol version incorrectly (TLS protocol version intolerance). 

Beside version intolerance, TLS extensions in the ClientHello handshake message can cause handshake failures with older servers not supporting that(TLS extension intolerance).

Because of that, all TLS communication has to be tested before using TLSv1.1 or TLSv1.2.

See note 510007 - Setting up SSL on Application Server ABAP for further details.

 

Outbound communication using IAIK library

Note 2284059 - Update of SSL library within NW Java server introduced new TLS versions for outbound communication using the IAIK library.

Default configuration is stored in iaik_ssl.jar in folder /usr/sap/<SID>/J21/j2ee/cluster/bin/ext/mail-activation.

iaik_ssl.jar contains a SSLContext.properties in folder iaik\security\ssl, listing the default config parameters.

 

SSLContext.properties

 

 

#########################################

 

#  SSLContext properties 

#  supported since ISASILK 4.4

#  Location of configuration file is iaik/security/ssl/SSLContext.properties within CLASSPATH

#  It can be redefined with system property iaik.security.ssl.configFile

#  e.g.java -Diaik.security.ssl.configFile=file:c:/java/SSLContext.properties

#

#########################################

# allowLegacyRenegotiation is set to trueotherwise we cann't communicate with unpatched peers

allowLegacyRenegotiation=true

# unsecure renegotiation is disabled forSSL server but remains allowed forclient SSL

server.disableRenegotiation=true

#deactivated to avoid regressions after ISASIK5.102

chainVerifier.checkExtensions=false

#avoid issues with IIS server

extension=signature_algorithms

 

Manual configuration is possible using custom config file

To enable custom configuration, one has to set the property "iaik.security.ssl.configFile". This is possible using the ConfigTool.

 

SSLConfigFile.PNG

 

Afterwards you have to create a file having your custom properties e.g. ssl.config.

Recommendation: Copy the values known from SSLContext.properties to avoid problems.

 

Important custom parameters are listed below:

 

ParameterExample valuesDescription
client.minProtocolVersionTLS12Requires TLS 1.2 as minimum version for communication. Lower versions are disabled.
client.maxProtocolVersionTLS11Limits usage of TLS to version 1.1
protocolVersionsnfe.fazenda.sp.gov.br(TLS11,TLS11)

Limits communication to TLS 1.1 using domain names. First value is min and second max value. If ports other than 443 are used,

they have to be added using a colon. example.com:5443(SSL20,TLS12)

 

More options can be found in SAP Note 2284059 - Update of SSL library within NW Java server.

 

Testing outbound communication

To test communication you can use XPIInspector. Use Example 11 (Authentication, SSL & PP) or Example 50 (XI Channel) if Example 11 does not deliver any results. (Seems to happen for FTPS channels)

 

SSL Debug Error

 

Begin IAIK Debug:

 

ssl_debug(21): Starting handshake (iSaSiLk 5.104)...

 

ssl_debug(21): Sending v3 client_hello message to preprod.connect.elemica.com:5443, requesting version 3.3...

 

ssl_debug(21): Sending extensions: renegotiation_info (65281), signature_algorithms (13)

 

ssl_debug(21): Received alert message: Alert Fatal: unexpected message

 

ssl_debug(21): SSLException whilehandshaking: Peer sent alert: Alert Fatal: unexpected message

 

ssl_debug(21): Shutting down SSL layer...

 

ssl_debug(21): Closing transport...

 

SSL Debug Success

 

Begin IAIK Debug:

 

ssl_debug(1): Starting handshake (iSaSiLk 4.5)...

 

ssl_debug(1): Sending v3 client_hello message to connect.elemica.com:5443, requesting version 3.1...

 

ssl_debug(1): Received v3 server_hello handshake message.

 

ssl_debug(1): Server selected SSL version 3.1.

 

ssl_debug(1): Server created newsession EA:05:A1:1E:C5:04:C5:2F...

 

ssl_debug(1): CipherSuite selected by server: SSL_RSA_WITH_3DES_EDE_CBC_SHA

 

Solving problems

Facing any intolerance errors, try to reduce TLS Versions allowed for domain using parameter protocolVersions. To prevent that the server uses signature extensions, only possible way is to set min version to SSL20 at the moment.

 

Find channels using TLS/SSL

There is no standard way how to find all channels using TLS/SSL. Some channels can be found with the extended search of the Integration Builder using the attribute "Adapter Type".

 

ExtendedSearch.PNG

 

This does not work for SOAP channels, therefore we used a SQL-statement to find out all SOAP channels and filtered for https in Excel.

 

SQL statement

select  a.CONTEXTID, a.OBJECTID, b.channel,      

 

        xmlparse(a.ATTRBTS)                      

from ppo.sapj2ee."XI_DIRSYNCCHANNEL"a,          

     ppo.sapj2ee."XI_DIRKEYCHANNEL"  b           

where a.objectid = b.objectid                    

  and   a.msgPROT = 'SOAP'                 

with ur;

 

Note: We are using DB2, xmlparse is a special function there

Integration on the run

$
0
0

Last Saturday evening as I was running along the river Rhine together with numerous other fellow women runners my mind drifted back to the busy week I had at Sapphire Orlando from May 17 – May 19.  Running is a great way to streamline your thought process and kind of get into a flow. I would recommend everybody who are juggling multiple projects both at work and home consider going for a run as it helps you streamline your thought process completely.

 

Compared to the tons of messages filled in my head when I got back home from Orlando I was able to focus on the key messages especially for those of you out there working in the integration space.

 

  • Across all keynotes it was clear that the Cloud Integration service of the HANA Cloud Platform is the default integration technology for all SAP solutions
  • The SAP API Business Hub which was launched at Sapphire will be the central catalog of APIs exposed by SAP Solutions (HCP, S/4 HANA, SuccessFactors, Ariba etc.)
  • SAP packs a powerful punch with the combination of well-defined APIs, packaged integration content on HANA Cloud Platform and the best in class cloud integration technology with HANA Cloud Integration

 

I myself had quite some sessions and meetings with customers at Sapphire and ASUG which went into exploring many of these topics in detail. Especially the session around exploring the extensions and integration in SAP S/4 HANA received a lot of attention from customers who wanted to understand how their integration and extension strategy would shape up compared to what they do today and how it would look like with S/4.  I really enjoyed moderating the ASUG roundtable on cloud integration as it was completely slide ware free and everybody could share their cloud adoption journey, their challenges with integration and how SAP and the ecosystem is working together to address most of these challenges.

Sindhu_Sapphire_schedule.png

SapphireASUGRoundtable.jpg

 

I enjoyed showing the demo around sentiment analysis in the context of S/4 HANA bringing together HCP, the packaged integration content as well as the  sentiment analysis module in S/4 HANA. This is a classical example of a digital transformation story in action. You use HANA Cloud Platform Integration Service to integrate with all the social media channels like Twitter, Facebook etc and push this data into the sentiment analysis module in S/4 HANA where you can slice and dice the data to take actionable insights as well as have targeted campaigns for specific user groups interested in your product portfolio.  If you are interested to know more I wrote an SAP Insider article as part of a series of articles around ‘How SAP S/4 HANA is evolving’

 

 

If you have questions on any of these sessions or around our cloud integration strategy do not hesitate to get in touch with me Sindhu Gangadharan

Looking forward to hearing from your side. Last but not least check out the keynote by SAP Executive Board Member Bernd Leukert.

SuccessFactors Deltasync - Option 1

$
0
0

Of late there have been a few discussion topics raised on how does Deltasync work in SuccessFactors.

 

Now Deltasync is only available for SFAPI entities - Compound Employee, FO_objects. For OData there are custom options available.

 

Out-of-the-box

 

A sender adapter polls SuccessFactors periodically (this could be hourly or daily or could be even based on a schedule you set up for the Sender Communications Channel to poll at 7 AM then 12 PM and then again towards close of day at 6 PM - totally up to requirements and business hours). This will then fetch all the changes to the Entity via the Deltasync option (which is based on the last_modified_on element).

 

For example, the following is the select statement from SuccessFactors for the Location Foundation Object -

 

SELECT address_state, end_date, externalCode, last_modified_on, name, start_date, status FROM FO_location WHERE  last_modified_on > to_datetime('${deltasync.maxDateFromLastRun}')

 

In the Advanced tab of the Sender Communications Channel, switch on Use Additional Settings and enter the parameter deltasync.maxDateFromLastRun and a value from where you want the fetch to happen from (thanks to Heiko Konert for posting the below picture on another discussion).


Delta Sync.jpg

 

And that is it!


A lot of Integration aficionados (including myself) have wondered where is this data held at? It is a parameter that cannot be modifiable! And perhaps for a good reason too.

 

The only way you can "change" it is by creating a totally new Communications Channel for your scenario. So that tells me that the  Communications Channel name must be a part of the key here. Thanks to Heiko Konert  for diligently finding out the parameters SFSF + Party + Service + Communications Channel Name + Channel Object Id.

 

This is the most basic scenario supported by SuccessFactors. What if for some reason the query needs to be executed at a different time (earlier time) from the value set in deltasync.maxDateFromLastRun?

 

We will take a look at potential solutions in another blog.

 

Kevin Laevers you may find this blog useful!

 

Regards

Arijit

Open Letter to SAP regarding HCI

$
0
0

Dear SAP,

 

 

I was not at the recent Sapphire conference in Orlando, but thanks to social media and SCN, I have been able to get some updates from the event. It was great to hear Daniel Graversen reporting from Sapphire that integration remains a key to SAP's push to the cloud. This was again emphasized by Sindhu Gangadharan in her recent blog, Integration on the run.

 

 

I have been mulling over some of the key messages mentioned in Sindhu's blog and hence the reason for this open letter.

 

 

As someone who plies his trade in SAP's integration space, SAP's direction has an impact on me, if not in the short term, then definitely in the foreseeable future (well, unless I call it a day, and open up my own non-SAP related business!).

 

 

After 10 years of working on XI/PI/PO/PRO, it was a breath of fresh air to be able to bring HCI "out on a test drive" when I got access to a trial tenant. The use of graphical-based Integration Flows to design integration scenarios is so intuitive compared to the early days of XI/PI. I applaud SAP's commitment to open standards by using Apache Camel as the message processing model in HCI, as well as providing HCI development tools on the open Eclipse development environment.

 

 

It is exciting to see the benefit of using Camel's open model as illustrated in Nic Botha's series of HCI related blogs. With access to reusable components publicly available in the non-SAP world, I can envision it reducing the time-to-market for custom adapters/solutions that are not yet available from SAP. Additionally, the availability of prepackaged integration content for a variety of integration scenarios will help to reduce "reinventing the wheel" that is quite prevalent in software development these days.

 

 

However, as much as I want to say that I love HCI, I still have some reservations. I have mentioned some of it in my blog about my experience with HCI. Additionally, per my comment on Sindhu's blog, if HCI is the default integration technology for all of SAP's solution, I would struggle to implement some integration requirements that I am able to achieve on PI/PRO today. And it is not just me, taking a stroll down the discussion forums, you would find others struggling too:-

HCI: big problems in small scenarios

HCI Custom functions using Groovy Script | SCN

 

 

As with any new or yet-to-mature product/service, it is a chicken & egg scenario - customers might not opt for HCI due to the lack (whether real or perceived) of HCI skills in the market, and the lack of installed base means there are not many opportunities for developers to get implementation experience. I have had contacts asking my opinion if they should invest their (hard-earned) money to attend HCI training in anticipation of market demand for HCI skills.

 

 

In my humble opinion, the success of SAP is intricately linked to the success of the army of SAP-skilled (internal, customer-based, partner-based or independent) professionals. Again, I applaud SAP for providing free access to trial HANA Cloud Platform (HCP) system that does not expire, and can be easily registered by anyone with an S or P-user account. This is a fantastic avenue for developers to hone their skills on HCP-related developments. However, although HCI also provides a free trial access, it is only provided upon request and expires after 30 days.

 

 

This is an area where I believe SAP can do more. By providing free trial access to HCI that does not expire and easily registered (similar to HCP), SAP can enable the multitude of seasoned PI/PRO developers to transition to its default integration technology. This removes the entry barrier for developers seeking to attain HCI skills. At the same time, such trial program will provide SAP with numerous feedback (similar to a Beta testing program) regarding the service's features and functionalities, which is crucial to the improvement of the service. Lastly, customers will be more willing to embrace HCI as their integration platform with the availability of HCI skills in the market.

 

 

In my book, this is a win for SAP, a win for its customers and a win for its consultants/developers.

 

 

These are my two cents in an effort to keep SAP relevant.

 

 

Yours truly,

Eng Swee

Deep Insert Suport in OData Provisioning in HCI

$
0
0

Introduction

As per the OData V2 specification, an OData Entry being created may contain links to other Entries in the service. If that is the case, the server is expected to create the inline Entry and the appropriate Links. For example, to create a new product in the catalog that is associated with the category Entry, you would execute a POST request against the OData.svc/Products collection with a product Entry containing a link to the Category Entry (using any URI that resolves to that resource).

Alternatively you can create and link an Entry to a related Entry by leveraging the addressing scheme if the server supports addressing related items. For example, if a server implements the OData URI conventions, the address …/Categories(10)/Products points at all the products in the specified category. When a POST request is issued against that products collection (instead of the top-level products collection), the server will create the new product Entry and automatically link it to the parent category.

You may combine the above two methods to create an Entry that is related to another one implicitly through the relationship implied in the URL, and related to other Entries by explicitly-specified Links in the request body. When you need to create multiple related Entries, you can do so as independent operations or perform a single POST with a tree of Entries (if the Links between Entries allow it structurally). The tree is formed by using inline expansion. All expanded Entries are considered new. Servers process a request with inline Entries by creating individual Entries and then linking them in the same way as linking would have happened in an independent request. The related Entry or collection of Entries is represented as the child element of an element.


For additional details refer the OData V2 links below:

http://www.odata.org/documentation/odata-version-2-0/operations/

http://www.odata.org/documentation/odata-version-2-0/uri-conventions/

http://www.odata.org/documentation/odata-version-2-0/atom-format/

 

Scenario

In HCI an OData service exposing a SOAP data source should be able to support deep insert operation on an entity in such a way that if the corresponding entity belonging to the SOAP service data model has a parent child relationship with another entity, then this should also get persisted along with the parent entity. There are certain prerequisites that defines the structural contract for the OData model and the SOAP service data model, which should be met and strictly followed to achieve the deep insert use case.

The edmx should look like


<?xml version="1.0" encoding="UTF-8"?><edmx:Edmx xmlns:edmx="http://schemas.microsoft.com/ado/2007/06/edmx" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" Version="1.0">    <edmx:DataServices m:DataServiceVersion="2.0">        <Schema xmlns="http://schemas.microsoft.com/ado/2008/09/edm" xmlns:sap="http://www.sap.com/Protocols/SAPData" Namespace="Test">            <EntityType Name="User">                <Key>                    <PropertyRef Name="userId"/>                </Key>                <Property Name="userId" Nullable="false" Type="Edm.String"/>                <Property Name="address" Type="Edm.String"/>                <Property Name="contactNo" Type="Edm.String"/>                <Property Name="email" Type="Edm.String"/>                <Property Name="userName" Type="Edm.String"/>                <NavigationProperty FromRole="From_User" Name="ServiceSet" Relationship="Test.UserService" ToRole="To_Service"/>            </EntityType>            <EntityType Name="Service">                <Key>                    <PropertyRef Name="serviceID"/>                </Key>                <Property Name="serviceID" Nullable="false" Type="Edm.String"/>                <Property Name="serviceName" Type="Edm.String"/>                <Property Name="userID" Type="Edm.String"/>            </EntityType>            <Association Name="UserService">                <End Multiplicity="1" Role="From_User" Type="Test.User"/>                <End Multiplicity="*" Role="To_Service" Type="Test.Service"/>                <ReferentialConstraint>                    <Principal Role="From_User">                        <PropertyRef Name="userId"/>                    </Principal>                    <Dependent Role="To_Service">                        <PropertyRef Name="userID"/>                    </Dependent>                </ReferentialConstraint>            </Association>            <EntityContainer Name="default" m:IsDefaultEntityContainer="true">                <EntitySet EntityType="Test.User" Name="UserSet"/>                <EntitySet EntityType="Test.Service" Name="ServiceSet"/>                <AssociationSet Association="Test.UserService" Name="UserServiceSet">                    <End EntitySet="UserSet" Role="From_User"/>                    <End EntitySet="ServiceSet" Role="To_Service"/>                </AssociationSet>            </EntityContainer>        </Schema>    </edmx:DataServices></edmx:Edmx>

Prerequisites

  1. There should be an association property defined between two OData entities
  2. There should be a one to many relationship cardinality defined between the two OData entities. This is also applicable for entities belonging to SOAP service data model.
  3. There should be a referential constraint defined between the two OData entities
  4. The web service API exposed for create operation should adhere to a nested inline structure format and return the parent entity structure as shown below:

Note: Deep Insert can be done up to any level, however the example below is a reference for only up to one level


Web Service API Input/Request Payload XML Format:


<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"

    xmlns:sei="http://sei.soap.service.metering.rt.gw.sap.com/">

    <soapenv:Header />

    <soapenv:Body>

        <sei:createSubscription>

            <!--Optional: -->

            <arg0>

                <!--Optional: -->

                <services>

                    <!--Zero or more repetitions: -->

                    <service>

                        <!--Optional: -->

                        <serviceID>1</serviceID>

                        <!--Optional: -->

                        <serviceName>getStockUpdates</serviceName>

                        <!--Optional: -->

                        <userID>1</userID>

                    </service>

                    <service>

                        <!--Optional: -->

                        <serviceID>2</serviceID>

                        <!--Optional: -->

                        <serviceName>purchaseStocks</serviceName>

                        <!--Optional: -->

                        <userID>1</userID>

                    </service>

                </services>

                <!--Optional: -->

                <user>

                    <!--Optional: -->

                    <address>Bangalore</address>

                    <!--Optional: -->

                    <contactNo>67867867</contactNo>

                    <!--Optional: -->

                    <email>user1@sap.com</email>

                    <!--Optional: -->

                    <userId>1</userId>

                    <!--Optional: -->

                    <userName>user1</userName>

                </user>

            </arg0>

        </sei:createSubscription>

    </soapenv:Body>

</soapenv:Envelope>

 

Web Service API Output /Response Payload XML Format:


<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">

   <soap:Body>

      <ns2:createSubscriptionResponse xmlns:ns2="http://sei.soap.service.metering.rt.gw.sap.com/">

         <return>

            <address>Bangalore</address>

            <contactNo>67867867</contactNo>

            <email>user1@sap.com</email>

            <userId>1</userId>

            <userName>user1</userName>

         </return>

      </ns2:createSubscriptionResponse>

   </soap:Body>

</soap:Envelope>

 

In the iflow only 1 entity can be modelled. In this case it is UserSet. Hence we should have a script before request mapping to fetch list of services and place this value in header. We should also have a script after request mapping which combines the output from mapping and value that was stored in header in previous script. This will be the request payload for soap.



Pic1.PNG


Request Mapping


Pic2.PNG


Response Mapping


Pic3.PNG

Script Before request Mapping

 

import java.lang.StringBuffer;
import java.lang.Class;
import com.sap.gateway.ip.core.customdev.logging.*;
import com.sap.gateway.ip.core.customdev.util.Message;
import org.w3c.dom.Document;
import org.w3c.dom.NodeList;
import org.w3c.dom.Node;
import org.apache.olingo.odata2.api.ep.feed.ODataFeed;
import org.apache.olingo.odata2.api.ep.entry.ODataEntry;
def Message processData(Message message) {    // Get the body from exchange  Document payload = message.getBody();  Node node = payload.getDocumentElement();  processNodes(payload.getDocumentElement(), message);  log.logErrors(LogMessage.TechnicalError, "message header1: "+  message.getHeaders().get("InlineEntry"));   return message;
}  def Message processNodes(Node node, Message message) {  StringBuffer reqXML = new StringBuffer();  NodeList nodeList = node.getChildNodes();  for (int i = 0; i < nodeList.getLength(); i++) {        Node currentNode = nodeList.item(i);  if (currentNode.getNodeName() == "ServiceSet") {  reqXML.append("<services>");  NodeList nodeList1 = currentNode.getChildNodes();  for (int j = 0; j < nodeList1.getLength(); j++) {  Node odataEntryNode = nodeList1.item(j);  if (odataEntryNode.getNodeName() == "Service") {  reqXML.append("<service>");  Node serviceNodeList = odataEntryNode.getChildNodes();  for (int k = 0; k < serviceNodeList.getLength();k++) {  Node serviceNode = serviceNodeList.item(k);  if (serviceNode.getNodeName() == "userID") {  reqXML.append("<userID>" + serviceNode.getTextContent() + "</userID>");  } else if (serviceNode.getNodeName() == "serviceID") {  reqXML.append("<serviceID>" + serviceNode.getTextContent() + "</serviceID>");  } else if (serviceNode.getNodeName() == "serviceName") {  reqXML.append("<serviceName>" + serviceNode.getTextContent() + "</serviceName>");  }  }  reqXML.append("</service>");  }  }  reqXML.append("</services>");  } else {  processNodes(currentNode, message);  }    }  if (reqXML.toString() != null && reqXML.toString().length() > 0) {  log.logErrors(LogMessage.TechnicalError, "reqXML: "+ reqXML);  message.setHeader("InlineEntry", reqXML);  log.logErrors(LogMessage.TechnicalError, "message header: "+ message.getHeaders().get("InlineEntry"));  }  return message;
}

Script after request mapping


import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.*;
import com.sap.gateway.ip.core.customdev.logging.*;
def Message processData(message) {  String payload = new String(message.getBody(), "UTF-8");  log.logErrors(LogMessage.TechnicalError, "payload in script2: "+payload);    StringBuffer newReqXML = new StringBuffer();    String inlineEntry = message.getHeaders().get("InlineEntry");    log.logErrors(LogMessage.TechnicalError, "InlineEntry: "+inlineEntry);    def tokens = payload.split("(?=<)|(?<=>)");    log.logErrors(LogMessage.TechnicalError, "tokens: "+tokens);    for(int i=0;i<tokens.length;i++) {        log.logErrors(LogMessage.TechnicalError, "tokens: "+tokens[i]);  newReqXML.append(tokens[i]);        if(tokens[i].contains("</user>")){            newReqXML.append(inlineEntry);        }    }    log.logErrors(LogMessage.TechnicalError, "newReqXML: "+newReqXML.toString());    message.setBody(newReqXML.toString());    return message;
}

OData Request Payload


<entry xmlns="http://www.w3.org/2005/Atom"

 

    xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"

    xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices"

    xml:base="https://localhost:8083/gateway/odata/SAP/MTR4;v=1/">

    <id>

        https://localhost:8083/gateway/odata/SAP/MTR4;v=1/UserSet('3')

    </id>

    <title type="text">UserSet</title>

    <updated>2015-09-11T19:14:22.678+05:30</updated>

    <category term="EXPAND.User"

        scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" />

    <link href="UserSet('3')" rel="edit" title="User" />

    <link href="UserSet('3')/ServiceSet"

        rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/ServiceSet"

        title="ServiceSet" type="application/atom+xml;type=feed">

        <m:inline>

            <feed xml:base="https://localhost:8083/gateway/odata/SAP/MTR4;v=1/">

                <id>

                    https://localhost:8083/gateway/odata/SAP/MTR4;v=1/ServiceSet

                </id>

                <title type="text">ServiceSet</title>

                <updated>2015-09-11T19:14:22.68+05:30</updated>

                <author>

                    <name />

                </author>

                <link href="UserSet('3')/ServiceSet" rel="self" title="ServiceSet" />

                <entry>

                    <id>

                        https://localhost:8083/gateway/odata/SAP/MTR4;v=1/ServiceSet(serviceID='5')

                    </id>

                    <title type="text">ServiceSet</title>

                    <updated>2015-09-11T19:14:22.681+05:30</updated>

                    <category term="EXPAND.Service"

                        scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" />

                    <link href="ServiceSet(serviceID='5')" rel="edit" title="Service" />

                    <content type="application/xml">

                        <m:properties>

                            <d:serviceID>5</d:serviceID>

                            <d:userID>3</d:userID>

                            <d:serviceName>getStockHolderNames</d:serviceName>

                        </m:properties>

                    </content>

                </entry>

                <entry>

                    <id>

                        https://localhost:8083/gateway/odata/SAP/MTR4;v=1/ServiceSet(serviceID='6')

                    </id>

                    <title type="text">ServiceSet</title>

                    <updated>2015-09-11T19:14:22.682+05:30</updated>

                    <category term="EXPAND.Service"

                        scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" />

                    <link href="ServiceSet(serviceID='6')" rel="edit" title="Service" />

                    <content type="application/xml">

                        <m:properties>

                            <d:serviceID>6</d:serviceID>

                            <d:userID>3</d:userID>

                            <d:serviceName>getNetWorth</d:serviceName>

                        </m:properties>

                    </content>

                </entry>

            </feed>

        </m:inline>

    </link>

    <content type="application/xml">

        <m:properties>

            <d:UserID>3</d:UserID>

            <d:UserName>user3</d:UserName>

            <d:ContactNo>324-5443-4353</d:ContactNo>

            <d:Email>user3@sap.com</d:Email>

            <d:Address>Chennai</d:Address>

        </m:properties>

    </content>

</entry>

 

OData Response Payload


<?xml version="1.0" encoding="utf-8"?>

<entry xmlns="http://www.w3.org/2005/Atom" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xml:base="http://localhost:8080/com.sap.gateway.core.gwcoreip.web/odata/SAP/MTR4;v=1/">

    <id>http://localhost:8080/com.sap.gateway.core.gwcoreip.web/odata/SAP/MTR4;v=1/UserSet('3')</id>

    <title type="text">UserSet</title>

    <updated>2015-09-24T13:23:02.103+05:30</updated>

    <category term="EXPAND.User" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme"></category>

    <link href="UserSet('3')" rel="edit" title="User"></link>

    <link href="UserSet('3')/ServiceSet" rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/ServiceSet" title="ServiceSet" type="application/atom+xml;type=feed"></link>

    <content type="application/xml">

        <m:properties>

            <d:UserID>3</d:UserID>

            <d:UserName>user3</d:UserName>

            <d:ContactNo>324-5443-4353</d:ContactNo>

            <d:Email>user3@sap.com</d:Email>

            <d:Address>Chennai</d:Address>

        </m:properties>

    </content>

</entry>


Is there anything you ever wanted to know about cloud security but were afraid to ask?

$
0
0

Enterprise digital security is a daunting area and almost impossible to get right. When moving to the cloud, digital security is often the number one concern for most enterprises. Furthermore, getting digital security wrong in the cloud may affect you employees, your customers and may even result in irreparable damage to public trust in your company's brand on a global scale.

If your company has moved to the cloud or is planning on moving to the cloud, now is the time to learn about cloud security.

275985_l_srgb_s_gl.jpg

 

The SAP Technology RIG is starting a series of web events covering various topics related to cloud integration. In our first virtual session Paul Todd and

Nghia Nugyen will give a "by example" introduction to cloud security. Furthermore, we would like to hear from you (using the comments section below) on other security topics you would like us to cover.

 

UPDATE: Thank you to all of you who have joined. If you missed this session you can view the recording here.

Make your SAP Java deployments lighter and faster !

$
0
0

Introduction:


I was recently working on modifying one huge PI java development which involved lot of Java projects. The projects involved using libraries for:

  • openSQL for optimizing data access for db2 databases.
  • Reading configuration set for the application to set system specific properties
  • Sending idocs via JRA
  • Checking permission using security permissions API


And the development was executed as a Java job which sends a Java proxy message in some cases and as idocs in others.

 

As I was making changes to the project, I realized that the deployment was taking pretty long – it was taking around 2:19 minutes to perform the deployment of the two EAR files for the project

 

1.png

 

 

The two EAR files are around 5.5 MB together and hence it's not surprising that the deployment takes so long. Development systems are always thin in terms of configuration and I was looking for ways to speed this up.

2.png

 

Approach


How to reduce the file size?

 

Assume we’ve an EJB project which gets deployed as an EAR. The same approach can be extended for any Java EE object.

 

The EJB project stays as is . The below screenshot shows the libraries required for build. We don't make any changes here.

 

3.png

 

 

 

In the EAR project, we need to do the following ( after all, it's the EAR project which gets deployed ).


1) The EAR project was set up to add the extra libraries. They need to be removed.

4.png

 

So it appears as follows after the change is done with all libraries removed.

 

5.png

 

2) To satisfy dependency at runtime, go to Javadoc for the class and find the DC.


eg. for NamePermission class the link is

http://help.sap.com/javadocs/CE/se/com.sap.se/com/sap/security/api/permissions/NamePermission.html

 

and the DC info is :

 

6.png

 

 

Copy the DC name and replace / by ~ .So the value is tc~je~usermanagement~api.

 

Search for this in Java class loader in NWA.

 

 

7.png

 

So the entry to be added to application-j2ee-engine.xml will be :

 

                <reference

                                reference-type="hard">

                                <reference-target

                                                provider-name="sap.com"

                                                target-type="library">tc~je~usermanagement~api</reference-target>

                </reference>

 


Conclusion:


Now, if we build the project and check the EAR file size, it has reduced a lot.

8.png

 

 

And the deployment takes much lesser time.

 

9.png

 

 

It’s down to 23 seconds from 139 seconds !

 

The same approach can be used for custom libraries if they’re being used in another project. This applies to PI adapter modules development as well since they get deployed as EAR files.


Using this approach, the deployments are optimized in both size and time. More importantly, it optimized the most valuable resource - a developer's brain cycles which are not spent waiting . I also see it as aligned with DRY principle of software engineering. There's one library doing a certain activity in a system.


Hope it's useful for the developer community as more developments are created on Java stack .

You don't need 10 years of Java coding experience to code in PI/PRO! - TechEd 2016 Barcelona

$
0
0

tenyears.gif

 

Struggling with your evil HR director because you don't have 10 years of Java coding experience?

 

seriously_2.jpg

 

Want to improve on Java coding skills for PI/PRO developments but don't know where to start?

 

Overwhelmed with frequent PI/PRO requirements which require complex Java developments?

 

Tired of repeatedly having to copy-and-paste the same block of logic for each new development?

 

Worried that a code change might unintentionally affect existing scenarios or functionalities?

 

 

 

If you have ever pondered upon any of these (with the exception of your evil HR director!), come and join me at SAP TechEd Barcelona this year as I host a Community Session exploring best practices to Enhance Productivity for Java Development in SAP Process Orchestration.

 

 

It will be a lecture-styled session peppered with loads of practical tips and some demos. I will be sharing some of the best practices that I have picked up over the years as a PI/PRO developer. In particular, it will revolve around techniques and tools meant to improve productivity when dealing with custom Java development for PI/PRO. Highlights of the items that will be covered are:-

  • Modularisation techniques for coding
  • Release the power of Object Oriented Programming by using Design Patterns
  • Explore Eclipse productivity tools like EGit and JUnit

 

 

The session code is INT37733, and you can also check out the rest of the sessions at Barcelona in the following link.

SAP TechEd Barcelona | November 08-10, 2016 | Home

 

 

This will be my first time ever at a TechEd conference so I'm also really excited at the opportunity to meet face-to-face and network with fellow SAP community members.

 

 

Be there or be square!

 

 

 

 

Images courtesy of:

Dilbert Comic Strip on 1997-10-29 | Dilbert by Scott Adams

No seriously tell me more... | Memes.com

How to hook up Ariba and SAP Business Suite in less than "an hour" using SAP adapter without middleware

$
0
0

Hello all,

 

I wanted to share my experience on using SAP adapter to connect SAP ECC to SAP Ariba network as a buyer without using a middleware.

 

Prerequisites:

 

  • Install Ariba adapter for SAP Business Suite 1.0.
  • Register and obtain Ariba Network ID (ANID) from Ariba (Seller test accounts are free)
  • SOA Manager is configured, certificate is installed using STRUST and bgRFC has been setup by basis (1-2 Hour activity)


Note:  Ariba Network Integration 1.0 for SAP Business Suite consists of 3 components. Download these components and install them on your SAP ERP system. The components are:

  • ARBFNDI1
  • ARBERPI1
  • ARBFNDI2; This component is only required if your SAP ERP system is on EhP4 or higher

Also download all Attribute Change Packages (ACPs) for this product.


Configuration:

Lets get started on configuration of the system. The config nodes for the SAP Ariba adapter can be found in...SPRO>>Integration with Other SAP Components>>SAP Business Suite Integration component for Ariba""The configuration of the Ariba Adapter has 2 components.

  1. "Framework Setting"; which is used to connect SAP to Ariba
  2. "Application specific setting"; which is used to configure what is sent to Ariba from SAP.

 

Framework Setting:

This configuration is used to Setup the connection framework between the two systems.

  1. Define Credentials and End Points for Ariba Network
    1. Enter your Ariba Network ID (ANID)
    2. Share Secret which you will need to use in Ariba Network during enablement
    3. Test Account Flag (Only if you are connecting to test ANID which ends with XXXXX-T)
    4. Enable End Points - "End points not enable"; Note: If you are using multiple test system, you need to enable here.*Credentials.JPG
  2. Define Basic Message Settings
    1. Choose New Entries
    2. Add cXML type for Purchase order (ORDR) with direction outbound mapping version V003 and cXML version 1.2.029*
    3. Add cXML type for Order confirmation (CONF) with direction inbound mapping version V002 and cXML vverion 1.2.029 and Send cXML statusUpdateRequest Message ticked.*messageconfig.JPG
  3. Direct Connectivity Settings: Please refere the prequisites.
    1. Define Settings for Polling Agent; add as per screenshot below.Polling.JPG
  4. Map Unit of Measure Codes for cXML Messages
    1. Enter All ISO UOM that need to be converted to UNUOM standards and the other way around.UoM.JPG

Application-Specific Settings

This configuration is used to setup the configuration of sending documents between SAP and Ariba.

  1. Assign Ariba Network ID to Company Code

Assign CC.PNG    

2. Define Message Output Control


POConfig.PNGMap.JPG   

3.  Define Document-Specific Message Customizing

Doc Setting.JPG   

4.  Map Texts of SAP ERP and Ariba Network (Map inbound and Outbound per your requirementtext.JPG   

5. Integration for Buyers

    • Enable Vendors for Ariba Network (Activates New seller ANID if it couldn't find a vendor under the Name in Ariba network and post document

    quick enable.JPG

    The Configuration required for transmitting a Purchase order (PO) and receiving a Purchase order Acknowledgement/Confirmation (POA) is complete.

    Testing the configuration:

     

    • Create a purchase order in SAP and validate the output generated a XML output.

         PO ouput.JPG

    • Validate SXI_MONITOR transaction to validate the the output has been sent successfully.

              Monitor.JPG

    Conclusion

         Who would have through that you can connect SAP to Ariba so quickly. Good Luck prototyping in any of your test environment. Ariba provides test seller accounts for free.


    What's Next

         Over the next few weeks I would will add the following blogs that are related to the SAP Ariba Adapter.

    1. Capabilities and limitation is functionalities of using SAP Ariba Adapter
    2. Gotya's to look out on the SAP Ariba Adapter
    3. How to generate identify what is mapped and extend the XML mapping when using SAP Ariba Adapter
    4. How to handle error using forward error handling when using SAP Ariba Adapter
    5. Various lessons learnt during the SAP Ariba implementation using adapter.

     

    • * Will cover on future blogs where I can explain the enhancements required in detail.

    SAP Ariba Adapter: How to generate mapping document and map extrinsic segments if required

    $
    0
    0

    Hello All,

     

    In this blog, I share my experience on using standard SAP tools to extract mapping documents and enhance the cXML fields for Ariba.

     

    Prerequisite:

     

    •      You are using SAP Adapter to connect from SAP ECC to Ariba Cloud.

     

    Download Mappings

     

    •     Firstly a ABAP program (No Transaction available) can be used to download the mapping  between SAP structures to cXML for Ariba in a excel format.
      • Program name: ARBFND_DCXML_EXPORT_TO_EXCEL

     

    Note: Note 1918732 contains the complete cXML structure plus the mappings.

    Excel.JPG

    There a lot of fields that are available for use but not all of them are mapped. e.g for ORDER_REQUEST there are over 4000+ fields.

    fields.JPG

    excelscreen.JPG

    • Secondly the SAP Structures and tables start with ARB and are easy to find or debug.

     

    Enhancement point or user exists

     

    There are 2 enhancement point which can be pretty much customise the interface to all needs. One for the message going out and one for message coming in with pre and post enhancement options.

     

    Inbound.JPG

    outbound.JPG

     

    The website cXML.org can be used for all cXML related questions.

     

    Good luck!!!

    Gotya's to look out on the SAP Ariba Adapter

    $
    0
    0

    Hello All,

     

    I would like to share my experience on the features of SAP Ariba Adapter that are available, planned for future or were missing during my implementation. My experience is limited to purchase order outbound, inbound purchase order acknowledgement , inbound shipping notice, supplier invoices and invoice status.

     

    Purchase order

     

    • Standard purchase order, subcontracting, consignment, service orders are supported
    • Returns purchase orders are not supported via Ariba (Scoped for future)
    • Purchasing group details are not mapped and can be mapped to contact details on the xml via enhancement. Please check my other blog which details enhancements/exists.
    • The phone and fax number format (Area code, city code) are different in Ariba than SAP and add a extra brackets and we had to enhance.
    • Price tolerance on the purchase order are sent to Ariba and has influence on inbound purchase order acknowledgement (POA). They could stop purchase order acknowledgement/confirmations in Ariba based on tolerance sent from SAP.
    • Freight condition are not mapped  by default however there is an estimated Freight cost that can be sent to Ariba via XML. We added the sum of all planned freight cost and passed onto a single field on the xml.
    • Some of our suppliers hardcoded the plant number ( buyerLocationID) to addresses rather than reading  address details from XML. This impacted third party orders sent to suppliers. Hence for all third party orders we change location tag to  'OneTime'. We also updated the mapping information guide document for vendor to map address when they see a 'OneTime' address.

     

    Shipment Notification (ASN)

     

    • Shipment notification create inbound delivery only. For third party orders we needed to post goods receipt. This had to be enhanced.
    • Serial number in SAP are only 18 Characters. If the serial over 18 character the will be truncated. We enhance to read from right to left.
    • If we didn't flag batch material where as supplier sent batch it will populate the data.
    • Batch number is SAP has only 10 character and had to be validated.

     

    Supplier Invoice

     

    • Ariba add-on supports invoice and credit memo. However doesn't support subsequent credit. We enhanced the logic.
    • All fright are posted as unplanned Freight. We added custom logic to handle this.
    • Freight on header of invoice are not supported by add-on adapter. We added custom logic to handle this.
    • Invoice reference number in SAP are always shown as capitals. Ariba support both capital and small letters. This makes it pass the duplicate check in SAP. We enhanced it to change to capital letter.
    • Payment terms from the supplier overrides the defaults if they are populated. This means supplier can override your payment terms.
    • Baseline date calculation had issue for end of month date calculation
    • Freight on credit memo was enhanced as well.

     

    Invoice status

     

    • Some invoice status error details are not self explanatory for some errors. This may need enhancement.

     

    Hope you find this document useful.

    Debug Outbound Proxy in ECC System - Troubleshoot IS_URL Configuration

    $
    0
    0

    Introduction

    We can send the proxy message to either integration engine or adapter engine using IS_URL parameter in SXMB_ADM integration engine configuration as described in this document How to Set Up the Communication between ABAP Backend and SOAP Adapter using XI Protocol, even though we maintain all the configurations in ECC to route the proxy message to adapter engine, still the message reach to ABAP stack as mentioned in this thread No receiver could be determined for ICO PI 7.3, we just need to make sure all the configurations in place and guess what went wrong.

     

    In this blog i want to show you how to debug the outbound proxy message to troubleshoot wrong configuration or some other problems in ECC, then we can identify the exact problem without guessing.

     

    Configuration in PI

    I have created below ICO in PI, The scenario is PROXY to FILE. I have created all the channels and activated ICO in the integration directory.

    ICO2.png

    Configuration in ECC

    The global configuration of integration engine in ECC system is pointing to ABAP stack of PI system using HTTP destination.

    global1.png

    The below is the HTTP destination which we used above to send the proxy message to integration server of PI ABAP stack.

    abap1.png

    But this particular interface configured as ICO and ICO only runs on adapter engine, we need to send the proxy message to adapter engine of PI system. I have created sender ID in transaction SXMSIF. I mentioned exactly same sender component, sender interface and sender namespace which is maintained in the ICO.

    sender2.png

    The below is the interface specific IS_URL configuration in SXMB_ADM->Integration Engine Configuration which is pointing to G type destination.

    is_url.png

    The below is the G type  destination which we used above. It is pointing to adapter engine of PI system.

    aae1.png

    Deregistration of Queue

    To be able to debug the outbound proxy message we need to deregister the outbound proxy queue which is XBTS*, go to SMQR select the XBTS* queue and click deregistration button in the application toolbar.

    deregister.png

    After we done the deregistration we can observe registration column updated with 'U'.

    unregister.png

     

    Trigger the Message

    Trigger the outbound proxy using outbound proxy program or test message from SPROXY.

    report1.png

    After we trigger the outbound proxy the message will be handed over to qRFC and we can see the entry in the SMQ2 and outbound proxy queue XBTS*. As we deregister the queue, the LUW will not processed automatically it will be sitting in READY status like below.

    smq2_scheduled.png

     

    Get Class and Method

    Before debug the message we need to identify what is the class and method from the trace of the previous messages in SXI_MONITOR, below is the example messages and in the trace you can see the class and method where the interface specific URL will determined.

     

    Class Name: CL_XMS_PLSRV_IE_ADAPTER

    Method Name: ENTER_PLSRV

    trace2.png

     

    Debug Proxy Message

    Now we can debug the LUW which is sitting in the queue, select the LUW and click on Debug LUW in the application toolbar.

    debug_luw.png

    The debug session will be started in separate window. We can set the breakpoint at specific method which we identified above. Click Breakpoints->Breakpoint at -> Breakpoint for Method like below.

    break1.png

    Next screen enter the class name and the method name like below then click on continue.

    break2.png

    After that run F8 or continue in the application toolbar as mentioned below.

    execute.png

    The system will take you to above class and method, with in the method line 57 getting the URL of integration server.

    geturl.png

    Right click on the line and select Create Session Breakpoint then it will create a session breakpoint on that line. Press F8 again the control will stop at line 57 and then click on F5 to go into that method.

    session_break.png

    With in this method line 56 getting the list of sender ID's which matches the current message header attributes, create the session breakpoint and press F8 then the control will stop at line 56 and press F5 to go into that method.

    getinterfaceid.png

    With in this method get all the configured sender ID values from SXMSIF transaction and reading one by one and comparing the current message attributes like party, service, interface and namespace. If match found adding it to sender ID internal table.

    getIDTab.png

    After finish above method if any entries found for senderID table then system is reading the interface specific IS_URL for that sender ID using function module SXMB_GET_CONFIG. If the entry found then writing the entry to trace of the message which we observed initially in SXI_MONITOR. After this system will send the proxy message to adapter engine using identified IS_URL.

    is_url_final.png

    You can find below result of the function module, it identifies the adapter engine URL which we configured in SXMB_ADM.

    iserver_url.png

    If there is no sender ID found for this interface based on message header then system will take the IS_URL from the global configuration.

    golbal_url.png

    Conclusion

    Using above described debug steps we can identify the problem if we have done any wrong configuration in the system for interface specific IS_URL without blind guesses. This is one of the example i showed in this blog, we can debug some other issues related to outbound proxy in future using above steps. I hope this helps whoever face above similar problem in future.


    SAP HCI - Facebook Integration - Part 2

    $
    0
    0

    In SAP HCI - Facebook Integration - Part 1 we saw how to obtain Facebook Connectivity artifacts and add it to SAP HCI Secure Parameters. In this blog I will show how to configure Facebook adapter to read post and comments.

     

    Integration Flow

     

    I have developed a simple Integration flow to be triggered every one hour. The Request-Reply service call branch will invoke Facebook endpoint and send the Message body as email.

     

    5.PNG

     

    Get Posts Endpoint Configuration:

     

    This producer endpoint will return posts(/feeds) from a User or Page. In the example I will show how to read feed from a User page. Configure the receiver channel as below.

     

    1. Choose Endpoint as Get Posts.

    2. User / Page ID: This parameter is filled with the User ID copied from the App created in previous blog Step-6.

    3. OAuth Settings parameters in the channels will refer the Secure Parameters artifact names created in the previous steps.

     

    1.PNG

    2.PNG

     

    This service call returns result as facebook4j.ResponseList<facebook4j.Post> object as message body.Below is an example payload from demo scenario.

    7.png

     

    This above message response can be read using a Groovy Script which I will cover in next blog.

     

    Get Post Comments Endpoint Configuration:


    This provider endpoint will retrieve all the comments of requested post. It requires a Post ID as Channel Parameter which is retrieved in Get Posts producer endpoint. Below is receiver channel configuration.

     

    1. Select Endpoint as Get Posts Comment.

    2. Set the Post ID. (Note: In next blog we will see how to read from facebook4j.Post using Groovy Script and set in the channel)

    3. OAuth Settings parameters in the channels will refer to the respective Secure Parameters artifact.

    3.PNG

    (Note : For Demo purpose Post ID is hard coded to read the comment using the same Integration Flow)

     

    This service call returns result as facebook4j.ResponseList<facebook4j.Comment> object as message body. Below is an example payload from demo scenario.

    8.png

    SAP HCI - Facebook Integration - Part 1

    $
    0
    0

    SAP HCI is delivered with Receiver Facebook Adapter. The Adapter is based on Apache Camel Facebook Component. It uses Facebook4J Java Library to invoke Facebook Graph API.

     

    Apache Camel Facebook Component has methods to perform Read, Add and Delete operation on most of the Facebook objects like posts, likes, comments, photos, albums etc... However the SAP HCI Facebook Adapter support option to read user, post and comment object with the below three endpoint. Also read SAP Help document.


        • Get Posts
        • Get Post Comment
        • Get Users

     

    In the blog(s) I will explain technical steps to configure a communication channel for Facebook Integration. Part-1 will explain the steps required to generate the connectivity artifacts required for channel configuration and SAP HCI - Facebook Integration - Part 2 will explain the configuration steps in Integration Flow. I do not intend to cover a business use case of Facebook Integration with this blog but just the technical configuration steps.

     

    Register Facebook Application

     

    Facebook Graph API uses Facebook Login and Access Tokens to control access permission. So the first step is to register a Facebook Application and obtain below artifacts to configure Facebook adapter.

     

        • Application ID
        • Application Secret
        • Access Token

           

    Steps:


    1. Login to https://developers.facebook.com/ with your Facebook account and select "Create New App" and then Website.

    2. Select "Skip and Create App ID",  Type a Name for your App, and complete the other details and select Create App ID.

    3. Navigate to Settings --> Basic and copy the App ID and  App Secret.

     

         3.png

     

    4. Navigate to "Access Token Tool" using URL https://developers.facebook.com/tools/accesstoken/.

    5. Copy the App Token (do not expire)of the application created. Also generate the User Token (valid for an hour and can be regenerated).

    4.png

    Note :  App Token or User Token can be used as Access Token. But User token will expire and need to be assigned permission(Scope) like user_posts, public_profile etc in User Data Permission.

     

    6. Click on the Debug button in User Token and copy the User ID from the next page .

    5.png

     

    Creating Secure Parameter


    Next step is to add the connection artifacts to security parameter.

     

    From NWDS:

     

    1. Open Integration Operation perspective.

    2. Right Click on the Tenant Node and Select Deploy Artifact.

    3. Select Secure Parameter.

    4. Provide a Name (eg: FacebookAppID) and copy paste the App ID.

    5. Repeat the same steps for App Secret and Access Token.


    In next blog SAP HCI - Facebook Integration - Part 2 we will see adapter configuration steps to read data from Facebook.

    Split architecture allows customers to upgrade to NetWeaver 7.5

    $
    0
    0

    Many customers are migrating from dual stack SAP process integration (PI) to SAP Process Orchestration (PO) systems as SAP road maps indicates that it will stop supporting SAP dual stack systems soon. In this blog I will provide the information on how customers can leverage dual stack split of the system so that they don’t have to invest time and money in converting complex ccBPM to NetWeaver BPM as there is no direct migration path available. To upgrade from dual stack to single depends on many factors as described below.

     

    Case 1: Migration to PO 7.5 single stack


    • Low complexity landscape with no complex scenarios
    • No BPM scenarios
    • Usage Classical scenario
    • No legacy system interfacing using BPM
    • No Java mapping
    • No XSLT mapping
    • No ABAP mapping
    • NO local tables maintained in PI ABAP stack

     

    Case2: Upgrade to 7.5 with split architecture to run dual stack environment


    • Complex landscape connecting many legacy system
    • Complex BPM connecting the 3rd party system
    • High usage of Java mapping
    • High usage of ABAP mapping
    • High usage of XSLT mapping
    • High usage of BPM in the landscape
    • Usage of Integrated Configuration Objects (ICO)

     

    In case1 there are no issues as it would be a straight forward migration from dual to single stack which can be accomplished using SAP migration tool as shown below.

    NWSplit.png

    In case2 there is more complexity involved due to multiple integration touch point. In this case we can still perform the migration of dual to single stack but it would be expensive and not straight forward which will raise many questions with respective to TCO and ROI from stake holders. In this situation SAP has provide some relief to customers running on dual stack and allow them to upgrade to NW 7.5 and still use dual stack architecture.

     

    Dual-stack SAP PI systems with a release ≤ 7.4 remain supported as dual-stack systems – new installation, update and upgrade to these releases still supported.

     

    Installation of SAP PI 7.5 and higher: For SAP systems based on SAP NetWeaver 7.5 and higher, dual stack no longer supported, without exception As a consequence, no dual-stack installation is offered as of SAP PI 7.5 – instead, standard installation consists of a separate ABAP + a separate Java stack.

     

    • You install application Server ABAP for SAP Process Integration there, also Java users for AS Java for SAP Process Integration system are created and    system is prepared to get connected to AS Java.

     

    • You install application Server Java for SAP Process Integration AS Java for SAP PI system uses User Management Engine (UME) of AS ABAP for SAP PI  system that you must have installed before.

     

    Upgrade to SAP PI 7.5 SP1 and higher:  After upgrading to SAP PI 7.5, you first have to split still existing dual-stack SAP PI systems before their usage is supported – for this, dual-stack split procedure now also offered for SAP PI 7.5 for more information please refer below link http://sapassets.edgesuite.net/sapcom/docs/2015/07/96224dc2-5b7c-0010-82c7-eda71af511fa.pdf

     

    Hope this helps!

    How to use external XSLT processor for PI Dual stack installations.

    $
    0
    0

    As many of us know, one of the new features introduced in SAP NetWeaver 7.3 EHP1 is the possibility of using custom XSLT Processor to support XSLT 2.0 transformations.

     

    There is a wonderful blog written by Chris Christoffersen which describes the steps need to be done in order to enable this new feature in PI/PO systems.

     

    One thing: some users reported in comments, that after applying all necessary settings and performing all steps external proccessor stiil wasn't involved in XSLT mappings.

     

    After some investigations, including attempts with different XSLT processor's JARs, the hint was found withing SAP Help page, describing this new feature:

     

    To enable the use of external transformers, a new property has been introduced in the exchange profile.

     

    Yes, the key difference between PI Dual Stack and PI AEX/PO systems is that PI Dual Stack reads parameter com.sap.aii.ibrun.server.mapping.externalTransfomer from Exchange Profile, while PI AEX/PO system takes this parameter from service XPI Service: AII Config Service.

     

    Thus, to enable using of external XSLT processor in PI Dual Stack system we should access Exchange Profile by following URL:

     

    http://server:port/webdynpro/dispatcher/sap.com/com.sap.xi.exprofui/XIProfileApp

     

    find parameter in parameters tree:

     

    ExtXSLTProc.png

     

    and set its value to "true" (mark checkbox).

     

    Viola! Now our PI Dual Stack system is ready to work with XSLT 2.0 transformations.

     

    Regards, Evgeniy.

    Michal's Tips: Stop testing your interface scenarios - you're not doing it anyway right ?

    $
    0
    0

    When we're starting a project both functional consultants and developers are both responsible to describe a set of test scenarios which always need to be executed to check if the interface is working properly. Functional consulstants will put all important business scenarios which need to work and developers will update those with some cases where they know the interface is being developed in a complex way (multiple lines, summarizations and other complex mapping logic). Thanks to this cooperation we can get a pretty decent subset of integration scenarios which once run will make sure the interface scenario is working perfectly. Running all of the prepared test scripts needs to happen in a few project phases:

     

    a) during the first integration testing phase (when the interface is being executed end to end for the first time ever)

     

    b) after we implement each change to the interface scenario during integration testing, user acceptance testing and any other testing phase which may be performed in between those two but before the first golive

     

    c) after golive when we need to fix any existing scenario or add any new functionality to it

     

    How does that look like in reality (at least from my 12 years of experience with >25 clients) ?

     

    a) during the first integration testing phase we need to check all possible scenarios, otherwise the interface would not work

     

    b) after we implement each change to the interface scenario we're usually in the middle of "rapid" development where everything needs to be finished ASAP and in many cases the development was already approved so testing is only run with a subset of the subset (maximum 1-2 testscripts)

     

    c) after golive when we need to fix any existing scenario or add any new functionality to it the we have a few choices:

    - hot fix - needs to be done immediatly (ASAP is too slow) - so we fix, run a test case and move to production (praying that it till not cause any failures to any other scenario)

    - new functionality - depending on the possible lead time - a small change can either be implemented if the lead time is small (meaning we don't test too much) or we don't implement the change (as testing team needs to run all possible test scripts and it takes 10 days to do it so business realizes they can live without the change - sad but also happens)

     

     

    What does that mean in reality? That we only have two choices:

     

    a) we can either push for running all prepared test scripts but risk huge project delays or simply rejecting any changes to the existing interface scenarios

     

    b) we can stop testing (vide articles's title) and run one or two test scripts and keep on praying when we transport to production environment

     

    What is the reason for that ? I've been asking myself the same question many times and I came into the conclusion that it's because of lack of interface scenario testing tools. I'm not saying that they don't exist, I'm only saying that they do not respond to the needs of both business and developers. What would those two groups need ? I'm hoping for your input for the same but let me just present my short list.

     

    Developers:

     

    a) being able to run a full set of interface scenarios tests with a single click after implementing each change without waiting for anyone else (especially from the business)

    b) not having the need to going to any transaction/entry screen as the module knowledge cannot be mandatory to retest an inteface after the change

    c) being able to test the interface both on development and on quality boxes (not only on quality after the change is transported)

     

    Business:

     

    a) being able to record a test script case from any existing document which was processed in the past and was posted correctly without the need to recreate it again

    b) being able to be sure that all of the fields will always be validated (and not only the ones selected during the initial test script preparation)

    c) test script execution in backgrund everyday validating all transports and changes done by the developemnt teams (as the latter can often change and may not be aware of what needs to be retested from te technical perspective)

     

     

    Request:

     

    Would anyone have any inputs on this topic ? It would also be possible for me to organize a session (SAP Mentor expert table) at SAP Teched 2016 (Barcelona or Vegas) if someone would be interested to discuss how to test/retest integration scenarios or to show how it's being done at their company. I'd kindly ask you to provide any input if you think this is a valid but not that much discussed topic.

     

     

    Important info:

    If the testing process looks completely different then described please do let me know as I can only tell what from what I've experienced.

    Viewing all 741 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>