Quantcast
Channel: ATeam Chronicles
Viewing all 376 articles
Browse latest View live

Retaining Mappings in ICS when Endpoints Change

$
0
0

ICS (Integration Cloud Service) is a PaaS cloud offering from Oracle that provides capabilities of integrating applications both on-cloud and on-premise. With any integration, there is a requirement to version the integration flows/services. The need to version integration flows is because the endpoint’s schemas change from time to time, due to application changes, for business needs. When the endpoints change, the integration flow needs to also change to make use of the new schema fields. ICS provides capabilities to version integration flows. This is a great way to add new versions of services/Integration flows and also keep the old services/Integration flows intact.

When a new version of Integration flow is created there is a need to retain the old mappings and create new mappings for new fields. ICS provides the functionality of retaining the mapping through the “Regenerate” endpoint functionality.

This blog demonstrates the functionality to regenerate endpoint.

Sample Integration Flow

Below is a simple ICS integration flow to demonstrate the regenerate endpoint functionality. The integration flow has a SOAP source endpoint and REST destination endpoint. The integration flow was created on version 16.1.5 of ICS.

Flow Overview

figure1

Mapping

The integration flow has a simple Request mapping as shown below.

figure2

figure3

WSDL

The source SOAP endpoint has 2 input fields input1 and input2 as shown below.

figure4

Regenerate Endpoint WSDL

Change WSDL

Assume that the SOAP endpoint schema changes and  2 more fields’ input3 and input4 are added as shown below.

figure5

Import New WSDL

After these above fields are added, a new WSDL is created. Once the new WSDL is created, one may choose to clone the existing integration or edit the existing integration if versioning is not required. After this, the connection needs to be updated with the new WSDL.

figure7

Updating the WSDL and trying to save the connection pops up a warning that the connection being modified is used by one or more integration and that the integrations needs to be reactivated for changes to take effect. Click Yes.

figure8

Regenerate Endpoint

To regenerate endpoint wsdl, go to the integration flow and click on the source endpoint (endpoint whose schema was updated). To retain the mappings click on the regenerate icon as shown below.

figure9

This will pop up confirmation message whether you would like to proceed with the regeneration. Click Yes.

figure10

After confirming, a message that the regeneration successful message will be shown as below.

figure11

By clicking on the mappings, one can see that the old mappings have been retained and the new fields added.

figure12

Edit Endpoint Vs Regenerate Endpoint

Editing the endpoint instead of regenerate does not retain the mappings as shown below.

figure13

After clicking on the edit icon the following screens pop up.

figure14

figure15

As seen below the color of the mappings activities change from green to grey. This indicates that both request and response mappings are lost on edit.

figure17

Summary

The regeneration of endpoints is supported for all endpoints.This blog demonstrates how to regenerate endpoints when the schema on the applications change. This functionality comes in handy when they are incremental schema changes. The regenerate endpoint functionality helps retain existing mapping therby saving effort to redo all mappings on schema changes.


Using Event Handling Framework for Outbound Integration of Oracle Sales Cloud using Integration Cloud Service

$
0
0

Introduction:

Oracle’s iPaaS solution is the most comprehensive cloud based integration platform in the market today.  Integration Cloud Service (ICS) gives customers an elevated user experience that makescomplex integration simple to implement.

Oracle Sales Cloud (OSC) is a SaaS application and is a part of the comprehensive CX suite of applications. Since OSC is usually the customer master and is the center for all Sales related activities, integration with OSC is often a requirement in most use cases

Although OSC provides useful tools for outbound as well as inbound integration, it is a common practice to use ICS as a tool to integrate OSC and other SaaS as well as on-premises applications. In this article, I will explore this topic in detail and also demonstrate the use of Event Handling Framework (EHF) in OSC to achieve the same.

Main Article:

Within ICS you can leverage the OSC adapter to create an integration flow. OSC can act both as source (inbound)  or as target (outbound) for integration with other SaaS or on-premises applications; with ICS in the middle acting as the integration agent. While the inbound integration flow is triggered by the source application, invoking the outbound flow is the responsibility of OSC.

InboundIntegration OurboundIntegration

In this article, I will discuss the outbound flow, where OSC acts as the source and other applications serve as the target. There are essentially 2 ways of triggering this integration:

  • Invoking the ICS integration every time the object which needs to be integrated is created or updated. This can be achieved by writing groovy code inside create/update triggers of the object and invoking the flow web service by passing in the payload.
  • Using the Event Handling Framework (EHF) to generate an update or create event on the object and notify the subscribers. In this case, ICS registers itself with OSC and gets notified when the event gets fired along with the payload

 

OSC supports events for most important business objects such as Contact, Opportunities, Partners etc. More objects are being enabled with EHF support on a continuous basis.

In this article, I will demonstrate how to use EHF to achieve an outbound integration. We will create a flow in ICS which subscribes to the “Contact Created” event and on being notified of the event, updates the newly created contact object. While this integration is quite basic, it demonstrates the concept. While we use Update Contact as a target for our integration, you can use another SaaS application (for example Siebel or Service Cloud) as the target and create a Contact there.

Integration

 

Detailed steps:

Before starting, let’s identify some URLs. For the example, we will need 2 URLs – One for CommonDomain and one for CRMDomain. You can find these out using from Review Topology under Setup and Maintenance

CRM_URL FS_URL

The URLs will be of the following form:

CommonDomain: https://<instance_name>.fs.us2.oraclecloud.com

CRMDomain: https://<instance_name>.crm.us2.oraclecloud.com

I will refer to these URLs as COMMON_DOMAIN_URL and CRM_DOMAIN_URL in the rest of the article.

Let’s now move on to configuring our environment and creating a example integration based on events.

The first step is to create a CSF key so that Sales Cloud can connect to ICS and invoke the subscriptions. In R11, this can be achieved through SOA Composer. To access SOA Composer, navigate to <CRM_DOMAIN_URL>/soa/composer

Inside SOA Composer, click on “Manage Security” to  open “Manage Credentials” dialog. The name of csf-key should be the same as identity domain on the ICS instance. Provide username and password of the user that OSC should use to invoke ICS subscriptions.

Note: Customers didn’t have this ability in R10 and it had to be done by the operations team.

001_CSF_Key

Login to ICS Console and and in the home page, click on Create Connections followed by Create New Connection

01_ICS_Home_Page

02_Connections

Click Select under Oracle Sales Cloud

 

03_Create_Connection

Provide a unique name and identifier for the connection. Optionally, provide a detailed description. Click Create

04_New_Connection

 

You will see the prompt that the connection was created successfully and will automatically go to the connection details page. It tracks your progress as well. Click Configure Connectivity

05_Connection_Created

In the Connection Properties page, provide details as follows:

OSC Services Catalog WSDL URL: <COMMON_DOMAIN_URL>/fndAppCoreServices/ServiceCatalogService?wsdl

OSC Events Catalog URL: <CRM_DOMAIN_URL >/soa-infra.

06_Connection_Properties

Click Configure Connectivity

07_Configure_Credential

Provide credentials of the service user that will be user for integration and click OK

08_Credentials

Connection details page shows the connection is 85% complete. The only step remaining at this point it to test the connection to make sure all the details provided are correct. Click on Test

09_Test

If all the provided details are correct, you will see message confirming the test was successful. Progress indicator also shows 100%. At this point, you Save and click Exit Integration.

10_Test_Successful

You see a confirmation that the connection was saved successfully. You can also see the new connection in the list.

11_Connections

The next step is to use this connection to create an integration. Click on Integrations followed by Create New Integration.

12_Create_Integration

In the Create Integration – Select a Pattern dialog, click Select under Map My Data. You may choose a different pattern based on your integration requirements but for this example, we will use Map My Data pattern.

13_Select_Pattern

In the New Integration – Information dialog provide the unique name and identifier for this integration, an appropriate version number, and optionally a package name and description.

14_Integration_Information

Drag and drop the connection that we created on the source. This opens the Configure Sales Cloud Endpoint wizard.

15_Integration_Created

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

16_Configure_Sales_Cloud_EP

In section titled Configure a Request, choose With Business Events to create this integration using Business Events in OSC. For this example, we will use Contact Created Event which fires when a contact is created is OSC. Click Next.

17_Pick_Event

In the next screen under section titled Response Type, choose None and click Next.

18_Response

The wizard shows the endpoint summary. Review the details and click Done.

19_EP_Summary

Now we have to create a target endpoint. Usually this target will be another application that we are integrating with OSC. For our example, we will simply use OSC as a target application itself. Drag and drop the OSC connection we created earlier into the target.

20_EP1_Done

In the Configure Sales Cloud Endpoint wizard, provide the name, and optionally a description of the endpoint. Click Next.

21_Configure_Sales_Cloud_EP

Under section titled Select a Business Object find the Contact object and click on it. The drop down below the operations this object supports. For this example, choose updateContact and click Next.

22_Pick_Business_Object

The wizard shows the endpoint summary. Review the details and click Done.

23_EP2_Summary

Now we need to map the source payload to the target payload. Clicking on the Map icon followed by the “+” icon to create a mapping.

24_EP1_Done

In the mapping wizard, you can specify the appropriate mapping. For our example, we will use a very simple mapping to update the PreviousLastName with the value of LastName we received in the payload. This doesn’t add a lot of value, but serves the purpose of illustrating an end-to-end integration. Drag and drop PartyId to PartyId from source to target and LastName to PreviousLastName from source to target. Click Save and Exit Mapper.

25_Map1

The integration details page shows our integration is 77% complete. One final step is to add tracking fields which allow us to identify various instances of integration. Click on Tracking.

26_Tracking

Drag and drop appropriate fields from Source into tracking fields and click Done.

27_Tracking_Identifiers

Now our integration is 100% complete. We can optionally choose an action for the response and fault. For our example, we will skip this step. Click on Save followed by Exit Integration.

28_integration_Complete

ICS console shows the integration was saved successfully. Newly created integration also shows up in the list of integrations. Click to Activate to activate this integration.

29_Integration_Saved

In the confirmation dialog, click Yes.

30_Activation_Confirmation

Once the integration is active, a subscription for it is created in OSC. You can review this subscription, as well as all the other subscriptions by invoking the following URL from your browser:

<CRM_DOMAIN_URL>/soa-infra/PublicEvent/subscriptions

31_Subscriptions

You can now create a Contact in Sales Cloud and it will almost instantaneously be updated with the new value of Previous Last Name.

 

Enhancing ICS Mappings with Custom Java Classes

$
0
0

Introduction

One of the most common tasks performed during the creation of integrations in ICS (Integration Cloud Service) is the implementation of mappings. In a nutshell, mappings are the resources that ICS uses to allow messages coming from the configured source application to be sent to the configured target application. Failure in properly defining and configuring these mappings directly impacts how integrations are going to behave while sending messages downstream.

In order to build mappings in ICS, users make use of the mapping editor. The mapping editor allows for the creation of complex XPath expressions via an intuitive drag-and-drop interface. Besides the support for XPath expressions, it is also possible to use built-in XSLT functions available within the Mapping Components section of the mapping editor, as shown in figure 1.

fig00-creating-mappings-editor

Figure 1: ICS mapping editor with the functions palette expanded.

However, it is not uncommon to find situations in which the set of built-in functions is not adequate to perform a specific data handling operation. When that happens, most people using ICS feel they’ve hit a roadblock due to the fact that there is no way to simply add a custom function. While there is always the possibility to open an SR (Service Request) within Oracle and request an enhancement, sometimes this is not possible because the ongoing project requires at least a workaround in order to be able to finish the use case in a timely manner.

This blog is going to show how classes from ICS’s Fusion Middleware foundation can be leveraged to provide custom data handling in mappings. To illustrate this, the following sections will show how to perform Base64 data decoding, using a utility class from the Oracle WebLogic API.

Programming in XLST Directly

In contrast to what many people think, ICS is not a black box. You can access pretty much everything that is generated by ICS when you export the integration, as shown in figure 2. Once you have access to the integration archive file, you can see what ICS generated for you and in case of mappings, even change it.

fig01-exporting-integration

Figure 2: Generating an integration archive.

With this option in mind, most people who are familiar with programming in XSLT feel more comfortable in handling each mapping directly; by using its own programming constructs if of course, it is valid under the XSLT specification. In order to be able to write XSLT code for the mappings, the first thing you need to do is locate the .XSL file that handles the mapping under the integration archive structure.

Be aware that .XSL files are only generated if you perform at least one initial mapping. Therefore, you must create a mapping using the visual editor to generate the file. Once generated, you can change it to suit your needs.

Typically, the location of the .XSL files within the integration archive use the following filename pattern:

$ROOT/icspackage/project/$INTEGRATION_NAME_VERSION/resources/processor_XXX/resourcegroup_XXX

Each mapping in the integration generates a processor_XXX folder. For example, in a typical SOAP-based request-reply integration, there will be at least three mappings: one for the request flow, one for the response flow and one for the fault flow. In this particular case, you can expect that there will be three processor_XXX folders under the resources folder. “XXX” in this context will be a system-generated identifier for each processor, which has the responsibility to uniquely identify that component within the integration. Figure 3 shows an example of a generated .XSL file. We are going to change that file in order to include a Java native function that performs Base64 decoding.

fig02-mapping-before-change

Figure 3: Example of .XSL generated file.

First, you should notice that there is a comment in the .XSL file that states where you should make any code changes. That can be observed in the “User Editing allowed BELOW this line” code comment. Unless you know exactly what you are doing please do not modify other portions of the code or you will end up breaking the code and will probably experience runtime errors.

Second, before calling any custom function in XSLT, you need to provide the namespace that defines that function. Specifically, you need to provide a namespace declaration that specifies which Java class is being used for the custom function. Figure 4 shows how to declare a custom function for the Base64 decoding use case, and how to use that function within the XSLT code.

fig03-mapping-after-change

Figure 4: Defining and using custom functions in XSLT.

The namespace declaration should specify a fully qualified Java class, and associate this class to a prefix. In the example shown in figure 4; the weblogic.apache.xerces.impl.div.util.Base64 class was associated with the “b64” prefix, so it can be used across the XSLT script by mentioning this prefix. Keep in mind that not all Java classes can be used as a custom function. In order to work, the class should expose only static methods, and the data types used must be simple types present in the XML/XSLT specification. Be aware of data type conversion as well. The values passed by argument to the functions must match with the types defined in the method, just like the value returned by the method must match with the enclosing tag that will receive the value.

Third, you have to save all changes back to the integration archive, and re-import the archive into ICS. When you do this, you can continue your development work using the ICS UI, or just activate it to start testing it.

fig04-importing-integration

Figure 5: Importing the integration archive.

While applying this technique in your integrations, please do so with care. Be particularly careful with which Java classes you chose to use it. Oracle provides no guarantees that some specific class will be always available on ICS’s Fusion Middleware foundation. For this reason, you may want to prefer classes that belong to the JDK and/or any class that you relatively certain is available under the ICS runtime architecture.

EDI Processing with B2B in hybrid SOA Cloud Cluster integrating On-Premise Endpoints

$
0
0

Executive Overview

SOA Cloud Service (SOACS) can be used to support the B2B commerce requirements of many large corporations. This article discusses a common use case of EDI processing with Oracle B2B within SOA Cloud Service in a hybrid cloud architecture. The documents are received and sent from on-premise endpoints using SFTP channels configured using SSH tunnels.

Solution Approach

Overview

The overall solution is described in the diagram shown here.

B2BCloudFlow(1)(1)An XML file with PurchaseOrder content is sent to a SOACS instance running in Oracle Public Cloud (OPC) from an on-premise SFTP server.

The XML file is received by an FTP Adapter in a simple composite for hand-off to B2B. The B2B engine within SOACS then generates the actual EDI file and transmits it over an SFTP delivery channel back to an on-premise endpoint.

In reality, the endpoint can be any endpoint inside or outside the corporate firewall. Communication with an external endpoint is trivial and hence left out of the discussion here. Using the techniques of SSH tunnels, the objective here is to demonstrate the ease by which any on-premises endpoint can be seamlessly integrated into the SOA Cloud Service hybrid solution architecture.

Our environment involves a SOACS domain on OPC with 2 managed servers. Hence, the communication with an on-premise endpoint is configured using SSH tunnels as described in my team-mate, Christian Weeks’ blog on SSH tunnel for on-premises connectivity in SOA Cloud clusters[1].

If the SOACS domain contains only a single SOACS node, then a simpler approach can also be used to establish the on-premise connectivity via SSH tunneling, as described in my blog on simple SSH tunnel connectivity for on-premises databases from SOA Cloud instance[2].

The following sections walk through the details of setting up the flow for a PurchaseOrder XML document from an on-premise back-end application, like eBusiness Suite to the 850 X12 EDI generated for transmission to an external trading partner.

Summary of Steps

  • Copy the private key of SOACS instance to the on-premise SFTP server
  • Update the whilelist for SOACS compute nodes to allow traffic flow between the SOACS compute nodes and the on-premise endpoints via the intermediate gateway compute node, referred to as CloudGatewayforOnPremTunnel in rest of this post from here onwards. This topic has also been extensively discussed in Christian’s blog[1].
  • Establish an SSH tunnel from the on-premise SFTP Server (OnPremSFTPServer) to the Cloud Gateway Listener host identified within the SOA Cloud Service compute nodes (CloudGatewayforOnPremTunnel). The role of this host to establish the SSH tunnel for a cluster has been extensively discussed in Christian’s blog[1]. This SSH tunnel, as described, will specify a local port and a remote port. The local port will be the listening port of SFTP server, (default is 22) and the remote port can be any port that is available within the SOACS instance (e.g. 2522).
  • Update FTP Adapter’s outbound connection pool configuration to include the new endpoint and redeploy. Since we have a cluster within the SOA Cloud service, the standard JNDI entries for eis/ftp/HAFtpAdapter should be used.
  • Define a new B2B delivery channel for the OnPremise SFTP server using the redirected ports for SFTP transmission.
  • Develop a simple SOA composite to receive the XML  payload via FTP adapter and hand-off to B2B using B2B Adapter.
  • Deploy the B2B agreement and the SOA composite.
  • Test the entire round-trip flow for generation of an 850 X12 EDI from a PurchaseOrder XML file.

sftpTunnel

Task and Activity Details

The following sections will walk through the details of individual steps. The environment consists of the following key machines:

  • SOACS cluster with 2 managed servers and all the dependent cloud services within OPC.
  • A compute node within SOACS instance is identified to be the gateway listener for the SSH tunnel from on-premise hosts (CloudGatewayforOnPremTunnel)
  • Linux machine inside the corporate firewall, used for hosting the On-Premise SFTP Server (myOnPremSFTPServer)

I. Copy the private key of SOACS instance to the on-premise SFTP server

When a SOACS instance is created, a public key file is uploaded for establishing SSH sessions. The corresponding private key has to be copied to the SFTP server. The private key can then be used to start the SSH tunnel from the database server to the SOACS instance.

Alternatively, a private/public key can be generated in the SFTP server and the public key can be copied into the authorized_keys file of the SOACS instance. In the example here, the private key for the SOACS instance has been copied to the SFTP server. A transcript of a typical session is shown below.

slahiri@slahiri-lnx:~/stage/cloud$ ls -l shubsoa_key*
-rw——- 1 slahiri slahiri 1679 Dec 29 18:05 shubsoa_key
-rw-r–r– 1 slahiri slahiri 397 Dec 29 18:05 shubsoa_key.pub
slahiri@slahiri-lnx:~/stage/cloud$ scp shubsoa_key myOnPremSFTPServer:/home/slahiri/.ssh
slahiri@myOnPremDBServer’s password:
shubsoa_key                                                                                100% 1679        1.6KB/s     00:00
slahiri@slahiri-lnx:~/stage/cloud$

On the on-premise SFTP server, login and confirm that the private key for SOACS instance has been copied in the $HOME/.ssh directory.

[slahiri@myOnPremSFTPServer ~/.ssh]$ pwd
/home/slahiri/.ssh
[slahiri@myOnPremSFTPServer ~/.ssh]$ ls -l shubsoa_key
-rw——-+ 1 slahiri g900 1679 Jan  9 06:39 shubsoa_key
[slahiri@myOnPremSFTPServer ~/.ssh]$

II. Create whitelist entries to allow communications between different SOACS compute nodes and on-premise SFTP server

The details about creation of a new security application and rule have been discussed extensively in Christian’s blog[1]. For the sake of brevity, just the relevant parameters for the definition are shown here. These entries are created from the Compute Node Service Console under Network tab.

Security Application
  • Name: OnPremSFTPServer_sshtunnel_sftp
  • Port Type: tcp
  • Port Range Start: 2522
  • Port Range End: 2522
  • Description: SSH Tunnel for On-Premises SFTP Server
Security Rule
  • Name: OnPremSFTPServer_ssh_sftp
  • Status: Enabled
  • Security Application: OnPremSFTPServer_sshtunnel_sftp (as created in last step)
  • Source: Security Lists – ShubSOACS-jcs/wls/ora-ms (select entry that refers to all the managed servers in the cluster)
  • Destination: ShubSOACS-jcs/lb/ora_otd (select the host designated to be CloudGatewayforOnPremTunnel, which could be either the DB or LBR VM)
  • Description: ssh tunnel for On-Premises SFTP Server

III. Create an SSH Tunnel from On-Premise SFTP Server to the CloudGatewayforOnPremTunnel VM’s public IP

Using the private key from Step I, start an SSH session from the on-premise SFTP server host to the CloudGatewayforOnPremTunnel, specifying the local and remote ports. As mentioned earlier, the local port is the standard port for SFTP daemon, e.g. 22. The remote port is any suitable port that is available in the SOACS instance. The syntax of the ssh command used is shown here.

ssh -R :<remote-port>:<host>:<local port> -i <private keyfile> opc@<CloudGatewayforOnPremTunnel VM IP>

The session transcript is shown below.

[slahiri@myOnPremSFTPServer ~/.ssh]$ ssh -v -R :2522:localhost:22 -i ./shubsoa_key opc@CloudGatewayforOnPremTunnel
[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0      0 127.0.0.1:2522              0.0.0.0:*                   LISTEN
tcp        0      0 ::1:2522                         :::*                            LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

After establishing the SSH tunnel, the netstat utility can confirm that the remote port 2522 is enabled in listening mode within the Cloud Gateway VM. This remote port, 2522 and localhost along with other on-premises SFTP parameters can now be used to define an endpoint in FTP Adapter’s outbound connection pool in Weblogic Adminserver (WLS) console.

IV. Define a new JNDI entry for FTP Adapter that uses the on-premise SFTP server via the SSH  tunnel

From WLS console, under Deployments, update FtpAdapter application by defining parameters for the outbound connection pool JNDI entry for clusters, i.e eis/Ftp/HAFtpAdapter.

The remote port from Step II is used in defining the port within the JNDI entry for FTP Adapter. It should be noted that the host specified will be CloudGatewayforOnPremTunnel instead of the actual on-premise hostname or address of the SFTP server, since the port forwarding with SSH tunnel is now enabled locally within the SOACS instance in Step III.

It should be noted that SOA Cloud instances do not use any shared storage. So, the deployment plan must be copied to the file systems for each node before deployment of the FTP Adapter application.

The process to update the FtpAdapter deployment is fairly straightforward and follows the standard methodology. So, only the primary field values that are used in the JNDI definition are provided below.

  • JNDI under Outbound Connection Pools: eis/Ftp/HAFtpAdapter
  • Host:CloudGatewayforOnPremTunnel
  • Username: <SFTP User>
  • Password: <SFTP User Password>
  • Port:2522
  • UseSftp: true

V. Configure B2B Metadata

Standard B2B configuration will be required to set up the trading partners, document definitions and agreements. The unique configuration pertaining to this test case involves setting up the SFTP delivery channel to send the EDI document to SFTP server residing on premises inside the corporate firewall. Again, the remote port from Step III is used in defining the port for the delivery channel. The screen-shot for channel definition is shown below.

edicloud6After definition of the metadata, the agreement for outbound 850 EDI is deployed for runtime processing.

VI. Verification of SFTP connectivity

After the deployment of the FTP Adapter. another quick check of netstat for port 2522 may show additional entries indicating an established session corresponding to the newly created FTP Adapter. The connections are established and disconnected based on the polling interval of the FTP Adapter. Another alternative to verify the SFTP connectivity will be to manually launch an SFTP session from the command-line as shown here.

[opc@shubsoacs-jcs-wls-1 ~]$ sftp -oPort=2522 slahiri@CloudGatewayforOnPremTunnel
Connecting to CloudGatewayforOnPremTunnel…
The authenticity of host ‘[cloudgatewayforonpremtunnel]:2522 ([10.196.240.130]:2522)’ can’t be established.
RSA key fingerprint is 93:c3:5c:8f:61:c6:60:ac:12:31:06:13:58:00:50:eb.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘[cloudgatewayforonpremtunnel]:2522’ (RSA) to the list of known hosts.
slahiri@cloudgatewayforonpremtunnel’s password:
sftp> quit
[opc@shubsoacs-jcs-wls-1 ~]$

While this SFTP session is connected, a quick netstat check on the CloudGatewayforOnPremTunnel host will confirm the established session for port 2522 from the SOACS compute node.

[opc@CloudGatewayforOnPremTunnel ~]$ netstat -an | grep 2522
tcp        0       0 0.0.0.0:2522                       0.0.0.0:*                               LISTEN
tcp        0      0 10.196.240.130:2522         10.196.246.186:14059        ESTABLISHED
tcp        0       0 :::2522                                 :::*                                       LISTEN
[opc@CloudGatewayforOnPremTunnel ~]$

VII. Use the newly created JNDI to develop a SOA composite containing FTP Adapter and B2B Adapter to hand-off the XML payload from SFTP Server to B2B engine

The simple SOA composite diagram built in JDeveloper for this test case is shown below.

The JNDI entry created in step IV (eis/ftp/HAFtpAdapter) is used in the FTP Adapter Wizard session within JDeveloper to set up a receiving endpoint from the on-premises SFTP server. A simple BPEL process is included to transfer the input XML payload to B2B. The B2B Adapter then hands-off the XML payload to the B2B engine for generation of the X12 EDI in native format.

edicloud4

Deploy the composite via EM console to complete the design-time activities. We are now ready for testing.

VIII. Test the end-to-end EDI processing flow

After deployment, the entire flow can be tested by copying a PurchaseOrder XML file in the polling directory for incoming files within the on-premise SFTP server. An excerpt from the sample XML file used as input file to trigger the process, is shown below.

[slahiri@myOnPremSFTPServer cloud]$ more po_850.xml
<Transaction-850 xmlns=”http://www.edifecs.com/xdata/200″ xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” XDataVersion=”1.0″ Standard=”X12” Version=”V4010” CreatedDate=”2007-04-10T17:16:24″ CreatedBy=”ECXEngine_837″>
     <Segment-ST>
           <Element-143>850</Element-143>
           <Element-329>16950001</Element-329>
      </Segment-ST>
      <Segment-BEG>
           <Element-353>00</Element-353>
           <Element-92>SA</Element-92>
           <Element-324>815455</Element-324>
           <Element-328 xsi:nil=”true”/>
           <Element-373>20041216</Element-373>
        </Segment-BEG>
–More–(7%)

The FTP Adapter of the SOA composite from SOACS instance will pick up the XML file via the SSH tunnel and process it in Oracle Public Cloud within Oracle B2B engine to generate the EDI. The EDI file will then be transmitted back to the on-premise SFTP server via the same SSH tunnel.

Results from the completed composite instance should be visible in the Enterprise Manager, as shown below.

edicloud2

Content of the EDI file along with the SFTP URL used to transmit the file can be seen in the B2B console, under Wire Message Reports section.

edicloud1

Summary

The test case described here is a quick way to demonstrate the concept that SOA Cloud Service can be very easily used in a hybrid architecture for modelling common B2B use cases, that require access to on-premise endpoints. The EDI generation process and all the business layer orchestration can be done in Oracle Public Cloud (OPC) with SOA Suite. Most importantly, integration with on-premise server endpoints can be enabled as needed via SSH tunnels to provide a hybrid cloud solution.

Acknowledgements

SOACS Product Management and Engineering teams have been actively involved in the development of this solution for many months. It would not have been possible to deliver such a solution to the customers without their valuable contribution.

References

  1. 1. Setting up SSH tunnels for cloud to on-premise with SOA Cloud Service clusters – Christian Weeks, A-Team
  2. 2. SOA Cloud Service – Quick and Simple Setup of an SSH Tunnel for On-Premises Database Connectivity – Shub Lahiri, A-Team

Using eBS Adapter in Integration Cloud Service – Part 1: Installing eBusiness Suite Integrated SOA Gateway for REST Services

$
0
0

Introduction

Integration Cloud Service (ICS) enables connecting applications in the cloud or on-premise. It also provides an adapter for eBusiness Suite. This eBS adapter is different than the eBS adapter in SOA Suite – it does not use a database connection. Instead it uses the REST services provided by eBS as part of Integrated SOA Gateway (ISG).

This article describes the steps needed to get eBusiness Suite including ISG REST services ready – not only for ICS (These instructions apply as well if you want to use REST services without ICS):

ISG requires some additional patches on top of eBS 12.2.4 – this is shown in this first part.

In a second part which you can find here, we show how to enable the REST metadata provider for ICS and test eBS REST services – both from a native REST client and from ICS using the adapter.

Prerequisites

As a starting point, we assume eBusiness Suite version 12.2.4. We will show the steps needed using the eBusiness Suite Virtual Appliance which is available from Oracle Software Delivery Cloud (http://edeliverly.oracle.com). Search for product “Oracle VM Virtual Appliances for Oracle E-Business Suite” under category “Linux/OVM/VMs“. Be sure to select the 12.2.4 version: “Oracle VM Virtual Appliances for Oracle E-Business Suite 12.2.4.0.0 for x86 64 bit, 41 files“. You only need to download all files marked with “Oracle E-Business Suite Release 12.2.4 Single Node Vision Install X86 (64 bit)” for this exercise.

Using this VM, we will focus on the basic steps needed to get REST services running – this will not include steps required for a production type setup, for example setup of SSL etc.

See also blog https://blogs.oracle.com/stevenChan/entry/e_business_suite_12_2

ISG REST services are part of core eBS 12.2 without the need for additional licenses.  (ISG SOAP services – which are not needed for ICS – would require additional license of SOA Suite with eBS 12.2)

For running the VM, we will use Oracle Virtualbox 5.0.16. All steps are executed for Oracle Linux 6 (x86).

Now, download all patches needed later from MOS – see Appendix.

General Patching Procedure

The overall procedure for patching is described in MOS Note 1617461.1 – Applying the Latest AD and TXK Release Update Packs to Oracle E-Business Suite Release 12.2

The overall procedure for installing ISG is described in MOS Note 1311068.1 – Installing Oracle E-Business Suite Integrated SOA Gateway, Release 12.2. Only steps for REST services in sections B and C are relevant.

Step 1 – Extract and Test the downloaded VM Image

After downloading, extract the VM using the following script – and import the resulting OVA file in Virtualbox.

for i in *.zip
do
unzip $i
done

cat Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.00 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.01 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.02 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.03 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.04 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.05 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.06 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.07 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.08 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.09 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.10 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.11 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.12 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.13 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.14 \
Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova.15 > Oracle-E-Business-Suite-12.2.4_VISION_INSTALL.ova

After startup of the VM, try login as root. This will ask for new passwords for users root, oracle and applmgr.

Next, check the network configuration: ebs.example.com must be resolvable in the VM. Then start the DB and App tier using the scripts startvisiondb.sh and startvisionapps.sh

The scripts to manage the Oracle E-Business Suite single node Vision installation are:

SCRIPTS BASE_DIR                      : /u01/install/VISION/scripts/
START SCRIPT FOR DB                   : /u01/install/VISION/scripts/startvisiondb.sh
STOP SCRIPT FOR DB                    : /u01/install/VISION/scripts/stopvisiondb.sh
START SCRIPT FOR APPS                 : /u01/install/VISION/scripts/startvisionapps.sh
STOP SCRIPT FOR APPS                  : /u01/install/VISION/scripts/stopvisionapps.sh
DB RE-CONFIG SCRIPT                   : /u01/install/VISION/scripts/visiondbconfig.sh
APPS RE-CONFIG SCRIPT                 : /u01/install/VISION/scripts/visionappsconfig.sh
DB CLEANUP SCRIPT                     : /u01/install/VISION/scripts/visiondbcleanup.sh
APPS CLEANUP SCRIPT                   : /u01/install/VISION/scripts/visionappscleanup.sh
CONFIGURE A NEW WEB ENTRY POINT       : /u01/install/scripts/configwebentry.sh

You now should be able to login to eBS home page using http://ebs.example.com:8000 using for example user SYSADMIN/sysadmin.

Step 1a – Change to graphical desktop (optional)

If you rather like to work with a graphical desktop that with the default command line shell provided by the VM, log into VM as root and execute

yum install oraclelinux-release

yum groupinstall Desktop

Change runlevel in /etc/inittab from 3 to 5.

Step 1b – Change tnsnames.ora

Edit /u01/install/VISION/11.2.0/network/admin/tnsnames.ora

EBSDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = slc01ozg.us.oracle.com)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = EBSDB)
)
)

Change slc01ozg.us.oracle.com to ebs.example.com and verify with tnsping EBSDB that this works fine.

Step 2 – Upgrade FMW – Apply Patch 20642039

Verify FMW Version: WLS is 10.3.6.0.7 (using WLS Console http://localhost:7001/console)

Download patch 20642039 and execute

export ORACLE_HOME=/u01/install/VISION/fs2/FMW_Home/oracle_common

opatch apply

If you encounter the following error, then correct the ORACLE_HOME to /u01/install/VISION/fs2/FMW_Home/oracle_common before running opatch.

OPatch could not find OUI based inventory in the Oracle Home

Step 3 – Attach DB Home

cd /u01/install/VISION/11.2.0/oui/bin

./attachHome.sh

Make sure the output looks like

The inventory is located at /u01/install/oraInventory
‘AttachHome’ was successful.

Step 3a – Check DB connectivity

Execute

cd. /u01/install/VISION/11.2.0/bin

export ORACLE_HOME=/u01/install/VISION/11.2.0

export ORACLE_SID=EBSDB

./sqlplus /NOLOG

connect / as sysdba

This must work`- otherwise adop apply with fail. If “connect / as sysdba” does not, work then check if  $TWO_TASK parameter is set:

echo $TWO_TASK

If this displays a value, execute

export TWO_TASK=

See note

UNIX: Checklist for Resolving Connect AS SYSDBA Issues (Doc ID 69642.1)

Step 4 – Run adgrants.sql

Before we can apply patches to eBS , we need to execute adgrants.sql – otherwise we will get the following error

AutoPatch error: Please apply adgrants.sql file in database tier before applying this patch

See note: E-Business Suite 12.2 R12.AD.C.DELTA.6 Patch 19197270 Fails With Error ‘Please Apply adgrants.sql File In Database Tier Before Applying This Patch’ (Doc ID 2039459.1)

Note: This issue can occur on other patches also.  You need to verify the version of adgrants.sql in all patches and run the highest version prior to attempting to apply the patches.

The EBS context file is located under /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/appl/admin/EBSDB_ebs.xml

(ORACLE_HOME=/u01/install/VISION/11.2.0)

Follow these steps:

copy adgrants.sql from patch 22123818 (in subdirectory admin) to the DB server $ORACLE_HOME/appsutil/admin. (backup the existing one before)

export ORACLE_SID=EBSDB

cd $ORACLE_HOME/bin

./sqlplus /NOLOG

@$ORACLE_HOME/admin/adgrants.sql APPS

(Parameter 1: APPS)

Output should look like

End of Creating PL/SQL Package AD_ZD_SYS.

Start of giving grants. This may take few minutes.

PL/SQL procedure successfully completed.

Start of PURGE DBA_RECYCLEBIN.

PL/SQL procedure successfully completed.

End of PURGE DBA_RECYCLEBIN.

Commit complete.

Step 5 Run ETCC DB Checker

Install ETCC DB Checker via patch 17537119: Create a new directory /u01/install/VISION/11.2.0/appsutil/etcc, copy the zip file there and extract it.

Run DB-ETCC:

cd  /u01/install/VISION/11.2.0/appsutil/etcc

./checkDBpatch.sh  /u01/install/VISION /11.2.0/appsutil/EBSDB_ebs.xml

If the utility ask for Database context file, then enter

Enter full path to Database context file: /u01/install/VISION/11.2.0/appsutil/EBSDB_ebs.xml

The tools was run successfully if you see the output similar to

Apply the missing bugfixes and then rerun the script.

Stored Technology Codelevel Checker results in the database successfully.

Finished prerequisite patch testing : Tue Apr 19 11:08:54 EDT 2016

Log file for this session: ./checkDBpatch_2810.log

Do not apply any of the recommendations made by the utility – we will skip those for this exercise. These should be installed however in a production environment.

Troubleshooting:

If running the ETC DB Checker failed with the following error, the the previous step (attach DB home) has not been executed successfully:

/u01/install/VISION/11.2.0/OPatch

./opatch lsinventory returns

Inventory load failed… OPatch cannot load inventory for the given Oracle Home.

Possible causes are:

   Oracle Home dir. path does not exist in Central Inventory

   Oracle Home is a symbolic link

   Oracle Home inventory is corrupted

LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo

OPatch failed with error code 73

See OPatch Fails With “LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo” (Doc ID 728417.1)

Step 6 – Run auto config

Source the run environment:

cd /u01/install/VISION

. ./EBSapps.env

and select the Run file system.

Then run

adautocfg.sh

Provide APPS password.

Check that the output ends with

AutoConfig completed successfully.

Step 7 – Install eBS patches

Download all patches listed in the appendix and unzip it in the $PATCH_TOP directory – in our case

/u01/install/VISION/fs_ne/EBSapps/patch:

. <EBS_ROOT>/EBSapps.env run

cd $PATCH_TOP

Make sure you have started DB and Apps tier before the next steps.

After each execution, the output should be

adop exiting with status = 0 (Success)

Step 7a – Run prepare phase

After sourcing the run environment, execute

adop phase=prepare

This should take a while. The result should look like:

adop phase=prepare – Completed Successfully

Log file: /u01/install/VISION/fs_ne/EBSapps/log/adop/12/adop_20160420_074702.log

adop exiting with status = 0 (Success)

If you receive the following error, then you have not executed auto config (step 6):

Worker count determination…

Validation successful. All expected nodes are listed in ADOP_VALID_NODES table.
[UNEXPECTED]adop is not able to detect any application tier nodes in FND_NODES table.
[UNEXPECTED]Ensure ICM is running and run autoconfig on all nodes
[UNEXPECTED]Error while checking if this is a multi node instance
Log file: /u01/install/VISION/fs_ne/EBSapps/log/adop/adop_20160420_073426.log

Step 7b – Install patches 20745242 and 22123818

Execute

adop phase=apply patches=20745242,22123818 merge=yes

If you see this error

AutoPatch error:
Please apply adgrants.sql file in database tier before applying this patch

then an older adgrants.sql than from patch 22123818 has been applied before. The right version should be in our case

adgrants.sql 120.67.12020000.37 2015/12/18

Step 7c – Install patches 20784380, 22363475, and 22495069

Execute

adop phase=apply patches=20784380,22363475,22495069 merge=yes

Step 7d – Install patch 19259764

Execute

adop phase=apply patches=19259764

Step 7e – Install ISG consolidated patch 22328483:R12.OWF.C

Execute

adop phase=apply patches=22328483

If this patch is not applied, you will get the following error when restarting eBS Apps Tier:

./startvisionapps.sh
Starting the Oracle E-Business Suite Application Tier Servicessh: /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/admin/scripts/adstrta: No such file or directory
./startvisionapps.sh: line 75: l.sh: command not found

Step 7f – Run finalize and cutover

Execute

adop phase=finalize

adop phase=cutover

Step 7g –  Run adop cleanup

EBSDB environment has changed.
All users must re-source the environment using below command:
source /u01/install/VISION/EBSapps.env run|patch

Then execute

adop phase=cleanup

Step 7h – Run adop fs_clone

Execute

adop phase=fs_clone

This step wa the last patch necessary before we procees in part 2 with configuration and testing.

Step 8 – Verify eBS after Restart

Before we proceed with Part 2, verify that the eBS instance is working as expected after patching:

Restart DB and App tier:

cd /u01/install/VISION/scripts

./stopvisionapps.sh

./stopvisiondb.sh

./startvisiondb.sh

./startvisionapps.sh

Login to eBS in a browser using SYSADMIN/sysadmin.

Navigate to Integrated SOA Gateway, Integration Repository, and search for services using “Employee” for “Business Entity” as filter.

You should see results similar to the following picture:

ISG-search-result

 

Appendix

List of patches to be applied

No. Component Patch Title
20642039 WLS/FMW MERGE REQUEST ON TOP OF 11.1.1.6.0 FOR BUGS 20361466 20484781
20745242 eBS R12.AD.C.delta.7: R12.AD.C.DELTA.7 PATCH
22123818 eBS BUNDLE FIXES II FOR R12.AD.C.DELTA.7 (20745242)
20784380 eBS R12.TXK.C.delta.7: R12.TXK.C.DELTA.7
22363475 eBS BUNDLE FIXES II FOR R12.TXK.C.DELTA.7 (20784380)
22495069 eBS TXK CONSOLIDATED PATCH FOR STARTCD 12.2.0.51
19259764 eBS ERROR WHEN OPENING FORMS IN IE8 ON MULTI-NODE EBS 12.2.3
22328483 eBS ISG Rest Services Consolidated Patch for 12.2.3+
17537119 eBS EBS Technology Codelevel Checker

References

Document  Title
Part 2 http://www.ateam-oracle.com/using-ebs-adapter-in-integration-cloud-service-part-2-configure-and-test-isg-rest-services/
1928303.1 Section 1.2.4: Using the Oracle E-Business Suite Oracle VM-based Installation
1311068.1 Installing Oracle E-Business Suite Integrated SOA Gateway, Release 12.2
1355068.1 Oracle E-Business Suite 12.2 Patching Technology Components Guide
2008451.1 How To Run The 12.2 EBS Technology Code Level Checker (ETCC) ?
728417.1 OPatch Fails With “LsInventorySession failed: OracleHomeInventory gets null oracleHomeInfo”
2039459.1 E-Business Suite 12.2 R12.AD.C.DELTA.6 Patch 19197270 Fails With Error ‘Please Apply adgrants.sql File In Database Tier Before Applying This Patch’

Using eBS Adapter in Integration Cloud Service – Part 2: Configure and Test ISG REST Services

$
0
0

Introduction

Integration Cloud Service (ICS) enables connecting applications in the cloud or on-premise. It also provides an adapter for Oracle eBusiness Suite. This eBS adapter is different than the eBS adapter in SOA Suite – it does not use a database connection. Instead it uses the REST services provided by eBS as part of Integrated SOA Gateway (ISG).

This article describes the steps needed to get eBusiness Suite including ISG REST services ready – either for using it with any REST client or with ICS. ISG requires some additional patches on top of eBS 12.2.4 – this was shown in this first part , see here.

In this second part, we will show how to enable the REST services, how to enable the metadata provider for ICS and test eBS REST services, first from a native REST client (SOAPUI) and then from ICS. All steps except chapter 4 are also relevant if you want to use Oracle eBusiness Suite ISG REST services without ICS.

Chapter 1 – Configure Integrated SOA Gateway (ISG) in eBS 12.2.4

Enabling ASADMIN User with the Integration Administrator Role

We will execute the steps in section 3 of the MOS note:

Log in to Oracle E-Business Suite as a SYSADMIN user and enter the associated password.
Expand the User Management responsibility from the main menu of the Oracle E-Business Suite Home Page.

Click the Users link to open the User Maintenance page (under “Vision Enterprises”)
Enter ‘ASADMIN’ in the User Name field and click Go to retrieve the ‘ASADMIN’ user.

Click the Update icon next to the ASADMIN user to open the Update User window.
Remove the Active To date field and click Apply.

Click the Reset Password icon next to ASADMIN user to open the Reset Password window. Make sure that ASADMIN’s password is at least eight characters long.
Enter new password twice and click Submit.

In the Update User window, click Assign Roles.
In the search window, select Code from the Search By drop-down list and enter “UMX|FND_IREP_ADMIN” in the value text box.
Click Select.
Enter a justification in the Justification field and click Apply. You will see a confirmation message indicating you have successfully assigned the role.

In my case, a warning is displayed (which can be ignored because the server is restarted later anyway):

Updates to Role data will not be visible in the application until the following processes are started : Workflow Background Engine

Change ISG agent properties

Execute the following as user oracle (as described in step 3 of MOS note):

mkdir /u01/install/VISION/isg_temp

Create /u01/install/VISION/fs2/inst/apps/EBSDB_ebs/soa/isgagent.properties to update the following line:

EBSDB.ISG_TEMP_DIRECTORY_LOCATION=/u01/install/VISION/isg_temp/

Edit /u01/install/VISION/fs1/inst/apps/EBSDB_ebs/soa/isgagent.properties to update the following line:

EBSDB.ISG_TEMP_DIRECTORY_LOCATION=/u01/install/VISION/isg_temp/

Configure ISG

Run

cd /u01/install/VISION

. ./EBSapps.env run

ant -f $JAVA_TOP/oracle/apps/fnd/txk/util/txkISGConfigurator.xml ebsSetup -DforceStop=yes

as described in step 4 of the MOS note. Enter the passwords as required – including the password of ASADMIN as chosen in the previous step.  If asked

The script will forcefully stop the Weblogic Servers now. Do you want to proceed (yes/no)? (yes, no)

then select “yes” to restart the servers.

Result should be after a couple of minutes “BUILD SUCCESSFUL”.

Execute fs_clone

Run

adop phase=fs_clone

Verify that the result is

adop exiting with status = 0 (Success)

Restart the Apps tier.

Chapter 2 – Test eBS REST Services using a PL/SQL package

We will demonstrate deploying a REST service using the HR_EMPLOYEE_API package.

Deploy CreateEmployee REST Service

Login to eBS in a browser using SYSADMIN. Navigate to Integrated SOA Gateway, Integration Repository and search for “Internal Name”  “HR_EMPLOYEE_API”:

ISG-employee-search

Select Employee and move to the “REST Web:Service” tab. Deploy selecting “Create_Employee” and entering “employee” as Service Alias:

ISG-employee-before-deploy

This should show status “Deployed”:

ISG-employee-after-deploy

Click on “WADL” to save the definition file for later.

Then grant the execution to Group “US SuperXXX”: Click on “Grant”, then select “Group of Users” and click on the search icon:

ISG-employee-before-create-grant2

Search for “US Super%” and select the first result:

ISG-employee-before-create-grant4

The result after the grant should report success and show the icon in the Grant column:

ISG-employee-after-create-grant

Now we can procees with testing this REST service from a client.

Send Request to CreateEmployee REST Service

Create a REST POST request using import of the WADL file in SOAPUI. Add HRMS/welcome as http basic authentication. Don’t forget to include the http headers for Content-Language, Content-Type and Accept – and past all lines after “User-Agent” below as payload.

A sample of a correct request would look in Raw mode like:

POST http://ebs.example.com:8000/webservices/rest/employee/create_employee/ HTTP/1.1
Accept-Encoding: gzip,deflate
Authorization: Basic SFJNUzp3ZWxjb21l
Content-Language: en-US
Content-Type: application/json
Accept: application/json
Content-Length: 886
Host: ebs.example.com:8000
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)

{
“CREATE_EMPLOYEE_Input”: {
“@xmlns”: “http://xmlns.oracle.com/apps/per/rest/createEmployee/create_employee/”,
“RESTHeader”: {
“xmlns”: “http://xmlns.oracle.com/apps/per/rest/createEmployee/header”,
“Responsibility”:”US_SHRMS_MANAGER”,
“RespApplication”:”PER”,
“SecurityGroup”:”STANDARD”,
“NLSLanguage”:”AMERICAN”,
“Org_Id” :”241″
},
“InputParameters”: {
“P_HIRE_DATE”: “2015-06-28T09:00:00”,
“P_BUSINESS_GROUP_ID”:”202″,
“P_LAST_NAME”:”Mueller”,
“P_SEX”:”M”,
“P_PER_COMMENTS”:”Create From REST Service”,
“P_DATE_OF_BIRTH”:”1979-01-04T09:00:00″,
“P_EMAIL_ADDRESS”:”michael.mueller@oracle.com”,
“P_FIRST_NAME”:”Michael”,
“P_KNOWN_AS”:”Michael”,
“P_MARITAL_STATUS”:”S”,
“P_MIDDLE_NAMES”:”Mueller”,
“P_NATIONALITY”:”AM”,
“P_NATIONAL_IDENTIFIER”:”183-25-2523″,
“P_REGISTERED_DISABLED_FLAG”:”N”,
“P_COUNTRY_OF_BIRTH”:”US”,
“P_REGION_OF_BIRTH”:”West”,
“P_TOWN_OF_BIRTH”:”San Francisco”
}
}
}

Execute the request in SOAPUI:

ISG_CreateEmployee_SOAPUI

A sample of a correct response would be

HTTP/1.1 200 OK
Date: Thu, 21 Apr 2016 14:25:36 GMT
Server:
Content-Length: 791
X-ORACLE-DMS-ECID: 005CFxvqUBGDkZWFLzUKOA00004x00008d
X-Frame-Options: SAMEORIGIN
Keep-Alive: timeout=15
Connection: Keep-Alive
Content-Type: application/json
Content-Language: en

{
“OutputParameters” : {
“@xmlns:xsi” : “http://www.w3.org/2001/XMLSchema-instance”,
“@xmlns” : “http://xmlns.oracle.com/apps/per/rest/employee/create_employee/”,
“P_EMPLOYEE_NUMBER” : “2401”,
“P_PERSON_ID” : “32853”,
“P_ASSIGNMENT_ID” : “34077”,
“P_PER_OBJECT_VERSION_NUMBER” : “2”,
“P_ASG_OBJECT_VERSION_NUMBER” : “1”,
“P_PER_EFFECTIVE_START_DATE” : “2015-06-28T00:00:00.000-04:00”,
“P_PER_EFFECTIVE_END_DATE” : “4712-12-31T00:00:00.000-05:00”,
“P_FULL_NAME” : “Mueller, Michael Mueller (Michael)”,
“P_PER_COMMENT_ID” : “306”,
“P_ASSIGNMENT_SEQUENCE” : {
“@xsi:nil” : “true”
},
“P_ASSIGNMENT_NUMBER” : “2401”,
“P_NAME_COMBINATION_WARNING” : “0”,
“P_ASSIGN_PAYROLL_WARNING” : “0”,
“P_ORIG_HIRE_WARNING” : “0”
}
}

Congratulations – you have executed your first REST service on eBusiness Suite!

Troubleshooting

HTTP 500 Internal Server Error

In case you use the wrong value for P_BUSINESS_GROUP_ID (or for any other LOV-based element) in the request for CREATE_EMPLOYEE, the server returns an http 500.

{
“ISGServiceFault” : {
“Code” : “ISG_SERVICE_EXECUTION_ERROR”,
“Message” : “Error occurred while executing the web service request”,
“Resolution” : “System error, please see service log trace for details.”,
“ServiceDetails” : {
“ServiceName” : “employee”,
“OperationName” : “create_employee”,
“InstanceId” : “0”
}
}
}

The same is the case if you have a typo somewhere in the JSON element names.

Chapter 3 – Configure Metadata Provider for using ISG REST with ICS

Open eBS homepage in a broser and login with user SYSADMIN. Navigate to Integrated SOA Gateway, Integration Repository.Click on “Search” button on the right, enter “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider” in the field “Internal Name” and click “Go” (If this doesn’t list anything, you are still missing a patch on the EBS instance. Please check the first part of this article).

Then click on “Metadata Provider”:
ISG-Metadata-Provider

Click on “REST Web Service” tab, then enter “provider” as is in the “Service Alias” field and click the button “Deploy”.
Navigate to “Grants” tab and give grants on all methods to “All users”.

Now you are ready to use the ISG REST services from ICS.

Troubleshooting

If you dont get any result when searching for “oracle.apps.fnd.rep.ws.service.EbsMetadataProvider”, then you have missed to install one or more patches listed in part 1 of this article.

Chapter 4 – Test eBS REST Service from ICS

A previous post by Greg Mally explains how to setup ICS to eBS using the ICS Connectivity Agent:

http://www.ateam-oracle.com/round-trip-on-premise-integration-part1_ics-to-ebs/

Additionally we will show here how to directly connect ICS with eBS without an agent. This works only if you have exposed eBS REST services over public internet.

For this purpose, I have reconfigured the eBS VM to use a public DNS hostname and changed to port to 80.

In ICS, create a new connection with Oracle eBusiness Suite – name it  for example EBS_TEST:

Enter the connection URL for the server and use HRMS/welcome as credentials. Test and Save the connection.

We want to create a new integration using SOAP inbound and this eBS connection outbound: Create a new SOAP connection, upload the WSDL (from here) and select “No Security”. Test and Save it. (For simplicity, I have left out any SOAP faults). Create a new integration named “Create_Employee”. Drop the EBS connection in the target side:

Enter a name for the endpoint, for example “EBS_CreateEmployee”.

ISG2-ICS-DropConnection1

Select Product Family “Human Resources Suite”, Product “Human Resources” and API “Employee”. The screen should look like:

ISG2-ICS-select-apipng

Select Next.

Select the operation Create_Employee:

ISG2-ICS-select-createEmployee-method

Select Next, then Done.

Your integration should look similar to:

 

ISG2-ICS-integration-after-drop-ebs

Drop the SOAP Connection as source and save the integration:

ISG2-ICS-after-drop-SOAP

Add the request mapping:

ISG2_ICS-create-mapping-for-SOAP-request

Add the response mapping:  Map “P_EMPLOYEE_NUMBER” to “result”:

ISG2_ICS-create-mapping-for-SOAP-response  ISG2_ICS-tracking-for-SOAP-test

Create the response mapping:

ISG2_ICS-create-mapping-for-SOAP-response

Add a tracking field:

ISG2_ICS-tracking-for-SOAP-test

The result should look like:

ISG-ICS-soap-ebs-integration-final

Activate the integration. Then copy the WSDL URL and use that to create a new SOAPUI project.

 

In SOAPUI, add the payload like below and add http basic auth, WS Username Token and WS-Timestamp:

<soapenv:Body>
<cre:process>
<cre:lastname>Mueller</cre:lastname>
<cre:middlenames>Edwin</cre:middlenames>
<cre:firstname>Michael</cre:firstname>
<cre:knownas>Michael</cre:knownas>
<cre:email>michael.mueller@oracle.com</cre:email>
<cre:comments>Create From SOAP Service</cre:comments>
<cre:sex>M</cre:sex>
<cre:martialstatus>S</cre:martialstatus>
<cre:businessgroup>202</cre:businessgroup>
<cre:dateofbirth>1979-01-04T09:00:00</cre:dateofbirth>
<cre:nationality>AM</cre:nationality>
<cre:nationalid>183-25-2523</cre:nationalid>
<cre:countryofbirth>US</cre:countryofbirth>
<cre:regionofbirth>West</cre:regionofbirth>
<cre:townofbirth>San Francisco</cre:townofbirth>
<cre:hiredate>2015-06-28T09:00:00</cre:hiredate>
</cre:process>
</soapenv:Body>

 

After executing the request you should get the created employee id as response:

<nstrgmpr:processResponse xmlns:wsdl=”http://schemas.xmlsoap.org/wsdl/” xmlns:nstrgmpr=”http://xmlns.oracle.com/CreateEmployee/CreateEmployeeSOAP/CreateEmployeeProcess” xmlns:plnk=”http://docs.oasis-open.org/wsbpel/2.0/plnktype” xmlns:xsd=”http://www.w3.org/2001/XMLSchema”>
    <nstrgmpr:result>2411</nstrgmpr:result>
</nstrgmpr:processResponse>

Congratulations – you have executed the ICS integration using eBusiness Suite Adapter successfully!

Using VNC securely in the Oracle Cloud

$
0
0

Introduction

Having access to a VM in the Cloud via VNC can be very useful in many situations – e.g. most customers want to install software using GUI based installer, e.g. Oracle Database etc. Using VNC the installation can continue, even without being connected. The easiest way to achieve this with a reliable and secure mechanism is to use VNC via a SSH Tunnel. In this example a simple Oracle Compute Cloud VM is used to configure a Gnome Desktop & VNC Server. It has been created as shown in the tutorial here. Most other VMs in the Oracle Cloud that run Oracle Linux can be configured in the same way, e.g. DBaaS VMs.

vnc

Configure SSH Tunnel

The SSH Tunnel is established using the Putty Tool – alternatives will be discussed later in this tutorial. Use the public IP address of the created VM and give it the session a name.

image2

Next expand the session tree on the left hand side and select the category “Data” in the “Connection” branch. Per default Oracle Cloud VMs are configured with the user opc. For easier login enter “opc” in the Auto-login username field.

image3

Expand the “SSH” branch and select “Tunnels”. Here enter 5901 as source port and the Public IP of the VM in the format 1.1.1.1:5901. Here 5901 is the destination port. Click add. All VNC traffic is routed through this tunnel, hence no additional port needs to be opened via Security Lists. See this MOS Note should you want to use iptables Doc ID 2102424.1.

image4

Next navigate to “Auth” in the “SSH” branch and point to the private key that has the authentification information as provided during the provisioning of the VM. See this tutorial, if you are unsure which key to use: SSH Keys

image5

Finally navigate back to the “Session” category. Press the “Save” Button and then press “Open” to establish the connection.

image6

Configure VNC Server

If everything is configured correctly you will be greated by the usual prompt.

image7

Install the GNOME desktop via yum. To achive this switch to the root user and then use the groupinstall function:

sudo su -
yum -y groupinstall "Desktop"

If you have issue with yum – follow this simple tutorial.

image8

Alternatively KDE desktop can be installed using:

yum -y groupinstall kde-desktop

image9

Make sure that there are no errors and look for the “Complete!” message once everything is installed.

image10

Install additional tools to help with your activities, like a browser (here Firefox) or even an Office Suite. Most importantly install “tigervnc-server” to allow access to the desktop.

yum -y install tigervnc-server
yum -y install firefox
yum -y groupinstall "General Purpose Desktop"

image11

After all packages are installed simply issue:

vncserver

This will start the VNC server with the default settings, e.g. port 5901 for display :1 etc. These settings can be canged in the configuration file: /home/opc/.vnc/xstartup.

Should you want to use iptables, also run:

iptables -I INPUT -m state --state NEW -p tcp --destination-port 5901 -j ACCEPT

<a href="http://www.ateam-oracle.com/wp-content/uploads/2016/04/image12.png" rel="attachment wp-att-38092"><img src="http://www.ateam-oracle.com/wp-content/uploads/2016/04/image12.png" alt="image12" width="697" height="408" class="alignnone size-full wp-image-38092" /></a>

Connect to the VNC Server

Next start your local VNC viewer on your local client. The SSH tunnel redirects the VNC output of your VM to your localhost on port 5901. Hence enter localhost:5901 in the VNC Server field and press “Connect”.

image13

The first time you connect you will be issued a warning, that the connection is not encrypted. As we are using a SSH tunnel to encrypt the traffic this warning can be ignored.

image14

Enter the password you have selected for the VNC Server.

image15

This will connect you to the Desktop. This Desktop will be active, even if you disconnect the Putty Session – this allows to resume work comfortably.

image16

To stop the VNC Server simply connect via putty or open a terminal and enter:

vncserver –kill :1

If you prefer to have a different resolution simply start the vncserver using the geometry flag and the prefered resolution.

vncserver –kill :1
vncserver -geometry 1600x1200 

Note that the Desktop has a timeout, after which the screen locks and you have to authenticate via password, to set the password run:

sudo passwd opc

Further Reading

My Oracle Support Notes

Access VNC Server Through A Web Browser (Doc ID 1555696.1)



                       

Creating a Mobile-Optimized REST API Using Oracle Mobile Cloud Service – Part 4

$
0
0

Introduction

To build functional and performant mobile apps, the back-end data services need to be optimized for mobile consumption. RESTful web services using JSON as payload format are widely considered as the best architectural choice for integration between mobile apps and back-end systems. At the same time, many existing enterprise back-end systems provide a SOAP-based web service application programming interface (API). In this article series we will discuss how Oracle Mobile Cloud Service (MCS) can be used to transform these enterprise system interfaces into a mobile-optimized REST-JSON API. This architecture layer is sometimes referred to as Mobile Backend as a Service (MBaaS). A-Team has been working on a number of projects using MCS to build this architecture layer. We will explain step-by-step how to build an MBaaS, and we will  share tips, lessons learned and best practices we discovered along the way. No prior knowledge of MCS is assumed. In part 1 we discussed the design of the REST API, in part 2 we covered the implementation of the “read” (GET) resources, in part 3 we discussed implementation of the “write” resources (POST,PUT and DELETE). In this fourth part, we will look at how we can use MCS Storage collections to cache payloads, and while doing so, we will use some more advanced concepts like chaining promises to execute multiple REST calls in sequence.

Main Article

In this article we will implement the GET /jobs endpoint which returns a list of jobs. This list is static, and as such can be cached within MCS to reduce the number of backend calls and speed up overall performance. Obviously, app developers can also choose to cache this list on the mobile device to further enhance performance, but that is beyond the scope of this article. We will use the MCS Storage API to store and retrieve the cached list of jobs. We will use a boolean query parameter refreshCache to force an update of the jobs list in storage.

Setting up the Storage Collection

To store files in MCS, a so-called storage collection must be created. Access to a storage collection is handled through roles. When creating a new storage collection, you assign roles that have read and/or write privileges. Users with the appropriate role(s) can then store files in the collection and/or retrieve them. So, we first create a role named HRManager, by clicking on the Mobile User Management menu option, select the Roles tab, and then click on New Role.

NewRole

After creating the role, we select the Storage menu option and click on New Collection to create the collection

NewCollection

We leave the collection to its default of Shared. After clicking the Create button, we assign read and write permissions to the role we just created:

CollectionPermissions

Finally, we need to associate this storage collection with our mobile backend. We open the HumanResources mobile backend, click on the Storage tab, click Select Collections, and then enter “HR” to link the collection we just created with our mobile backend.

SelectCollection

Implementing the GET Jobs Endpoint

The behavior for the GET /jobs endpoint that we need to implement is as follows:

If refreshCache query parameter is set to true:

  • Invoke the findJobsView1 SOAP method
  • Transform the SOAP response into JSON format as defined during API design (see part 1)
  • Store the JSON payload in the HR storage collection in a file named JobsList
  • Return the content of JobsList as response

If refreshCache query parameter is not set, or set to false:

  • Retrieve the JobsList JSON file from the HR collection
  • If the file is found, return the content as response
  • If the file is not found, perform the steps described above when refreshCache is true.

To keep our main hr.js file clean, we start with adding a call in this file to a new getJobs function that we must implement in hrimpl.js:

service.get('/mobile/custom/hr/jobs', function (req, res) {
    hr.getJobs(req,res,(req.query.refreshCache === 'true'));        
});

Refer to part 2 for more info on this separation between the contract and the implementation of our human resources API. The signature of the getJobs function in hrimpl.js looks like this:

exports.getJobs = function getJobs(req, res, refreshCache) {
  // TODO: add implementation
}

Implementing Helper Functions

To keep the implementation clean and readable, we will first define 3 helper functions, one for each service call we need to make. The function to invoke the SOAP web service to retrieve the list of jobs looks as follows:

function getJobsFromSOAP(sdk) {
  var requestBody = {Body: {"findJobsView1": null}};
   return sdk.connectors.hrsoap1.post('findJobsView1', requestBody, {inType: 'json', outType: 'json'}).then(
      function (result) {
          var response = result.result;
          var jobList = response.Body.findJobsView1Response.result.map(transform.jobSOAP2REST);
          return jobList;
      },
      function (error) {
         throw(error.error);
      }
    );
}

The transformation function jobSOAP2REST in transformations.js looks like this:

exports.jobSOAP2REST = function(job) {
    var jobRest = {id: job.JobId, name: job.JobTitle, minSalary: job.MinSalary, maxSalary: job.MaxSalary};
    return jobRest;
};

With the concepts you learned in previous parts, most of this code should be pretty self-explanatory. We use the hrsoap1 connector to invoke the findJobsView1 SOAP method. By specifying the inType and outType, we instruct MCS to auto-convert the request from JSON to XML, and the response back from XML to JSON. Also notice the return keyword in front of the connector call, by returning the connector call promise, we can chain multiple service calls, as we will explain later in more detail. Finally note that in case of an error, we use the throw function. We will explain this later on as well.

The function to store the jobs list in HR collection looks as follows:

function storeJobsInStorage(sdk,jobs) {
   return sdk.storage.storeById("HR", 'JobsList', JSON.stringify(jobs), {contentType: 'application/json', mobileName: 'JobsList'}).then(
         function (result) {
           return JSON.stringify(jobs);
         },
         function (error) {
           throw(error.error);
         }
       );
}

We use the storeById function of the MCS storage API to store the jobs list as a JSON file. The mobileName property is set to JobsList to ensure that the file ID is set to JobsList instead of a system-generated value, which makes it easier to retrieve the file from storage later on.

And here is the last helper function to retrieve the JobsList from the HR storage collection:

function getJobsFromStorage(sdk) {
   return sdk.storage.getById("HR", 'JobsList').then(
         function (result) {
             return result.result;             
         },
         function (error) {
           throw(error.error);  
         }
       );
}

Implementing the GetJobs Function

With these helper functions in place, we can now code our main getJobs function. Here is the full implementation:

exports.getJobs = function getJobs(req, res, refreshCache) {
    var sdk = req.oracleMobile;
    if (refreshCache) {
        getJobsFromSOAP(sdk)
          .then(function(jobs) {
            return storeJobsInStorage(sdk,jobs);
        }).then(function(jobs) {
            res.send(200,jobs).end();
        }).catch(function(error) {
            res.send(500,error).end();            
        });            
    } 
    else {
      getJobsFromStorage(sdk)
        .then(function (jobs) {
            res.send(200,jobs).end();            
      }).catch(function(error) {
          var statusCode = JSON.parse(error).status;
          if (statusCode===404) {
            // Jobs list not yet in cache, do recursive call to get from 
            // SOAP service and store in cache  
            getJobs(req,res,true);
          }
          else {
            // something else went wrong, just return the error  
            res.send(statusCode,error).end();                                    
          }
      });
    }
};

This code nicely illustrates how we can execute multiple service calls sequentially by simply chaining the promises using then(). This is how it works:

  • By returning the promise in line 6, we can chain the next then() function, which in our case writes the response
  • The input parameter of the function passed into then() is determined by the previous promise: the getJobsFromSOAP method returns the jobs as a JSON array which is then passed into the storeJobsInStorage method. This method returns the same jobs array, so we can use it in line 8 to send the response of our GET /jobs endpoint.
  • If an unexpected error occurs, the promise function throws an error which is caught on line 9, and sent as response on line 10. When an error is thrown the chain of promises using then() is interrupted, similar to a try-catch block in Java.
  • When no cache refresh is requested the catch function in line 17 is used to determine whether the JobsList is already in the cache. The error message returned by the getById function of the storage API when the file is not present looks like this:
    {
      "type": "http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.4.1",
      "status": 404,
      "title": "The Storage Service can't find the object in the collection.",
      "detail": "The Storage Service can't find an object with ID JobsList in the HR collection. Verify that the object ID is correct.",
      "o:ecid": "005CWENZaf4A9T3_RlXBid0004il00002l, 0:3",
      "o:errorCode": "MOBILE-82701",
      "o:errorPath": "/mobile/platform/storage/collections/HR/objects/JobsList"
    }

    This error message is thrown in our getJobsFromStorage helper function which allows us to inspect it inside the catch block in line 18.  If the value of the status attribute of the error message is 404 (Not Found), we know the file is not present in the cache, either because this is the very first time the endpoint is executed, or because someone deleted the file. If this the case, we make a recursive call to our getJobs function with the refreshCache flag set to true which will populate the cache and return the jobs list as response. Note that for the recursive call to work, we need to repeat the name of the exported function (getJobs) with the function declaration.

  • If the getById function call returns another value for the status attribute, an unexpected error occurred so we simply return the error message to enable the invoker to figure out what is wrong.

As you can see the usage of promises allows us to write clean and easily readable code, much better then deeply nested callbacks (aka as “callback hell”) that we easily can get when using callback functions. In addition, we no longer need a library like the NodeJs asynch module to avoid such deeply nested callbacks.

Oracle MCS uses the bluebird library for implementing these promises. In this article, we have seen a simple example where we implemented sequential asynchronous calls by simply chaining then() functions. Bluebird also has support for more advanced scenarios, like executing multiple asynchronous calls in parallel and then doing some final processing when all calls have completed. The bluebird documentation has a useful section Coming from Other Libraries that describes these more advanced scenario’s.

Testing the GET Jobs Endpoint

To test our implementation we can use Postman REST client as described in part 2:

TestGetJobs1

You can no longer use the anonymous access key in the Authorization header because the anonymous user has no access to the HR collection. You need to create a mobile user for the mobile backend, assign the HRManager role to this user, and use the credentials of this user in the Authorization header.

To verify that the JobsList is cached indeed in our HR storage collection, we can also execute the GET /mobile/platform/storage/collections/HR/objects endpoint which returns the metadata of all files in he HR collection:

TestGetJobs2

You can also use the MCS web user interface to inspect, navigate to the HR collection, and click on the Contents tab, this will show a list of all files in the collection.

Note that if you then call GET /jobs?refreshCache=true to force a refresh of the cache, you will see the that the eTag attribute is incremented to “2” when you execute the storage metadata endpoint again. The (new) value of the eTag can be inspected by client applications to check whether they need to download the content of the file again in case on-device caching was applied as well.

Conclusion

This article showed how you can use the MCS storage API to cache payloads to reduce the number of potentially slow calls to backend services. However, that was just a sample used to illustrate the key takeaway of this article: the power of the bluebird promises library used in Oracle MCS which allows you to write complex orchestrations in a clean and concise way.


Integrating PCS, DOCS and BI Publisher using OSB/SOACS – Part 1

$
0
0

Introduction

In this 3 part blog series, I will be demonstrating on how to use PCS to invoke OSB (On-Prem or SOACS) services and upload a report/doc generated using BI Publisher to DOCS.  The following components are used in this demonstration:

  • Oracle Document Cloud Service (DOCS)
  • Process Cloud Service (PCS) with DOCS integration configured.
  • BI Publisher 12.2.1 with sample reports installed
  • SOACS with OSB or OSB On-Prem.

In this blog series, I will be covering the following topics:
Part 1 – Overview and Setup (Part 1) – cover the use case, configuration steps for PCS, BI Publisher and DOCS REST Services.
Part 2 – OSB Services Design (Part 2) – cover the OSB service design in details and what you need to look out for when you design your OSB services.
Part 3 – PCS Application Design and Testing (Part 3) – cover the PCS process design and how to test the sample project.

Use Case

We will use PCS to design a Salary Report request process, the process allows the user to submit a salary report request and generate a PDF report in BI Publisher.  We also need to create a report folder dynamically and attach the PDF report to the newly created folder in the same PCS process instance so that we can review and approve the generated report.

PCS_DOCS_BI_OSB_High-Level

PCS and DOCS Integration

We will need to configure PCS and DOCS integration in order for this demonstration to work, the instruction for PCS and DOCS configuration can be found here.

PCS_DOCS_BI_OSB_PCS-DOCS

We will use the default folder structure in this demonstration. However, you can also configure the folder structure (shown below) if it’s required.

PCS_DOCS_BI_OSB_PCS-DOCS Configuration in PCS

Once the PCS and DOCS integration has been configured, when a process is initiated, a process folder for that process instance will be created in DOCS.  We can then login to DOCS using the same username and password we used to configure PCS and DOCS integration to review the folder structure:

PCS_DOCS_BI_OSB_DOCS_FolderStructure

DOCS REST Services

DOCS exposes number of REST services to retrieve folder/item information and upload the file using multipart messages.  In this demonstration, we will use the following DOCS REST API to retrieve the folder ID and upload file:

Resource URI Method Description
/api/1.1/folders/items GET Get a collection of all items (folders and files) that user has access to, including folders that others have shared with that user. The type field indicates if an item is a folder or file.
/api/1.1/folders/{folder id} GET Get folder information (metadata) for the specified folder.
/api/1.1/folders/{folder id} POST Create a new subfolder in the specified destination folder.
/api/1.1/folders/{folder id}/items GET Get a collection of child items (folders and files) in the specified folder. The type field indicates if an item is a folder or file.
/api/1.1/files/data POST Upload a new file using a multipart request to specify destination and file information.

BI Publisher

BI Publisher exposes a web service called report service, which has an operation called runReport. This operation allows you to run a report with or without input parameters and return the generated report in base64 encoded data format.  A list of operations available in BI Report Service is here

In this demonstration, we will be using the sample report called SalaryReport without parameter, the generated report will be a PDF file in base64 encoded data format.

SOACS and OSB on-premise

If you are planning to use SOACS, you will need to expose the BI Publisher report service externally so it can be invoked by SOACS.  If you are using OSB on-premise to invoke the BI Publisher report service, you will need to expose the OSB service externally to allow PCS to invoke, and you will also need to import the SSL certificate from DOCS to your OSB managed server trust store.

Summary

In this first part of this blog series, we have covered the use case, basic integration configuration and setup required for the PCS, DOCS, BI Publisher and OSB.  In part 2 of the blog, we will examine OSB services design in detail.

Integrating PCS, DOCS and BI Publisher using OSB/SOACS – Part 2

$
0
0

Introduction

In this part 2 of the blog, we will examine the OSB design used in this demonstration.  OSB is used as an integration layer for PCS, DOCS and BI Publisher as OSB plays an important role acting as an integration layer for protocol translation and message transformation, and it will also serve as a service orchestration layer.

At the time of writing this blog, SOACS is based on SOA Suite 12.1.3, so if you are planning to use SOACS, you will need to use SOA Suite 12.1.3 Quickstart with Jdeveloper 12.1.3 as the IDE tool to design your OSB service. If you are using OSB on-premise, you can use either SOA 12.1.3 or SOA 12.2.1.

In this demonstration, we will expose few SOAP based OSB services to PCS for interaction with DOCS and BI Publisher OSB :

# Proxy Service Proxy /Business Service DOCS REST API BI Publisher Service
1. GenerateBIReportToDOCS BIPublisherReportService
UploadFile
ReportService/runReport
2. GetFolderByID DOCS_REST
3. GetFolderByName GetFolderItems
GetFolderItemsByID
4. GetFolderItemsByID DOCS_REST /folders/{folderID}/items
5. GetFolderItems DOCS_REST /api/1.1/folders/items
6. UploadFile UploadFile
7. CreateFolder DOCS_REST
8. CreateMultipleFolders GetFolderByName CreateFolder “/api/1.1/folders/{folder id}

During the report generation process, it will also upload the generated report in a specific folder created for the PCS instance in DOCS using the HTTP Transport.  In order to upload file to DOCS using HTTP transport, we will also need to build a multipart message.  The following section of this blog describe the OSB design in detail and how to build a multipart message.

OSB Design

PCS will invoke 2 SOAP based proxy services, CreateMultipleFolders and GenerateBIReportToDOCS. They will then invoke other reusable proxy services to perform the task required by the process.

CreateMultipleFolders is used to create single or multiple folders in DOCS.  PCS will pass in an array of folders along with the DOCS instance and identity domain name, and PCS instance folder path in DOCS. CreateMultipleFolders will then loop through the list of folders and invoke CreateFolder proxy service and DOCS_REST business service to create the folder in DOCS.  The DOCS_REST encapsulates the REST API exposed by DOCS. For folder creation it will invoke “/api/1.1/folders/{folder id}” resource in DOC.

Prior to the DOCS API invocation to create the folder, we will need to retrieve the base parent folder id where the subfolder(s) will be created. In this case, the base folder will be the PCS process instance folder that you have configured in PCS, e.g. “/SalaryReportProcess_1.0/ SalaryReportProcess_Instance_123/Application Documents”.  We will be discussing the GetFolderByName later in the blog.

PCS_DOCS_BI_OSB_CreateMultiFolder

CreateFolder

In CreateFolder proxy pipeline, we will replace the URL with the DOCS instance and identity domain name passed in from PCS.

PCS_DOCS_BI_OSB_CreateFolder

GetFolderByName

In this pipeline design, we will invoke 2 proxy services, GetFolderItems and GetFolderItemsByID.  Prior to the invocation, we will need to remove the “bpmn:” prefix folder path, this is because in current PCS (16.2.1), the predefined instance_id variable contain the “bpmn:” prefix, however the folder created in DOCS is without the prefix, hence, in order to retrieve the correct folder id in DOCS for the PCS instance, we will need to remove the prefix in the request payload.

GetFolderItems will retrieve all root folders for a DOCS user that you have configured in PCS-DOCS Integration by invoking the DOCS API: /api/1.1/folders/items. We will then loop through the response payload and find the matching root folder name we are searching for.

Using the matched root folder ID found in the GetFolderItems, we will then invoke GetFolderItemsByID proxy service to find the folder ID for the remaining folder name.  To do this, we need to invoke the DOCS REST API:  /folders/{folderID}/items and pass in the root folder ID. We will then loop through the response payload and find the matching folder name that we are searching for until we find the folder ID for the last destination folder.  The folder ID will then be returned to the calling proxy service.

PCS_DOCS_BI_OSB_GetFolderByName

GenerateBIReportToDOCS

The GenerateBIReportToDOCS proxy service will invoke 2 services, the BI Publisher Report Service and UploadFile proxy service.

GenerateBIReportToDOCS invokes runReport operation in BI Publisher Report Service to generate a PDF report, the report data in response will be base64 encoded.  It will then pass the report data to UploadFile proxy service to decode the data to a byte array and invoke the DOCS upload file REST service to upload the doc.

PCS_DOCS_BI_OSB_GenerateBIReportToDOCS

UploadFile

The UploadFile proxy service will construct a multipart message, decode base64 data returned by the BI Publisher using Java Callout, then add the decoded data on to the message.  After the message has been constructed, it will route the multipart message to the DOCS upload service using HTTP Transport.

When you are building the multipart message, DOCS will require a multipart request message with a specified boundary delimiter,  for example:

Content-Type: multipart/form-data; boundary=MIME_Boundary

The DOCS multipart message translates into 2 attachments in OSB and the boundary delimiter:”– MIME_Boundary” will be added by OSB. For example:

-- MIME_Boundary 
Content-Disposition: form-data; name="jsonInputParameters"

{
"parentID":"FB4CD874EF94CD2CC1B60B72T0000000000100000001"
}
-- MIME_Boundary
Content-Disposition: form-data; name="primaryFile"; filename="example.pdf"
Content-Type: application/pdf

<Binary File Content>
-- MIME_Boundary--

 

The first attachment consists of the following message:

Content-Disposition: form-data; name=”jsonInputParameters”

{

“parentID”:”‘,$folderID,'”

}

To build the 1st message part, you will need to add the 1st attachment node using insert action as the first child of attachments variable in OSB, and then replace the node with the correct message:
PCS_DOCS_BI_OSB_OSB_CONFIG1  PCS_DOCS_BI_OSB_OSB_CONFIG2

 

The 2nd attachment consists of the following message with the decoded binary data:

Content-Disposition: form-data; name=”primaryFile”; filename=”example.pdf”

Content-Type: application/pdf

 

<Text/Binary File Content>

To build the 2nd message part, you will need to add a new 2nd attachment node using insert action as last child of attachments variable, then replace the 2nd attachment with the above message.

PCS_DOCS_BI_OSB_OSB_CONFIG3

PCS_DOCS_BI_OSB_OSB_CONFIG4

PCS_DOCS_BI_OSB_UploadFileProxyService

Summary

In this 2nd part of this blog series, we have covered OSB services design for this demonstration.  In Part 3 of this blog, we will discuss the PCS design and how to invoke the services exposed by OSB.

Integrating PCS, DOCS and BI Publisher using OSB/SOACS – Part 3

$
0
0

Introduction

In the part 1 and part 2 of this blog, I’ve covered the overview, setup and OSB design, in this part 3 of the blog, I will be covering the PCS design.

PCS Design

In PCS, we will need to create 2 web service connectors, the first connector is for DOCS folder creation using CreateMultipleFolders proxy service, and the second connector is for report generation and file upload to DOCS using GenerateBIReportToDOCS proxy service.

PCS_DOCS_BI_OSB_PCS Design

When a PCS instance is initiated, it will create the process instance folder(s) in DOCS, in some occasion, it might take a few seconds for the DOCS to create the folder and be visible when you invoke the DOCS REST service.  Hence, we will need to put a wait activity before each service call, this is to ensure all necessary information are created and available before the next service call and the folder(s) will be created in the designated PCS instance folder in DOCS.

PCS_DOCS_BI_OSB_DOCS Folder-1

In the CreateMultipleFolders service, the PCS instance id will be used as an input for the create docs folder service.  The following screenshot demonstrates the data input mapping for Create DOCS Folder activity.  For the parent folder of the sub folder(s) that you want to use, you will need to use the predefined properties called instanceid, e.g. “SalaryReportProcess-1.0/SalaryReportProcess_Instance_” + instanceId + “/Application Documents”.

PCS_DOCS_BI_OSB_PCS-CreateDOCSFolder_DataMapping

For this demonstration, I am using a data object called reportFolde to store the folder name output from the OSB web service, it will be used as a destination folder in DOCS for the generated report.  In this demonstration, we will use the first folder created in the list for demonstration purpose. You can do this by explicitly specifying the index number of the element array as an input data:

PCS_DOCS_BI_OSB_PCS-CreateDOCSFolder_DataMapping-output

 

The following screenshot demonstrates the data input mapping for Generate and Upload Report activity.  We will map the foldername using this expression: “SalaryReportProcess-1.0/SalaryReportProcess_Instance_” + instanceId + “/Application Documents/” + reportFolder, this will allow the OSB to upload generated PDF report to the designated report folder in DOCS and the report will be visible in the PCS Review Report human task.

You will also need to provide the BI Publisher sample report path, I am using the salary report with no parameters, for example, the report absolute path is “/Samples/1. Overview/Salary Report – No Parameters.xdo”, this sample report will generate a PDF file with no parameters.

PCS_DOCS_BI_OSB_PCS-CreateDOCSFolder_DataMapping3

 

Test and Verify

To test the PCS application, you can follow this instruction to deploy the application for testing. https://docs.oracle.com/cloud/latest/process_gs/CPRCW/GUID-82201213-DB6D-4324-9CF9-DF83138B0728.htm#CPRCW-GUID-82201213-DB6D-4324-9CF9-DF83138B0728.  We will be using the workspace for testing.

Steps:

  • After you have deployed the SalaryReport application, you will start the application using the workspace.
  • Enter the following data in the webform and click submit:
      • Requester Name: <<Your name>>
      • File Name: <<Your name>>_salaryreport.pdf
      • Folder Name: Salary Report
      • Folder Description: Salary Report Folder

    PCS_DOCS_BI_OSB_Test1

    • In another browser window, login to DOCS using the username and password that you have configured when you configured the PCS and DOCS integration in part 1 of this blog. You will notice the process instance folder and Salary Report folder are created.  The <<Your name>>_salaryreport.pdf will be created in the Salary Report folder.
    • Going back to the workspace, you will see a task waiting for you. Open the task and navigate to the attachment and you will find the same PDF report in the Salary Report folder.

    PCS_DOCS_BI_OSB_Test2
    PCS_DOCS_BI_OSB_Test3

    • After you have review the report, you can then click on the approve button and complete the task.

    Summary

    In this 3 part blog series, I have demonstrated the PCS, SOACS/OSB, DOCS and BI Publisher integration using REST and SOAP. The sample project for this demonstration can be downloaded from this link: PCS-DOCS-SOACS_Sample. Additional configuration information can be found in the enclosed readme.txt file.

Transport Level Security (TLS) and Java

$
0
0

Know Which Versions of TLS are Supported in Recent Java Versions

In the twenty-plus years of the Internet’s interaction with the Secure Sockets Layer (SSL) and Transport Level Security (TLS) protocols, there have been some rough patches.  Over the years, various vulnerabilities, some of them exposed in a laboratory setting and others discovered and exploited by hackers, have made it necessary to revamp the protocols and to augment specifications.  Major changes to specifications, obviously, make it more difficult to support backwards compatibility, especially in a network as large and as decentralized as the Internet.  After all, it just isn’t realistic to declare by decree that every web site on the Internet upgrade its older, insecure version of SSL or TLS in favor of the new and improved security framework du jour.

Another problem, somewhat related to backwards compatibility, is that the Internet, due to its sheer size and diversity, cannot react instantaneously to an upgraded security protocol, and that the client agents and programming languages need time to assimilate security protocol changes into their own upgrade and version release schedules.  A major case in point is the Java language, which is perhaps the most widely used programming language for developing applications that are designed to communicate with clients and/or with other servers over both private and public networks.  Occasionally, special handling in some versions of Java is required to allow applications to communicate with each other using the latest and greatest versions of these security protocols.  Working with these Java versions as needed given certain versions of TLS is covered here.

A Little History:  SSL Under Siege

In the Internet’s formative years, when it was still the exclusive domain of scientific researchers and developers, before commerce and entertainment became Internet channels, and before it became a part of everyday life and business for millions, few users would have put a very high priority on the need to add a security layer to communications, even though it was a public network.  Of course, priorities changed quickly as the Internet’s usage patterns expanded, as more and more people started using the Internet for more and more things.  With increased popular usage, and with the expanded potential for usage in commerce, the incentive for hijacking network communications became more lucrative potentially.  It became obvious that if the Internet were to take on an expanded role, and if privacy was going to be respected, something would have to be done to protect sensitive data transmissions from unauthorized snoopers and others with criminal intentions.

And thus the Secure Sockets Layer, or SSL, protocol was born.  The original authors and developers of the specification identified three core functional areas:

  • Privacy: Use of encryption
  • Identity authentication: System Identification using certificates
  • Reliability: Maintaining a secure connection with message integrity checking

Attempts to implement these features securely in the first few major versions of the SSL protocol went down a path fraught with false starts and wrong turns.  Indeed, SSL version 1.0, developed by Netscape in the early 1990’s, was never even released publicly due to a large number of security flaws.  (There are claims that SSL v1.0 was cracked ten minutes after Netscape introduced it to the developer community.)  SSL v2.0, released in 1995, also had numerous security gaps, which forced a ground-up redesign and rework of the protocol.  This work culminated in the release of SSL v3.0 in 1996 (IETF RFC 6101).

Up until version 3.0 of SSL, drafting the specifications and doing the actual work largely fell under the purview of one vendor, Netscape.  Technically speaking, therefore, the first few versions of the protocol were proprietary solutions.  Due to security’s increasing significance and importance the IETF governing body took over management of SSL after version 3.0 was released.  Renaming SSL to Transport Level Security (TLS) was one of the body’s first actions, and after that there were a number of incremental revisions to TLS, resulting in TLS 1.0, 1.1, and TLS 1.2, which is the most recent iteration of the specification and protocol.

Historically, vulnerabilities tended to cluster around two key components of the SSL/TLS protocols:  exploiting holes in the initial handshake process, or finding gaps in either the encryption process or the encrypted data.  In SSL v2.0, for example, one of the early exploits manipulated the list of available cipher suites presented to a communications partner during the handshake process, thereby forcing a weaker cipher, smoothing the way for Man-In-The-Middle attacks to be successful.  SSL v3.0 filled this hole, along with incorporating other improvements, but vulnerabilities continued to be uncovered, resulting eventually in TLS 1.0.  Despite improvements in TLS over SSL, this family of cipher suite downgrade attacks continues to be relevant for even newer versions of TLS.

Researchers have been (and continue to be) major contributors in identifying weaknesses and security holes in SSL and TLS.  The BEAST, CRIME, and BREACH attacks were identified in 2011 and 2012, resulting in a number of supplied fixes from browser and O/S vendors.  Some browsers were more open to these types of attacks than others.

Even after fixes and improvements were incorporated into TLS 1.0, Google Security Team researchers were able to uncover a means of accomplishing a Man-In-The-Middle exploit by taking advantage of TLS 1.0’s built-in behavior of reverting back to SSL 3.0 to support backwards compatibility for communications partners.  Exposed in late 2014, this came to be known as the POODLE vulnerability, and it set off a flurry of corrective activity from browser vendors and language providers alike.  The net result was a wholesale movement to remove support for SSL 3.0 in browsers and networked applications.

In addition to the defects uncovered in the various SSL and TLS specifications, bugs in library implementations have also been problematic for secure network communications.  In early 2014, the Heartbleed bug/vulnerability in the extremely popular OpenSSL software stack opened up websites to the potential for compromising their secret private keys, thereby making it possible for attackers to eavesdrop on communications, impersonate identities, and steal valuable data.

SSL/TLS and Java Support

This brief historical overview of SSL and TLS demonstrates that the protocol specifications have been extremely fluid and have never been in a state that other technologies can just take as a constant.  The protocols are in many ways moving targets that other computing technologies, such as Java, need to take into account when patches and major upgrades are released.  There have been occasions when TLS has incorporated a fix for an identified security gap (take the somewhat recent POODLE exploit as an example), and other technologies that interact with SSL/TLS have little choice but to “catch up” in their support of the new additions.  This is especially true when the solution involves removing support for an older protocol, such as what happened with POODLE and SSL 3.0.  The time lags are unfortunate, but due to the “late-breaking” nature of security holes when they are discovered, there does not seem to be much of an alternative.

Over time, Java releases have reacted to the evolution of SSL and TLS by building in support for newer releases of the security protocols.

Java Version SSL/TLS Default Other Supported Versions
Java 6 TLS 1.0 TLS 1.1 (update 111 and later), SSLv3.0*
Java 7 TLS 1.0 TLS 1.2, TLS 1.1, SSLv3.0*
Java 8 TLS 1.2 TLS 1.1, TLS 1.0, SSLv3.0*

* SSLv3 support disabled in January, 2015 patch releases

Up until January, 2015, all of the above-listed Java releases had fallback support for SSL 3.0.  As a response to the late-2014 POODLE exploit, Oracle issued CPU releases in early 2015 (JDK 8u31, JDK 7u75, and JDK 6u91), to disable SSL v3 by default.  Publicly available Java 6 releases do not have built-in support for TLS 1.2.  Java 8, with its default support for TLS 1.2, has caught up with the latest released specification of the protocol.

Due to POODLE, the vast majority of web sites have disabled support for SSL 3.0 on their servers.  There has been related momentum building up in the Internet security community to recommend disabling “early” versions of TLS as well, as there are known security issues both in TLS 1.0 and TLS 1.1.  Running only TLS 1.2, however, may block older browsers and other clients from connecting successfully.  This represents a tradeoff between tightly clamping down on security versus being a bit more flexible about supporting older browsers and other clients.

Clients (and servers connecting to other servers as clients) dependent upon the Java 7 JDK and JRE may be affected negatively by sites allowing only newer TLS versions.  By default Java 7 does not successfully negotiate with TLS 1.1 and TLS 1.2 servers.  If attempts to connect with a Java 7 client result in a stacktrace with an SSLHandshakeException it is possible that the default client behavior will need to be modified.  The expanded stacktrace (taken from a SOAPUI session) makes it fairly clear that the server is shutting down the client connection before the handshake sequence can complete:

javax.net.ssl.SSLException: Connection has been shutdown: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake
        at sun.security.ssl.SSLSocketImpl.checkEOF(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.checkWrite(Unknown Source)
        at sun.security.ssl.AppOutputStream.write(Unknown Source)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flushBuffer(AbstractSessionOutputBuffer.java:131)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flush(AbstractSessionOutputBuffer.java:138)
        at org.apache.http.impl.conn.LoggingSessionOutputBuffer.flush(LoggingSessionOutputBuffer.java:95)
        at org.apache.http.impl.AbstractHttpClientConnection.doFlush(AbstractHttpClientConnection.java:270)
        at org.apache.http.impl.SocketHttpClientConnection.close(SocketHttpClientConnection.java:245)
        at org.apache.http.impl.conn.DefaultClientConnection.close(DefaultClientConnection.java:164)
        at org.apache.http.impl.conn.AbstractPooledConnAdapter.close(AbstractPooledConnAdapter.java:152)
        at org.apache.http.protocol.HttpRequestExecutor.closeConnection(HttpRequestExecutor.java:142)
        at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:129)
        at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:633)
        at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:454)
        at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:820)
        at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:754)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport$Helper.execute(HttpClientSupport.java:233)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport.execute(HttpClientSupport.java:323)
        at com.eviware.soapui.impl.wsdl.submit.transports.http.HttpClientRequestTransport.submitRequest(HttpClientRequestTransport.java:290)
        at com.eviware.soapui.impl.wsdl.submit.transports.http.HttpClientRequestTransport.sendRequest(HttpClientRequestTransport.java:220)
        at com.eviware.soapui.impl.wsdl.WsdlSubmit.run(WsdlSubmit.java:119)
        at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)
Caused by: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake
        at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source)
        at sun.security.ssl.SSLSocketImpl.writeRecord(Unknown Source)
        at sun.security.ssl.AppOutputStream.write(Unknown Source)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.flushBuffer(AbstractSessionOutputBuffer.java:131)
        at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:151)
        at org.apache.http.impl.conn.LoggingSessionOutputBuffer.write(LoggingSessionOutputBuffer.java:74)
        at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:114)
        at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:120)
        at org.apache.http.entity.ByteArrayEntity.writeTo(ByteArrayEntity.java:68)
        at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
        at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:108)
        at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:120)
        at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:263)
        at org.apache.http.impl.conn.AbstractClientConnAdapter.sendRequestEntity(AbstractClientConnAdapter.java:227)
        at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:255)
        at com.eviware.soapui.impl.wsdl.support.http.HttpClientSupport$SoapUIHttpRequestExecutor.doSendRequest(HttpClientSupport.java:119)
        at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:123)
        ... 14 more
Caused by: java.io.EOFException: SSL peer shut down incorrectly
        at sun.security.ssl.InputRecord.read(Unknown Source)
        ... 32 more		

SSL handshake failures typically occur because the client and the server cannot agree on which version of the protocol to use.  In the case of a default Java 7 client and a server that supports TLSv1.2 and possibly TLSv1.1, there is no common ground for agreement because there are no shared, supported protocols.  At the beginning of the handshake process, the Java 7 client sends a “ClientHello” message with an indication that it is ready to support the TLSv1 protocol (or anything older such as SSLv3.0).  The server sees the request but has no choice other than to close the connection because it knows the client cannot support TLSv1.2 or TLSv1.1.

To work around this issue, supply the command line directive “-Dhttps.protocols=TLSv1.2,TLSv1.1,TLSv1” when starting the Java VM, or in code, the equivalent system property directive is “System.setProperty(“https.protocols”, “TLSv1.2,TLSv1.1,TLSv1”)”.  If connecting to a TLS 1.1 or TLS 1.2 site from a JEE server environment, it will be necessary to add the command line directive to the server’s startup scripts.  With Weblogic, for example, editing the Java startup properties in the setDomainEnv.sh script should get the job done.

With these directives in place the handshake process succeeds.  Initially, the Java 7 client sends a “ClientHello” message, but now it indicates that it can support the TLSv1.2 protocol.  The server responds, certificate information is sent, a cipher suite is agreed upon, and protected communications between the two parties can begin.

To determine with certainty which version(s) of the TLS protocol are supported by a target web site, Qualys SSL Labs provides a free online service (https://www.ssllabs.com) that provides audit reports of SSL URL endpoints.  In addition to certificate details, supported cipher suite listings, and simulated handshake sequences with a variety of user agents (including Java 6, Java 7, and Java 8), the report has a section on enabled protocols for the site.

Not so coincidentally, repeated unsuccessful attempts to connect to a security-upgraded web site with a default Java 7 client was the catalyst for this research and writeup.  Here is the supported protocol summary from the SSL Labs online report for the problematic (at least for Java 7 configured with the defaults) site:

SSLLabsProtocols

Many endpoints protected by SSL/TLS are serving SOAP or REST web services.  The current release of Smart Bear Software’s popular testing tool, SOAPUI, is built with Java 7, so it too will need special configuration when trying to connect to a TLS 1.1 or TLS 1.2 SOAP or REST endpoint.  One may think that setup for a successful handshake with the server would be to add the Java command line directive detailed above in the SOAPUI startup sequence.  Although seemingly a sound strategy, this does not lead to success; SOAPUI has put in place its own command line configuration modifiers.  Adding “-Dsoapui.https.protocols=TLSv1.2,TLSv1.1,TLSv1.0” to the JAVA_OPTS environment variable in the soapui.bat file will work successfully, however.

Looking Ahead

With default support in Java 8 for TLS 1.2, the current release of the security protocol, it makes the most sense to run Java client applications with Java 8 if there is some flexibility in being able to choose the Java version.  Without that flexibility, If running Java 7 is the only option, then it will be necessary to modify Java startup parameters for the application if the communications target has both SSLv3 and TLSv1.0 disabled.  This non-default setup requirement may become more and more a necessity as more and more secure endpoints disable TLS1.0 along with SSLv3.0.  But at the same time, applications should slowly but surely start to uptake Java 8, so the non-default configuration settings requirement for Java 7 should be short-lived.

Additional Resources

The following on-line resources were helpful in compiling this document:

Java SE 7 Security Enhancements Documentation:  http://docs.oracle.com/javase/7/docs/technotes/guides/security/enhancements-7.html

Java SE 8 Security Enhancements Documentation:  https://docs.oracle.com/javase/8/docs/technotes/guides/security/enhancements-8.html

Java Platform Group Blog: Diagnosing TLS, SSL, and HTTPS : https://blogs.oracle.com/java-platform-group/entry/diagnosing_tls_ssl_and_https

 

Hybrid Mobile Apps: Using the Mobile Cloud Service JavaScript SDK with Oracle JET

$
0
0

Introduction

Oracle’s Mobile Cloud Service has a Javascript SDK that makes connecting your hybrid mobile app to your mobile backend service a breeze.  The Javascript SDK for MCS comes in two flavors, one for web applications and one for Cordova applications. The MCS Javascript SDK for Cordova has a few more capabilities than the web version, such as methods for registering a device and notifications.  However, for the most part the two SDKs are quite similar. For creating hybrid mobile apps, choose the Cordova SDK.

To download the Javascript SDKs for MCS, login to your MCS instance and click on the “Get Started” page. This page has SDK downloads for native apps in Android, iOS, and MCS Javascript SDKs. You can download the SDK with a starter app or choose to download the SDK alone and add it to an existing project. For the example in this post, I downloaded the SDK by itself and added it to a project created using Oracle JET (Javascript Extension Toolkit). To get started with Oracle JET, follow the Get Started link on the JET home page.

The steps below include one way to connect the hybrid app to Mobile Cloud Service using the MCS JavaScript SDK. I will cover making calls to MCS for authentication and uploading a picture taken on the device to the MCS Storage repository.

NOTE: This example uses the camera plugin of Cordova. To test this on iOS the sample app will have to be run on an actual iOS device, since the iOS simulator does not have a camera. For Android, the emulator does have a camera, so on Android either a device or emulator will work.

Main Article

To get started, use the handy Oracle JET generator to stamp out a mobile app template. The generator can be installed used npm. Using Yeoman, the app template can be created for whatever platform you wish to use. The steps in this post will focus on Android primarily, but also work with iOS hybrid apps.

Install JET generator

To generate a hybrid mobile starter application, the Yeoman generator for Oracle JET must be installed. Use npm to install “generator-oraclejet”. Again, note that on Mac you may need to use sudo.

npm install -g generator-oraclejet

To verify the generator was installed, run the following command:

npm list -g generator-oraclejet

Scaffold a Mobile Application

Using Yeoman, the JET generator can scaffold three different types of starter applications. This example will use the “navBar” template. To see screens of the options, follow this link.

Open a command prompt and create a directory where you want your mobile application to reside. Change directory to that new folder and run the Yeoman command to create an Oracle JET application. The command below creates a hybrid app named “JETMobileDemo” using the “navBar” template. Note that if on Windows, the platforms option cannot include iOS. On Mac you can use both iOS and android.

yo oraclejet:hybrid --appName=JETMobileDemo --template=navBar --platforms=android

Once this command completes, the directory that you are in will have a JET mobile application created. The output should show this at the end:

  Done, without errors.
  Oracle JET: Your app is ready! Change to your new app directory and try grunt build and serve…

See this link to understand the folder structure that is created from the template. Below are the descriptions of the project folders from the documentation. Note that the “src” directory is where the app pages are defined and edited. At build time, the src files are copied to the hybrid directory. This is important to understand so you avoid developing in the hybrid directory which gets overwritten by the grunt build task (or more specifically the “grunt copy” task).

Build and Serve

If you have installed the Android or iOS tooling, then the template app is ready to be built and run on a device or emulator. To build the app, grunt is used. Set the platform as needed.

grunt build:dev --platform=android

Run the app using “grunt serve”. For more details on the grunt serve command, see the JET documentation on grunt commands. The command below runs the app on an Android device attached to your machine.

grunt serve --platform=android --destination=device

 

On the device or emulator, the starter template should look something like this. If running on iOS, the navigation bar will be on the bottom of the page instead of the top. The navbar can be styled to be on top or bottom for all devices if needed, but the default is top for Android, bottom for iOS. Hint: if you want the header on top for iOS as well, use the JET style “oj-hybrid-applayout-navbar-fixed-top” on the navigation div in index.html. For information on the built-in JET styles see this link.

 

scaffold

 

Open the JET Project for Editing

A grunt command can be used to start a local web server to run the project.

grunt serve --platform=android --web=true

For development, use the IDE or editor of your choice.
NOTE: During development of JET hybrid apps, edit the files under the “src” directory, not the “hybrid/www”. When grunt serve is run on the project, changes made in the src directory will be automatically copied over to the hybrid/www directory (this is part of the live reload feature). When grunt build executes, the files from the src directory are copied over to the hybrid directory. Note that if you don’t want to run a full build but just want to copy the files over from src to hybrid/www, run this command, which will delete everything under the hybrid/www and then copy the src files over again:

grunt clean copy

 

Main.js and AppController.js

JET uses Require.js for loading modules. In the index.html file, only two script tags are needed to run in the browser, one script tag for Require.js and one for main.js. The main.js file defines the Require configuration as well as the top level view model, named MainViewModel, which initializes the single-page application.

<script type="text/javascript" src="js/libs/require/require.js"></script>
<script type="text/javascript" src="js/main.js"></script>

This can be combined into a single line, if preferred. Require.js has a data-main attribute that can be used to reference main.js on the same line as Require.js.

<script type="text/javascript" data-main="js/main" src="js/libs/require/require.js"></script>

The MainViewModel also initializes the AppController.js view model object. The AppController defines the pages for the router and creates the navigation entries. Finally, the router is synchronized and then the bindings are applied to the body of the index.html file.

What about Cordova.js?

Cordova.js is referenced in the index.html file, but only in the hybrid/www directory. This script tag is added by the build process to the index.html file in the hybrid/www/index.html file. When the “grunt build:dev” is executed, watch for a step in the output that reads “includeCordovaJs”. This step insert the important cordova.js script tag into index.html.

<script src="cordova.js"></script>

The cordova.js is needed when running on a device or emulator. When running in the browser, a 404 error will occur because of the script tag for including cordova.js. Do not be concerned with the 404 issue on cordova.js when testing in the browser. But do be concerned if you see the same error on the device or emulator. The cordova.js file should be available when running on a device or emulator because the build process parks the cordova.js file in the platform directory. For Android the location is:

<project>\hybrid\platforms\android\platform_www\cordova.js

Because cordova.js is not available when running in the browser, testing any usage of Cordova plugins can only be done on a device or emulator/simulator. However, with the live reload capabilities using “grunt serve”, this makes it a less time consuming process to update the app on the device when making changes.

Connect to MCS using Javascript SDK

Ok, now let’s get to the MCS Javascript SDK! We’ll get started by creating a simple login page in the app to allow for user/password login and logout using the SDK. As a bonus we will use some JET components ojLabel, ojInputText, and ojInputPassword will be used with Knockout observables to set the username and password. If you haven’t checked out the plethora of JET UI components, run – don’t walk – to the JET cookbook page. This sample only uses very basic input components, but the JET UI components are extensive and powerful. See the Data Visualization Components for proof.

A JavaScript SDK is available for hybrid apps to connect to Oracle Mobile Cloud Service. This library is not available via bower, so this will be manually copied into the project.

At the end of this section the intial page of the app should look like this image.

loginpage

Get the SDK library

Download the MCS JavaScript SDK from your “Get Started” page in Mobile Cloud Service. From that page, links are provided to the different SDKs available in MCS.

Unzip the file to a temp directory and locate the mcs.js and mcs-min.js files.

 

Add SDK to Project and Require.js configuration

In your app project, under “src/js” folder create a new folder called “mcs”. Copy the mcs.js into this folder (or you can use the minified mcs-min.js file for these steps, if you prefer).

Update Require.js Configuration to include the MCS SDK

In the main.js file for the app, add a reference in the Require config file for the MCS JavaScript SDK. The mcs.js or minified mcs-min.js can be referenced. The configuration below use the non-minified version. One line is added below:

'mcs': 'mcs/mcs'

 

Config portion of main.js

requirejs.config(
{
  baseUrl: 'js',
  
  // Path mappings for the logical module names
  paths:
  //injector:mainReleasePaths
  {
    'knockout': 'libs/knockout/knockout-3.4.0.debug',
    'jquery': 'libs/jquery/jquery-2.1.3',
    'jqueryui-amd': 'libs/jquery/jqueryui-amd-1.11.4',
    'promise': 'libs/es6-promise/promise-1.0.0',
    'hammerjs': 'libs/hammer/hammer-2.0.4',
    'ojdnd': 'libs/dnd-polyfill/dnd-polyfill-1.0.0',
    'ojs': 'libs/oj/v2.0.0/debug',
    'ojL10n': 'libs/oj/v2.0.0/ojL10n',
    'ojtranslations': 'libs/oj/v2.0.0/resources',
    'text': 'libs/require/text',
    'signals': 'libs/js-signals/signals',
    'mcs': 'mcs/mcs'
  }
  //endinjector
  ,
  // Shim configurations for modules that do not expose AMD
  shim:
  {
    'jquery':
    {
      exports: ['jQuery', '$']
    }
  }
}
);

 

Get MCS URL and Keys

Login to MCS service console and go to the mobile backend that the app will use. On the Settings page, you can obtain the keys needed to wiring your mobile app to the MCS backend.  Click the “Show” links to see the URLs and keys for the mobile backend. These will be used to configure the MCS connection object in JavaScript.

settings

Create the view model

To interact with MCS from the app, an interface to MCS needs to be created. Since the MCS JavaScript SDK is already available to the app, a new JavaScript file can be created to call the MCS methods that the application needs to use.
Like the “mcs” folder that was created in the previous step, add another folder called “mbe”, an abbreviation for “Mobile Backend”. Create a JavaScript file called mbe.js. This should be at the same level as the mcs folder previously created.

 

Insert the code below into the file. Change the keys to use your MCS backend keys. This file is doing several things for configuring the application connection to the Mobile Backend in MCS:

 

  • The configuration is defined for a backend called “JETSample”. This name doesn’t have to match the name of the backend that is actually in MCS, but can if you wish to match it.
  • The “mcs” reference is included in the define array of dependencies. This reference is available because the Requires.js configuration in main.js was already updated in an earlier step.
  • The mcs_config object contains the URL and keys for accessing MCS. This object is defined in the official documentation for MCS. The basicAuth and OAuth portions of the object are defined, but only basicAuth is used in the demo.
  • The init() method in the file declares and initializes the MCS backend object, setting the authentication type to basic auth. A user named jetuser is defined in this particular MCS backend.
  • The methods defined in this file are for login and logout of the backend. A method for anonymous authentication is included but will not be used. The authenticate and logout methods will be used for a demo on the dashboard page to work in concert with JET components for a basic login page.
  • This file also contains methods for adding to a collection in MCS, which we will use later when interacting with the Cordova camera plugin. This example hardcodes a Storage collection name, “JETCollection”, that is assumed to be defined on the MCS backend.

mbe.js

define(['jquery', 'mcs'], function ($) {
    //define MCS mobile backend connection details
    var mcs_config = {
        "logLevel": mcs.logLevelInfo,
        "mobileBackends": {
            "JETSample": {
                "default": true,
                "baseUrl": "https://mobileportalsetrial-yourdomain.mobileenv.us2.oraclecloud.com:443",
                "applicationKey": "0fc655f4-0000-4876-0000-0000000000000",
                "authorization": {
                    "basicAuth": {
                        "backendId": "b00000cf-0000-4cda-a3e0-000000000000",
                        "anonymousToken": "redacted"
                    },
                    "oAuth": {
                        "clientId": "00000000-2c24-4667-b80e-000000000000000",
                        "clientSecret": "mysecretkey",
                        "tokenEndpoint": "https://mobileportalsetrial-yourdomain.mobileenv.us2.oraclecloud.com/oam/oauth2/tokens"
                    }
                }
            }
        }
    };

    function MobileBackend() {
        var self = this;
        self.mobileBackend;
        //Always using the same collection in this example, called JETCollection. Can be dynamic if using multiple collections, but for example using one collection.
        var COLLECTION_NAME = "JETCollection";
        function init() {
            mcs.MobileBackendManager.setConfig(mcs_config);
            //MCS backend name for example is JETSample. 
            self.mobileBackend = mcs.MobileBackendManager.getMobileBackend('JETSample');
            self.mobileBackend.setAuthenticationType("basicAuth");            
        }

        //Handles the success and failure callbacks defined here
        //Not using anonymous login for this example but including here. 
        self.authAnonymous = function () {
            console.log("Authenticating anonymously");
            self.mobileBackend.Authorization.authenticateAnonymous(
                    function (response, data) {                        
                        console.log("Success authenticating against mobile backend");
                    },
                    function (statusCode, data) {
                        console.log("Failure authenticating against mobile backend");
                    }
            );
        };

        //This handles success and failure callbacks using parameters (unlike the authAnonymous example)
        self.authenticate = function (username, password, successCallback, failureCallback) {
            self.mobileBackend.Authorization.authenticate(username, password, successCallback, failureCallback);
        };

        //this handles success and failure callbacks using parameters
        self.logout = function (successCallback, failureCallback) {
            self.mobileBackend.Authorization.logout();
        };

        self.isAuthorized = function () {
            return self.mobileBackend.Authorization.isAuthorized;
        };
       
        self.uploadFile = function (filename, payload, mimetype, callback) {            
            self.getCollection().then(success);                        
            
            function success(collection) {                
                //create new Storage object and set its name and payload
                var obj = new mcs.StorageObject(collection);
                obj.setDisplayName(filename);
                obj.loadPayload(payload, mimetype);                
                return self.postObject(collection, obj).then(function (object) {                                        
                    callback(object);
                });
            }
        }
        
        //getCollection taken from official documentation example at site https://docs.oracle.com/cloud/latest/mobilecs_gs/MCSUA/GUID-7DF6C234-8DFE-4143-B138-FA4EB1EC9958.htm#MCSUA-GUID-7A62C080-C2C4-4014-9590-382152E33B24
        //modified to use JQuery deferred instead of $q as shown in documentaion
        self.getCollection = function () {
            var deferred = $.Deferred();

            //return a storage collection with the name assigned to the collection_id variable.
            self.mobileBackend.Storage.getCollection(COLLECTION_NAME, self.mobileBackend.Authorization.authorizedUserName, onGetCollectionSuccess, onGetCollectionFailed);

            return deferred.promise();

            function onGetCollectionSuccess(status, collection) {
                console.log("Collection id: " + collection.id + ", description: " + collection.description);
                deferred.resolve(collection);
            }

            function onGetCollectionFailed(statusCode, headers, data) {
                console.log(mcs.logLevelInfo, "Failed to download storage collection: " + statusCode);
                deferred.reject(statusCode);
            }
        };

        //postObject taken from official documentation example at site https://docs.oracle.com/cloud/latest/mobilecs_gs/MCSUA/GUID-7DF6C234-8DFE-4143-B138-FA4EB1EC9958.htm#MCSUA-GUID-7A62C080-C2C4-4014-9590-382152E33B24
        //modified to use JQuery deferred instead of $q as shown in documentaion
        self.postObject = function (collection, obj) {
            var deferred = $.Deferred();

            //post an object to the collection
            collection.postObject(obj, onPostObjectSuccess, onPostObjectFailed);
            
            return deferred.promise();

            function onPostObjectSuccess(status, object) {            
                console.log("Posted storage object, id: " + object.id);
                deferred.resolve(object.id);
            }

            function onPostObjectFailed(statusCode, headers, data) {
                console.log("Failed to post storage object: " + statusCode);
                deferred.reject(statusCode);
            }
        };

        init();
    }

    return new MobileBackend();
});

 

Add JET Components to the dashboard page

For this example the dashboard page will have a login form added to it. The JET cookbook can be used for understanding how to add form elements and buttons into a view and viewModel. Knockout knowledge is important when using JET components. For beginning Knockout users the http://learn.knockoutjs.com/ tutorial provides a good hands-on lab.

Replace the scaffolded app’s dashboard.html page with the following html. What this html does is add a username and password entry field using JET components. The login form is only visible when the user is not authorized. The logout button is only visible when the user is authorized. The Knockout visible binding is used to hide and show elements based on the login status.

The username and password entry field values are bound to Knockout observables defined in dashboard.js. JET bindings for form inputs use the “value” attribute. The login button is bound on the click event to a function in the view model called login. Likewise, the logout button is bound on a click event to a logout function in the view model.

Additional elements that will be used are commented out for the time being.

dashboard.html

<div class="oj-hybrid-padding">
  <h3>Dashboard Content Area</h3>
  <div>
    <div data-bind="visible: !isLoggedIn()" class="oj-flex oj-sm-flex-direction-column oj-md-flex-direction-column"> 
      <div class="oj-flex-item">
        <label for="text-input">Username</label>
        <input id="text-input" type="text" data-bind="ojComponent: {component: 'ojInputText', value: username}"/>
      </div>
      <div class="oj-flex-item">
        <label for="password">Password</label>
        <input type="password" id="password" data-bind="ojComponent: {component: 'ojInputPassword', value: password}"/>
      </div>
      <div class="oj-flex-item">
        <input id="inputButton" type="button" data-bind="click: login, ojComponent: {component: 'ojButton', label: 'Login', chroming: 'full'}"/>
      </div>
    </div> 
  </div>
  <div data-bind="visible: isLoggedIn">
    <input id="inputButton" type="button" data-bind="click: logout, ojComponent: {component: 'ojButton', label: 'Logout', chroming: 'full'}"/> 

    <!--<input id="inputButton" type="button" data-bind="click: takePicture, ojComponent: {component: 'ojButton', label: 'Take a picture', chroming: 'none'}"/> 
    <br>
    <img id="cameraImage" src="" height="250" width="100%" data-bind="attr: { src: picture }">

    <input id="inputButton" type="button" data-bind="click: uploadPicture, ojComponent: {component: 'ojButton', disabled: (picture() === null), label: 'Upload to MCS', chroming: 'half'}"/> -->
  </div> 
</div>

 

Edit the dashboard view model

The dashboard.html page changes require related changes to the view model. Replace the dashboard.js file with the JavaScript below.

The first change is to include the proper dependencies in the define method (first line). The Mobile Backend helper class needs to be added here as a reference in order to makes calls to the MCS API that we have created in mbe.js.

Additional references are needed for JET components used on the page, in this case buttons and form inputs. Three Knockout observables are defined, one for the login status, one for the username, and one for the password. These are bound in the html to the JET input components. The login status is used to hide or show the login form or logout button based on the status (true or false). The username and password are initialized to working values so no typing is required in testing the login. An observable for picture is also defined for when we later add the Cordova camera plugin.

Notice the methods “authenticate” and “logout” call the Mobile Backend helper object’s methods to handle calls to MCS. In the case of the login method, callbacks are passed to the Mobile Backend helper so that upon success or failure the dashboard view model can react.

 

dashboard.js

define(['ojs/ojcore', 'knockout', 'jquery', 'mbe/mbe', 'ojs/ojknockout', 'ojs/ojselectcombobox', 'ojs/ojbutton', 'ojs/ojinputtext'],
        function (oj, ko, $, mbe) {
            function DashboardViewModel() {
                var self = this;
                self.isLoggedIn = ko.observable(false);

                //set user to a default value to quicken your login testing
                self.username = ko.observable("jetuser");

                self.picture = ko.observable(null);

                //pass callbacks to the login to trigger page behavior on success or failure
                self.login = function () {
                    mbe.authenticate(self.username(), self.password(), self.loginSuccess, self.loginFailure);
                };

                //pass callbacks to the login to trigger page behavior on success or failure
                self.logout = function () {
                    mbe.logout();
                    self.isLoggedIn(false);
                };

                self.loginSuccess = function (response) {
                    console.log(response);                    
                    self.isLoggedIn(true);
                };

                self.loginFailure = function (statusCode) {
                    self.isLoggedIn(false);
                    alert("Login failed! " + statusCode);
                };

            }
            return new DashboardViewModel;
        }
);

 

Test the login and logout

You may have to run “grunt clean copy” once to get all the updated files in place, and then run grunt serve. If a 401 error occurs on login, make sure that you are not hitting the CORS error (a Chrome workaround is defined in the next section).

After login, the Knockout visible binding on the isLoggedIn observable then hides the form and shows the logout button.

loginpagelogout

Cross Origin Error?

Calling backend services on MCS from your app while testing in a browser may cause a CORS error. This will show up in the browser console as a “HTTP Allow Access Origin Header” error. An example error showing this is:

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://mymcshost.mobileenv.us2.oraclecloud.com/mobile/platform/users/login. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing).

 

To avoid this error while developing and testing in the browser, refer to Appendix B of the MCS guide that covers Oracle Mobile Cloud Service Environment Policies.  Specifically, the Security_AllowOrigin policy can be changed from the default value of “disallow” to “allow” in order to workaround this error in your development environment.

Content Security Policy Header in index.html

The meta tag in Cordova apps is an important change in your hybrid app’s index.html file. If the tag is not set, warnings will appear in the browser console while the app is running on Android. The details of this meta tag can be found at the Cordova whitelist-plugin github page.

https://github.com/apache/cordova-plugin-whitelist

A default declaration for your app can be the following. Note that the “gap:” entry is needed for iOS when the Cordova Splash plugin is in use. Put this into the index.html file in the head. Specific needs may require alterations to this, and for production consider changing this to point only to the MCS host (and specific hosts that your app needs to communicate with).

<!--Allows connection to any host. Consider changing this to only allow calls to your MCS host: connect-src 'self' http://mobilecloudservicehost -->
<meta http-equiv="Content-Security-Policy" content="default-src * data: gap:; script-src 'self' 'unsafe-inline' 'unsafe-eval' 127.0.0.1:* localhost:*; style-src 'self' 'unsafe-inline'; media-src *"/>

Working with Cordova Plugins

This set of steps will briefly cover how to add Cordova plugins to the project, but for full details on Cordova plugins see the Apache Cordova website. The sample in this demo is to take a picture using the device and then upload it to a MCS Collection as a jpg file.
Cordova has a set of core plugins that can be added to the app. To add a plugin, in your project, change directory on the command line into the “hybrid” folder. This is where the config.xml file is located for Cordova, which defines the Cordova plugins, platforms, and hooks.
To add a plugin use the “cordova plugin add” command. To persist the change to the config.xml file, use the –save option on the command. Add the camera and file plugins using these commands.

cordova plugin add cordova-plugin-camera --save
cordova plugin add cordova-plugin-file --save

Once the plugin is installed, the config.xml will have two new lines added to it.

 

<plugin name="cordova-plugin-camera" spec="~2.1.1" />
<plugin name="cordova-plugin-file" spec="~4.1.1" />

These lines should be near the end of the config.xml file. For questions about the contents of the config.xml, Apache has a reference guide for the config.xml file that describes all of the entries in detail.

Add a Mobile Backend Method for Uploading to a MCS Collection

In the Mobile Backend helper file that uses the MCS JavaScript SDK, the methods required for uploading files to a collection were already added when you created mbe.js. In MCS, create a collection named “JETCollection” (or whatever you like). Set the collection name in the mbe.js file to a variable accessible by the view model methods. This was already done in the mbe.js sample code listed earlier in this post.

var COLLECTION_NAME = "JETCollection";

The getCollection and postObject are documented using the $q async/promise library at this MCS documentation link. However, since JET already has JQuery available, the $.Deferred object can be used for handling the async calls to MCS for getting a collection and posting a file to a storage collection. In the callbacks the deferred object can be resolved or rejected.

From the dashboard, the uploadFile method can be used by passing a filename, file blob (arrayBuffer), the MIME type of the file (image/jpeg), and lastly a callback to run on completion so that the user can be notified that the upload is done.
The uploadFile performs two tasks: first, get the collection from MCS, and second, POST the file object to the storage collection.

 

Using the Plugins in the Dashboard

The capabilities of the plugin “cordova-plugin-camera” are on the Apache documentation site where the core Cordova plugins are documented. For this example, we will add a button to the dashboard page that uses the getPicture method of the Camera plugin.
First, edit the view and view model. Add two buttons and an image tag to dashboard.html. (These were commented out in the earlier dashboard.html text that was used, so you can now uncomment these lines.) This html is placed after the “Logout” button but inside the same div so that th picture area is only visible when the user is logged in.

For illustration, this JET button shows the chroming option of “none”. This styles the button differently from the login and logout buttons previously created. A third option for chroming is “half”, which will give another look to the button. The “Upload to MCS” button uses the “half” option. Another JET component option on the button is to make it disabled based on certain criteria in the view model. If the picture variable in the view model is null, the disabled option is turned on. This only allows the user to click the button when there is a picture already taken.

 

Add tags on dashboard.html

<input id="inputButton" type="button" data-bind="click: takePicture, ojComponent: {component: 'ojButton', label: 'Take a picture', chroming: 'none'}"/> 
<br>
<img id="cameraImage" src="" height="250" width="100%" data-bind="attr: { src: picture }"> 
<input id="inputButton" type="button" data-bind="click: uploadPicture, ojComponent: {component: 'ojButton', disabled: (picture() === null), label: 'Upload to MCS', chroming: 'half'}"/>

 

The buttons rely on methods named “takePicture” and “uploadPicture” in the view model, so add these functions in the view model dashboard.js. The takePicture method will use the “navigator.camera” object which is available due to the Cordova plugin. Add a check for the existence of the camera object, because running in the browser this object will not exist. To take a picture, the camera plugin has a method called “getPicture”. The parameters are success and failure callback methods. A third parameter for camera options is available. The options parameter allows control over image quality, encoding, height, width, and more. Add the code in the block below to the dashboard.js.

Add this block to dashboard.js

//use the Cordova camera plugin to take a picture
                self.takePicture = function () {
                    if (navigator.camera && typeof navigator.camera !== "undefined") {       
                        //sample camera options, using defaults here but for illustration....
                        //Note that the destinationType can be a DATA_URL but cordova plugin warns of memory usage on that.
                        var cameraOptions = {
                            quality: 50,
                            destinationType: Camera.DestinationType.FILE_URL,
                            sourceType: Camera.PictureSourceType.CAMERA,
                            allowEdit: false,
                            encodingType: Camera.EncodingType.JPEG,                            
                            saveToPhotoAlbum: false,
                            correctOrientation: true
                        };
                        //use camera pluging method to take a picture, use callbacks for handling result
                        navigator.camera.getPicture(cameraSuccess, cameraError, cameraOptions);
                    } else {
                        //running on web, the navigator.camera object will not be available
                        console.log("The navigator.camera object is not available.")
                    }
                };

                function cameraSuccess(imageData) {
                    //returns a file path such as: file:///storage/emulated/0/Android/data/org.oraclejet.mcsexample/cache/1459277993352.jpg
                    //set observable to the path and img tag's src attributein view will be updated.
                    self.picture(imageData);

                }

                function cameraError(error) {
                    console.log(error);
                }

                self.uploadPicture = function () {
                    //load file as blob, then once loaded add to MCS collection. Use callback for when complete.                    
                    getBlobFromFile(self.picture())
                            .then(function (arrayBuffer) {
                                mbe.uploadFile("picture.jpg", arrayBuffer, "image/jpeg", self.pictureUploadSuccess);                                
                            });
                };

                self.pictureUploadSuccess = function (objectid) {
                    console.log(objectid);
                    //showing alert to notify user that upload completed, MCS object id is shown.
                    alert("Picture uploaded to MCS, id is: " + objectid);
                };

                function getBlobFromFile(filepath) {
                    //Use Cordova file plugin API to get the file:
                    //On success, load the file as array buffer
                    var deferred = $.Deferred();

                    if (window.resolveLocalFileSystemURL && typeof window.resolveLocalFileSystemURL !== "undefined") {
                        window.resolveLocalFileSystemURL(filepath,
                                function (fileEntry) {
                                    //on success use fileEntry handle to read the file, then run callback when onloadend event occurs 
                                    fileEntry.file(function (file) {
                                        var reader = new FileReader();
                                        reader.onloadend = function (e) {                                            
                                            deferred.resolve(this.result);
                                        };
                                        reader.onerror = function (e) {                                            
                                            deferred.reject(e);
                                        };
                                        reader.readAsArrayBuffer(file);
                                    });
                                },
                                function (error) {
                                    console.log("Error getting file. Message: ", error);
                                    deferred.reject(error);                                    
                                }
                        );
                    } else {
                        var msg = "The object window.resolveLocalFileSystemURL does not exist. Cannot get file."
                        console.log(msg);
                        //reject immediately
                        deferred.reject(msg);
                    }
                    return deferred.promise();
                }

Build and Serve to Device

To build and run on an emulator or device, be sure that you have followed the guides for your mobile operating system.

Android: Follow the steps on Cordova’s site for Android setup.
https://cordova.apache.org/docs/en/latest/guide/platforms/android/index.html

iOS: Follow the steps on Cordova’s site for iOS setup.
https://cordova.apache.org/docs/en/latest/guide/platforms/ios/index.html

For Android, if you have an emulator running or device plugged in via USB, you should be able to see them using the Android Debug Bridge (adb). Running adb devices will show all emulators and devices attached.

adb devices

Output:

List of devices attached
c143bx44 device

If your device or emulator doesn’t show up, check the documentation on the Android Debug Bridge page to see if you have setup the device or emulator properly. For devices that fail to appear in the list, USB debugging may not enabled on the device itself or a driver for your device isn’t installed on the machine (see the Google OEM drivers page).

 

Build and Serve for Android

The first step once you are ready to deploy is to build the .apk file for Android. This command is already familiar from when you first created the hybrid app and performed a build on the project. Keep in mind that this command should be run from the project root, where the Gruntfile.js is located. This does not run from the hybrid directory. Only cordova commands need to be run in the hybrid directory.
Note: These steps are covered in the official JET documentation.

grunt build:dev --platform=android

Once the .apk file is built, you can serve the app to the device or emulator. The following command installs the app onto the device.

grunt serve:dev --platform=android –destination=device

The serve command should show a successful build followed by the installation and launch of the .apk file on the device or emulator.

BUILD SUCCESSFUL
Total time: 8.941 secs
Built the following apk(s):
D:\jet\mcsexample\hybrid\platforms\android\build\outputs\apk\android-debug.apk
Using apk: D:\jet\mcsexample\hybrid\platforms\android\build\outputs\apk\android-debug.apk
Installing app on device...
Launching application...
LAUNCH SUCCESS
Done, without errors.

 

Take a Picture

Take a picture. Hopefully your picture will be more inspiring than the picture of my desk! The dashboard should show the image after a picture is taken. The image can then be uploaded to MCS by clicking the “Upload to MCS” link.

takepicture

 

If you are using the Android emulator, and the “Back Camera” is set to “Emulated”, the picture you can take is a green square on a background of checkered squares.

takepicture-emulator

 

Once the file is uploaded, a new “picture.jpg” will be visible in the Collection.

uploaded

Inspect in Chrome

To troubleshoot the app for Android, open Chrome and enter the address “chrome://inspect”. This should show a list of running applications on the device. In the image below, the “WebView in org.oraclejet.mcsexample” is the app that was served. Clicking the “inspect” link will bring up Chrome developer tools and allow for debugging the app while it runs on the device.

inspect

The inspect view allows the use of Chrome developer tools while the app runs on the device or emulator.

 inspect-dashboard

Conclusion

The MCS Javascript SDK provides the link between your Cordova based application and Oracle Mobile Cloud Service. Like this example using Oracle JET, this same SDK can be used with Ionic/Angular apps. The MCS Javascript SDK covers more than just authentication and file storage as well, including the ability to register the device, log analytics, and call custom APIs. The SDK can save time in developing hybrid apps since the interactions with the backend are already built for you, and all of the power of Oracle Mobile Cloud Service is at your command using a supported SDK.

 

Integration Cloud Service (ICS) Security & Compliance

$
0
0

The attached white paper is the product of a joint A-Team effort that included Deepak Arora, Mike Muller, and Greg Mally.  Oracle Integration Cloud Service (ICS) runs within the Oracle Cloud where the architecture is designed to provide customers with a unified suite of Cloud Services with best-in-class performance, scalability, availability, and security. The Cloud Services are designed to run on a unified data center, hardware, software, and network architecture. This document is based on the Cloud Security Assessment section of the Security for Cloud Computing: 10 Steps to Ensure Success V2.0 document, which is produced by the Cloud Standards Customer Council where Oracle is a member.

For more details, see attached:

ICS Security and Compliance_v1.0

Java API for Integration Cloud Service

$
0
0

Introduction

Oracle ICS (Integration Cloud Service) provides a set of handy REST APIs that allow users to manage and monitor related artifacts such as connections, integrations, lookups and packages. It also allow the retrieval of monitoring metrics for further analysis. More details about it can be found in the following documentation link.

The primary use case for these REST APIs is to allow command-line interactions to perform tasks such as gathering data about ICS integrations or backup a set of integrations by exporting its contents. In order to interface with these REST APIs, users may adopt command-line utilities such as cURL to invoke them. For instance, the command below shows how to retrieve information about a connection:

curl -u userName:password -H “Accept:application/json” -X GET https://your-ics-pod.integration.us2.oraclecloud.com/icsapis/v1/connections/connectionId

If the command above executes completely then the output should be something like the following JSON payload:

{
   "links":{
      "@rel":"self",
      "@href":"https:// your-ics-pod.integration.us2.oraclecloud.com:443/icsapis/v1/connections/connectionId"
   },
   "connectionproperties":{
      "displayName":"WSDL URL",
      "hasAttachment":"false",
      "length":"0",
      "propertyGroup":"CONNECTION_PROPS",
      "propertyName":"targetWSDLURL",
      "propertyType":"URL_OR_FILE",
      "required":"true"
   },
   "securityproperties":[
      {
         "displayName":"Username",
         "hasAttachment":"false",
         "length":"0",
         "propertyDescription":"A username credential",
         "propertyGroup":"CREDENTIALS",
         "propertyName":"username",
         "propertyType":"STRING",
         "required":"true"
      },
      {
         "displayName":"Password",
         "hasAttachment":"false",
         "length":"0",
         "propertyDescription":"A password credential",
         "propertyGroup":"CREDENTIALS",
         "propertyName":"password",
         "propertyType":"PASSWORD",
         "required":"true"
      }
   ],
   "adaptertype":{
      "appTypeConnProperties":{
         "displayName":"WSDL URL",
         "hasAttachment":"false",
         "length":"0",
         "propertyGroup":"CONNECTION_PROPS",
         "propertyName":"targetWSDLURL",
         "propertyType":"URL_OR_FILE",
         "required":"true"
      },
      "appTypeCredProperties":[
         {
            "displayName":"Username",
            "hasAttachment":"false",
            "length":"0",
            "propertyDescription":"A username credential",
            "propertyGroup":"CREDENTIALS",
            "propertyName":"username",
            "propertyType":"STRING",
            "required":"true"
         },
         {
            "displayName":"Password",
            "hasAttachment":"false",
            "length":"0",
            "propertyDescription":"A password credential",
            "propertyGroup":"CREDENTIALS",
            "propertyName":"password",
            "propertyType":"PASSWORD",
            "required":"true"
         }
      ],
      "appTypeLargeIconUrl":"/images/soap/wssoap_92.png",
      "appTypeMediumGrayIconUrl":"/images/soap/wssoap_g_46.png",
      "appTypeMediumIconUrl":"/images/soap/wssoap_46.png",
      "appTypeMediumWhiteIconUrl":"/images/soap/wssoap_w_46.png",
      "appTypeName":"soap",
      "appTypeSmallIconUrl":"/images/soap/wssoap_32.png",
      "displayName":"SOAP",
      "features":"",
      "source":"PREINSTALLED",
      "supportedSecurityPolicies":"Basic Authentication, Username Password Token, No Security Policy"
   },
   "code":"connectionCode",
   "imageURL":"/images/soap/wssoap_w_46.png",
   "name":"Connection Name",
   "percentageComplete":"100",
   "securityPolicy":"USERNAME_PASSWORD_TOKEN",
   "status":"CONFIGURED",
   "supportsCache":"true"
}

These APIs were designed to return JSON payloads in most cases. However, some operations allow the result to be returned in the XML format. Users can control this by specifying the “Accept” HTTP header in the request.

Regardless of which format is chosen to work with, it is expected that users must handle the payload to read the data. This means that they will need to develop a program to retrieve the payload, parse it somehow and then work with that data. Same applies to invoking REST endpoints with path parameters or posting payloads to them. The end result is that a considerable amount of boilerplate code must be written, tested and maintained no matter which programming language is chosen.

Aiming to make things easier, the Oracle A-Team developed a Java API to abstract the technical details about how to interact with the REST APIs. The result is a simple-to-use, very small JAR file that contains all you need to rapidly create applications that interact with ICS. Because the API is written in Java, it can be reused across a wide set of programming languages that can run on a JVM including Clojure, JavaScript, Groovy, Scala, Ruby, Python, and of course Java.

The Java API for ICS is provided for free to use “AS-IS” but without any official support from Oracle. Bugs, feedback and enhancement requests are welcome but need to be performed using the comments section of this blog and the A-Team reserves the right of help in the best-effort capacity.

This blog will walk you through the steps required to use this Java API, providing code samples that demonstrate how to implement a number of common use cases.

Getting Started with the Java API for ICS

The first thing you need to do to start playing with the Java API for ICS is to download a copy of the library. You can get a free copy here. This library also depends on a few Open-Source libraries so you will need to download these as well. The necessary libraries are:

* FasterXML Jackson 2.0: The library uses this framework to handle JSON transformations back and forth to Java objects. You can download the libraries here. The necessary JAR files are: jackson-core, jackson-annotations and jackson-databind.

* Apache HTTP Components: Used to handle any HTTP interaction with the REST APIs for ICS. You can download the libraries here. The necessary JAR files are: http-core, http-client, http-mime and commons-codec and commons-logging.

It is important to remember that you must use JDK 1.6 or higher. Any JDK older version won’t work. Once all the libraries are on the classpath, you will be ready to get started.

Excuse me Sir – May I Have a Token?

The Java API for ICS was designed to provide the highest level of simplicity possible. Thus, pretty much all operations can be executed by calling a single object. This object is called Token. In simpler terms, a token gives you access to execute operations against your ICS pod. However; as you may expect, tokens are not freely accessible. In order to create a token, your code needs to authenticate against ICS. The example below shows how to create a token.

import com.oracle.ateam.cloud.ics.javaapi.Token;
import com.oracle.ateam.cloud.ics.javaapi.TokenManager;

public class CreatingTokens {

   public static void main(String[] args) throws Exception {

      String serviceUrl = "https://your-ics-pod.integration.us2.oraclecloud.com";
      String userName = "yourUserName";
      String password = "yourPassword";

      Token token = TokenManager.createToken(serviceUrl, userName, password);
      System.out.println("Yeah... I was able to create a token: " + token);

   }

}

The parameters used to create a token are pretty straightforward, and you should be familiar with them for your ICS pod. When the createToken() method is executed, it tries to authenticate against the ICS pod mentioned in the service URL. If for some reason the authentication does not happen, an exception will be raised with proper details. Otherwise, the token will be created and returned to the caller.

A token is a very lightweight object that can be reused across your application. Thus, it is a good idea to cache it after its creation. Another important aspect of the token is that it is thread-safe. That means that multiple threads can simultaneously invoke its methods without concerns about locks of any kind.

Using a Token to Perform Operations against ICS

Once you have properly created a token, you can start writing code to retrieve data from ICS and/or invoke operations against it. The examples below will show various ways in which the Java API for ICS can be used.

Listing all the integrations; who created them and their status

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();

for (Integration integration : integrations) {

   System.out.println(integration.getName() + ": Created by '" +
   integration.getCreatedBy() + "' and it is currently " +
   integration.getStatus());

}

Showing source and target connections of one specific integration

Token token = TokenManager.createToken(serviceUrl, userName, password);
Integration integration = token.retrieveIntegration(integrationId, integrationVersion);
		
System.out.println("Integration: " + integration.getName());
Connection sourceConnection = integration.getSourceConnection();
Connection targetConnection = integration.getTargetConnection();
		
System.out.println("   Source Connection: " + sourceConnection.getName() +
         " (" + sourceConnection.getAdapterType().getDisplayName() + ")");
System.out.println("   Target Connection: " + targetConnection.getName() +
         " (" + targetConnection.getAdapterType().getDisplayName() + ")");

Exporting all integrations that are currently active

final String BACKUP_FOLDER = "/home/rferreira/ics/backup/";
Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();
		
for (Integration integration : integrations) {
						
   if (integration.getStatus().equals("ACTIVATED")) {
				
      Status status = integration.export(BACKUP_FOLDER + integration.getCode());
      System.out.println(integration.getCode() + " = " + status.getStatusInfo());
				
   }
			
}

Alternatively, if you are using JDK 1.8 then you could rewrite the entire for-each code using Lambdas:

integrations.parallelStream()
   .filter(i -> i.getStatus().equals("ACTIVATED"))
   .forEach((i) -> {
				
      String fileName = BACKUP_FOLDER + i.getCode();
      System.out.println(i.export(fileName).getStatusInfo());
			
   });

Printing monitoring metrics of one specific integration

Token token = TokenManager.createToken(serviceUrl, userName, password);
MonitoringMetrics monitoringMetrics = token.retrieveMonitoringMetrics(integrationId, integrationVersion);
		
System.out.println("Flow Name: " + monitoringMetrics.getFlowName());
System.out.println("   Messages Received...: " + monitoringMetrics.getNoOfMsgsReceived());
System.out.println("   Messages Processed..: " + monitoringMetrics.getNoOfMsgsProcessed());
System.out.println("   Number Of Errors....: " + monitoringMetrics.getNoOfErrors());
System.out.println("   Errors in Queues....: " +
   monitoringMetrics.getErrorsInQueues().getErrorObjects().size());
System.out.println("   Success Rate........: " + monitoringMetrics.getSuccessRate());
System.out.println("   Avg Response Time...: " + monitoringMetrics.getAvgRespTime());
System.out.println("   Last Updated By.....: " + monitoringMetrics.getLastUpdatedBy());

Deleting all connections that are currently incomplete

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Connection> connections = token.retrieveConnections();
		
for (Connection connection : connections) {
			
   if (Integer.parseInt(connection.getPercentageComplete()) < 100) {
				
      connection.delete();
				
   }
			
}

Deactivating all integrations whose name begins with “POC_”

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<Integration> integrations = token.retrieveIntegrations();
		
for (Integration integration : integrations) {
			
   if (integration.getName().startsWith("POC_")) {
				
      System.out.println(integration.deactivate());
				
   }
			
}

Listing all packages and its integrations (using JDK 1.8 Lambdas)

Token token = TokenManager.createToken(serviceUrl, userName, password);
List<com.oracle.ateam.cloud.ics.javaapi.types.Package> pkgs = token.retrievePackages();
		
pkgs.forEach((p) -> {
			
   System.out.println("Package Name: " + p.getPackageName());
   p.getPackageContent().forEach(pc -> System.out.println(pc.getName()));
		
});

Importing integrations into ICS from the previously exported archive

Token token = TokenManager.createToken(serviceUrl, userName, password);
Status status = token.importIntegration("home/rferreira/ics/backup/myInteg.iar", false);
System.out.println(status.getStatusInfo());

Tip: the boolean parameter in the importIntegration() method controls whenever the integration must be replaced or not. If that parameter is set to true, then it will override any existing integration that has the same name. Otherwise; if that parameter is set to false, then it will assume that the integration does not exist in ICS and it will create it.

Alternatively, you can import a complete set of integrations at once by importing a package:

Status status = token.importPackage("home/rferreira/ics/backup/samplePackage.par");
System.out.println(status.getStatusInfo());

Conclusion

ICS is a powerful iPaaS solution offered by Oracle that provides a robust set of management capabilities. Along with its development console, it also provides a set of REST APIs that enable the creation of custom apps that can fetch data from ICS. Although those REST APIs are useful, developers often need more productivity while writing their code. This blog introduced the Java API for ICS, a simple-to-use library that abstracts the technical details of the REST APIs. We provided details on how to download, configure, and use the library, and code samples were provided to demonstrate how to use the Java API to access a range of common ICS management functions.


Creating custom Fusion Applications User Interfaces using Oracle JET

$
0
0

Introduction

JET is Oracle’s new mobile toolkit specifically written for developers to help them build client slide applications using JavaScript. Oracle Fusion Applications implementers are often given the requirement to create mobile, or desktop browser, based custom screens for Fusion Applications. There are many options available to the developer for example Oracle ADF (Java Based) and Oracle JET (JavaScript based). This blog article gives the reader a tutorial style document on how to build a hybrid application using data from Oracle Fusion Sales Cloud. It is worth highlighting that although this tutorial is using Sales Cloud, the technique below is equally applicable to HCM cloud, or any other Oracle SaaS cloud product which exposes a REST API.

Main Article

Pre-Requisites

It is assumed that you’ve already read the getting started guide on the Oracle Jet website and installed all the pre-requisites. In addition if you are to create a mobile application then you will also need to install the mobile SDKs from either Apple (XCode) or Android (Android SDK).

 

You must have a Apple Mac to be able to install the Apple IOS developer kit (XCode), it is not possible to run XCode on a Windows PC

Dealing with SaaS Security

Before building the application itself we need to start executing the REST calls and getting our data and security is going to be the first hurdle we need to cross.Most Sales Cloud installations allow “Basic Authentication” to their APIs,  so in REST this involves creating a HTTP Header called “Authorization” with the value “Basic <your username:password>” , with the <username:password> section encoded as Base64. An alternative approach used when embedding the application within Oracle SaaS is to use a generated JWT token. This token is generated by Oracle SaaS using either groovy or expression language. When embedding the application in Oracle SaaS you have the option of passing parameters, the JWT token would be one of these parameters and can subsequently be used instead of the <username:password>. When using JWT token the Authorization string changes slightly so that instead of “Basic” it become “Bearer”,

 

Usage Header Name Header Value
Basic Authentication Authorization Basic <your username:password base64 encoded>
JWT Authentication Authorization Bearer <JWT Token>

 

Groovy Script in SalesCloud to generate a JWT Token

def thirdpartyapplicationurl = oracle.topologyManager.client.deployedInfo.DeployedInfoProvider.getEndPoint("My3rdPartyApplication" )
def crmkey= (new oracle.apps.fnd.applcore.common.SecuredTokenBean().getTrustToken())
def url = thirdpartyapplicationurl +"?jwt ="+crmkey
return (url)

Expression Language in Fusion SaaS (HCM, Sales, ERP etc) to generate a JWT Token

#{EndPointProvider.externalEndpointByModuleShortName['My3rdPartApplication']}?jwt=#{applCoreSecuredToken.trustToken}

Getting the data out of Fusion Applications using the REST API

When retrieving  data from Sales Cloud we need to make sure we get the right data, not too much and not too little. Oracle Sales Cloud, like many other Oracle SaaS products, now supports the REST API for inbound and outbound data access. Oracle HCM also has a REST API but at the time of writing this article, the API is in controlled availability.

Looking at the documentation hosted at Oracle Help Center :http//docs.oracle.com/cloud/latest/salescs_gs/FAAPS/ 

The REST call to get all Sales Cloud Opportunities looks like this :

https://yourCRMServer/salesApi/resources/latest/opportunities

If you executed the above REST call you will notice that the resulting payload is large, some would say huge. There are good reasons for this, namely that the Sales Cloud Opportunity object contains a large number fields, secondly the result not only contains data but also contains metadata and finally the request above is a select all query. The metadata includes links to child collections, links to List of Values, what tabs are visible in Sales Cloud , custom objects, flexfields etc. Additionally the query we just executed is a the equivalent of a select * from table, i.e. it brings back everything so we’ll also need to fix that.

 

Example snippet of a SalesCloud Opportunity REST Response showing custom fields,tabs visible, child collections etc

"Opportunity_NewQuote_14047462719341_Layout6": "https://mybigm.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=3000000xxx44105%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "Opportunity_NewQuote_14047462719341_Layout6_Layout7": "https://mybigMmachine.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=300000060xxxx5%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "ExtnFuseOpportunityEditLayout7Expr": "false",
  "ExtnFuseOpportunityEditLayout6Expr": "false",
  "ExtnFuseOpportunityCreateLayout3Expr": "false",
  "Opportunity_NewQuote_14047462719341_Layout8": "https://mybigm-demo.bigmachines.com/sso/saml_request.jsp?RelayState=/commerce/buyside/document.jsp?process=quickstart_commerce_process_bmClone_4%26formaction=create%26_partnerOpportunityId=300000060744105%26_partnerIdentifier=fusion%26_partnerAccountId=100000001941037",
  "ExtnFuseOpportunityEditLayout8Expr": "false",
  "CreateProject_c": null,
  "Opportunity_DocumentsCloud_14399346021091": "https://mydoccloud.documents.us2.oraclecloud.com/documents/embed/link/LF6F00719BA6xxxxxx8FBEFEC24286/folder/FE3D00BBxxxxxxxxxxEC24286/lyt=grid",
  "Opportunity_DocsCloud_14552023624601": "https://mydocscserver.domain.com:7002/SalesCloudDocCloudServlet/doccloud?objectnumber=2169&objecttype=OPPORTUNITY&jwt=eyJhxxxxxy1pqzv2JK0DX-xxxvAn5r9aQixtpxhNBNG9AljMLfOsxlLiCgE5L0bAI",
  "links": [
    {
      "rel": "self",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169",
      "name": "opportunities",
      "kind": "item",
      "properties": {
        "changeIndicator": "ACED0005737200136A6176612E7574696C2E41727261794C6973747881D21D99C7619D03000149000473697A65787000000002770400000010737200116A6176612E6C616E672E496E746567657212E2A0A4F781873802000149000576616C7565787200106A6176612E6C616E672E4E756D62657286AC951D0B94E08B020000787200106A6176612E6C616E672E4F626A65637400000000000000000000007870000000017371007E00020000000178"
      }
    },
    {
      "rel": "canonical",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169",
      "name": "opportunities",
      "kind": "item"
    },
    {
      "rel": "lov",
      "href": "https://mycrmserver-crm.oracledemos.com:443/salesApi/resources/11.1.10/opportunities/2169/lov/SalesStageLOV",
      "name": "SalesStageLOV",
      "kind": "collection"
    },

Thankfully we can tell the REST API that we :

  • Only want to see the data, achieved by adding onlyData=true parameter
  • Only want to see the following fields OpportunityNumber,Name,CustomerName (TargetPartyName), achieved by adding a fields=<fieldName,fieldname> parameter
  • Only want to see a max of 10 rows, achieved by adding the limit=<value> parameter
  • Only want to see open opportunities, achieved by adding the q= parameter with a query string, in our case StatusCode=OPEN

If we want to get the data in pages/blocks we can use the offset parameter. The offset parameter tells the REST service to get the data “from” this offset. Using offset and limit we can effectively page through the data returned by Oracle Fusion Applications REST Service.

Our final REST request URL would look like :

https://myCRMServeroracledemos.com/salesApi/resources/latest/opportunities?onlyData=true&fields=OptyNumber,Name,Revenue,TargetPartyName,StatusCode&q=StatusCode=OPEN&offset=0&limit=10

The Oracle Fusion Applications REST API is documented in the relevant Oracle Fusion Applications Documentation, e.g. for Sales Cloud, http://docs.oracle.com/cloud/latest/salescs_gs/FAAPS/ but it is also worth noting that the Oracle Fusion Applications REST Services are simply an implementation of the Oracle ADF Business Components REST Services, these are very well documented here  https://docs.oracle.com/middleware/1221/adf/develop/GUID-8F85F6FA-1A13-4111-BBDB-1195445CB630.htm#ADFFD53992

Our final tuned JSON result from the REST service will look something like this (truncated) :

{
  "items": [
    {
      "Name": "Custom Sentinel Power Server @ Eagle",
      "OptyNumber": "147790",
      "StatusCode": "OPEN",
      "TargetPartyName": "Eagle Software Inc",
      "Revenue": 104000
    },
    {
      "Name": "Ultra Servers @ Beutelschies & Company",
      "OptyNumber": "150790",
      "StatusCode": "OPEN",
      "TargetPartyName": "Beutelschies & Company",
      "Revenue": 175000
    },
    {
      "Name": "Diablo Technologies 1012",
      "OptyNumber": "176800",
      "StatusCode": "OPEN",
      "TargetPartyName": "Diablo Technologies",
      "Revenue": 23650
    }
}

Creating the Hybrid Application

Now we have our datasource defined we can start to build the application. We want this application to be available on a mobile device and therefore we will create a “Mobile Hybrid” application using Oracle JET, using the –NavDrawer template.

yo oraclejet:hybrid OSCOptyList --template=navDrawer --platforms=android

Once the yeoman script has built your application, you can test the (basic) application using the following two commands.

grunt build --platform=android 
grunt serve --platform=android --web=true

The second grunt serve command has a web=true parameter at the end, this is telling the script that we’re going to be testing this in our browser and not on the device itself. When this is run you should see basic shell [empty] application in your browser window.

For

Building Our JavaScript UI

Now that we have our data source defined we can now get onto to task of building the JET User Interface. Previously you executed the yo oraclejet:hybrid command, this created you a hybrid application using a template. Opening the resulting project in an IDE, like NetBeans, we can see that the project template has created a collection of files and that one of them is “dashboard.html” (marked 1 in the image), edit this file using your editor.

dashboard.html

 

Within the file delete everything and replace it with this snippet of html code

<div class="oj-hybrid-padding">
    <div class="oj-flex">
        <div class="oj-flex-item">
            <button id= "prevButton" 
                    data-bind="click: previousPage, 
                       ojComponent: { component: 'ojButton', label: 'Previous' }">
            </button>
            <button id= "nextButton"
                    data-bind="click: nextPage, 
                       ojComponent: { component: 'ojButton', label: 'Next' }">
            </button>
        </div>
    </div>
    <div class="oj-flex-item">    
        <div class="oj-panel oj-panel-alt1 oj-margin">
            <table id="table" summary="Opportunity List" aria-label="Opportunity List"
                   data-bind="ojComponent: {component: 'ojTable', 
                                data: opportunityDataSource, 
                                columnsDefault: {sortable: 'none'}, 
                                columns: [{headerText: 'Opty Number', 
                                           field: 'OptyNumber'},
                                          {headerText: 'Name', 
                                           field: 'Name'},
                                          {headerText: 'Revenue', 
                                           field: 'Revenue'},
                                          {headerText: 'Customer Name', 
                                           field: 'TargetPartyName'},
                                          {headerText: 'Status Code', 
                                           field: 'StatusCode'}
           ]}">
            </table>
        </div>    
    </div>
</div>

The above piece of html adds a JET table to the page, for prettiness we’ve wrapped the table in a decorative panel and added a next and previous buttons. The table definition tells Oracle JET that the data is coming from a JavaScript object called “opportunityDataSource“, it also defines defines the columns, column header text and that the columns are not sortable. The button definitions reference two functions in our JavaScript (to follow) which will paginate the data.

Building The logic

We can now move onto the JavaScript side of things, that is the part where we get the data from Sales Cloud and makes it available to the table object in the html file. For this simplistic example we’ll get the data direct from Sales Cloud and display it in the table, with no caching and nothing fancy like collection models for pagination .

Edit the dashboard.js file, this is marked as 2 in the above image. This file is a RequiresJS AMD (Application Module Definition File) and is pre-populated to support the dashboard.html page.

Within this file, cut-n-paste the following JavaScript snippet.

define(['ojs/ojcore', 'knockout', 'jquery', 'ojs/ojtable', 'ojs/ojbutton'],
        function (oj, ko, $) {
            function DashboardViewModel() {
                var self = this;
                var offset = 0;
                var limit = 10;
                var pageSize = 10;
                var nextButtonActive = ko.observable(true);
                var prevButtonActive = ko.observable(true);
                //
                self.optyList = ko.observableArray([{Name: "Fetching data"}]);
                console.log('Data=' + self.optyList);
                self.opportunityDataSource = new oj.ArrayTableDataSource(self.optyList, {idAttribute: 'Name'});
                self.refresh = function () {
                    console.log("fetching data");
                    var hostname = "https://yourCRMServer.domain.com";
                    var queryString = "/salesApi/resources/latest/opportunities?onlyData=true&fields=OptyNumber,Name,Revenue,TargetPartyName,StatusCode&q=StatusCode=OPEN&limit=10&offset=" + offset;
                    console.log(queryString);
                    $.ajax(hostname + queryString,
                            {
                                method: "GET",
                                dataType: "json",
                                headers: {"Authorization": "Basic " + btoa("username:password")},
                                // Alternative Headers if using JWT Token
                                // headers : {"Authorization" : "Bearer "+ jwttoken; 
                                success: function (data)
                                {
                                    self.optyList(data.items);
                                    console.log('Data returned ' + JSON.stringify(data.items));
                                    console.log("Rows Returned"+self.optyList().length);
                                    // Enable / Disable the next/prev button based on results of query
                                    if (self.optyList().length < limit)
                                    {
                                        $('#nextButton').attr("disabled", true);
                                    } else
                                    {
                                        $('#nextButton').attr("disabled", false);
                                    }
                                    if (self.offset === 0)
                                        $('#prevButton').attr("disabled", true);
                                },
                                error: function (jqXHR, textStatus, errorThrown)
                                {
                                    console.log(textStatus, errorThrown);
                                }
                            }
                    );
                };
                // Handlers for buttons
                self.nextPage = function ()
                {

                    offset = offset + pageSize;
                    console.log("off set=" + offset);
                    self.refresh();
                };
                self.previousPage = function ()
                {
                    offset = offset - pageSize;
                    if (offset < 0)
                        offset = 0;
                    self.refresh();
                };
                // Initial Refresh
                self.refresh();
            }
            
            return new DashboardViewModel;
        }
);

Lets examine the code

Line 1: Here we’ve modified the standard define so that it includes a ojs/table reference. This is telling RequiresJS , which the JET toolkit uses, that this piece of JavaScript uses a JET Table object
Line 8 & 9 : These lines maintain variables to indicate if the button should be enabled or not
Line 11: Here we created a variable called optyList, this is importantly created as a knockout observableArray.
Line 13: Here we create another variable called “opportunityDataSource“, which is the variable the HTML page will reference. The main difference here is that this variable is of type oj.ArrayTableDataSource and that the primary key is OptyNumber
Lines 14-47 :  Here we define a function called “refresh”. When this javascript function is called we execute a REST Call back to SalesCloud using jquery’s ajax call. This call retrieves the data and then populates the optyList knockout data source with data from the REST call. Specifically here note that we don’t assign the results to the optyData variable directly but we purposely pass a child array called “items”. If you execute the REST call, we previously discussed, you’ll note that the data is actually stored in an array called items
Line 23 : This line is defining the headers, specifically in this case we’re defining a header called “Authorization” , with the username & password formatted as “username:password” and then the base64 encoded.
Line 24-25  :These lines define an alternative header which would be appropriate if a JWT token was being used. This token would be passed in as a parameter rather than being hardcoded
Lines 31-40 : These query the results of the query and determine if the next and previous buttons should be enabled or not using jQuery to toggle the disabled attribute
Lines 50-63 : These manage the next/previous button events
Finally on line 65 we execute the refresh() method when the module is initiated.

Running the example on your mobile

To run the example on your mobile device execute the follow commands

grunt build --platform=android 
grunt serve --platform=android

or if you want to test on a device

grunt serve --platform=android -destination=[device or emulator name]

If all is well you should see a table of data populated from Oracle Sales Cloud

 

For more information on building JavaScript applications with the Oracle JET tool make sure to check out our other blog articles on JET here , the Oracle JET Website here and the excellent Oracle JET You Tube channel here

Running the example on the browser and CORS

If you try and run the example on your browser you’ll find it probably won’twork. If you look at the browser console (control+shift+I on most browsers) you’ll probably see that the error was something like “XMLHttpRequest cannot load…” etc

cors

This is because the code has violated “Cross Origin Scripting” rules. In a nut shell “A JavaScript application cannot access a resource which was not served up by the server which itself was served up from”.. In my case the application was served up by Netbeans on http://localhost:8090, whereas the REST Service from Sales Cloud is on a different server, thankfully there is a solution called “CORS”. CORS stands for Cross Origin Resource Sharing and is a standard for solving this problem, for more information on CORS see this wikipedia article, or other articles on the internet.

Configuring CORS in Fusion Applications

For our application to work on a web browser we need to enable CORS in Fusion Applications, we do this by the following steps :

  1. 1. Log into Fusion Applications (SalesCloud, HCM etc) using a user who has access to “Setup and Maintenance”
  2. 2. Access setup and Maintenance screens
  3. 3. Search for Manage Administrator Profile Values and then navigate to that task
  4. 4. Search for “Allowed Domains” profile name (case sensitive!!).
  5. 5. Within this profile name you see a profile option called “site“, this profile option has a profile value
  6. 6. Within the profile value add the hostname, and port number, of the application hosting your JavaScript application. If you want to allow “ALL” domains set this value to “*” (a single asterisk )
  7. WARNING : Ensure you understand the security implication of allowing ALL Domains using the asterisk notation!
  8. 7. Save and Close and then retry running your JET Application in your browser.
setupandMaiteanceCORS

CORS Settings in Setup and Maintenance (Click to enlarge)

If all is good when you run the application on your browser, or mobile device, you’ll now see the application running correctly.

JETApplication

Running JET Application (Click to enlarge)

 

Final Note on Security

To keep this example simple the security username/password was hard-coded in the mobile application, not suitable for a real world application. For a real application you would create a configuration screen, or use system preferences, to collect and store the username , password and the SalesCloud Server url which would then be used in the application.

If the JET Application is to be embedded inside a Fusion Applications Page then you will want to use JWT Token authentication. Modify the example so that the JWT token is passed into the application URL as a parameter and then use that in the JavaScript (lines 24-25) accordingly.

For more information on JWT Tokens in Fusion Applications see these blog entries (Link 1, Link 2) and of course the documentation

Conclusion

As we’ve seen above its quite straightforward to create mobile, and browser, applications using the Oracle JET Framework. The above example was quite simple and only queried data, a real application would also have some write/delete/update operations and therefore you would want to start to look at the JET Common Model and Collection Framework (DocLink) instead. Additionally in the above example we queried data direct from a single SalesCloud instance and did no processing on it.. Its very likely that a single mobile application will need to get its data from multiple data sources and require some backend services to “preprocess” and probably post process the data, in essence provide an API.. We call this backend a  “MBaaS”, ie Mobile Backend As A Service, Oracle also provides a MBaaS in its PaaS suite of products and it is called Oracle Product is called “Mobile Cloud Service”..

In a future article we will explore how to use Oracle Mobile Cloud Service (Oracle MCS) to query SalesCloud and Service cloud and provide an API to the client which would be using the more advanced technique of using the JET Common Model/Collection framework.

 

 

SaaS workflow extensions using Process Cloud Service.

$
0
0

Introduction

Oracle Process Cloud Service (PCS), a Platform-as-a-Service offering, enables human workflow capabilities on the cloud with easy-to-use composer and wor space. PCS allows authoring of processes using business-analyst friendly BPMN notation over swim-lanes. PCS eliminates the burden of building on-premise process manage platforms, while allowing enterprises to leverage knowledge from on-premise implementations.

 

Key features of Process Cloud Service include:

  • Invoke workflows through Web forms, SOAP service and REST service.
  • Invoke external SOAP and REST/JSON services.
  • Synchronous and Asynchronous service invocations.
  • Ability to import existing BPMN based workflows.

With the rapid adoption of Oracle SaaS applications, PCS comes in handy as an option to extend SaaS with human-task workflows. Here is some scenario where PCS is a strong candidate:

  • Workflows customizations to SaaS products are necessary to meet enterprise needs.
  • Workflow capabilities need to enabled rapidly for on-premise or Cloud applications.
  • Orchestration use cases with heavy use of human tasks.

Let’s look at extending Supply Chain Management Cloud with a workflow in PCS to capture, review and submit sales orders.

Sample Workflow

In this scenario, users in an enterprise submit orders in a PCS workflow. PCS then sends the orders to Supply Chain Management cloud’s Distributed Order Orchestration (DOO) web services.  The status of SCM Cloud Sales Order is retrieved for user’s review before the workflow ends.  This sample demonstrates the capabilities of PCS with basic functions of PCS and SCM Cloud.  It could be extended for advanced use cases. Figure 1 shows the high-level workflow.

Figure 1

Worflow overkview

 

Environment requirements for the sample

Below are requirements to enable the sample workflow between PCS and SCM Cloud..

Process Cloud Service

  • Access to PCS composer and workflow provisioned.
  • Network connectivity between PCS and SCM verified.

Supply Chain Management Cloud (R11)

  • Access to SCM Cloud with implementation privileges provisioned.
  • SCM Order Management features provisioned and implemented.
  • Order Management Order Capture service endpoint.
  • Relevant configuration for Source systems and item relationships.
  • Collection of order reference data enabled.

SCM Cloud Order Management module should be implemented in order for the order capture services function properly. For more information on configuring  Order Management, refer to the white papers listed at the bottom of this post. These documents might require access to Oracle support portal.

 

SCM Cloud order management services

For the sample, a test instance of Oracle Supply Chain Management Cloud Release 11 was used. Order capture service accepts orders from upstream capture systems through ProcessOrderRequest calls. It also provides details of an order through GetOrderDetails call. XML payloads for both services were captured from sample workflow and  provided below and, for sake of brevity, detailed instructions on Order Management configuration is left to support documentation.

As of Release 11, SCM Cloud only exposes SOA services for Order capture. the SOA service endpoint for R11 is not listed in the catalog. The endpoint is

https://<hostname>:<port>/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService. Append “?WSDL” to end the endpoint to retrieve the WSDL.

 

Sample payload for ProcessOrderRequest:

<?xml version = '1.0' encoding = 'UTF-8'?>
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsa="http://www.w3.org/2005/08/addressing">
   <env:Header>
      <wsa:To>https://eczc-test.scm.em2.oraclecloud.com:443/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService</wsa:To>
      <wsa:Action>ProcessOrderRequestSync</wsa:Action>
      <wsa:MessageID>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:MessageID>
      <wsa:RelatesTo>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:RelatesTo>
      <wsa:ReplyTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <orasoa:PortType xmlns:ptns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite" xmlns:orasoa="http://xmlns.oracle.com/soa">ptns:ReceiveOrderRequestServiceCallback</orasoa:PortType>
            <instra:tracking.ecid xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">297bed2c-1dde-4850-8a34-cf2da58d19ca-0001398a</instra:tracking.ecid>
            <instra:tracking.conversationId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">urn:eebb5147-2840-11e6-9a89-08002741191a</instra:tracking.conversationId>
            <instra:tracking.FlowEventId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40879</instra:tracking.FlowEventId>
            <instra:tracking.FlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40047</instra:tracking.FlowId>
            <instra:tracking.CorrelationFlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">0000LKDtQPbFw000jzwkno1NJbIb0000LQ</instra:tracking.CorrelationFlowId>
         </wsa:ReferenceParameters>
      </wsa:ReplyTo>
      <wsa:FaultTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <orasoa:PortType xmlns:ptns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite" xmlns:orasoa="http://xmlns.oracle.com/soa">ptns:ReceiveOrderRequestServiceCallback</orasoa:PortType>
         </wsa:ReferenceParameters>
      </wsa:FaultTo>
   </env:Header>
   <env:Body>
      <process xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/DooDecompReceiveOrderExternalComposite">
         <OrchestrationOrderRequest>
            <SourceTransactionIdentifier xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">1154551RBWM</SourceTransactionIdentifier>
            <SourceTransactionSystem xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">OPS</SourceTransactionSystem>
            <SourceTransactionNumber xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">1154551RBWM</SourceTransactionNumber>
            <BuyingPartyName xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">Computer Service and Rentals</BuyingPartyName>
            <TransactionalCurrencyCode xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">USD</TransactionalCurrencyCode>
            <TransactionOn xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">2016-06-01T14:36:42.649-07:00</TransactionOn>
            <RequestingBusinessUnitIdentifier xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">300000001548368</RequestingBusinessUnitIdentifier>
            <PartialShipAllowedFlag xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">false</PartialShipAllowedFlag>
            <OrchestrationOrderRequestLine xmlns:ns2="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/" xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/receiveTransform/receiveSalesOrder/model/">
               <ns2:SourceTransactionLineIdentifier>1</ns2:SourceTransactionLineIdentifier>
               <ns2:SourceTransactionScheduleIdentifier>1</ns2:SourceTransactionScheduleIdentifier>
               <ns2:SourceTransactionLineNumber>1</ns2:SourceTransactionLineNumber>
               <ns2:SourceTransactionScheduleNumber>1</ns2:SourceTransactionScheduleNumber>
               <ns2:ProductNumber>AS54888</ns2:ProductNumber>
               <ns2:OrderedQuantity>1</ns2:OrderedQuantity>
               <ns2:OrderedUOMCode>zzx</ns2:OrderedUOMCode>
               <ns2:OrderedUOM>EA</ns2:OrderedUOM>
               <ns2:RequestingBusinessUnitIdentifier>300000001293806</ns2:RequestingBusinessUnitIdentifier>
               <ns2:ParentLineReference/>
               <ns2:RootParentLineReference/>
               <ns2:ShippingInstructions>BM Ship Instructions- Ship it in a day</ns2:ShippingInstructions>
               <ns2:PackingInstructions/>
               <ns2:RequestedShipDate>2016-12-26T00:00:00</ns2:RequestedShipDate>
               <ns2:PaymentTerms/>
               <ns2:TransactionCategoryCode>ORDER</ns2:TransactionCategoryCode>
               <ns2:BillToCustomerName>Computer Service and Rentals</ns2:BillToCustomerName>
               <ns2:BillToAccountSiteUseIdentifier>300000001469016</ns2:BillToAccountSiteUseIdentifier>
               <ns2:BillToCustomerIdentifier>300000001469002</ns2:BillToCustomerIdentifier>
               <ns2:PartialShipAllowedFlag>false</ns2:PartialShipAllowedFlag>
               <ns2:UnitListPrice>100.0</ns2:UnitListPrice>
               <ns2:UnitSellingPrice>100.0</ns2:UnitSellingPrice>
               <ns2:ContractEndDate>2018-12-13</ns2:ContractEndDate>
               <ns2:ExtendedAmount>100.0</ns2:ExtendedAmount>
               <ns2:TaxExempt>S</ns2:TaxExempt>
               <ns2:ShipSetName>{{SHIPSET}}</ns2:ShipSetName>
               <ns2:OrigSysDocumentReference>ORIGSYS</ns2:OrigSysDocumentReference>
               <ns2:OrigSysDocumentLineReference>ORIGSYSLINE</ns2:OrigSysDocumentLineReference>
               <ns2:LineCharge>
                  <ns2:ChargeDefinitionCode>QP_SALE_PRICE</ns2:ChargeDefinitionCode>
                  <ns2:ChargeSubtypeCode>ORA_PRICE</ns2:ChargeSubtypeCode>
                  <ns2:PriceTypeCode>ONE_TIME</ns2:PriceTypeCode>
                  <ns2:PricedQuantity>1</ns2:PricedQuantity>
                  <ns2:PrimaryFlag>true</ns2:PrimaryFlag>
                  <ns2:ApplyTo>PRICE</ns2:ApplyTo>
                  <ns2:RollupFlag>false</ns2:RollupFlag>
                  <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                  <ns2:ChargeTypeCode>ORA_SALE</ns2:ChargeTypeCode>
                  <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                  <ns2:SequenceNumber>2</ns2:SequenceNumber>
                  <ns2:PricePeriodicityCode/>
                  <ns2:GsaUnitPrice/>
                  <ns2:ChargeComponent>
                     <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                     <ns2:HeaderCurrencyCode>USD</ns2:HeaderCurrencyCode>
                     <ns2:HeaderCurrencyExtendedAmount>150.0</ns2:HeaderCurrencyExtendedAmount>
                     <ns2:PriceElementCode>QP_LIST_PRICE</ns2:PriceElementCode>
                     <ns2:SequenceNumber>1</ns2:SequenceNumber>
                     <ns2:PriceElementUsageCode>LIST_PRICE</ns2:PriceElementUsageCode>
                     <ns2:ChargeCurrencyUnitPrice>150.0</ns2:ChargeCurrencyUnitPrice>
                     <ns2:HeaderCurrencyUnitPrice>150.0</ns2:HeaderCurrencyUnitPrice>
                     <ns2:RollupFlag>false</ns2:RollupFlag>
                     <ns2:SourceParentChargeComponentId/>
                     <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                     <ns2:SourceChargeComponentIdentifier>SCC3</ns2:SourceChargeComponentIdentifier>
                     <ns2:ChargeCurrencyExtendedAmount>150.0</ns2:ChargeCurrencyExtendedAmount>
                  </ns2:ChargeComponent>
                  <ns2:ChargeComponent>
                     <ns2:ChargeCurrencyCode>USD</ns2:ChargeCurrencyCode>
                     <ns2:HeaderCurrencyCode>USD</ns2:HeaderCurrencyCode>
                     <ns2:HeaderCurrencyExtendedAmount>150.0</ns2:HeaderCurrencyExtendedAmount>
                     <ns2:PriceElementCode>QP_NET_PRICE</ns2:PriceElementCode>
                     <ns2:SequenceNumber>3</ns2:SequenceNumber>
                     <ns2:PriceElementUsageCode>NET_PRICE</ns2:PriceElementUsageCode>
                     <ns2:ChargeCurrencyUnitPrice>150.0</ns2:ChargeCurrencyUnitPrice>
                     <ns2:HeaderCurrencyUnitPrice>150.0</ns2:HeaderCurrencyUnitPrice>
                     <ns2:RollupFlag>false</ns2:RollupFlag>
                     <ns2:SourceParentChargeComponentId/>
                     <ns2:SourceChargeIdentifier>SC2</ns2:SourceChargeIdentifier>
                     <ns2:SourceChargeComponentIdentifier>SCC1</ns2:SourceChargeComponentIdentifier>
                     <ns2:ChargeCurrencyExtendedAmount>150.0</ns2:ChargeCurrencyExtendedAmount>
                  </ns2:ChargeComponent>
               </ns2:LineCharge>
            </OrchestrationOrderRequestLine>
         </OrchestrationOrderRequest>
      </process>
   </env:Body>
</env:Envelope>

Sample payload for GetOrderDetails:

<?xml version = '1.0' encoding = 'UTF-8'?>
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsa="http://www.w3.org/2005/08/addressing">
   <env:Header>
      <wsa:To>https://eczc-test.scm.em2.oraclecloud.com:443/soa-infra/services/default/DooDecompReceiveOrderExternalComposite/ReceiveOrderRequestService</wsa:To>
      <wsa:Action>GetOrderDetailsSync</wsa:Action>
      <wsa:MessageID>urn:109ed9ec-2841-11e6-9a89-08002741191a</wsa:MessageID>
      <wsa:RelatesTo>urn:eebb5147-2840-11e6-9a89-08002741191a</wsa:RelatesTo>
      <wsa:ReplyTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
            <instra:tracking.ecid xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">297bed2c-1dde-4850-8a34-cf2da58d19ca-0001398a</instra:tracking.ecid>
            <instra:tracking.conversationId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">urn:eebb5147-2840-11e6-9a89-08002741191a</instra:tracking.conversationId>
            <instra:tracking.FlowEventId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40886</instra:tracking.FlowEventId>
            <instra:tracking.FlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">40047</instra:tracking.FlowId>
            <instra:tracking.CorrelationFlowId xmlns:instra="http://xmlns.oracle.com/sca/tracking/1.0">0000LKDtQPbFw000jzwkno1NJbIb0000LQ</instra:tracking.CorrelationFlowId>
         </wsa:ReferenceParameters>
      </wsa:ReplyTo>
      <wsa:FaultTo>
         <wsa:Address>http://www.w3.org/2005/08/addressing/anonymous</wsa:Address>
         <wsa:ReferenceParameters>
            <orasoa:EndpointAddress xmlns:orasoa="http://xmlns.oracle.com/soa">http://localhost:7003/soa-infra/services/testing/SalesOrderProcess!595*soa_8366f568-20d7-4a6a-ad68-58effa7a29e3/SCMWebService%23SCMSalesOrderProcess/Services.Externals.SCMWebService.reference</orasoa:EndpointAddress>
         </wsa:ReferenceParameters>
      </wsa:FaultTo>
   </env:Header>
   <env:Body>
      <GetOrderDetailsProcessRequest xmlns="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/DooDecompOrderDetailSvcComposite">
         <SourceOrderInput xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:dood="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/DooDecompOrderDetailSvcComposite" xmlns:mod="http://xmlns.oracle.com/apps/scm/doo/decomposition/orderDetailServices/model/">
            <mod:SourceOrderSystem>OPS</mod:SourceOrderSystem>
            <mod:SalesOrderNumber>1154660RBWM</mod:SalesOrderNumber>
         </SourceOrderInput>
      </GetOrderDetailsProcessRequest>
   </env:Body>
</env:Envelope>

PCS workflow in detail

The sample PCS workflow has submission and approval human tasks, associated web forms, a basic gateway rule, a REST web service call to dynamically populate drop-down list and two SOAP web service calls to Order Management services.  Order management service in this case is secured with HTTP basic authentication and accessible only over TLS. Figure 2 shows the swim lane representation of the workflow with self-describing flow elements names. We’ll focus on SCM Cloud specific aspects of the process flow. Process Cloud Service provides several pre-built samples and pattern templates, both of which could be used for quick development of a flow.

Figure 2

processflow



Adding a connector to SCM Cloud web service

In order to use web services to be used in a process, a web service connector  WSDL and associated schema files must be added to the PCS project in composer. Figure 3 shows how to create a connector. Once a connector is created, it is available for implementation in a Service flow element. As part of the connector setup, composer allows security to be configured with options such as HTTP basic authentication or WS-Security username token. Note that these settings can be changed in customization page when the flow is deployed.

Figure 3

WebServiceConnector

 

Associating Data between PCS flow elements

As the PCS flow transitions between flow elements, data input to the element and data output from the element need to be associated to suitable data objects. Data objects could be based on pre-built types such as int or String, or based on a one of the types defined in imported XML schema files. XML schema types be imported under “Business Objects” section of the composer.  Figure 4 shows data association to capture input for order capture service. As shown, some elements are captured from a web form submitted by a user and many others are hard-coded for this sample flow.

Figure 4

dataassociation

 

 Building web forms from pre-defined business types.

Human  tasks in PCS are represented by Web form-based UI. Web forms could be built quickly from pre-defined data types, such as XML complex types. Web form elements inherit the data contraints defined in the complex type. Once a form is generated, fields could be re-arranged to improve the UI.  Figure 5 shows a web form generated based on output of GetOrderDetails web service call, with details returned by SCM Cloud.

Figure 5

OrderDetailsStatus

 

Customizing process during deployment

Process cloud service supports deployments from composer to a test environment and then to a production or subsequent test environments. During the deployment, environment specific information such as endpoints and credentials could be updated.  All web service connectors used by the project are available to be configured. Figure 6 shows customization page for test deployment.

Figure 6

DeploymentCustomization

 
























Conclusion

Process Cloud Service offers a quick and reliable way to author and deploy work flows that could orchestrate human tasks and system interactions using industry standard notations and protocols. This is very useful to integrate and extend SaaS applications from Oracle and other vendors. PCS allows enterprise to leverage in-house expertise in process development without the hassles of building and maintaining the platform or having to master process flow implementation techniques in multiple SaaS products. This article covered a specific use case where PCS captures orders through rules and approval tasks in PCS and sends the order to SCM Cloud’s order capture service, and , finally, obtains order status and other details from SCM Cloud.

 

References

Migrating AMPA Apps to Oracle MAF 2.3.1 Client Data Model

$
0
0

Introduction

Oracle MAF 2.3.1 has just been released. This release contains a major new feature, the client data model (CDM). CDM is the productized version of the A-Team Mobile Persistence Accelerator (AMPA). This article explains how you can migrate your existing MAF app that uses AMPA to MAF 2.3.1 with CDM. We recommend to perform this migration as soon as possible since Oracle A-Team will no longer maintain the AMPA open source project in GitHub. The migration steps are pretty straightforward and risk-free since the complete code base of AMPA has been integrated “as is” with MAF, the biggest change is the renaming of Java packages.

Main Article

If you are migrating from MAF 2.3.0, you need to decide whether you want to upgrade your existing JDeveloper 12.2.1 installation to MAF 2.3.1 or you prefer to install another fresh instance of JDeveloper 12.2.1 which allows you to run both MAF 2.3.0 and MAF 2.3.1 apps side by side.

See the article How do I install 2 versions of the same version of JDeveloper for more info.

If want to upgrade you need to perform the following steps after you installed the MAF 2.3.1 extension:

  • Upgrade the MAF JDK 1.8 Compact profile 2
  • Remove AMPA extension

If you do no want to upgrade, or you are coming from an older MAF version that required JDeveloper 12.1.3, you can start with a fresh install of JDeveloper 12.2.1 and install the MAF extension as documented here. You can then proceed with the migration steps:

  • Perform General Migration Steps (not AMPA specific)
  • Change Namespace in persistence-mapping.xml
  • Rename Java packages
  • Change AMPA EL Expressions
  • Change Reusable Feature Archive References
  • Configure Database to be Unencrypted

The next sections will explain all of these steps in detail. The last section will discuss the available CDM documentation.

Upgrade the MAF JDK 1.8 Compact Profile 2

When you install the MAF 2.3.1 extension over the MAF 2.3.0 extension, the MAF JDK 1.8 Compact Profile 2 is not updated automatically. Since the CDM functionality is added to MAF through new jar files that are not automatically added to this profile, JDeveloper will not be able to find the CDM classes. This will cause compilation errors like
package oracle.maf.api.cdm.persistence.model does not exist

Note that you will also get these errors when you create a new app using CDM, not just when migrating an AMPA app.

To fix this you need to upgrade the JDK profile as follows:

  • In JDeveloper, go to Tools > Manage Libraries
  • Click on the tab Java SE Definitions
  • Select the MAF JDK 1.8 Compact 2 Profile library and click the Remove button
  • Restart JDeveloper

The MAF JDK 1.8 Compact 2 Profile should now be re-added automatically with the correct jar files. Here is a screen shot of the correct profile definition:

profile

To test whether JDeveloper can find the CDM classes you can use Go to Java File option (Ctrl-Minus on Windows, Cmd-J on Mac) and enter InitDB in it. This should bring up the oracle.maf.impl.cdm.lifecycle.InitDbLifecyleListner class.

Remove AMPA Extension

To remove the AMPA extension from JDeveloper, you go to the Tools -> Features menu option. Then click on Installed Updates. Select the A-Team Mobile Persistence Accelerator and click the Uninstall button.

RemoveAMPA

This removes the AMPA extension registration and jar files. However, it does not remove the oracle.ateam.mobile.persistence folder in the jdeveloper/jdev/extensions folder.

RemoveAMPAFolder2

You can remove this folder manually.

Perform General Migration Steps (not AMPA specific)

General migration steps are documented in the MAF developer’s guide, section Migration your Application to MAF 2.3.1. That section also includes a paragraph about migrating AMPA apps, this blog article is a more comprehensive version of that paragraph, with more background info, and some additional steps.

One general migration step is not documented there, the network access plugin has been renamed. If you open the maf-plugins.xml file, you will see that the network pluginId is now invalid:

NetworkPluginError

To fix this, you need to change the pluginId to maf-cordova-plugin-network-access. Or you can remove the network plugin line, go to the Overview tab of maf-application.xml and check the Network Access checkbox on the Plugins tab.

Change Namespace in persistence-mapping.xml

Open the persistence-mapping.xml, located in META-INF directory of your ApplicationController project and change the namespace to http://xmlns.oracle.com/adf/mf/amx/cdm/persistenceMapping.

namespace

Rename Java Packages

All non-deprecated AMPA classes have been included in MAF 2.3.1. MAF makes a distinction between public classes and internal implementation classes. Public classes are included in a package name starting with oracle.maf.api. Implementation classes are included in a package name starting with oracle.maf.impl. The signature of public classes is guaranteed to ensure upwards compatibility, implementation classes might change over time and might break your custom code if you use these classes and then migrate to a newer MAF version. This is why Oracle recommends to only use public MAF framework classes in your custom code. For this first release of CDM, the AMPA code has been included “as is” but over time the code base will be improved to support new features. To keep flexibility in improving and refactoring the code over time, a number of the AMPA classes have been moved into implementation packages starting with oracle.maf.impl.cdm. While Oracle generally recommends to avoid use of implemntation classes in your custom code, it is fine and even inevitable to do so with some of the CDM implementation classes. For example, all your service classes will now extend from oracle.maf.impl.cdm.persistence.service.EntityCRUDService.   

With this explanation, we are ready to rename the AMPA java packages. The table below lists the global search and replace actions you should perform in order on all files in your application.

Search Text Replace With
oracle.ateam.sample.mobile.v2.persistence.service.EntityCRUDService oracle.maf.impl.cdm.persistence.service.EntityCRUDService
oracle.ateam.sample.mobile.v2.persistence.manager.MCSStoragePersistenceManager oracle.maf.impl.cdm.persistence.manager.MCSStoragePersistenceManager
oracle.ateam.sample.mobile.v2 oracle.maf.api.cdm
oracle.ateam.sample.mobile.mcs.storage oracle.maf.api.cdm.mcs.storage
oracle.ateam.sample.mobile oracle.maf.impl.cdm

You can perform these global search and replace actions in JDeveloper by navigating to the Search -> Replace in Files option.

SearchReplace

Make sure you set the Scope field to your entire application, not just to one of the projects.

If you now try to compile your application, you still might get a few compilation errors. This is because classes in the same AMPA package might have been divided over both the implementation and public CDM packages. Easiest way to fix these errors is to double-click on the error to jump to the Java class, and rename the .api. part of the import to .impl. or vice versa. Alternatively, you can remove the invalid import statement and let JDeveloper automatically suggest the correct import.

There is one remaining change you have to make manually in maf-application.xml because this file is in a directory that is not scanned for global search and replace actions, you need to change the Lifecycle Event Listener property from oracle.ateam.sample.mobile.lifecycle.InitDBLifeCycleListener to oracle.maf.impl.cdm.lifecycle.InitDBLifeCycleListener. If you are using a custom lifecycle listener that extends InitDBLifecyleListener, you don’t have to do anything because your custom class is already updated to point to the CDM package.

If your application compiles successfully, you can do a final check by doing a global search in your application on the string oracle.ateam.sample which should no longer return any hits.

Change AMPA EL Expressions

AMPA comes with some standard EL expressions around background tasks and pending data sync actions. To update these expressions for CDM you should perform the following global search and replace actions:

Search Text Replace With
applicationScope.ampa_bg_task_running applicationScope.maf_bg_task_running
applicationScope.ampa_hasDataSyncActions applicationScope.maf_hasDataSyncActions
applicationScope.ampa_dataSyncActionsCount applicationScope.maf_dataSyncActionsCount
applicationScope.[entityName]_hasDataSyncActions applicationScope.maf_hasDataSyncActions

The last entry in this table is applicable when you are migrating from an earlier AMPA release, not the latest 12.2.1.0.68 release. In previous releases, the data synchronization happened in the context of an Entity CRUD service, it only synchronized the data object of the entity CRUD service and its child data objects (if applicable). Therefore the expression to check whether an entity (data object) had pending data sync actions included the entity name as prefix instead of the general ampa_ prefix used in latest AMPA release.

Change Reusable Feature Archive References

AMPA shipped with two reusable feature archives to inspect web service calls and to view pending synchronization actions. While the web service calls feature is not yet documented in the CDM chapter, both feature archives are included with CDM. If you run the MAF User Interface Generator they wil be automatically added to your application, just like AMPA did.  If your existing AMPA application is using one or both of these features, you need to do two things:

  • Change the feature archive library reference
  • Change the feature reference in maf-application.xml

To change the library reference, go to Application -> Application Properties menu option, and click on the Libraries and Classpath option at the left.

FeatureArchives

Remove both jar files, and click the Add Jar/Library button. Navigate to the directory where the CDM feature archives can be found, which is jdeveloper/jdev/extensions/oracle.maf/FARs/CDM.

AddCDMFeatures

Select both jar files and click Open button to add them to the application.

Now, go to the Overview tab of maf-application.xml and update the invalid feature references oracle.ateam.sample.mobile.datasynch and oracle.ateam.sample.mobile.wscalls with the new CDM id’s of these features.

NewFeatureRef

Configure Database to be Unencrypted

Unfortunately, during the integration of AMPA to CDM a minor code change has introduced an issue which currently prevents you from encrypting the SQLite database. This will be fixed with the next MAF release. For now, you should disable DB encryption by adding the following line to the mobile-persistence-config.properties file, located in the META-INF directory of tour ApplicationController project:

db.encryption=false

If you don’t have this entry, or you change the value to true, your application will hang at application start up and the log will display the following error:

CDM Database key is null

MAF Client Data Model Documentation

The CDM chapter in the MAF developer’s guide currently contains a subset of the information that you can find in the AMPA Developer’s guide. Sections which refer to AMPA classes that ended up in the implementation packages (see previous section Rename Java Packages for more info) have not been include yet. In subsequent MAF releases the remaining documentation topics will be added once final decisions have been made about the location of all CDM classes and methods

For now, you can continue to consult the AMPA Developer’s guide for those topics, because all of the content is still valid. Most notably, you might want to check out the following sections:

 

MCS Mobile Solution for E-Business Suite – Part 1 (Introduction)

$
0
0

Introduction

Earlier this month, a new version of mobile cloud service (MCS) was released, MCSV2.0 (16.2.3) offers many new features and capabilities such as Location based services. For the full list of new capabilities,please check Oracle MCS V2.0 release notes here.

One of the new features MCS V2.0 offers, is a set of pre-packaged applications that allows you to mobilise on-premise Oracle E-business suite (SSHR) and JD Edwards (Field Service). These mobile applications are available to all MCS customers who are on MCS V2.0 and will serve as a jump start kit for your mobile solution, fully utilising MCS as an MbaaS connecting to the backend on-premise application (Ebs or JDE). For more details on MCS Mobile Solution offering, check the following Blog post by Srikant Subramanian.

In this series of blog posts, I’ll go through several topics detailing the best practices on how to setup, use and customise Ebs SSHR packaged mobile solution.

Main Article

E-business Suite Self-Service HR (SSHR)

During my journey with Oracle, I would say that mobilising Ebs Self-Service HR (SSHR) is the number one concern for all Ebs customers. The ability for employees to submit HR related processes, and for managers to be able to respond and approve pending worklist items using a mobile device, will defiantly set a new milestone on how enterprises do business.

MCS SSHR packaged application offers several capabilities for both employees and managers. The functionalities will be released in a phased approach, a subset of these functionalities are available today, and the rest will be available in upcoming releases.

Employees SSHR

The first thing a user sees after starting the SSHR mobile app is the login screen, notice that it has a bottom ‘Dev Mode‘ bar with two options:

  • Mock Data from MCS: in this mode, the mobile app connects to MCS, however MCS uses mock data stored at MCS and doesn’t actually connect to Ebs.
  • Simulate Offline: in this mode, the mobile app doesn’t connect to MCS, and uses mock data bundled within the application itself, this is handy in case you want to demo the application and experience its capabilities without setting up the entire echo system for the solution.

To test the application in offline mode without connecting to MCS and EBS, make sure to turn on ‘Simulate Offline‘ mode and then login with the user ‘BPALMER‘ with an empty password.

02

In production systems, ‘Dev Mode‘ should be disabled so it won’t show on screens.

In a following blog entry I'll show you how to disable the 'Dev Mode' bar

After logging-in, you will be directed to the employee main dashboard, notice that in this current version of the app (V1) TimeCards, Vacation Balance, Personal/Medical Leave of Absence, Out of Office Message are implemented, in next scheduled releases Tax Forms & W2 will be implemented.

03

Below are screenshots for all the services available for as part of the Employee SSHR, they are self explanatory so I will not comment on them.

Time Card
03 07 08 09 10
Vacation Balance
03 11
Personal/Medical Leave of Absence
03 12 13 Absence Type is fetched from Ebs Absence Reason is fetched from Ebs 16 17
Out of office Message
03 19

Managers SSHR

Because the logged in user (BPALMER) is granted the ‘Manager’ role, he has access to the ‘Manager SSHR’ screen, that is why you can see these navigation ellipses below the user name, to navigate to the Manager SSHR screen, simply drag the screen to the left. Currently in V1.0, only the ‘Approval Worklist’ is implemented.

11 12 20
Approval Worklist
20 27 Pull list to refresh Filter Worklist items 25 21 23 24

Architecture

MCS Ebs SSHR packaged mobile app is designed and certified with Ebs R11, it utilises Oracle Integration Cloud Service (ICS) and ICS on-premise DB agent to expose Ebs PL/SQL APIs to REST APIs that can be consumed by MCS. The Ebs SSHR is a hybrid mobile application built using Oracle JET and cordova, it supports both iOS and Android devices.

mcsat_dt_001

Using the Oracle Mobile Cloud Service SDK with AngularJS and Ionic

$
0
0

Introduction

Oracle’s Mobile Cloud Service (MCS) can be used with any mobile client development tool. To speed up development for the most popular development tools, MCS comes with a Software Development Kit (SDK) for native Android, iOS and Windows development, as well as a JavaScript SDK for hybrid JavaScript-based development. Oracle’s A-Team has gained experience with the MCS JavaScript SDK in combination with various JavaScript toolkits. In a previous post, we discussed how to use the SDK when building Oracle JET applications. In this post we will share code samples, tips and best practices for building a hybrid mobile app using AngularJS (version 1) and Ionic that connects to MCS. We assume you have a basic knowledge of AngularJS and Ionic. Check out their websites for tutorials if you are new to these technologies.

Main Article

In this article, we will first explain how to download and “install” the SDK, and then we will discuss how to use the SDK to authenticate against MCS, connect to your custom REST API’s defined in MCS and how to leverage the MCS storage and analytics platform services. In subsequent articles we will discuss how MCS eases the implementation of push notifications and how the MCS Offline Sync API can be used to make your mobile app usable without internet connection.

Note that all MCS platform services can be accessed through the Oracle MCS REST API, so strictly speaking you do not need an SDK to connect to MCS. However, using the SDK makes your life much easier as we will demonstrate in this article.

We will use a basic CRUD application consisting of the following screens along the way.

HrApp

We used the Ionic Creator tool to quickly create these screens in a visual way using drag and drop, and then exported the project which gave us a complete Angular/Ionic starter project with the pages, and controller and service scaffolds all set up.

Downloading and Configuring the JavaScript Cordova SDK

The Javascript SDK for MCS comes in two flavours, one for JavaScript applications and one for Cordova applications. The MCS Cordova SDK is a super set of the MCS JS SDK as it provides a few more more capabilities that depend on Cordova, such as methods for registering a device for push notifications.  Since we are creating a hybrid mobile app, we choose the Cordova SDK.

To download an SDK for MCS, login to your MCS instance and click on the Get Started button on the homepage. This takes you to a page where you can select your target platform, and download the SDK with or without a quickstart project.

DownloadSDK

The quickstart project doesn’t really follow the best practices outlined in this post, so we will go for the option to download the SDK alone and add it to an existing project. Alternatively, you can click on the Applications option in the hamburger menu, and click on the Download SDK button. This button takes you to a page that lists all the SDK’s and doesn’t have the option to download a quickstarter project.

After downloading the file, we unzip it and copy over the following files:

  • Copy mcs.js and mcs.min.js into a new mcs subfolder under the lib folder of your project
  • Copy the oracle_mobile_cloud_config.js.js file to the js folder.

The folder structure should now look something like this:

McsDirStruc

The folder structure used in this sample app is suitable for smaller tutorial-style applications. If you are planning to build a large Angular/Ionic app you might want to check out the article AngularJS Best Practices: Directory Structure

We add the mcs.js file to the index.html file, and add it above the existing script tags that include the angular controllers, services, etc:

    <script src="lib/mcs/mcs.js"></script>

    <script src="js/app.js"></script>
    <script src="js/controllers.js"></script>
    <script src="js/routes.js"></script>
    <script src="js/services.js"></script>
    <script src="js/directives.js"></script>

In services.js we define an MCS service that will provide all the functions that our apps needs to interact with MCS:

angular.module('app.services', [])
.factory('mcsService', function($q){

    var mcs_config = {
      "logLevel": mcs.logLevelInfo,
      "mobileBackends": {
        "HR": {
          "default": true,
          "baseUrl": "https://mobileportalsetrial.yourdomain.mobileenv.us2.oraclecloud.com:443",
          "applicationKey": "a4a8af19-38f8-4306-9ac6-adcf7a53deff",
          "authorization": {
            "basicAuth": {
              "backendId": "e045cc30-a347-4f7d-a05f-4d285b6a9abb",
              "anonymousToken": "QVRFQU1ERVZfTUVTREVWMV9NT0JJTEVfQU5PTllNT1VTX0FQUElEOnByczcuYXduOXRlUmhp"
            }
          }
        }
      }
    };

    // initialize MCS mobile backend
    mcs.MobileBackendManager.setConfig(mcs_config);
    var mbe = mcs.MobileBackendManager.getMobileBackend('HR');
    mbe.setAuthenticationType("basicAuth");
})

We pass in $q in the service constructor function because we will be using Angular’s promise implementation later on. The structure of the mcs_config variable can be copied from the oracle_mobile_cloud_service_config.js file. This file includes all the declarative SDK configuration settings. Since we are using basic authentication in our app, we left out the configurations required when using OAuth, Facebook or Single-Sign-On (SSO) authentication. The configuration is defined for a backend called HR. This name doesn’t have to match the name of the backend that is actually in MCS, but can if you wish to match it. A description of all possible configuration settings can be found in the chapter Cordova Applications in the MCS Developer’s Guide.

You can find the values of the mcs_config settings baseUrl, backendId and anonymousToken on the overview of your mobile backend in MCS:

MBE-Settings

The applicationKey is only required for push notifications and can be taken from the Clients tab of your mobile backend page.

On lines 22-24 we initialize the mobile backend object by specifying the configuration file, retrieving the HR mobile backend (although uncommon, you can use multiple backends in your application), and setting the authentication type to basic.

With this skeleton mcsService in place, we are ready to use the various functions of the MCS Cordova SDK!

We could have done the SDK configuration directly in oracle_mobile_cloud_service.js as well, and add a reference to this file in index.html.The downside of that approach is that we “pollute” the global space with another global variable mcs_config.  In addition, we prefer to make the mcsService self-contained including the required configuration settings. If you prefer to have the configuration in a separate file, then it is better to create a json file that holds the config object and read the content of this file into your mcs_config variable within the services.js file.
After you copied the mcs_config variable, you can remove file oracle_mobile_cloud_service.js again from your project.

To know which SDK functions are available to us it is useful to consult the Oracle Mobile Cloud Service JavaScript SDK for Cordova Reference.

Authenticating Against MCS

MCS basic authentication provides two ways to authenticate, a “named” authentication using a username and password and a so-called anonymous authentication which uses the anonymousToken that we specified in the mcs_config variable. Anonymous authentication might be convenient during initial development when you did not set up a user realm with your mobile backend yet, or when you want to use another authentication mechanism for your app that is unrelated to MCS.

We first add functions to our mcsService to support both ways of authentication and to be able to logout:

var authenticate = function(username,password) {
   var deferred = $q.defer();
    mbe.Authorization.authenticate(username, password 
    , function(statusCode,data) {deferred.resolve(statusCode,data)}
    , function(statusCode,data) {deferred.reject(statusCode,data)});         
    return deferred.promise;
 };     

 var authenticateAnonymous = function() {
   var deferred = $q.defer();
    mbe.Authorization.authenticateAnonymous(
      function(statusCode,data) {deferred.resolve(statusCode,data)}
    , function(statusCode,data) {deferred.reject(statusCode,data)});         
    return deferred.promise;
 };     

 var logout = function() {
    var deferred = $q.defer();
    mbe.Authorization.logout(
      function(statusCode,data) {deferred.resolve(statusCode,data)}
    , function(statusCode,data) {deferred.reject(statusCode,data)});         
    return deferred.promise;
 };

And to make the functions callable from the controllers, we return a JSON object with the public functions at the end of the mcsService:

return {
   authenticate:authenticate,
   authenticateAnonymous:authenticateAnonymous,
   logout:logout
}

As you can see, we “promisified” the SDK calls that expect success and failure callback handlers. We strongly recommend you to do the same, for one it makes your coding easier, in particular if you want to chain multiple SDK calls together, but even more important, by using the Angular promise implementation the two-way data binding automatically kicks in. In other words, when the asynchronous SDK callback function returns some data and you need to refresh the user interface with that data you do not need to call $scope.$apply to force the UI to be updated. We will see an example of this behavior later on when we discuss invocation of MCS custom API’s.

With the authentication functions in place, we can now define a login function in our login controller that is invoked by the login button:

angular.module('app.controllers', ['app.services']) 
.controller('loginCtrl', function($scope,$state, mcsService) {

    // We could add logic here to pre-populate username, password with previously used values 
    $scope.loginData = {
        username: '',
        password: '',
    };

    $scope.doLogin = function () {
     mcsService.authenticate($scope.loginData.username, $scope.loginData.password)
       .then(function() {$state.go('departments')})
       .catch(function(err) {alert('Username or password is invalid')});  
    }
})

On line 2, we use Angular’s dependency injection feature to inject the mcsService in our controller. The doLogin function calls the authenticate function of our MCS service which returns a promise. And to complete this section, here is the template code used for the login page.

<ion-view title="Login" id="login" class=" ">
    <ion-content padding="true" class="has-header">
        <form ng-submit="doLogin()" id="login-form1" class="list ">
            <ion-list id="login-list2" class=" ">
                <label class="item item-input " id="login-input1">
                    <span class="input-label">Username</span>
                    <input type="text" ng-model="loginData.username" placeholder="">
                </label>
                <label class="item item-input " id="login-input2">
                    <span class="input-label">Password</span>
                    <input type="password" ng-model="loginData.password" placeholder="">
                </label>
            </ion-list>
            <div class="spacer" style="height: 40px;"></div>
            <button type="submit"  class=" button button-positive  button-block ">Log in</button>
        </form>
    </ion-content>
</ion-view>

As you would expect, this is standard Angular/Ionic syntax, nothing special.

Invoking the HR Custom API

The screen shots included above show a page with a list of departments, and by clicking on a department, we can go to a detail page with all department data and a list of employees within the department. We can create, update and delete a department. We will discuss how to add the employee images in the next section “Using the Storage Service”.

To support these pages, we have defined the following REST endpoints in our custom API in MCS:

Endpoint Method Description
/departments GET List of departments with id, name attributes
/departments POST Add new department
/department/{id} GET All department attributes and list of employees in department
/department/{id} PUT Update department
/department/{id} DELETE Delete department

As you can see, the endpoints and payloads nicely map to the screen design. The more your endpoints and payloads are optimized for your mobile application, the faster you can build the mobile application, and the better the performance of your mobile app.

Oracle A-Team has written an article series on creating a mobile-optimized API using Oracle MCS. This series provides guidelines and best practices for creating such an API from initial design to implementation using the NodeJS engine in Oracle MCS.

To access our custom API endpoints through the SDK, we add the following function to our angular mcsService:

var invokeCustomAPI = function(uri,method,payload) {
  var deferred = $q.defer();
   mbe.CustomCode.invokeCustomCodeJSONRequest(uri , method , payload
   , function(statusCode,data) {deferred.resolve(data)}
   , function(statusCode,data) {deferred.reject(statusCode,data)});         
   return deferred.promise;
};

And we need to make this function public by adding it to the JSON object returned by the service:

return {
   authenticate:authenticate,
   authenticateAnonymous:authenticateAnonymous,
   logout:logout,
   invokeCustomAPI:invokeCustomAPI
}

In the departments controller, we add the code that retrieves the list of departments when the controller gets instantiated:

.controller('departmentsCtrl', function($scope,mcsService) {
    $scope.departments = [];
    mcsService.invokeCustomAPI("hr/departments" , "GET" , null)
    .then (function(data) {
          $scope.departments = data;          
    })
    .catch(function(err) {
        console.log('Error calling endpoint /departments: '+err);
    });      
})

Note that we only need to update the departments array on the controller scope object, there is no need to call $scope.$apply to update the bindings in the page. As mentioned before, Angular automatically takes care of this because we “promisified” the SDK callback functions.

And for completeness, here is the departments list snippet in the departments template:

<ion-list id="departments-list1" class=" ">
  <ion-item ng-repeat="dep in departments" id="departments-list-item4" 
    href="#/editDepartment/{{dep.id}}" class="  ">{{dep.id+' '+dep.name}}
  </ion-item>
</ion-list>

The implementation of the edit department page is really very similar, upon intialization we call the /departments/{id} endpoint, and we define functions for saving and removing a department:

.controller('editDepartmentCtrl', function($scope,$state,$stateParams,mcsService) {
    var currentDepId = $stateParams.departmentId;
    $scope.department = {};
    mcsService.invokeCustomAPI("hr/departments/"+currentDepId , "GET" , null)
    .then (function(data) {
          $scope.department = data;        
    })
    .catch(function(err) {
        console.log('Error calling endpoint /departments/'+currentDepId+': '+err);
    });  
    
    $scope.save = function() {
        mcsService.invokeCustomAPI("hr/departments/"+currentDepId , "PUT" , $scope.department)
        .then (function(data) {
            $state.go('menu.departments');
        })
        .catch(function(err) {
            alert('Error saving department: '+err);
        });          
    }

    $scope.delete = function() {
        alert('DELETE');
        mcsService.invokeCustomAPI("hr/departments/"+currentDepId , "DELETE" , null)
        .then (function(data) {
            $state.go('menu.departments');
        })
        .catch(function(err) {
            alert('Error deleting department: '+err);
        });          
    }
})

Using the MCS Storage Service

MCS includes a file storage service where you can store or cache mobile application objects, such as text, JSON, or images. You can define a storage collection and then add files to such a collection. For our sample application, we store the employee images in an MCS collection named HR. The storage ID of each image includes a reference to the employee ID, so we can easily link each employee with his/her photo in the HR collection:

HRCollection

You can upload files to a storage collection using the MCS UI as shown above. When you use the MCS UI, the ID of the storage object is system-generated. This is inconvenient if you want to associate MCS storage objects with the data of your systems of record that you expose through MCS. Fortunately, if you use the PUT method of the storage REST API, you can add new files and determine the storage ID yourself. For example, you can use a CURL command to upload an image to a storage collection like this:

curl -i -X PUT  -u steven.king:AuThyRJL!  -H "Oracle-Mobile-Backend-ID:bcda8418-8c23-4d92-b656-9299d691e120" -H "Content-Type:image/png"  --data-binary @FayWood.png https://mobileportalsetrial1165yourdomain.mobileenv.us2.oraclecloud.com:443/mobile/platform/storage/collections/HR/objects/EmpImg119

To use the storage service in our app, we first add functions to read storage objects in our mcsService:

var getCollection = function(collectionName) {
  var deferred = $q.defer();
  mbe.Storage.getCollection(collectionName, null
  , function(collection) {deferred.resolve(collection)}
  , function(statusCode,data) {deferred.reject(statusCode,data)});         
  return deferred.promise;
};

var getStorageObjectFromCollection = function(collection,storageId) {
  var deferred = $q.defer();
  collection.getObject(storageId
  , function(storageObject) {deferred.resolve(storageObject)}
  , function(statusCode,data) {deferred.reject(statusCode,data)}
  ,'blob');         
  return deferred.promise;
};

var getStorageObject = function(collectionName,storageId) {
    //  This is the officially documented way, but fires redundant REST call:
    //  return getCollection(collectionName).then( function (collection) {
    //      return getStorageObjectFromCollection(collection,storageId)           
    //  })
    var collection = new mcs._StorageCollection({id:collectionName,userIsolated:false},null,mbe.Storage);
    return getStorageObjectFromCollection(collection,storageId);
 };

return {
 authenticate:authenticate,
 authenticateAnonymous:authenticateAnonymous,
 logout:logout,
 invokeCustomAPI:invokeCustomAPI,
 getStorageObject:getStorageObject
}

The “official” way for accessing a storage object is through its collection. This can be done by calling the getCollection method on the Storage object of the SDK. This returns the collection object which holds the metadata of all the objects inside the collection. On this collection object we can then call methods like getObject, postObject and putObject.  In our app, we want to prevent this additional REST call since we are not interested in the collection as a whole. This is why in line 23 we programmatically instantiate the collection object without making a REST call. As indicated  by the underscore, the _StorageCollection constructor function was intended to be “private” but this use case has been identified as valid and it will be made public in the next version of the SDK.

To be able to show the images with the employees, we extend the code in the editDepartment controller as follows:

    var currentDepId = $stateParams.departmentId;
    $scope.images = {};
    $scope.department = {};
    mcsService.invokeCustomAPI("hr/departments/"+currentDepId , "GET" , null)
    .then (function(data) {
          $scope.department = data;        
          data.employees.forEach(function(emp) {
              getEmployeeImage(emp);
          }) 
    })
    .catch(function(err) {
        console.log('Error calling endpoint /departments/'+currentDepId+': '+err);
    });  

    function getEmployeeImage(emp) {
        var storageId =  "EmpImg"+emp.id;
        mcsService.getStorageObject("HR", storageId)
        .then(function(storageObject){
            var url = URL.createObjectURL(storageObject.getPayload());
            $scope.images[emp.id] = url;
         })
         .catch(function(err) {
             console.log('Error getting storage object with ID '+emp.id+': '+err);
         });          
    }

On line 2 we initialize an images JSON object that holds a list of key-value pairs where the key is the employee ID and the value is the image binary converted to a URL so we can use it an image tag on the page. On line 7-8 we loop over the employees array included in the department object returned by the custom API REST call, and for each employee we call function getEmployeeImage. In this function we call getStorageObject and once the REST all has completed we convert the response payload to an URL and add the employee key-value pair to the images object.

The code snippet for the employees list in the Ionic page template looks like this:

<ion-list id="emp-list1" class=" ">
  <ion-item ng-repeat="emp in department.employees" id="demp-list-item4"  class="  ">
    <img  ng-src="{{images[emp.id]}}" style="width:50px;" /> {{emp.firstName+' '+emp.lastName}} 
  </ion-item>
</ion-list>

The sample app only reads files from the HR collection. If your app has functionality that requires uploading new files to a storage collection, you can add the following methods to the mcsService:

var mergeStorageObjectInCollection = function(collection,fileName, payload, mimetype) {
  var deferred = $q.defer();
  var storageObject = new mcs.StorageObject(collection,{id:fileName,name:fileName});
  storageObject.loadPayload(payload, mimetype);        
  collection.putObject(storageObject
  , function(storageObject) {deferred.resolve(storageObject)}
  , function(statusCode,data) {deferred.reject(statusCode,data)});         
  return deferred.promise;
};

var mergeStorageObject = function(collectionName,filename, payload, mimetype) {
    //  This is the officially documented way, but fires redundant REST call:
    //  return getCollection(collectionName).then( function (collection) {
    //      return getStorageObjectFromCollection(collection,storageId)           
    //   })
    var collection = new mcs._StorageCollection({id:collectionName,userIsolated:false},null,mbe.Storage);
    return mergeStorageObjectInCollection(collection,filename, payload, mimetype);
};

The mergeStorageObjectInCollection function uses the putObject method rather than the postObject method on the collection object which allows you to set the storage object ID yourself instead of having MCS auto-generate the ID for you. If you specify a storage object ID that already exists in the collection, the storage object metadata will be updated in MCS and the content will be replaced.

One typical use case is taking a picture with the device camera and upload this picture to MCS. A-Team has written another article on using the MCS JavaScript SDK with Oracle JET, and that article includes a section on using the Cordova camera plugin to implement this use case.

Using MCS Analytics Service

The MCS Analytics platform service is a very powerful tool to get detailed insight in how your mobile app is used. From your mobile app you can send so-called system events like startSession and endSession to get insight in session duration, device properties, location, etc.  Even better, you can send custom events to get very specific information about how the app is used, for example which pages are accessed for how long, which data is viewed the most, etc.

To support the MCS analytics events, we add the following functions to our mcsService:

var logStartSessionEvent = function() {
    mbe.Analytics.startSession();
}

var logEndSessionEvent = function() {
    mbe.Analytics.endSession();
}

var logCustomEvent = function(eventName, properties) {
    var event = new mcs.AnalyticsEvent(eventName);
    event.properties = properties;
    mbe.Analytics.logEvent(event);
}

var flushAnalyticsEvents = function() {
    mbe.Analytics.flush();
}

return {
   ...
   logStartSessionEvent:logStartSessionEvent,
   logEndSessionEvent: logEndSessionEvent,
   logCustomEvent:logCustomEvent,
   flushAnalyticsEvents:flushAnalyticsEvents
}

When you log an event, the event is not yet sent to the MCS server. You can batch up multiple events and then flush them to the server, which is more efficient because all events are then sent in one REST call. If you log a custom event and you didn’t log a startSession event before, the SDK will automatically create a startSession event first.

In our app, we are going to log a custom event in the editDepartmentCtrl controller whenever the user navigates to the departments detail page to view/edit a department:

    mcsService.invokeCustomAPI("hr/departments/"+currentDepId , "GET" , null)
    .then (function(data) {
          $scope.department = data;      
          mcsService.logCustomEvent('ViewDepartment',{user:$scope.userName,department:$scope.department.name});
          mcsService.flushAnalyticsEvents();
          data.employees.forEach(function(emp) {
              getEmployeeImage(emp);
          }) 
    })
    .catch(function(err) {
        console.log('Error calling endpoint /departments/'+currentDepId+': '+err);
    });

The name of the event is ViewDepartment and we send the user name and department name as properties with the event. If you check the REST request payload that is sent to MCS, you can see how the SDK is easing your life, the required context object with device information, the startSession event and the custom event itself are all included in the payload:

AnalyticsPayload

In the MCS user interface, we can navigate to the custom events analytics page, and get some nice graphs that represent our ViewDepartment event data:

AnaGraphs

Making Direct REST Calls to MCS

At the end of the day, every interaction between a client app using the MCS SDK and MCS results in a REST call being made. The MCS SDK provides a nice abstraction layer which makes your life easier and can save you a lot of time as we have seen with the payload required to send an MCS analytics event. However, there might be situations where you want to make a direct REST call to MCS, for example:

  • to call your custom API with some custom request headers
  • to get access to the raw response object returned by the REST call
  • to call a brand new REST API not yet supported by the JavaScript SDK, like the Locations API.

In such a case, the SDK can still help you by providing the base URL and the Authorization and oracle-mobile-backend-id HTTP headers. Here are two functions you can add to your mcsService to expose this data:

var getHttpHeaders = function() {
    return mbe.getHttpHeaders();
}

var getCustomApiUrl = function(customUri) {
  return mbe.getCustomCodeUrl(customUri);    
}

return {
   ...
   getHttpHeaders:getHttpHeaders,
   getCustomApiUrl:getCustomApiUrl
}

With these functions in place, it becomes quite easy to call an MCS REST API using the Angular $http service object. Here is an example that calls the /departments endpoint in our custom HR API:

$http({method:'GET'
      ,url:mcsService.getCustomApiUrl("hr/departments")
      ,headers:mcsService.getHttpHeaders()})
.then(function(response){ 
    $scope.departments = response.data; })
.catch(function(err) {
     console.log('Error calling endpoint /departments: '+err);
 });

Conclusion

The MCS Javascript Cordova SDK provides an easy and fast way to connect to your custom API’s defined in MCS as well as the various MCS platform services. In this article we provided tips and guidelines for using the MCS authorization, storage and analytics services in the context of an Angular/Ionic app. If you are using another JavaScript toolkit, most of the guidelines and code samples (with some minor tweaks) should still be useful. If you are using Oracle JET, we provide some Oracle JET specific guidelines.

In follow-up articles we will dive into push notifications and offline sync. Keep an eye on Oracle A-Team Chronicles home page, or if you are specifically interested in articles around Oracle MCS, there is also a list of MCS-related articles.

 

Viewing all 376 articles
Browse latest View live