Introduction
Integration Cloud Service (ICS) is an Oracle offering for a Platform-as-a-Service (PaaS) to implement message-driven integration scenarios. This article will introduce into the use of ICS for integrating an on-premise E-Business Suite (EBS) instance via Database Adapter. While EBS in recent releases offers a broad set of integration features like SOAP and REST support (i.e. via Integrated SOA Gateway), these interfaces are not available in older versions like 11.5.x. In the past it has been a proven approach to use Oracle Fusion Middleware Integration products (SOA, OSB etc.) running on-premise in a customer data center to connect to an EBS database via DB Adapter. In a short time this feature will be available also in a cloud based integration solution as we will discuss in this article.
Unless we focus on EBS integration here the DB Adapter in ICS will work similarly against any other custom database. Main reason to use an EBS context is the business case shown below, where ICS is connected to Mobile Cloud Service (MCS) to provide a mobile device solution.
Business Case and Architecture
Not hard to imagine that Oracle customers running EBS 11.5.x might have a demand to add a mobile channel for their end-users. One option could be an upgrade to a recent release of EBS. As this will be in most cases a bigger project, an alternative could be the creation of a custom mobile solution via Oracle Jet and MCS as figured below. MCS is a PaaS offering and requires access to an underlying database via REST/JSON. This is the situation where ICS appears in this architecture.
In absence of native SOAP or REST capabilities being available in EBS 11.5.x tech stack, the integration via ICS would close that gap. Any database access activities (retrieving data, CRUD operations etc.) can run via an ICS/DB Adapter connection to an EBS on-premise database. ICS itself will provide a REST/JSON interface for the external interaction with EBS. This external interface is generic and not restricted to MCS as caller at all. However in our business case the ICS with DB Adapter fulfills the role of a data access layer for a mobile solution.
As shown in the architecture figure above the following components are involved in this end-to-end mobile solution:
- DB Adapter uses a local component to be installed on-premise in EBS data center named ICS Agent. This agent communicates via JCA with the database and retrieves/sends data from DB Adapter in ICS from/to database
- Communication between ICS Agent and DB Adapter is setup via Oracle Messaging Service tunneled through HTTPS
- DB Adapter provides a standard SQL interface for database access
- Part of the embedded features in ICS are data mapping and transformation capabilities
- The external REST endpoint in ICS will be made public through REST Adapter in ICS
The ICS configuration and communication in architecture figure stands for a generic approach. In this sample the mobile solution for EBS 11.5.x makes use of the described data access capabilities as follows (mobile components and Jet are not in scope of this document as we focus on the ICS part here):
- MCS connects to ICS via a connector or generic REST interface
- EBS data will be processed and cached in MCS
- Mobile devices communicate with MCS via REST to render the EBS data for visualization and user interaction
In the following article we will focus purely on the ICS and DB Adapter integration and leave the mobile features out of scope. The technical details of ICS and DB Adapter implementation itself won’t be handled here too, as they will become the main content of another blog. Instead we will show how the implementation features can be used from an Application Integration Developer’s perspective.
ICS Configuration Overview
At the beginning of an ICS based integration there are some configuration activities to be done like creation of connections. This is a one-time or better first-time task in order to make ICS ready for creation of integration flows. This is probably not really an Application Developer’s activity. In most cases a dedicated ICS Administrator will perform the following actions by himself.
At least two connections must be setup for this EBS integration via database communication
- Database Adapter pointing to the EBS database – database connection parameters will be used by ICS Agent running in-house on customers datacenter
- REST Adapter to provide a REST interface for external communication
Screenshot below shows a sample configuration page for DB Adapter connected to an EBS instance. The main parameters can be seen as a local connection from ICS Agent to database: hostname, port, SID.
By using this configuration page there must be also an assignment of a local ICS Agent to this DB Adapter made.
In most cases it will make sense to use EBS database user APPS for this connection as this credential provides the most universal and context-sensitive access to EBS data model.
The other connection to setup is a REST interface (further listed as ICS LocalRest in this article) used for inbound requests and outbound responses. As showing in screenshot below this is a quite straightforward task without extensive configuration in our case. Variances are possible – especially for Security Policies, Username etc:
- Connection Type: REST API Base URL
- Connection URL: https://<hostname>:<port>/ics
- Security Policy: Basic Authentication
- Username: <Weblogic_User>
- Password: <password>
- Confirm Password: <password>
After setting up two connections we are good to create an integration between EBS database and any other system being connected via REST.
DB Adapter based Integration with EBS
During our activities we created some good practices that are probably worth to be shared this way. In general we made some good experience with a top-down approach that looks like follows for creation of an integration flow:
- Identify the parameter in REST call to become part of the JSON payload (functionality of this integration point) for the external interface
- Identify the EBS database objects being involved (tables, views, packages, procedures etc)
- Create a JSON sample message for inbound and another one for outbound
- Design the data mapping between inbound/outbound parameters and SQL statement or PLSQL call
- Create a EBS DB integration endpoint, enter the SQL statement or call the PLSQL procedure/function dedicated to perform the database activity
- Create a local REST integration endpoint to manage the external communication
- Assign the previously created inbound and outbound sample JSON messages to the request and response action
- Create a message mapping for inbound parameters to SQL/PLSQL parameters
- Do the same for outbound parameters
- Add a tracking activity, save the integration and activate it for an external usage
The DB adapter is able to handle complex database types for a mapping to record and array structures in JSON. This means there won’t be any obvious limitations to pass nested data structures to PLSQL packages via JSON.
Here is a sample. In PLSQL we define a data type like follows:
TYPE timeCard IS RECORD ( startTime VARCHAR2(20), stopTime VARCHAR2(20), tcComment VARCHAR2(100), tcCategoryID VARCHAR2(40)); TYPE timeCardRec IS VARRAY(20) OF timeCard;
The parameter list of the procedure looks embeds this datatype in addition to plain type parameters:
procedure createTimecard( userName in varchar2, tcRecord in timeCardRec, timecardID out NUMBER, status out varchar2, message out varchar2 );
The JSON sample payload for the IN parameters would look like this:
{ "EBSTimecardCreationCollection": { "EBSTimecardCreationInput": { "userName": "GEVANS", "timeEntries" : [ { "startTime": "2015-08-17 07:30:00", "stopTime": "2015-08-17 16:00:00", "timecardComment": "Regular work", "timecardCategoryID": "31" },{ "startTime": "2015-08-18 09:00:00", "stopTime": "2015-08-18 17:30:00", "timecardComment": "", "timecardCategoryID": "31" },{ "startTime": "2015-08-19 08:00:00", "stopTime": "2015-08-19 16:00:00", "timecardComment": "Product Bugs Fixing", "timecardCategoryID": "31" },{ "startTime": "2015-08-20 08:30:00", "stopTime": "2015-08-20 17:30:00", "timecardComment": "Customers Demo Preparation", "timecardCategoryID": "31" },{ "startTime": "2015-08-21 09:00:00", "stopTime": "2015-08-21 17:00:00", "timecardComment": "Holiday taken", "timecardCategoryID": "33" } ] } } }
The JSON sample below will carry the output informtion from PLSQL package back inside the response message:
{ "EBSTimecardCreationOutput": { "timecardID": "6232", "status": "Success", "message": "Timecard with ID 6232 created for User GEVANS” } }
As shown we can use complex types in EBS database and are able to create an according JSON structure that can be mapped 1:1 for request and response parameters.
Creating an EBS Integration
To start with the creation of an EBS integration an Application Developer must login to the assigned Integration Services Cloud instance with the username and password as provided.
Entry screen after login shows the available activities that are
- Connections
- Integrations
- Dashboard
As an Applications Developer we will chose Integrations to create, modify or activate integration flows. Connections handling has been shown earlier in this article and Dashboard is usually an option to monitor runtime information.
To create a new integration flow choose Create New Integration and Map My Data. This will create an empty integration where you have the opportunity to connect to adapters/endpoints and to create data mappings.
Enter the following information
- Integration Name : Visible Integration name, can be changed
- Identifier : Internal Identifier, not changeable once created
- Version : Version number to start with
- Package Name (optional) : Enter name if integration belongs to a package
- Description (optional) : Additional explanatory information about integration
Screenshot below shows an integration which is done by 100% and ready for activation. When creating a new integration both sides for source and target will be empty. Suggestion is to start creating a source as marked on left side in figure below.
As mentioned before it might be a good practice to follow a top-down approach. In this case the payload for REST service is defined and exists in form of a JSON sample.
The following information will be requested when running the Oracle REST Endpoint configuration wizard:
- Name of the endpoint (what do you want to call your endpoint?)
- Description of this endpoint
- Relative path of this endpoint like /employee/timecard/create in our sample
- Action for this endpoint like GET, POST, PUT, DELETE
- Options to be configured like
- Add and review parameters for this endpoint
- Configuration of a request payload
- Configure this endpoint to receive the response
Sample screenshot below shows a configuration where a POST operation will be handled by this REST endpoint including the request and response.
The next dialog window configures the request parameter and the JSON sample is taken as a payload file. The payload content will appear later in mapping dialog as the input structure.
The response payload will be configured similar to the request payload. As mentioned the input/output parameters are supposed to be defined in a top-down approach for this endpoint. In the response payload dialog we assign the sample JSON payload structure as defined for output payload for this REST service.
Finally the summary dialog window appears and we con confirm and close this configuration wizard.
Next action is a similar configuration for target – in our sample the DB adapter connected to EBS database.
DB adapter configuration wizard starts with a Basic Information page where the name of this endpoint is requested and general decisions has to be made whether the service will use a SQL statement or make a PLSQL procedure/function call.
As shown in screenshot below the further dialog for a PLSQL based database access will basically start by choosing the schema, package and procedure/function to be used. For EBS databases the schema name for PLSQL packages and procedures is usually APPS.
After making this choice the configuration is done. Any in/out parameter and return values of a specific function become part of the request/response payload and appear in message mapping dialog later.
In case the endpoint will run a plain SQL statement just choose Run a SQL statement in basic information dialog window.
A different dialog window will appear which allows the entering of a SQL statement that might be a query or even a DML operation. Parameter must be passed in a JCA notation with a preceding hash-mark (#).
After entering the SQL statement it must be validated by activating Validate SQL Query button. As long as any validation error messages appear those must be corrected first in order to finalize this configuration step. Once the statement has been successfully validated a schema file will be generated.
By clicking on the schema file URL a dialog window shows the generated structure as shown below. The elements of this structure have to be mapped in transformation component later, once the endpoint configuration has is finished.
The newly created integration contains two transformations after endpoint configuration has been finished – one for requests/inbound and another one for response/outbound mappings.
The mapping component itself follows the same principles like the comparable XSLT mapping tools in Fusion Middleware’s integration products. As shown in screenshot below the mapped fields are marked with a green check mark. The sample shows an input structure with a single field (here: userName) and a collection of records.
Sample below shows the outbund message mapping. In the according PLSQL procedure three parameters are marked as type OUT and will carry the return information in JSON output message.
Once finished with the message mappings, the final step for integration flow completion is the addition of at least one tracking information (see link on top of page). This means one field in message payload has to be identified for monitoring purposes. The completion level will change to 100% afterwards. The integration must be saved and Application Developer can return to integration overview page.
Last step is the activation of an integration flow – supposed to be a straightforward task. Once the completion level of 100% has been reached for completion level the integration flow is ready to be activated.
After clicking on Activate button a Confirmation dialog appears asking whether this flow should be traced or not.
Once activated the REST endpoint for this integration is enabled and ready for invocation.
Entering the following URL in a browser window will test the REST interface and return a sample:
- https://<hostname>:<port>/integration/flowapi/rest/<Integration Identifier>/v<version>/metadata
Testing the REST integration workflow requires a tool like SoapUI to post a JSON message to REST service. In this case the URL from above changes in terms of adding the integration access path as configured in REST connection wizard:
- https://<hostname>:<port>/integration/flowapi/rest/<Integration Identifier>/v<version>/employee/timecard/create
Security Considerations
Earlier in this document we discussed the creation of a DB endpoint in EBS and the authentication as APPS user. In general it is possible to use other DB users alternately. The usage of a higher privileged user like SYSTEM is probably not required and also not recommended due to the impact if this connection might be hacked.
There are multiple factors having an influence on the security setup tasks to be done:
- What are the security requirements in terms of accessed data via this connection?
- Gathering of non-sensitive information vs running business-critical transactions
- Common data access like reading various EBS configuration information vs user specific and classified data
- Does this connection have to provide access to all EBS objects in database (packages, views across all modules) or can it be restricted to a minimum of objects being accessed?
- Is the session running in a specific user context or is it sufficient to load data as a feeder user into interface tables?
Depending on the identified integration purpose above the security requirements demand might range in a span from extremely high to moderate. To restrict user access to a maximum it would be possible to create a user with a limited access to a few objects only like APPLSYSPUB. Access to PLSQL packages would be given on demand of accessibility.
If access to database is required to run in a specific context the existing EBS DB features to put a session into a dedicated user or business org context via FND_GLOBAL.APPS_INITIALIZE or MO_GLOBAL.INIT (R12 onward) must be used. That will probably have an impact on the choice to run a plain SQL statement vs a PLSQL Procedure. With the requirement to perform a preceding call of FND_GLOBAL also a SELECT statement has to run inside a procedure this way and the result values must be declared as OUT parameters as shown previously.
In general the requirement to perform a user authentication is outside of scope of this (EBS) database adapter. In practice the upper layer on top of ICS must support that no unsolicited user access will be given. While connection encryption via SSL is supposed to be the standard there could be obviously a need to create a full logical session management for end-user access including user identification, authentication and session expiration.
Such a deep-dive security discussion was out-of-scope for this blog and should be handled in another article.
For non-EBS databases similar considerations will obviously apply.
Contribution and Conclusion
This blog posting was dedicated to give an overview on the quite new DB adapter in ICS. While recent EBS releases will have a benefit to integrate via EBS adapter or built-in tools the older versions probably won’t. Using the DB adapter will be possibly the preferred method to create a cloud based access to a legacy on-premise EBS database.
At this point I’d like to thank my team mate Greg Mally for his great contribution! We worked and still work closely together in our efforts to provide some good practices for ICS adoption by our customers. Greg has recently published a great own blog to give a deeper technical look behind the scenes of ICS and DB adapter. So it will be very worth to read his blog too!
All content listed on this page is the property of Oracle Corp. Redistribution not allowed without written permission