SI_Documents

Sterling Integrator related documents.

XAPI call by java

How to call XAPI functions directly from java ...

Copy the following JARs from SI jars directory to your Java project:
asset.jar
b2b_aee.jar
b2b_base.jar
b2b_oba.jar
ebics.jar
entities.jar
gis.jar
install_foundation.jar
mailbox.jar
maverick-all.jar
perimeter.jar
platform_activemq.jar
platform_activity.jar
platform_afc.jar
platform_afc_security.jar
platform_aop.jar
platform_asi.jar
platform_baseutils.jar
platform_dv.jar
platform_ifcbase.jar
platform_ifcui.jar
platform_osgi.jar
platform_security.jar
platform_services.jar
resources.jar
soap.jar
standards.jar
translator.jar
 
Add these JARs to your project build path, but not as exported JAR for compilation (they are already loaded by SI runtime, no need to add them to your binary).
 
Regarding, the XAPI call here is a sample:
 
import com.yantra.interop.japi.YIFApi;
import com.yantra.interop.japi.YIFClientFactory;
import com.yantra.interop.client.InteropEnvStub;
import com.yantra.yfc.dom.YFCDocument;
 

import com.yantra.interop.japi.YIFApi;
import com.yantra.interop.japi.YIFClientFactory;
import com.yantra.interop.client.InteropEnvStub;

YIFApi api = YIFClientFactory.getInstance().getLocalApi();
InteropEnvStub envStub = new InteropEnvStub("admin", "My XAPI Interface");
org.w3c.dom.Document xapiOutputDoc = api.invoke(envStub, xApiName, xapiInputDoc);

Note: xapiInputDoc has to be a org.w3c.dom.Document containing API input document

Thanks to fabiengb for this code! 

Sterling Secure Proxy - Configuration

Debashis Ray's picture

QSS - Sterling Quick Start Services

Sterling Quick Start Services

 

Purpose of this Document:


          This document will illustrate my experience while working in Sterling Quick Start Services commonly known as QSS. I have included some tricks and tips , a known BUG of this product and its workaround and my notion about QSS. Currently the support for this product has been stopped by IBM, so this document may help them who are working with QSS in any organization, and to them who loves to read through the experience of working with this product. I have not included the configuration procedures, migration procedures etc in this document, as they can be found in the IBM website documentation for this product.

 

Concept:


          By the name, this was developed and marketed by IBM as an extension of Sterling B2B Integrator to quickly onboard and integrate the partners in the system. It seems to me like having a framework in SI that will provide an additional help towards building the dataflow EDI and NON-EDI architecture of the Company. Perhaps this is why all the backend tables and the readymade BPs have followed the pattern of starting the name as “SIFRAMEWORK_”.

 

How this product will Help:


          This product will definitely help to set up a new partner quickly enough, only if the requirements matches with any of the existing partners, or follow the existing processes(in QSS it has been referred to as Rule and Rule Group), But if the requirement needs more customization then we have to create new Rules and this will lead us to the creation of new set of BPs in SI. That means the quickness of Sterling Quick Start Services depends on the diversity of the requirements. Hence sometimes we may find it less useful.

 

Components of QSS:


          There are four components – Trading Partner Set up (This is not the same as SFG partner set up, we may call it as QSS TP set up), Rule, Rule Group Creation and multi destination set up.

Rule is nothing but the individual BPs . Say for example we have developed a BP that will do the translation of the primary document, and we can create a rule as “translation” and assign it to that BP. Similarly we can create rule like “mailbox_add” “FTP” etc ..

Rule group is a combination of six types of rules, say the new partner data needs to be translated first then a FTP is required, then wecan create a rule group containing translation and FTP in it, and assign it to the partner while creating the Trading partner set up in QSS.

 

Multidestination I have not explored, but it is used to send file to different location for a partner.

Now to set up a QSS partner, we have to create a QSS TP and the corresponding QSS TP documents, below is a sample screenshot. Here the parameters will be passed as XML tags to the ProcessData of the BPs that are assigned in that rule group.

So basically, while onboarding a new partner, we have to create the QSS TP, and the TP documents, and assign the correct rule group to it.

 

        

 

How it works:

          If we closely look into the backend tables and the inbuilt BPs provided by this product, then we can see that it is basically storing the parameters (QSS TP, TP docs, Rule etc.) in the below backend tables, and just firing a lightweight JDBC adapter inside the BP to retrive them and call the desired BPs inline to process the data.



Tips & Tricks:

 

1)   How to create several partners in QSS at once:
 Just form an insert SQL with the help of the below and execute it in the backend DB of SI & similarly we can make entry in the SIFRAMEWORK_DOCUMENTS as well.
insert into SIFRAMEWORK_TP (TP_NAME) VALUES ("PARTNER1");
insert into SIFRAMEWORK_TP (TP_NAME) VALUES ("PARTNER2");
insert into SIFRAMEWORK_TP (TP_NAME) VALUES ("PARTNER3");
commit;



2)   How to use more than six rules for a specific partner:
To do this, there are two ways, either create a new BP combining the two or more BPs from the Rules you want to apply for the partner and then create a new rule and rule group; or you can create another TP docs and pass the data from the first TP document to the second one, and the assign the first six rules in a rule group and assign it to the first TP document and assign the remaining rules in the second TP document. The output from the last TP Docs will be the desired output for that partner.

One known BUG and its workaround:

          During migration of the QSS partners, the map name does not get migrated into the destination environment, or instances of SI, All the time we have to manually update the map name in the QSS TP document. This is a known BUG in QSS, and the below process can be used to get a workaround type solution of this.

 

To the context, let me describe the migration process a bit, first we have to export the TP data from QSS, and the TP Docs data and in the target environment execute the SIFramework_MigrationUtil_ImportData BP with TP data first and then with TP Docs.

The migration process will remain the same. We have to modify the SIFramework_MigrationUtil_ImportData BP to take the map value from the source environment and update it directly into the destination environment by a JDBC adapter. For taking the map value from the source environment, we can use the below java code in a java task service.

 Java Code:


import java.sql.*; String url = "jdbc:oracle:thin:@edidbdv00:1521:"; String dbName = "edidev"; String driver = "oracle.jdbc.OracleDriver"; String userName = "sbidb"; String password = "passwd"; String ctp = (String)wfc.getWFContent("ctp"); String ctpd = (String)wfc.getWFContent("ctpd"); String sql="select map_name from SIFRAMEWORK_DOCUMENTS where tp_name = '"+ctp+"' and document_name='"+ctpd+"'"; try { Class.forName(driver).newInstance(); System.out.println(url+dbName+userName+password); Connection conn = DriverManager.getConnection(url+dbName,userName,password); Statement stmt = conn.createStatement(); ResultSet rs = stmt.executeQuery(sql); while (rs.next()) { wfc.setWFContent("mapn", rs.getString("MAP_NAME"), true); } conn.close(); } catch (Exception e) { e.printStackTrace(); } return "";

 -       where  edidbdv00 is the source SI DB, schema name is  sbidb and its a oracle DB.

 Modified BP:

 
 

 

Added the below portion in the original BP

<rule name="Loopenter">
                    <condition>count(/ProcessData/test/document) &gt;= /ProcessData/docctr/text()</condition>
          </rule>
          <assign to="docctr">1</assign>
          <choice name="LoopSQCHC">
                    <select>
                              <case ref="Loopenter" activity="LoopSQ"/>
                    </select>
                    <sequence name="LoopSQ">
                              <assign name="Assign" to="ctp" from="/ProcessData/test/document[position()=/ProcessData/docctr/text()]/TPName/text()"/>
                              <assign name="Assign" to="ctpd" from="/ProcessData/test/document[position()=/ProcessData/docctr/text()]/DocumentName/text()"/>
                              <operation name="JavaTask Service">
                                       <participant name="BDimportqss"/>
                                       <output message="JavaTaskInputMessage">
                                                 <assign to="." from="*"/>
                                       </output>
                                       <input message="inmsg">
                                                 <assign to="." from="*"/>
                                       </input>
                              </operation>
                              <assign name="Assign" to="sql" from="concat(&quot;update SIFRAMEWORK_DOCUMENTS set map_name = &apos;&quot;,mapn,&quot;&apos; where tp_name = &apos;&quot;,/ProcessData/ctp,&quot;&apos; and document_name=&apos;&quot;,/ProcessData/ctpd,&quot;&apos;&quot;)"/>
                              <operation name="FirstLookup">
                                       <participant name="CVI_Lightweight_JDBC"/>
                                       <output message="LightweightJDBCAdapterTypeInputMessage">
                                                 <assign to="query_type">UPDATE</assign>
                                                 <assign to="pool">oraclePool</assign>
                                                 <assign to="result_name">Result1</assign>
                                                 <assign to="row_name">row</assign>
                                                 <assign to="sql" from="sql/text()"/>
                                                 <assign to="." from="*"/>
                                       </output>
                                       <input message="inmsg">
                                                 <assign to="JDBC_Lookup" from="DocToDOM(PrimaryDocument)" append="true"/>
                                       </input>
                              </operation>
                              <assign to="docctr" from="docctr+1"/>
                              <operation name="Release Service">
                                       <participant name="ReleaseService"/>
                                       <output message="ReleaseServiceTypeInputMessage">
                                                 <assign to="." from="*"/>
                                                 <assign to="TARGET">sql | mapn | ctp | ctpd</assign>
                                       </output>
                                       <input message="inmsg">
                                                 <assign to="." from="*"/>
                                       </input>
                              </operation>
                              <repeat name="LoopSQCHCrepeat" ref="LoopSQCHC"/>
                    </sequence>
          </choice>

 

 

 --- With Warm Wishes --- Debashis – Dt: August 12 , 2015

 

 

 

Mirjana's picture

XAPI inputs for Replay and Redeliver in File Gateway

Here are the input XMLs for Auto Replay and Auto Delivery in SFG:
 

<replayFgArrivedFile ArrivedFileKey="xxx" Comment="BulkReplay"/>

<redeliverFgConsumerFile DeliveryKey="xxx" Comment="BulkReplay"/>

 

 

Mirjana's picture

Custom property file in location different than default

Use custom properties thatr is defined in folder different than default one, where default one is installFolder/properties.
 
If we want to define file in any folder and use property values from it by sci-get-property() function, do the following:
 
  1. Create property file in e.g. installFolder/MIRJANA/mirjana.properties
 
 
  1. Content of mirjana.properties is:
 
propName = propValue
 
  1. Define the following line in installFolder/properties/servers.properties that is correlation between property file name and its location in the file system.
 
myCustomProp=C:/SterlingCommerce/SI50/install/MIRJANA/mirjana.properties
 
  1. Run setupfiles.sh and restart the system
 
  1. Create the following BP, to refresh cache with custom property values and take property by sci-get-property() function
 
<process name="default">
      <sequence>
            <operation name="Cache Refresh Service">
                  <participant name="CacheRefreshService"/>
                  <output message="CacheRefreshServiceTypeInputMessage">
                        <assign to="." from="*"/>
                        <assign to="cache_name">myCustomProp</assign>
                        <assign to="cache_type">properties</assign>
                  </output>
                  <input message="inmsg">
                        <assign to="." from="*"/>
                  </input>
            </operation>
            <assign to="VAR" from="sci-get-property('myCustomProp','propName')"/>
      </sequence>
</process>
 
 
  1. Result in the process data is:
 
<?xml version="1.0" encoding="UTF-8"?>
<ProcessData>
      <VAR>propValue</VAR>
</ProcessData>
 
 
 
Mirjana's picture

Encrypt – Decrypt password in a translation map

Encrypt – Decrypt password in a translation map
 
  • To encrypt a field in a translation map we can write the following code in extended rule:
 
string[255] password;
object ob;
 
ob = new("com.sterlingcommerce.security.opsutilities.DBPasswords");
password = ob.encrypt($COLL_FTP_password.#COLL_FTP_password);
 
  • To decrypt a field in a translation map, extended rule will be:
 
object ob;
ob = new("com.sterlingcommerce.security.opsutilities.DBPasswords");
#COLL_FTP_password = ob.decrypt(#COLL_FTP_password);
 
I found this function cannot be done on the output side when format is SQL, just on the input side or any extended rule level where it is possible to do, either on input field directly, or to take a field value into variable and run a mehtod (encrypt or decrypt) on variable.
 
Mirjana's picture

Sort function in translation map

Sorting data in translation map
 
Input data:
 
1,123,def
3,345,xxx     
2,234,bcd
 
Simple CSV map is:
 
 
 
  • If we want to sort per ID, extended rule in INPUT (root) record, On-End must be:
 
sort($record,#ID);
 
Screenshot follows:
 
 
And the result/output is:
 
1,123,def
2,234,bcd
3,345,xxx 
 
  • Extended rule for sort per stringField, in ascending order, is:
 
sort($record,#stringField);
... or ...
sort($record,#stringField asc);
 
... and the result is:
 
2,234,bcd
1,123,def
3,345,xxx   
 
 
  • Extended rule for sort per stringField, in descending order, is:
 
sort($record,#stringField desc);
 
... and the result is:
 
3,345,xxx     
1,123,def
2,234,bcd
Somasekhar's picture

SI Cluster - Service Group

Service Groups in GIS

 

FTP Client adapter is considered a non-clusterable adapter and should instead be set up in a group.
Non-clusterable adapters can be grouped together using service groups and accessed as a single entity. These adapters include:

  • FTP server adapter
  • HTTP server adapter
  • FTP client adapter
  • HTTP client adapter
  • Connect:Direct adapter
  • Connect:Enterprise adapter
  • B2B Communication adapter

Here is how to create a service group for protocol adapters

 

Creating a Service Configuration

 

In some cases, you must create a configuration of the service you want to use. You can create many configurations from one service type.

Note: If you are using a MySQL database, do not create service configurations whose names begin with an accented character.

You can save your configuration at any point in the configuration process by clicking Save. You can then return to it later.

To create a service configuration:

  1. From the Deployment menu, select Services > Configuration.
  2. In the Services Configuration page, next to Create New Service, click Go!
  3. Use the Tree View or List View to locate and select a service to configure, or type the full name of the service in the Service Type field. Click Next.
  4. Type a unique and meaningful name and description for your configuration in the appropriate fields and click Next.
  5. Select or create a service group for this service configuration, as needed. Selections are:
    • None - You do not want to include this configuration in a group at this time.
    • Create New Group - You can enter a name for a new group in this field, which will then be created along with this configuration.
    • Select Group - If you have already created one or more groups for this service type, they are displayed in the list. Select a group from the list.

For more information, see Using Service Groups.

  1. Complete the fields specific to the service. See the documentation for a specific service for more information.
  2. To enable this configuration for business processes, verify that Enable for Business Processes is selected.
  3. In the Confirm page, verify the information about the service configuration you created, and then click Finish to add it to Gentran Integration Suite.
  4. To determine if additional configuration is necessary, see the documentation for the specific service. You may need to specify additional parameters for some services using the Service Editor in the GPM.

 

Using Service Groups

 

In Gentran Integration Suite, you can create groups of service configurations. A group can only contain service configurations of one service type. A service group is a set of service configurations of the same service type that can act as peers (can be configured to perform the same activity in the same setting). In some situations, using a service group instead of a single service configuration can be an effective way to do load balancing and failover processing. For example, you might have many trading partners submitting data that gets collected by a File System adapter on your Gentran Integration Suite system. Because this might cause a bottleneck when processing the incoming data, you could create a group of File System adapters and use the group in your business process, instead of just one File System adapter. As a result, Gentran Integration Suite can balance the incoming data load across all the File System adapters in the group.This can reduce wait times, free up system resources, and provide failover capability - if one File System adapter fails, the other File System adapters in the group can still process requests.

Load balancing and failover in a cluster can be helped by service groups, which are groups of services or adapters of the same type that can act as peers and share the load. More than one service instance of the same service/adapter type can be configured and placed within a service group. Thus, all adapters in a service group are viewed as a single entity and a service group name (instead of the service instance name) is referenced in the business process.

In a cluster environment, the service group does a round robin of all adapters on a per node basis. Each node handles its own load balancing across all adapters belonging to the service group. If a service group detects failure of a specific adapter, it goes out to get a good one, thus providing not only load balancing but also failover.

Adapters that cannot be configured on every cluster node (because of resource constraints or connectivity to external systems) can be deployed one per node and placed on a service group. The business process can be configured to use a service group instead of a service instance, thus attaining clusterable adapters deployed on all nodes, facilitating failover and load balancing.

Here are some important concepts about service groups:

  • A service group can include only one service type. For example, a group can contain multiple configurations of the HTTP Server adapter, but cannot contain an HTTP Server adapter configuration and an FTP Server adapter configuration.
  • Groups do not exist as actual entities in the system; groups are only stored as parameters in service configurations.
  • The only way to create or edit a group is by creating or editing a service configuration.
  • A service group cannot have the same name as an individual service or as another group, even if the group is for a different service type.
  • To remove a service group from Gentran Integration Suite, simply remove all the service configurations from the group. Also, if you delete all the service configurations that are part of the group, the group no longer exists.
  • Once a group no longer exists, you can reuse its name for a new service configuration or group.
  • When writing BPML, to use a service group, refer to it in the same way that you would an individual service configuration. For example, if you had a service group named MyHTTPServers, you could use the statement <participant name="MyHTTPServers"/> in a business process.
  • <assign to="FTPClientAdapter">FTPServiceGroup</assign>(actually FTPClientAdapter wil be there so change it to group which you created)
  • <assign to="HTTPClientAdapter">HTTPServiceGroup</assign>
  • In the GPM, you can see and select service groups from the config list, the same way that you would select one individual service configuration.

 

Bullet Points:

 

1) In the config list, service groups are not distinguished from individual service configurations

2) All adapters that use perimeter services are not clusterable so in our case FTP client, FTP server, HTTP server, HTTP client adapters,SFTP need to point to a specific node attached to the perimeter server to which they are connected. These can be grouped together using service groups to achieve load balancing/failover.

 

3) The other adapters which are not using perimeter services can point to “All” under a service group

 

4) We can create multiple adapters for FSA and can point to Environment “All” only when we use the shared file system

 

5) SAP adapter also not using perimeter services so we can point to “All” but there is one line in the documentation

   Licensing restrictions For example, the SAP and Web Methods Enterprise adapters might not be fully clustered across all server instances because of the licensing agreement.

 

 

Documentation Links :

 

http://www.sterlingcommerce.com/Documentation/Plat_IFC10/Content/Clustering/cluster_considerations.html

http://www.sterlingcommerce.com/Documentation/Plat_IFC10/Content/Clustering/cluster_architecture.html

http://www.sterlingcommerce.com/Documentation/Plat_IFC10/Content/Clustering/cluster_questions.html

 

  

Example screenshots

 

 

 

Mirjana's picture

Install CLA2 as Windows Service

CLA2 Installation on a remote server – as a windows service

Copy the following files on a remote server:

-          copy the file JavaService.exe (you can find it in folder /SI_install_dir/install/client/msmq/installJavaService)

-          copy the file CLA2Client.jar (you can find it in folder /SI_install_dir/install/client/cmdline2

-          write the script InstallCLA2.cmd so it fits your installation :

_______________________________________________________________

 

REM *** INSTALLS A CLA2 Client as Windows Service ***

 

REM *** CHANGE TO MATCH YOUR INSTALL ***

 

set ListenPort=13052

set ClientJar=C:\CLA2\CLA2Client.jar

set JVM="C:\Program Files\Java\jre1.5.0\bin\client\jvm.dll"

set StdOut="C:\CLA2\serviceOutput.txt"

set ErrorOut="C:\CLA2\serviceError.txt"

 

REM *** DO NOT HAVE TO CHANGE ***

 

set ServiceName=SI_CLA2Adapter_%ListenPort%

set StartClass=com.sterlingcommerce.woodstock.services.cmdline2.CmdLine2RemoteImpl

 

REM *** CHANGE TO MATCH YOUR INSTALL ***

 

C:\CLA2\JavaService -install "%ServiceName%" %JVM% -Djava.class.path=%ClientJar% -start %StartClass% -params %ListenPort% -out %StdOut% -err %ErrorOut%

_______________________________________________________________

 

 

Java version on a remote system must be the same as one used by IBM Sterling Integrator.

 

-          run the script

-          after installation go to Start → Run → regedit

-          find HKEY_LOCAL_MACHINE/SYSTEM/CurrentControlSet/Services/SI_CLA2Adapter_16052* (*the name of the service)

-          choose Parametars → JVM Option Number 0 and write the right path for CLA2Client.jar file.

 

This part is only for CLA2 used in PGP encryption

 

-          Start → Control Panel → Administrative Tools → Services

-          Choose CLA2 service and click on Properties → Log On. Click This account and set the user which generated keys.

 

  *** OR ***

nssm install "Command Line Adapter" "c:\Program Files\Java\jre6\bin\java.exe" "-jar c:\cla2\CLA2Client.jar 12699"

Mirjana's picture

Create Schedule by XAPI function

Input XML data examples for createSchedule XAPI function
 
XAPI function createSchedule can create a schedule based on XML input data. You can find 4 examples for TIMER, DAILY, WEEKLY and MONTHLY schedules. Input data taken from Exported schedule that look like as follows:
 
<?xml version="1.0" encoding="UTF-8"?>
<SI_RESOURCES xmlns="http://www.stercomm.com/SI/SI_IE_Resources" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" GISVersion="3207" FrameworkVersion="2">
          <SCHEDULES>
                   <SCHEDULE>
                             <SCHEDULE_TYPE>2</SCHEDULE_TYPE>
                             <ASSOCIATED_BP_NAME>Alert</ASSOCIATED_BP_NAME>
                             <ASSOCIATED_SERVICE_NAME>Alert</ASSOCIATED_SERVICE_NAME>
                             <SCHEDULE_PARAMS/>
                             <SCHEDULE_ONSTARTUP>1</SCHEDULE_ONSTARTUP>
                             <SCHEDULE_EXECUTION_COUNT>-1</SCHEDULE_EXECUTION_COUNT>
                             <SCHEDULE_EXECUTION_CURRENT_COUNT>0</SCHEDULE_EXECUTION_CURRENT_COUNT>
                             <SCHEDULE_EXECUTION_STATUS>WAIT</SCHEDULE_EXECUTION_STATUS>
                             <SCHEDULE_STATUS>ACTIVE</SCHEDULE_STATUS>
                             <SCHEDULE_SYSTEMNAME>node1</SCHEDULE_SYSTEMNAME>
                             <SCHEDULE_USERID>admin</SCHEDULE_USERID>
                             <SCHEDULE_TIMINGXML><![CDATA[<timingxml><days><day ofMonth="1"><times><time>2200</time></times></day><day ofMonth="3"><times><time>2300</time></times></day><day ofMonth="10"><times><time>0000</time></times></day><day ofMonth="12"><times><time>1200</time></times></day><day ofMonth="24"><times><time>0100</time></times></day></days><excludedDates></excludedDates></timingxml>]]></SCHEDULE_TIMINGXML>
                        </SCHEDULE>
          </SCHEDULES>
          <BPDEFS>
                   <BPDEF>
                             <LangResource>SIB64ENCODEPHByb2Nlc3MgbmFtZT0iQWxlcnQiPgoJPHNlcXVlbmNlIG5hbWU9IkFsZXJ0TWFpbiI+CgkJPG9wZXJhdGlvbiBuYW1lPSJBbGVydExvY2siPgoJCQk8cGFydGljaXBhbnQgbmFtZT0iU3lzdGVtTG9ja1NlcnZpY2UiLz4KCQkJPG91dHB1dCBtZXNzYWdlPSJYb3V0Ij4KCQkJCTxhc3NpZ24gdG89IkxPQ0tfS0VZIj5BTEVSVExPQ0s8L2Fzc2lnbj4KCQkJCTxhc3NpZ24gdG89IkRVUkFUSU9OIj44NjQwMDAwMDwvYXNzaWduPgoJCQkJPGFzc2lnbiB0bz0iQ0xFQVJfT05fU1RBUlRfVVAiPnRydWU8L2Fzc2lnbj4KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICA8YXNzaWduIHRvPSJVU0VSIj5TeXN0ZW1Mb2NrU2VydmljZTwvYXNzaWduPgoJCQkJPGFzc2lnbiB0bz0iLiIgZnJvbT0iKiI+PC9hc3NpZ24+CgkJCTwvb3V0cHV0PgoJCQk8aW5wdXQgbWVzc2FnZT0iWGluIj4KCQkJCTxhc3NpZ24gdG89Ii4iIGZyb209IioiPjwvYXNzaWduPgoJCQk8L2lucHV0PgoJCTwvb3BlcmF0aW9uPgoJCgkJPG9wZXJhdGlvbiBuYW1lPSJBbGVydCBQcm9jZXNzIFJlcG9ydCI+CgkJCTxwYXJ0aWNpcGFudCBuYW1lPSJBbGVydFNlcnZpY2UiLz4KCQkJPG91dHB1dCBtZXNzYWdlPSJBbGVydFNlcnZpY2VUeXBlSW5wdXRNZXNzYWdlIj4KCQkJPC9vdXRwdXQ+CgkJCTxpbnB1dCBtZXNzYWdlPSJpbm1zZyI+CgkJCTwvaW5wdXQ+CgkJPC9vcGVyYXRpb24+CgkJCgkJPG9wZXJhdGlvbiBuYW1lPSJBbGVydFVuTG9jayI+CgkJPHBhcnRpY2lwYW50IG5hbWU9IlN5c3RlbUxvY2tTZXJ2aWNlIi8+CgkJPG91dHB1dCBtZXNzYWdlPSJYb3V0Ij4KCQkJPGFzc2lnbiB0bz0iQUNUSU9OIj51bmxvY2s8L2Fzc2lnbj4KCQkJPGFzc2lnbiB0bz0iTE9DS19LRVkiPkFMRVJUTE9DSzwvYXNzaWduPgogICAgICAgICAgICAgICAgICAgICAgICA8YXNzaWduIHRvPSJVU0VSIj5TeXN0ZW1Mb2NrU2VydmljZTwvYXNzaWduPgoJCQk8YXNzaWduIHRvPSIuIiBmcm9tPSIqIj48L2Fzc2lnbj4KCQk8L291dHB1dD4KCQk8aW5wdXQgbWVzc2FnZT0iWGluIj4KCQkJPGFzc2lnbiB0bz0iLiIgZnJvbT0iKiI+PC9hc3NpZ24+CgkJPC9pbnB1dD4KCQk8L29wZXJhdGlvbj4KCQkKCQk8b25GYXVsdD4KCQkJPG9wZXJhdGlvbj4KCQkJPHBhcnRpY2lwYW50IG5hbWU9IlN5c3RlbUxvY2tTZXJ2aWNlIi8+CgkJCTxvdXRwdXQgbWVzc2FnZT0iWG91dCI+CgkJCQk8YXNzaWduIHRvPSJBQ1RJT04iPnVubG9jazwvYXNzaWduPgoJCQkJPGFzc2lnbiB0bz0iTE9DS19LRVkiPkFMRVJUTE9DSzwvYXNzaWduPgogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIDxhc3NpZ24gdG89IlVTRVIiPlN5c3RlbUxvY2tTZXJ2aWNlPC9hc3NpZ24+CgkJCQk8YXNzaWduIHRvPSIuIiBmcm9tPSIqIj48L2Fzc2lnbj4KCQkJPC9vdXRwdXQ+CgkJCTxpbnB1dCBtZXNzYWdlPSJYaW4iPgoJCQkJPGFzc2lnbiB0bz0iLiIgZnJvbT0iKiI+PC9hc3NpZ24+CgkJCTwvaW5wdXQ+CgkJCTwvb3BlcmF0aW9uPgoJCTwvb25GYXVsdD4KIAk8L3NlcXVlbmNlPgo8L3Byb2Nlc3M+CgoK</LangResource>
                             <ConfigResource>
                                      <ConfDescription>created via command line</ConfDescription>
                                      <ConfProcessName>Alert</ConfProcessName>
                                      <ConfWFDID>645</ConfWFDID>
                                      <ConfWFDVersion>1</ConfWFDVersion>
                                      <OBJECT_VERSION>1</OBJECT_VERSION>
                                      <SIResourceDefaultVersion>true</SIResourceDefaultVersion>
                                      <ConfPersist>0</ConfPersist>
                                      <ConfLifeSpan>-1</ConfLifeSpan>
                                      <ConfRemoval>0</ConfRemoval>
                                      <ConfDocStorage>4</ConfDocStorage>
                                      <ConfPriority>4</ConfPriority>
                                      <ConfRecoveryLevel>3</ConfRecoveryLevel>
                                      <ConfOnfaultFlag>false</ConfOnfaultFlag>
                                      <ConfStatus>1</ConfStatus>
                                      <ConfLastUsed>Empty</ConfLastUsed>
                                      <ConfEncoding>None</ConfEncoding>
                                      <ConfType>1</ConfType>
                                      <ConfDocTracking>false</ConfDocTracking>
                                       <ConfDeadLineInterval>-1</ConfDeadLineInterval>
                                      <ConfFirstNotifyInterval>-1</ConfFirstNotifyInterval>
                                      <ConfSecondNotifyInterval>-1</ConfSecondNotifyInterval>
                                      <ConfEventLevel>2</ConfEventLevel>
                                      <ConfCategory/>
                                      <ConfPreferredNode/>
                                      <ConfMandatoryNode/>
                                      <ConfTransaction>false</ConfTransaction>
                                      <ConfCommitOnError>true</ConfCommitOnError>
                             </ConfigResource>
                   </BPDEF>
          </BPDEFS>
</SI_RESOURCES>
 
 
In SCHEDULE_TIMINGXML element we can find an input for XAPI function.
 
  • Example for Schedule based on Timer:
 
 
<Schedule
ExecutionTimer="TIMER"
ScheduleType="SCHEDULE_WORKFLOW"
Status="ACTIVE"
SystemName="node1"
ServiceName="UniqueScheduleName1"
WorkFlowName="TEST_assign"
UserID="admin">
          <TimingXML>
                   <days>
                             <day ofWeek="-1">
                                      <times>
                                                <timeRange>
                                                          <range>0000-2359</range>
                                                          <interval>120</interval>
                                                          <onMinute>0</onMinute>
                                                </timeRange>
                                      </times>
                             </day>
                   </days>
                   <excludedDates/>
          </TimingXML>
</Schedule>
 
 
  • Example for Daily Schedule:
 
 
<Schedule
ExecutionTimer="DAILY_WEEKLY_MONTHLY"
ScheduleType="SCHEDULE_WORKFLOW"
Status="ACTIVE"
SystemName="node1"
ServiceName="UniqueScheduleName1"
WorkFlowName="TEST_assign"
UserID="admin">
          <TimingXML>
                   <days>
                             <day ofWeek="-1">
                                      <times>
                                                <time>2200</time>
                                                <time>2300</time>
                                                <time>1000</time>
                                                <time>1100</time>
                                      </times>
                             </day>
                   </days>
                   <excludedDates/>
                   <excludedDates/>
          </TimingXML>
</Schedule>
 
  • Example for Weekly Schedule:
 
 
<Schedule
ExecutionTimer="DAILY_WEEKLY_MONTHLY"
ScheduleType="SCHEDULE_WORKFLOW"
Status="ACTIVE"
SystemName="node1"
ServiceName="CollectionCode5"
WorkFlowName="TEST_assign"
UserID="admin">
          <TimingXML>
                   <days>
                             <day ofWeek="2">
                                      <times>
                                                <time>2200</time>
                                      </times>
                             </day>
                             <day ofWeek="3">
                                      <times>
                                                <time>2300</time>
                                      </times>
                             </day>
                             <day ofWeek="4">
                                      <times>
                                                <time>1200</time>
                                      </times>
                             </day>
                             <day ofWeek="5">
                                      <times>
                                                <time>0000</time>
                                      </times>
                             </day>
                             <day ofWeek="1">
                                      <times>
                                                <time>1400</time>
                                      </times>
                             </day>
                   </days>
                   <excludedDates/>
          </TimingXML>
</Schedule>
 
  • Example for Monthly Schedule:
 
 
<Schedule
ExecutionTimer="DAILY_WEEKLY_MONTHLY"
ScheduleType="SCHEDULE_WORKFLOW"
Status="ACTIVE"
SystemName="node1"
ServiceName="CollectionCode5"
WorkFlowName="TEST_assign"
UserID="admin">
          <TimingXML>
                   <days>
                   <day ofMonth="1">
                             <times>
                                      <time>2200</time>
                             </times>
                   </day>
                   <day ofMonth="3">
                             <times>
                                      <time>2300</time>
                             </times>
                   </day>
                   <day ofMonth="10">
                             <times>
                                      <time>0000</time>
                             </times>
                   </day>
                   <day ofMonth="12">
                             <times>
                                      <time>1200</time>
                             </times>
                   </day>
                   <day ofMonth="24">
                             <times>
                                      <time>0100</time>
                             </times>
                   </day>
          </days>
          <excludedDates/>
          </TimingXML>
</Schedule>
 
  • BP that will run XAPI service to create a schedule:
 
<process name="default">
 <sequence>
    <operation name="XAPI Service">
      <participant name="XAPIService"/>
      <output message="XAPIServiceTypeInputMessage">
        <assign to="." from="*"></assign>
        <assign to="api">createSchedule</assign>
      </output>
      <input message="inmsg">
        <assign to="." from="*"></assign>
      </input>
    </operation>
 
 </sequence>
</process>
Syndicate content