Saturday, December 28, 2013

Migrating WSO2 API Manager 1.5.0 to 1.6.0

WSO2 API Manager latest version 1.6.0 got released recently. This post provides instructions on how to migrate your data from API Manager 1.5.0 to 1.6.0. If you had already used API Manager 1.5.0 and need to migrate your data to the latest version you can follow this.

You can download WSO2 API Manager 1.6.0 from here.


You can download the latest migration scripts from

https://svn.wso2.org/repos/wso2/carbon/platform/branches/turing/products/apimgt/1.6.0/modules/distribution/resources/migration-1.5.0_to_1.6.0


Checkout the migration kit by executing below command.

%> svn co https://svn.wso2.org/repos/wso2/carbon/platform/branches/turing/products/apimgt/1.6.0/modules/distribution/resources/migration-1.5.0_to_1.6.0


  1. Shutdown AM 1.5.0 if it is running. 
  2. Backup your API Manager Databases of your AM 1.5.0 instance.
  3. Execute relevant sql script in migration-1.5.0_to_1.6.0 against your API Manager Database. (ex:If your database is mysql run mysql.sql against your APIM mysql database)
  4. Now point same WSO2 Carbon Database(User Store and Registry) and API Manager Databases of your AM 1.5.0 instance to AM 1.6.0. (Configure AM_1.6.0/repository/datasource/master-datasources.xml to point same databases configured in AM 1.5.0)
  5. Move all your synapse configurations to APIM_1.6.0. For that, copy and replace APIM_1.5.0/repository/deployment/server/synapse-config/default directory to APIM_1.6.0/repository/deployment/server/synapse-config/default. [If you had APIs created by tenants copy tenant configs as well which can be found in repository/tenants]
  6. Start AM 1.6.0 and Login.
  7.  Change the registry extension file[.rxt file] with the new one[which can be found from /rxt/api.rxt]. For that go API Manager Management console. Navigate to path 'Home-> Extensions-> Configure-> Artifact Types' from management console and click the link 'View/Edit' and replace above mentioned new api.rxt and save.
  8. Configure endpoint-migration/build.xml with the information for the below properties.
    • registry.home= Path to AM pack location [In a distributed setup, give the Publisher node path]
    • username= Username for the AM server
    • password= Password for the AM server
    • host= IP of running AM server [In a distributed setup, give the host of the Publisher node]   
    • port= Port of running AM server [In a distributed setup, give the port of the Publisher node]   
    • version= Version of AM server
  1. Go inside endpoint-migration/ and execute "ant run". You should get a "BUILD SUCCESSFUL" message if it ran correctly.

Hope this instructions help you to migrate your data from API Manager 1.5.0 to 1.6.0. Queries are welcome if you face any issues with the migration.



Wednesday, December 4, 2013

WSO2 API Manager- Customizing Store User Sign-up

WSO2 API Manager Store Sign-up page can be customized according to your requirements by adding new fields or modifying existing fields. These fields are known as user claims.

By default API Store Sign-up looks as below.



Let's say you want to add a new field called 'Organization' to Store Sign-up page. This post provides step by step instructions on how to achieve this.

1. Start API Manager 1.5.0 and go to Management Console (https://localhost:9443/carbon/)

2. Go to Configure -> Claim Management


3. Click on first Claim dialect displayed as 'http://schemas.xmlsoap.org/ws/2005/05/identity'. This will list all the existing claims under selected dialect. 

4. Click on 'Add New Claim Mapping'. 

5. Enter the below values for new claim.


Display Name : Organization
Description : Organization
Claim Uri : http://schemas.xmlsoap.org/ws/2005/05/identity/claims/organization
Mapped Attribute : organization
Supported by Default : select

Note that claims which are 'Supported by Default' true, are only displayed in the Sign-up page. Therefore when you are adding new claims make sure to check 'Supported by Default' checkbox.



If you need this claim to be a required field [Mandatory field in Sign-up], make sure to check 'Required' checkbox.

6. Open wso2am-1.5.0/repository/deployment/server/jaggeryapps/store/site/conf/locales/jaggery/locale_default.json and add new entry 'Organization' as below. Then Save.

7. Go to API Store Sign-up page and refresh. You can see the newly added field.



Modifying Existing Claims

Let's say now you want to make the Organization field 'required'. Also you want to change the field display order. 

1. Click on Organization Claim and check 'required' as below. Set the Display Order as 4. 

Edit Claim

Check Required

Also I have changed the display order of all the below field such that given numbers are entered for each claim 'Display Order'

First Name : 1
Last Name : 2
Email : 3
Organization : 4



2. Now Access the API Store Sign-up page. You will see the modifications in the display order and Organization field as 'required'.




Thursday, November 7, 2013

Migrating WSO2 API Manager 1.4.0 to 1.5.0


Migrating WSO2 API Manager 1.4.0 to 1.5.0

WSO2 API Manager latest version 1.5.0 got released recently. This post provides instructions on how to migrate your data from API Manager 1.4.0 to 1.5.0. If you had already used API Manager 1.4.0 and need to migrate your data to the latest version you can follow this.

You can download WSO2 API Manager 1.5.0 from here.

You can download the latest migration scripts from https://svn.wso2.org/repos/wso2/carbon/platform/branches/turing/products/apimgt/1.6.0/modules/distribution/resources/migration-1.4.0_to_1.5.0/

Checkout the migration kit by executing below command.

%> svn co https://svn.wso2.org/repos/wso2/carbon/platform/branches/turing/products/apimgt/1.6.0/modules/distribution/resources/migration-1.4.0_to_1.5.0/



1. Shutdown APIM 1.4.0 if it is running.

2. Backup your WSO2 Carbon Database(User Store and Registry) and API Manager Databases of your APIM 1.4.0 instance.

3. Execute relevant sql script in 'migration-1.4.0_to_1.5.0/userstore_db' directory against your WSO2 Carbon Database. This will migrate tables and data in your jdbc user store.

4. Execute relevant sql script in 'migration-1.4.0_to_1.5.0/apimgt_db' directory against your API Manager Database.

5. Now point same WSO2 Carbon Database(User Store and Registry) and API Manager Databases of your AM 1.4.0 instance to AM 1.5.0.
(Configure AM_1.5.0/repository/datasource/master-datasources.xml to point same databases configured in AM 1.4.0)

6. Open AM_1.5.0/repository/conf/user-mgt.xml and add the property to existing 'AuthorizationManager' configuration.


true
ex:

            /permission
     true
     true


7. Move all your synapse configurations to APIM_1.5.0. For that, copy and replace APIM_1.4.0/repository/deployment/server/synapse-config/default directory to APIM_1.5.0/repository/deployment/server/synapse-config/default

8. Start APIM 1.5.0


Wednesday, October 2, 2013

How to add additional headers to WSO2 API Manager Swagger Console

WSO2 API Manager has integrated Swagger to allow API consumers to explore APIs through a interactive console.

When integrating Swagger with WSO2 API Manager we had to support CORS between API Store and the API Gateway. So in order to send any headers from Swagger, we need to add those required headers to the response coming from the API Gateway as 'Access-Control-Allow-Headers' . By default below set of headers are allowed to be send from swagger.

authorization,Access-Control-Allow-Origin,Content-Type

So if you need to add any additional headers to Swagger UI, then we should add that header to list of 'Access-Control-Allow-Headers'.

There are 2 options to modify 'Access-Control-Allow-Headers'.

1. Modify the Synapse configuration of APIs and add modified set of headers to OutSequence.

If you choose this method, you have to modify API configuration of each API and add the 'Access-Control-Allow-Headers'. For that go to Management Console of API Manager and then Go to Source View. Then modify the OutSequence with Access-Control-Allow-Headers property setting required headers as the value.

For example let's say the additional header wee need to add is 'Action', then outSequence should be modified as below.




2. Modify API templates.

Inside API Manager distribution you can find templates for APIs from APIM_HOME/repository/resources/api_templates. By modifying outSequences of below template files in there, all the APIs created thereafter will get the modified outSequence.

api_templates_complex_resource.xml
api_templates_complex_resource_with_jwt.xml
api_templates_resource.xml
api_templates_resource_with_jwt.xml


        
        




Once you have followed one of the above approaches, then you will be able to send those headers through swagger UI. So only thing left to do is adding the header parameter to API definition. For that go to API Publisher and select required API. Go to 'Docs' tab and click 'Edit Content' for API definition.


Then add required header parameter to available parameter list of the required operation/s.

ex:

                     {
                            "name": "Action",
                            "description": "SoapAction",
                            "paramType": "header",
                            "required": false,
                            "allowMultiple": false,
                            "dataType": "String"
                        }

Once you saved this, you will be able to see the added header parameter in the Swagger UI.




Friday, August 30, 2013

CSV to XML transformation with WSO2 ESB Smooks Mediator

This post provides a sample CSV to XML transformation with WSO2 ESB. WSO2 ESB supports executing Smooks features through 'Smooks Mediator'. 

Latest ESB can be downloaded from here


We are going to transform the below CSV to an XML message.

Lakmali,Erandi,Female,20,SriLanka
Lakmali,Baminiwatta,Female,20,SriLanka

This is the format of the XML output message.
here

 
  Lakmali
  Erandi
  Female
  20
  SriLanka
 
 
  Lakmali
  Baminiwatta
  Female
  20
  SriLanka
 

First lets write the smooks configuration to transform above CSV to given XML message (smooks-csv.xml).


<smooks-resource-list
 xmlns="http://www.milyn.org/xsd/smooks-1.1.xsd"
 xmlns:csv="http://www.milyn.org/xsd/smooks/csv-1.2.xsd">

   <resource-config selector="org.xml.sax.driver">
  <resource>org.milyn.csv.CSVReader</resource>
  <param name="fields">firstname,lastname,gender,age,country</param>
  <param name="rootElementName">people
  </param>
  <param name="recordElementName">person
   </param>
    </resource-config>
</smooks-resource-list>
Now let's write a simple proxy service to take the CSV file as the input message and process through the smooks mediator. For that first you need to enable VFS transport sender and reciever.

Below is the service synapse configuration. Make sure to change the following parameters according to your file system. You can find more information about the parameters from here.

  •   transport.vfs.FileURI
  •   transport.vfs.MoveAfterProcess
  •   transport.vfs.ActionAfterFailure



<proxy xmlns="http://ws.apache.org/ns/synapse"
       name="CSVSmooks"
       transports="https,http,vfs"
       statistics="disable"
       trace="disable"
       startOnLoad="true">
   <target>
      <inSequence>
         <smooks config-key="smooks-csv">
            <input type="text"/>
            <output type="xml"/>
         </smooks>
         <log level="full"/>
      </inSequence>
   </target>
   <parameter name="transport.PollInterval">5</parameter>
   <parameter name="transport.vfs.ActionAfterProcess">MOVE</parameter>
   <parameter name="transport.vfs.FileURI">file:///home/lakmali/dev/test/smooks/in</parameter>
   <parameter name="transport.vfs.MoveAfterProcess">file:///home/lakmali/dev/test/smooks/original</parameter>
   <parameter name="transport.vfs.MoveAfterFailure">file:///home/lakmali/dev/test/smooks/original</parameter>
   <parameter name="transport.vfs.FileNamePattern">.*.csv</parameter>
   <parameter name="transport.vfs.ContentType">text/plain</parameter>
   <parameter name="transport.vfs.ActionAfterFailure">MOVE</parameter>
   <description/>
</proxy>

You have to make a ESB local entry with the key 'smooks-csv' and give path to smooks-csv.xml which we crated above. So in the smooks mediator above, we are loading the smooks config through the local entry key name (smooks-csv).

To perform the transformation what you need to do is drop the input message file to transport.vfs.FileURI location. In the log you can see the transformed message in XML!! Now you got the CSV message in XML in your synapse sequence. So you can perform any further mediation to this message such as send to some endpoint/database/file etc.


Monday, March 25, 2013

Huge Message Processing with WSO2 ESB Smooks Mediator


Smooks is a powerful framework for processing, manipulating and transforming XML and non XML data. WSO2 ESB supports executing Smooks features through 'Smooks Mediator'. 

One of the main features introduced in Smooks v1.0 is the ability to process huge messages (Gbs in size) [1]. Now with the WSO2 ESB 4.5.0 release (and later), Huge Message Processing feature is supported through Smooks Mediator!

Smooks supports three types of processing for huge messages which are,
1. one-to-one transformation
2. splitting and routing
3. persistence

This post shows how to process large input messages using Splitting and routing approach. 

Step 1: Create sample Huge Input file. 

This post assumes the input message is in the following format.


    
Joe
Pen 8.80 Book 8.80 Bottle 8.80 Note Book 8.80

You can write a simple java program to generate a file with large number of entries. 

FileWriter fw = new FileWriter("input-message.txt");
PrintWriter pw = new PrintWriter(fw);
       
        /*XML */
        pw.print("\n 
\n Joe\n
\n \n"); for(int i=0;i<=2000000;i++){ pw.print("\t\n\t\tPen\n\t\t8.80\n\t\n"); } pw.write(" \n
");

Step 2: Smooks Configuration 

Let's write the Smooks configuration to split and route the above message. When we are processing huge messages with Smooks, we should make sure to use the SAX filter.

The basic steps of this Smooks process are, 
1. Java Binding - Bind the input message to java beans
2. Templating - Apply a template which represents split message on input message elements
3. Routing - Route each split message

So for doing each of the above steps we need to use the relevant Smooks cartridges.

1. Java Binding

The Smooks JavaBean Cartridge allows you to create and populate Java objects from your message data [2]. We can map input message elements to real java objects by writing bean classes or to virtual objects which are Maps and Lists. Here we will be binding to virtual objects. In that way we can build complete object model without writing our own business classes.

Let's assume that we are going to split the input message such that one split message contains a single order item information (item-id, product, quantity, price) with the order information (order-id, customer-id, customer-name).

So we can define two beans in our smooks configuration;  order and orderItem.


    
 

 
 

     
     
     
     
     

 

     
     
     
     

      


2. Templating

Smooks Templating allows fragment-level templating using different templating solutions. Smooks supported templating technologies are FreeMarker and XSL templating. In here we are going to use FreeMarker templating solution.

Configuring FreeMarker templates in Smooks is done through the http://www.milyn.org/xsd/smooks/freemarker-1.1.xsd configuration namespace. We can refer the message content in template definition through the java beans which we have defined in the above step.

There are two methods of FreeMarker template definitions. They are In line and External Template Reference. In this example let's use in-line templating.

First we need to decide the format of a single split message. Since we are going to split the input message such that one split message contains a single order-item information (item-id, product, quantity, price) with the order information (order-id, customer-id, customer-name), it will look as follows.

The java object model we had populated above is been used in template definition.

         
 
    
           ${order.customerName}
           ${order.customerNumber?c}
    
       
${order.orderItem.product} ${order.orderItem.quantity} ${order.orderItem.price}


Let's add the templating configuration to our smooks configuration.


    
 

 
 

     
     
     
     
     

 

     
     
     
     

      


  
  
  
       
       
  



Please note that using <ftl:outputto>, you can direct Smooks to write the templating result directly to an OutputStreamResource.

 3. Routing

So far we have defined the bean model of the message, then defined the template of a single split message. Now we have to continue smooks configuration to route each message fragment to an endpoint. These endpoints can be file, database or JMS endpoints.

In this sample let's route the message fragments to file locations. As in the above step we defined the outputTo element to write to orderItemSplitStream resource, lets add outputStream named orderItemSplitStream to our smooks configuration.

We need to define following attributes when defining the outputStream

fileNamePattern

Can be composed by referring java object model we created. The composing name should be a unique name for each message fragment.

destinationDirectoryPattern

Destination where files should be created.

highWaterMark

Maximum number of files that can be created in the directory. This should be increased according to the input message size.


    
 

 
 

     
     
     
     
     

 

     
     
     
     

      


  
  
  
       
       
  




     order-${order.orderId}-${order.orderItem.itemId}.xml
     
          /home/lakmali/dev/test/smooks/orders
     
     

 


Step 3: Process with WSO2 ESB Smooks Mediator

Now we have finished writing the smooks configuration which will split and route an incoming message. So now we need to get this executed against our Huge Message. WSO2 ESB Smooks Mediator is a solution for this which integrates Smooks features with WSO2 ESB.

So our next step is writing a synapse configuration to fetch the file containing the incoming message through VFS transport and  mediate through the Smooks Mediator to get our task done.

Here is the synpase Configuration
<definitions xmlns="http://ws.apache.org/ns/synapse">
   <proxy name="SmooksSample" startonload="true" transports="vfs">
      <target>
         <insequence>
            <smooks config-key="smooks-key">
               <input type="xml" />
               <output type="xml"/>
            </smooks>
         </insequence>
      </target>
      <parameter name="transport.vfs.ActionAfterProcess">MOVE</parameter>
      <parameter name="transport.PollInterval">5</parameter>
      <parameter name="transport.vfs.MoveAfterProcess">file:///home/lakmali/dev/test/smooks/original</parameter>
      <parameter name="transport.vfs.FileURI">file:///home/lakmali/dev/test/smooks/in</parameter>
      <parameter name="transport.vfs.MoveAfterFailure">file:///home/lakmali/dev/test/smooks/original</parameter>
      <parameter name="transport.vfs.FileNamePattern">.*\.xml</parameter>
      <parameter name="transport.vfs.ContentType">application/xml</parameter>
      <parameter name="transport.vfs.ActionAfterFailure">MOVE</parameter>
   </proxy>
   <localentry key="smooks-key" src="file:repository/samples/resources/smooks/smooks-config-658.xml"></localentry>
   <sequence name="fault">
         <log level="full"/>
         <property name="MESSAGE" value="Executing default fault sequence"/>
         <property expression="get-property('ERROR_CODE')" name="ERROR_CODE"/>
         <property expression="get-property('ERROR_MESSAGE')" name="ERROR_MESSAGE"/>
         <drop/>
   </sequence>
   <sequence name="main">
      <log/>
      <drop/>
   </sequence>
</definitions>
Make sure to Change the VFS Transport Configuration Parameters.


transport.vfs.MoveAfterProcess - Move the input file to this location after processing
transport.vfs.FileURI - Input File location
transport.vfs.MoveAfterFailure - Move the input file to this location after a failure

Create a proxy service with the given synpase configuration. There is an available ESB sample with this configuration which you can run by executing the following command.

Go to ESB_HOME/bin
And run
./wso2esb-samples.sh -sn 658

Now drop the sample Huge Input file to transport.vfs.FileURI location.

Now check the destinationDirectoryPattern location where you can find the split file results of the huge file.

MKKP78F3XW2U
[1] http://www.smooks.org/mediawiki/index.php?title=V1.5:Smooks_v1.5_User_Guide#Processing_Huge_Messages_.28GBs.29
[2] http://www.smooks.org/mediawiki/index.php?title=V1.5:Smooks_v1.5_User_Guide#Java_Binding