Google

Feb 27, 2013

JavaScript Coding Questions and Answers



Q. Can you write JavaScript Utils functions for the given scenarios?

Scenario 1: Generate random numbers to break cache in restful service calls? For example, the random numbers can be added as a query parameter to break cache.


function Utils() {

    this.cacheBreaker = function () {
            return ((new Date()).getTime() + '').substr(2,8);   // A string that is different every second
    };
}


Scenario 2: Convert Date format from "YYYYMMDD" to "DD/MM/YYYY". The following examples use regular expressions

function Utils() {

    //For example, from JSON data format YYYYMMDD to GUI display format DD/MM/YYYY. 
    this.dateConvertRestToUser = function (dateInput) {
        var m = dateInput.match(/^([1-9][0-9]{3})([0-9]{2})([0-9]{2})$/);
        if (m) {
            return m[3] + '/' + m[2] + '/' + m[1];
        }
      return null;
    };
 
 //For example, from GUI display format DD/MM/YYYY to JSON data format YYYYMMDD
 this.dateConvertUserToRest = function (dateInput) {
        var m = dateInput.match(/^ *([0-3]?[0-9])\/([0-1]?[0-9])\/([1-2][0-9][0-9][0-9]) *$/);
        if (m) {
            var day = m[1];
            var month = m[2];
            var year = m[3];

            if (day.length == 1) {
                day = '0' + day;
            }
            if (month.length == 1) {
                month = '0' + month;
            }
            return year + month + day;
        }
        return null;
    };
}



Scenario 3: Get the attribute value for a given HTML element and attribute.

function Utils() {

    this.getAttrString = function (attr, element) {
         var attrValue = element.attr(attr);
         if (attrValue && attrValue.length > 0) {
            return attrValue;
         }
        return null;
    };
}


Scenario 4: Extract HTML value form a given tag.

function Utils() {

    this.extractHTMLValue = function (html, tag) {
        var htmlValue = "";

        if (tag != null && html != null) {
            var beginTag = "<" + tag.toUpperCase() + ">";
            var endTag = "</" + tag.toUpperCase() + ">";

            html = html.toUpperCase();

            var beginIndex = html.indexOf(beginTag);
            var endIndex = html.indexOf(endTag);

           if (beginIndex > -1 && endIndex > -1 && endIndex > beginIndex) {
                htmlValue = html.substring(beginIndex + beginTag.length, endIndex);
            }
        }

        return htmlValue;
    };
 
}



Scenario 5: Logging from your JavaScriot at different log levels like debug, info, warn, and error.

function PortalLog(logLevel) {
     
        if (typeof console != 'object' || (typeof console.log != 'function' && typeof console.log != 'object')) {
            logLevel = 4;
        }

        function evalLog(type, args) {
            var evalLog = '';
            for (var i = 0; i < args.length; i++) {
                if (i > 0) {
                    evalLog += ',';
                }
                evalLog += 'args[' + i + ']';
            }
            return eval('console.' + type + '(' + evalLog + ');');
        }

        if (logLevel <= 0) {
            this.log = function () {
                return evalLog('log', arguments);
            }
            this.debug = this.log;
        } else {
            this.log = function () {};
            this.debug = function () {};
        }

        if (logLevel <= 1) {
            this.info = function () {
                return evalLog('info', arguments);
            }
        } else {
            this.info = function () {};
        }

        if (logLevel <= 2) {
            this.warn = function () {
                return evalLog('warn', arguments);
            }
        } else {
            this.warn = function () {};
        }

        if (logLevel <= 3) {
            this.error = function () {
                return evalLog('error', arguments);
            }
        } else {
            this.error = function () {};
        }

}



Q. What’s the difference between these two statements?

var x = 3;

x = 3;


A. The first statement puts the variable in the scope of whatever function it was defined. The second statement places the variable in global scope. Global scope can potentially cause collision with other variables with the same name. Therefore, the keyword var must be used when defining variables, and an anonymous function should be used as a closure if needed, encapsulating multiple functions which can share access to the same set of variables. That makes sure the variables stay sandboxed, accessible only by those functions which need them.

Q. What is the difference between the following 2 statements?

!!(obj1 && obj2);

(obj1 && obj2);



A. The first statement returns a “real” boolean value, because you first negate what is inside the parenthesis, but then immediately negate it again. So, this is like saying something is “not not” truth-a, making it true. The second example simply checks for the existence of the obj1 and obj2, but might not necessarily return a “real” boolean value, instead returning something that is either truth-a or false-a. This can be problematic, because false-a can be the number 0, or an empty string, etc. Simple existence can be truth-a. A “real” boolean will only be true or false.

Labels:

Feb 22, 2013

Handling Concurrent modifications in Java



There are scenarios where you need to deal with concurrent modifications in Java. Here are 2 scenarios that I can currently think of.

Scenario 1: Looping through a list of items and removing an item in the list could lead to "ConcurrentModificationException". Here is an example.

Code that throws an Exception:


 private void removeDetailSummaryRecordsWithAllZeroAmounts(CashForecastSummaryVO cfVo)
    {
        List<cashforecastsummaryaccountvo> accounts = cfVo.getAccounts();
        for (CashForecastSummaryAccountVO cfAcctVO : accounts)
        {
            List<cashforecastsummaryrecordvo> summaryRecords = cfAcctVO.getSummaryRecords();
          
            for (CashForecastSummaryRecordVO recordVO:summaryRecords)
            {
                if (recordVO.getRecordtype() == RecordType.DETAILS)
                {
                    List<bigdecimal> amounts = recordVO.getAmounts();
                    boolean foundNonZero = false;
                    for (BigDecimal amount : amounts)
                    {
                        if (BigDecimal.ZERO.compareTo(amount) != 0)
                        {
                            foundNonZero = true;
                        }
                    }
                    
                    if (!foundNonZero)
                    {
                        summaryRecords.remove(recordVO); // throws aoncurrentModificationException
                    }
                }
            }
        }
    }


Code that fixes the above issue: Using an iterator and remove from the iterator to prevent the exception.


 private void removeDetailSummaryRecordsWithAllZeroAmounts(CashForecastSummaryVO cfVo)
    {
        List<cashforecastsummaryaccountvo> accounts = cfVo.getAccounts();
        for (CashForecastSummaryAccountVO cfAcctVO : accounts)
        {
            List<cashforecastsummaryrecordvo> summaryRecords = cfAcctVO.getSummaryRecords();
            Iterator<cashforecastsummaryrecordvo> it = summaryRecords.iterator();  // get the iterator
            CashForecastSummaryRecordVO recordVO = null;
            while (it.hasNext())
            {
                recordVO = it.next();
                if (recordVO.getRecordtype() == RecordType.DETAILS)
                {
                    List<bigdecimal> amounts = recordVO.getAmounts();
                    boolean foundNonZero = false;
                    for (BigDecimal amount : amounts)
                    {
                        if (BigDecimal.ZERO.compareTo(amount) != 0)
                        {
                            foundNonZero = true;
                        }
                    }
                    
                    if (!foundNonZero)
                    {
                        it.remove();  // an iterator is used
                    }
                }
            }
        }
    }
 

Scenario 2: Two users try to modify the same record in the database. In this scenario, you want one modification to go through and the other modification to notify the user as shown below.

"This record was not updated as the record you are trying to update has been updated by another user. Try refreshing your data, and update again."

This will require a number of steps.

Step 1: You would require a "version number" or a "timestamp"  column in the database table to detect concurrent modifications.

Step 2: When a record is initially read, the time stamp or version number also read.


    @Override
    public List<AdjustmentDetail> getAdjustmentRecords(final AdjustmentCriteria criteria)
    {
        String sql = "select a.detailid, a.portfolioCd, a.accountCd, a.PositionIndicator, a.cashValue, TmStamp = convert(int,substring(a.Timestamp,5,4))" +
                "from AdjustmentDetail a " +
                "Where a.portfoliocd = ? " +
                "and   a.valuationDttm = ? " +
                "and   a.inactiveFlag = 'N' ";
        
        List<Object> parametersList = new ArrayList<Object>();
        parametersList.add(criteria.getPortfolioCode());
        parametersList.add(criteria.getValuationDate());
        
        Object[] parameters = parametersList.toArray(new Object[parametersList.size()]);
        
        List<AdjustmentDetail> adjustments = jdbcTemplateSybase.query(sql, parameters,
                new RowMapper<AdjustmentDetail>()
                {
                    public AdjustmentDetail mapRow(ResultSet rs, int rowNum) throws SQLException
                    {
                        AdjustmentDetail record = new AdjustmentDetail();
      record.setDetailId(BigInteger.valueOf(rs.getLong("DetailId")));
                        record.setPortfolioCode(criteria.getPortfolioCode());
                        record.setAccountcd(rs.getString("accountCd"));
                        record.setAmount(rs.getBigDecimal("cashValue"));
      record.setPositionIndicator(rs.getString("PositionIndicator"));
                     record.setTimestamp(rs.getInt("TmStamp"));  // timestamp to detect any later modifications 

                        return record;

                    }
                });

        return adjustments;
        
    } 


Step 3: After the record has been modified, when ready to update the record, do a select query first to read the time stamp or version number for the same record to ensure that it has not been modified. if the "timestamp" or the "version number" has changed, you need to throw the above exception and abort modifying the record as it had been modified by another user.


@Override
    public AdjustmentDetail modifyAdjustment(AdjustmentDetail adjDetail)
    {
        if (adjDetail == null)
        {
            throw new RuntimeException("adjDetail is null");
        }
        
        int noOfRecords = 0;
        
        String inactiveFlag;
        
        
    
        try
        {
         //check if the record has been modified.
         Integer adjustmentModifiedTimestamp = getAdjustmentModifiedTimestamp(adjDetail.getDetailId());
            
            //logic to modify adjustments go here
   //every time the record is modified, the timestamp or version number is incremented.
        }
        catch (Exception e)
        {
            logger.error("Error updating adjustment  detail: ", e);
        }
        
        if (noOfRecords == 0) throw new ValidationException("The adjustment was not updated. It may be the record you are trying to update has been updated by another user. Try refreshing your data and update again.");
        
        
        logger.info("No of adjustment details updated = " + noOfRecords);
        
        return adjDetail;
    }
 

now the sample method that retrieves the timestamp.

//retrieve the timestamp value for the given datialid to detect if it has been modified.
 private Integer getAdjustmentModifiedTimestamp(BigInteger adjustmentDetailId) {

  String sql = "SELECT TmStamp = convert(int,substring(Timestamp,5,4)) from AdjustmentDetail where DetailId = ?";

  List<Object> parametersList = new ArrayList<Object>();
  parametersList.add(cashForecastDetaillId.intValue());

  Object[] parameters = parametersList.toArray(new Object[parametersList.size()]);

  List<Integer> ts = jdbcTemplateSybase.query(sql, parameters, new RowMapper<Integer>() {
   public Integer mapRow(ResultSet rs, int rowNum) throws SQLException {
    Integer tsValue = rs.getInt("TmStamp");
    return tsValue;

   }
  });
  return ts.get(0);
 }
 

Labels:

Feb 19, 2013

Spring JDBC Template examples -- calling stored proc, simple select, and insert with returning the generated key

Spring Interview Questions and Answers Q1 - Q14 are FAQs

Q1 - Q4 Overview & DIP Q5 - Q8 DI & IoC Q9 - Q10 Bean Scopes Q11 Packages Q12 Principle OCP Q14 AOP and interceptors
Q15 - Q16 Hibernate & Transaction Manager Q17 - Q20 Hibernate & JNDI Q21 - Q22 read properties Q23 - Q24 JMS & JNDI Q25 JDBC Q26 Spring MVC Q27 - Spring MVC Resolvers

Q24. How will you go about invoking stored procedures with Spring JDBC?  
A24. This post covers three typical scenarios of using the Spring JDBC template.   

1. Invoking a stored procedure to retrieve some results. This uses the JDBC Callable statement.
2. Retrieving the data from the database via a simple "SELECT" query.
3. Insert a new record into a table and then return the generated primary key.

Here is the sample code snippet to achieve the above requirements using the Spring framework.


package com.myapp.repository.impl;

import java.math.BigInteger;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Types;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import javax.annotation.Resource;

import org.apache.commons.lang.StringUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.core.namedparam.MapSqlParameterSource;
import org.springframework.jdbc.core.simple.SimpleJdbcCall;
import org.springframework.jdbc.core.simple.SimpleJdbcInsert;
import org.springframework.stereotype.Repository;

//...other imports

@Repository(value = "myAppDao")
public class MyAppDaoImpl implements MyAppDao {

 private static Logger logger = LoggerFactory.getLogger(MyAppDaoImpl.class);
 

 @Resource(name = "jdbcBasicTemplateSybase")
 private JdbcTemplate jdbcTemplateSybase;

 // ************ Retrieve data fromm a stored procedure  *******************
 
 @Override
 public List<MyAppFeedResult> getMyAppFeedData(final MyAppFeedCriteria criteria) {
  SimpleJdbcCall call  = new SimpleJdbcCall(jdbcTemplateSybase)
                          .withProcedureName("ProcGetMyAppFeed");
  
  call = call.returningResultSet("my_app_proc_result", new RowMapper<MyAppFeedResult>() {
   public MyAppFeedResult mapRow(ResultSet rs, int rowNum) throws SQLException {
    MyAppFeedResult  record = new MyAppFeedResult();

    record.setPortfolioCode(criteria.getPortfolioCode());
    record.setValuationDate(criteria.getValuationDate());
    record.setAccountcd(rs.getString("accountCd"));
    record.setPositionIndicator(rs.getString("PositionIndicator"));
    record.setAmount(rs.getBigDecimal("amount"));
    record.setSecurityIdentifier(rs.getString("securityIdentifier"));
    record.setCurrencyCode(rs.getString("currencyCd"));
    record.setUnitCost(rs.getBigDecimal("unitCost"));
    return record;
   }
  });
                          

  //construct the stored proc input parameters      
  java.sql.Date valDate = new java.sql.Date(criteria.getValuationDate().getTime());
  java.sql.Date foreCastDateAsAtEndOf = null;
  java.sql.Date foreCastDate = null; 

  if (criteria.getForeCastAsAtEndOf() != null) foreCastDateAsAtEndOf = new java.sql.Date(criteria.getForeCastAsAtEndOf().getTime());
  if (criteria.getForeCastDate() != null) foreCastDate = new java.sql.Date(criteria.getForeCastDate().getTime());
  
  final MapSqlParameterSource params = new MapSqlParameterSource();
  params.addValue("PortfolioCd", criteria.getPortfolioCode());
  params.addValue("ValuationDttm",valDate);
  params.addValue("ForeCastAsAtEndOf",foreCastDateAsAtEndOf);
  params.addValue("AccountCd",criteria.getAccountCode());
  params.addValue("ForecastDate", foreCastDate);
  params.addValue("TranTypeDesc", criteria.getTranTypeDesc());
  params.addValue("Debug", "N");
  
  //execute the stored proc with the input parameters
  Map<String, Object> results = call.execute(params);
  
  //get the results
  List<MyAppFeedResult> resultList = (List<MyAppFeedResult>)results.get("my_app_proc_result");
  
  return resultList;
 }
 
 @Override
 /** Simple select query **/
 public List<MyAppAccount> getMyAppAccountRecords(ReconciliationCriteria criteria) 
 {
  String sql = "Select MyAppId, PortfolioCd, AccountCd, CurrencyCd, ValuationDttm" +
                  "From MyApp " +
         "Where PortfolioCd = ? " +
         "And   InactiveFlag = 'N' " +
                  "Order by CurrencyCd, AccountCd";
  
  List<Object> parametersList = new ArrayList<Object>();
  parametersList.add(criteria.getPortfolioCode());
  parametersList.add(criteria.getValuationDate());

  Object[] parameters = parametersList.toArray(new Object[parametersList.size()]);

  List<MyAppAccount> parentList = jdbcTemplateSybase.query(sql, parameters, new RowMapper<MyAppAccount>() {
   public MyAppAccount mapRow(ResultSet rs, int rowNum) throws SQLException {
    MyAppAccount record = new MyAppAccount();

    record.setMyAppId(rs.getLong("MyAppId"));
    record.setPortfolioCode(rs.getString("portfolioCd"));
    record.setAccountCd(rs.getString("AccountCd"));
    record.setCurrencyCd(rs.getString("CurrencyCd"));
    record.setValuationDate(rs.getDate("ValuationDttm"));   
    return record;
   }
  });
  
  return parentList;
 }
 

 
 @Override
 /** insert a new record and get the generated primary key id**/
 public MyAppDetail addOrModifyAdjustment(MyAppDetail adjDetail) {
  if (adjDetail == null) {
   throw new RuntimeException("adjDetail is null");
  }

  try {
   SimpleJdbcInsert jdbcInsert = new SimpleJdbcInsert(jdbcTemplateSybase).withTableName("MyAppdetail").usingGeneratedKeyColumns("MyAppDetailid");
   Map<String, Object> lParameters = new HashMap<String, Object>(20);
      lParameters.put("MyAppId", adjDetail.getMyAppId().longValue());
      lParameters.put("TranCd",  adjDetail.getTxnCd());
      lParameters.put("TranTypeCd", Integer.valueOf(adjDetail.getTxnTypeCd()));
      lParameters.put("TranTypeDesc",  adjDetail.getTxnTypeDesc());
          
      
   Number generatedKey = jdbcInsert.executeAndReturnKey(lParameters);
   logger.info("adjustment detail added with id = " + generatedKey.longValue());
   
   adjDetail.setMyAppId(generatedKey.longValue());
  
   
  } catch (Exception e) {
   logger.error("Error saving MyApp transaction detail: ", e); 
   throw new RuntimeException(e);
  }
  

  return adjDetail;
 }

 //seter of the jdbcTemplate
 public void setJdbcTemplateSybase(JdbcTemplate jdbcTemplateSybase) {
  this.jdbcTemplateSybase = jdbcTemplateSybase;
 }

}


Q. How will you process the results and return them as a Map?
A. Use the ResultSetExtractor class from Spring.

    @Override
 public Map<String, BigDecimal> getAccountPVClosingBalances(PortfolioCriteria criteria) {
  String sql = "select accountcd, LiquidityLocal from portfolio p where p.portfoliocd = ? and   p.valuationdttm = ?  ";
    
  List<Object> parametersList = new ArrayList<Object>();
  parametersList.add(criteria.getPortfolioCd()); 
  parametersList.add(criteria.getValuationDtTm());

  //where clause prepared statement parameters
  Object[] parameters = parametersList.toArray(new Object[parametersList.size()]);

  //store results in a map
  Map<String, BigDecimal> results = jdbcTemplateSybase.query(sql, parameters, new ResultSetExtractor<Map<String, BigDecimal>>() {
   public Map<String, BigDecimal> extractData(ResultSet rs) throws SQLException {
    Map<String, BigDecimal> mapOfPortfolioBalances = new HashMap<String, BigDecimal>(100);
    while (rs.next()) {
     String accounrCd = rs.getString("accountcd");
     BigDecimal portfolioBalance = rs.getBigDecimal("LiquidityLocal");
     mapOfPortfolioBalances.put(accounrCd, portfolioBalance);
    }
    return mapOfPortfolioBalances;
   }
  });
  
  return results;
 }
  


The "jdbcTemplateSybase" is configured and injected via the Spring dependency injection.


JDBC, Spring, and Hibernate tutorials


Labels:

Feb 13, 2013

JMS versus AMQP, Enterprise Integration Patterns (EIP), and Spring Integration versus Apache Camel

Q. Why do you need AMQP when there is JMS?
A. AMQP stands for Advanced Message Queuing Protocol, and was developed to address the problem of interoperability by creating a standard for how messages should be structured and transmitted between platforms the same way as SMTP, HTTP, FTP, etc. have created interoperable systems. This standard binary wire level protocol for messaging would therefore allow hetrogeneous disparate systems between and within companies to exchange messages regrdless of the message broker vendor or platform. 


RabbitMQ, Apache Qpid, StormMQ, etc are open source message broker softwares (i.e. MOM - message-oriented middlewares) that implements the Advanced Message Queuing Protocol (AMQP). 

Q. How does AMQP differ from JMS?
A. JMS is a standard messaging API for the Java platform. It provides a level of abstraction that frees developers from having to worry about specific implementation and wire protocols. This is similar to the JDBC API that allows you to easily switch databases. With JMS, you can switch from one JMS complian message broker (e.g. Web Methods) with another one (e.g. MQSeries or WebspehreMQ) with little or no changes to your source code. It also provides interoperability between other JVM based languages like Scala and Groovy. Altough JMS brokers can be used in .NET applications,  the whole JMS specification does not guarantee interoperability, and integration between Java to .NET  or Java to Ruby, is proprietary and can be quite tricky. In scenarios where you want to send a message from a Java based message producer to a .NET based message consumer, then you need a message based cross platform interoperability that is what AMQP does. With AMQP, you can use any AMQP compliant client library, and AMQP compliant message broker.


Q. What are the different alternatives to integrate various enterprise systems?
A.

Alternative 1: Custom Solution. Implement an individual solution that works for your problem without separating problems into little pieces. For example, use Apache CXF for Web services, overnight batch job runs to load data feeds, JMS for messaging, etc. This is more suited for small use cases. This has higher maintainability and developer effort. The developer has to handle errors, service retries, transactional management, etc. This is suited if you want to integrate one or two applications using one or two protocols. 


Alternative 2: Using an (opensource) integration framework like Spring Integration or Apache Camel. This helps you integrate systems in a standardised way adhering to the enterprise integration patterns (EIP). Apache Camel is a light weight integration framework that allows you to use HTTP, FTP, JMS, EJB, JPA, RMI, JMS, JMX, LDAP, and Netty to name a few. You use the same concept to integrate various protocols in Apache Camel. This increases maintainability and reduces developer effort. This is more suited  if you want to integrate several applications with different protocols.


Alternative 3: Using an ESB (Enterprise Service Bus) to integrate your applications. For example, Oracle Service Bus, TIBCO ESB, webMethods, etc. Under the hood, the ESB also uses an integration framework and provide more services and management functionalities like monitoring, high availability, clustering, graphical user inteface for routing and configuring, etc. Usually, an ESB is a complex and powerful product with a higher learning curve. Suited for very large integration projects. Projects requiring BPM (Business Process Managemnt) integration and other integrated services like monitoring, clustering, etc.



Q. What is an architecture that enables separate applications to work together, but in a de-coupled fashion such that applications can be easily added or removed without affecting the others?
A. This is achieved via a Message Oriented Middleware (aka a message bus).



Q. How can the caller be sure that exactly one receiver will receive the document or perform the call?
A. Use the point-to-point channel


Q. How can the sender broadcast an event to all interested receivers?
A. Use the publish subscribe channel.

Q. What will the messaging system do with a message it cannot deliver?
A. Put it on the dead letter channel.

Q. How can the sender make sure that a message will be delivered, even if the messaging system fails?
A. Use the "guaranteed delivery" mechanism.


Q. What are the diffrent ways to route messages?
A. EIP (Enterprise Integration Patterns) define different types of rules based routing to solve common enterprise intergration problems. Like GoF design patterns is the EIP allows integration architects and designers to share a common vocabulary.

  1. Content based routing uses XPath predicates to route messages based on the message content.  Content enricher supplements the original message with additional relevant information recieved from the other sources.
  2. Splitter  provides the EIP service engine to split messages into separate parts based on the XPATH expression. Splits a composite message into a series of individual message parts.
  3. Split Aggregator is used to collect and store individual message parts until a complete set of co-related message parts has been recieved. Once all the related parts have been recieved, they are aggregated to form a single message.
  4. Static Recipent List based routing inspects an incoming message, and depending upon the number of recipients mentioned in the list, it can forward the message to all channels associated with the "recipients list".  
  5. A Resquencer is used to get a stream of related but out of sequence messages back into correct order. Because individual messages may follow different routes, some messages are likely to pass through the processing steps sooner than others, resulting in the messages getting out of order. A resequencer usually does not modify the message contents.
  6. A message filter is a processor that eliminates undesired messages based on specific criteria. Filtering is controlled by specifying a predicate in the filter: when the predicate is true, the incoming message is allowed to pass; otherwise, it is blocked. A message filter usually does not modify the message contents.


Q. How would you deal with large message volumes?
A.
  • You can reduce the data volume with the use of "Claim Check" pattern, which allows you to replace message content with a claim check (a unique key), which can be used to retrieve the message content at a later time. The message content will be stored temporarily in a persistent store like a database or file system. This pattern is very useful when message content is very large and not all components require all information.
  • A Content Filter can be used to remove unwanted data elements from a message. It is useful to simplify the structure of the message. Very often, messages are represented as tree structures containing many levels of nested, repeating groups because they are modeled after generic, normalized database structures. Very often, this level of nesting is superfluous and a Content Filter can be used to 'flatten' the hierarchy into a simple list of elements that can be more easily understood and processed by other systems.


Q. How would you go about choosing between Spring Integration and Apache Camel to solve common integration problems?
A. 

Spring Integration provides an extension to the Spring programming model to support the well-known Enterprise Integration Patterns while building on the Spring Framework's existing support for enterprise integrationSpring Integration is more suited, if you already have got a Spring project and need to add some integration stuff to it. It requires almost no additional effort to learn Spring Integration if you know Spring itself. Nevertheless, Spring Integration only offers very rudimenary support for technologies such as AMQP, Spring Application Events, Feeds (e.g, RSS/ATOM), File, FTP, FTPS, SFTP, Gemfire, Groovy, HTTP (REST), TCP/IP, JDBC, JMS, JMX, Mail (IMAP/IDLE/POP3), MongoDB, Redis, RMI, Twitter, Web Services (SOAP), and XMPP. Integrations are implemented by writing a lot of XML based DSL(without a real DSL - Domain Specific Language). Spring Integration can be though of as a catch up game to Apache Camel as JavaEE did a catch up with Spring.

Apache Camel is a powerful open source integration framework based on known Enterprise Integration Patterns with powerful support for integration with core Spring. Apache Camel is almost identical to Mule ESB, and offers many components (even more than Spring Integration) for almost every technology you could think of. If there is no component available, you can create your own component very easily starting with a Maven archetype. Camel also supports a Spring based XML configuration as well as a "DSL" for Java, Groovy, and Scala. The benefits of using the Java DSL is that your IDE can auto complete your code as you start typing, rather than having to mess around with buckets of XML. The Java DSL is also very expressive as you can mix and match your own code within the language for Expression or Predicate evaluations. So, it has better readability and there are commercial tools like "Fuse IDE" for generating XML based DSL code.

Mule ESB is another choice and as the name suggests, it is an ESB including additional bells and whistles. This can be compared to "Apache ServiceMix", which is an extension to Apache Camel. Mule also only offers XML based DSL. The Mule Studio is a visual designer. Mule does provide proprietary connector support for systems like SAP, Tibco Rendevous, PayPal, Sibel CRM, IBM's CICS, etc.

So, the decision is not clear cut and it depends on your needs.



Labels: ,