Velocity If Statement is Type Sensitive
I discovered that comparisons in Velocity if statements are type-sensitive the hard way. This can cause subtle bugs that are difficult to track down, especially when working with HTML form data.
The Problem
I was trying to display the 'selected' option in an HTML <SELECT> object. The list of possible values is $list, each is a simple 'NameValuePair' value object { String name, String value }. The object I am editing in this form is $obj, and the field I am testing against is $obj.typeId defined as a long.
#foreach ( $item in $list )
#if ( $item.value == $obj.typeId )
<option value="$item.value" selected>$item.name</option>
#else
<option value="$item.value">$item.name</option>
#end
#end
However, the if statement on line 2 is actually comparing a String to a long which are of course never going to be equal.
The Solution
The easiest work around in this case was to convert the $obj.typeId into a String before the comparison.
#set ( $typeId = "$!{obj.typeId}" )
#foreach ( $item in $list )
#if ( $item.value == $typeId )
<option value="$item.value" selected>$item.name</option>
#else
<option value="$item.value">$item.name</option>
#end
#end
The $!{obj.typeId} syntax ensures that the long value is converted to a string representation that can be properly compared with the string values from the list.
Key Takeaway
Always be mindful of data types when performing comparisons in Velocity templates. Unlike some other template engines, Velocity doesn't perform automatic type coercion, so explicit type conversion is necessary when comparing values of different types.
Any comments / feedback welcomed ;)
Hadoop: Processing ZIP files in Map/Reduce
Updated ZipFileInputFormat framework for processing thousands of ZIP files in Hadoop with failure tolerance and comprehensive examples
Consuming Twitter streams from Java
Build a Java utility class to consume Twitter Streaming API data for offline analysis in Hadoop with automatic file segmentation
Reading ZIP files from Hadoop Map/Reduce
Custom utility classes to extract and parse ZIP file contents in Hadoop MapReduce jobs using ZipFileInputFormat and ZipFileRecordReader