Skip to main content

Bug Tracker

Side navigation

#444 closed enhancement (fixed)

Opened November 28, 2006 12:04PM UTC

Closed December 20, 2006 03:25AM UTC

Last modified June 20, 2007 01:54AM UTC

jQuery.merge speedup

Reported by: jquery.neytema@gmail Owned by:
Priority: major Milestone:
Component: core Version: 1.1a
Keywords: Cc:
Blocked by: Blocking:
Index: jquery.js
--- jquery.js	(revision 642)
+++ jquery.js	(working copy)
@@ -1740,15 +1740,12 @@

 		// Now check for duplicates between a and b and only
 		// add the unique items
 		for ( var i = 0; i < second.length; i++ ) {
-			var noCollision = true;
 			// The collision-checking process
-			for ( var j = 0; j < first.length; j++ )
-				if ( second[i] == first[j] )
-					noCollision = false;
+			for ( var j = 0; j < first.length
+				&& second[i] != first[j]; j++ );
 			// If the item is unique, add it
-			if ( noCollision )
+			if ( j == first.length )
 				result.push( second[i] );
Attachments (0)
Change History (7)

Changed November 29, 2006 08:01PM UTC by joern comment:1

version: → 1.1

Changed November 29, 2006 09:16PM UTC by Dave comment:2

I have an alternate suggested fix that avoids going completely through first each time:

		// Now check for duplicates between a and b
		// and only add the unique items
		for ( var i = 0; i < second.length; i++ ) {
			for ( var j = 0; j < first.length; j++ )
				if ( second[i] == first[j] )
					continue DupCheck;
			// The item is unique, add it
			result.push( second[i] );

Changed November 30, 2006 02:48AM UTC by andrea ercol comment:3

merge is used a lot of times for a simple selection, so any milliseconds adds up quickly

I made some tests myself, with a very similar code (BTW, a simple break in the inner loop could be easier), but I could not get noticeably better results than what is now implemented in jQuery

I still have to test the following version, based on mergesort:

function merge( first, second ) {

  • be first the longest array
  • if( second.length == 0 && first.constructor == Array ) return

mergesort( first.concat() ); sort for consistency with the next

  • if( second.constructor == Array && first.constructor == Array ) return mergesort( first.concat( second ) ); mergesort is fast and eliminates duplicates
  • return merge( makeArrayFrom( first ), makeArrayFrom( second ) );


Changed November 30, 2006 04:19AM UTC by Dave comment:4

I am not even sure that merge needs to be used to check for duplicates so often. Isn't the comma the only selector operator that can introduce duplicate elements?

Changed December 13, 2006 11:39AM UTC by henrah comment:5

''Isn't the comma the only selector operator that can introduce duplicate elements?''

No -- say you have a structure like this:

    <a> something </a>

...and a selector like div a.

The div selector will select both the inner and outer divs, then the a descendant selector will select the same link element twice, because it descends from both of them. This is just one example; many are less obvious.

Changed December 13, 2006 11:49AM UTC by henrah comment:6

If we are interested in a duplicate-removing function which only deals with whole objects and is thus checking for ''identity'', rather than simple value, maybe something like this would be appropriate:

jQuery.elementUid = 1;
jQuery.uniquify = function (list){
    var seen = {}, key = 'jq_uid', result = [], item, i = 0;
    while (item = list[i++]){
      if (! item[key]) item[key] = jQuery.elementUid++;
      else if (seen[item[key]]) continue;
      seen[item[key]] = true;
    return result;

One potential flaw with this idea is that it leaves new properties on the elements of the array. You could do a second pass over the original list and delete the jq_uid property, but assuming we are dealing with the special case of DOM nodes -- which are unlikely to be used as JS hashes, for instance -- it is probably safe to leave a UID attached to them. This also means that subsequent applications of the uniquify function will skip re-applying the property, which would indicate a (probably trivial) further optimization.

Changed December 20, 2006 03:25AM UTC by brandon comment:7

resolution: → fixed
status: newclosed

Fixed in REV 731 by Dave