JavaScript's xor result different than the one of Java -


solution

i had bug in internal conversion logic.

original question

i need implement algorithm both in java , javascript, whereas java implementation , calculation result reference. however, when invoking xor operator on "negative" value (i know java , javascript make use of 2's complement) causes java's result positive, whereas javascript result negative, shown in output below:

java output:     hash a: 16777619     hash b: 637696617     hash a: 637696613     hash b: 988196095     hash a: 988196062     hash b: -1759370886     hash a: 1759370917    <-- here javascript implementation behaves different     hash b: -1169850945  javascript output:      hash a: 16777619     hash b: 637696617     hash a: 637696613     hash b: 988196095     hash a: 988196062     hash b: -1759370886     hash a: -1759370843   <-- result should equal java result     hash b: -1883572545         

below can see java source code:

private static final int fnv_prime = 0x1000193; private static final int fnv_compress = 0xffff; ...  public long gethash(int inputnumber) {     int hash = fnvcalculator.fnv_prime;     bytebuffer inttobytearrayconverter = bytebuffer.allocate(4);     inttobytearrayconverter.putint(inputnumber);     byte[] inputvalues = inttobytearrayconverter.array();      // inputvalues.length equal 4     (byte processcounter = (byte) 0; processcounter < inputvalues.length; processcounter++)     {         hash ^= inputvalues[processcounter];         system.out.println("hash a: " + hash);         hash *= fnv_prime;         system.out.println("hash b: " + hash);     }     return (hash & fnvcalculator.fnv_compress); } 

the following snippet shows javascript code:

var constants = {     fnv_prime: parseint("1000193", 16),     fnv_compress: parseint("ffff", 16),     byte_array_length: 4,     ... }; object.freeze(constants);  var hash = constants.fnv_prime;  (var counter = 0; counter < constants.byte_array_length; counter++) {     hash ^= inputnumberarray[counter];     console.log("hash a: " + hash);     // mutltiply hash 32 bit fnv prime number: 2^24 + 2^8 + 0x93     // source: https://github.com/wiedi/node-fnv/blob/master/fnv.js     hash += ((hash << 24) + (hash << 8) + (hash << 7) + (hash << 4) + (hash << 1));     hash |= 0;     console.log("hash b: " + hash); }  return (hash & constants.fnv_compress); 

the array numbers equal in java in javascript version, show below (all numbers decimal numbers):

java version:     inputvalues[0]: 0     inputvalues[1]: 12     inputvalues[2]: 33     inputvalues[3]: -33 javascript version:     inputnumberarray[0]: 0     inputnumberarray[1]: 12     inputnumberarray[2]: 33     inputnumberarray[3]: -33 

i have tried replacing byte array integer array, has not helped. i'm using webkit's javascriptcore engine.

seeing values, suspect java sign extending 223 when convert series of bytes , javascript isn't. 223 = 0xdf = 0xffffffdf when sign extended....

things careful when porting between java , javascript.

bitwise shift operators operate on 32 bit values in javascript. javascript internally represents all numbers 64 bit floats , not, distinguish between floats , ints java does. more, javascript not have int or float sizes e.g. there no byte, int or long types.

there risk becoming unstuck because of above statements , difference in way languages represents numbers.


Comments

Popular posts from this blog

javascript - Using jquery append to add option values into a select element not working -

Android soft keyboard reverts to default keyboard on orientation change -

Rendering JButton to get the JCheckBox behavior in a JTable by using images does not update my table -