javascript - Why is my function not returning accurate count? -


expanding code i've been working on supplement tracker current function not returning accurate count of numbers greater average 'mean' nor count of integers below mean average. i've commented out 2 questions within code because don't quite understand why array set index[0]. i've learned comments , searching answers here. thankful site exists! looking learn bit more question.

function supparray() { var nums = new array(); //create array  var sum = 0; //variable hold sum of integers in array var avg = 0; //variable hold average var i;  var count = 0; var count2 = 0; var contents = ''; //variable hold contents output       var dataprompt = prompt("how many numbers want enter?", "");     dataprompt = parseint(dataprompt);      for(i = 0; <= dataprompt - 1; i++) { //loop fill array numbers          nums[i] = prompt("enter number","");         nums[i] = parseint(nums[i]);         contents += nums[i] + " "; //variable called display contents         sum = sum + nums[i];     }         avg = sum / nums.length;      for(i = 0; < nums.length; i++) { //loop find largest number         var biggest = nums[0]; //why have index 0 , not 'i' ?         if(nums[i] > biggest)          biggest = nums[i]; //largest integer in array     }     for(i = 0; < nums.length; i++) { //loop find smallest integer         var smallest = nums[0]; //why have index 0 , not 'i' ??         if(nums[i] < smallest)          smallest = nums[i]; //smallest integer in array     }            for(count = 0; count < nums.length; count++) { //count of numbers higher average         if(nums[i] > avg)          count = nums[i];     }        for(count2 = 0; count2 < nums.length; count2++) { //count of numbers lower average         if(nums[i] < avg)         count2 = nums[i];     } } 

your function isn't returning right values because assigning count or count2 inccorectly. if run through code @ end count , count2 equal nums.length. because using them in for loop. in loops reference i (i believe) equal nums.length @ point.

i think want this:

count = 0; count2 = 0;  for(i = 0; < nums.length; i++)  {     if(nums[i] > avg)     {         count++; //increase count of numbers above average     }     else if(nums[i] < avg)     {         count2++;  //increase count of numbers below average     } }    

you may want reading on scope , for loop seem little confused on them.

edit

if want biggest , smallest values in array can this:

//assign values first element default var biggest = nums[0]; var smallest = nums[0];  for(var = 1; < nums.length; i++) {     //set biggest larger number, either biggest or current number     biggest = math.max(biggest, nums[i]);     //set smallest smaller number, either biggest or current number     smallest = math.min(smallest, nums[i]); } 

note: assumes have @ least 1 value in array


Comments

Popular posts from this blog

javascript - AngularJS custom datepicker directive -

javascript - jQuery date picker - Disable dates after the selection from the first date picker -