I've recently published an article on the stack, Calculate Hash Rate in IOS, in hopes of deciphering how the hash rate is calculated when benchmarking performance. The reason I wanted to do this is to compare the results presented in Is it possible to exploit crypto-currencies with an iPhone? After days of hunting on the Internet, I came to the conclusion that calculating the hash rate was not as simple as it seems.
I had the impression, especially after reading: How to calculate the number of hashes generated per second ?, Explanation of the hash rate or hash power in crypto-currencies and calculation of the hash rate of the extraction equipment, to calculate the hash rate as follows:
Have a collection of x number of messages (M) of 80 bytes in size Browse the records and calculate the sha256 (sha256 (M)) Divide the number of calculated hashes according to the elapsed time # the result is a hash rate (in seconds, since your elapsed time is in s)
After that, I see results up to 24,000 hashes per second, or 24 KH / s. Obviously, this is alarming because the reference article I'm trying to compare shows the following results:
The iPhone 6 has a hash rate of 25/30 (H / s)
The iPhone 8 55 H / s
The iPhone X (iPhone 10) 65 hashes per second
I then continued my hunt and came across this stack message how to calculate the hash rate of your platform ?, and the accepted answer indicates that the hash rate is defined empirically. Although I have no doubt about the author, I now have trouble figuring out what is really the hash rate. (Does the "nonce" have an impact on the calculation of the hash rate? That is, the standard hash rate is determined by calculating the number of times my machine can guarantee that the hashed result starts with 3 (or 5, or 7) leading zeros in 1 second?)
I want to calculate a hash rate for my IOS device. Which algorithm and algebraic expression should I use to calculate the hash rate, so I can compare my results to those of other devices, such as laptops and graphics cards?