How to Lose $13 of Users’ Funds (As a Blockchain Developer)

·

7 min read

The gov says we are not in recession, but at the same time, we hear about skyrocketing inflation, interest rate increases, and layoffs in almost every sector of the economy.

Despite crypto and TradFi being impacted the most severely, many companies are still building their tokens, protocols, and DeFi products. Are you one of them?

Today, I will be speaking on data types, and hold on. I have something very important to say. You might picture me as a 60+-year-old professor from MIT torturing students with lectures about subjects that no longer matter. But that’s not true.

Data types are still important, and neglecting them leads to serious consequences. I will try to briefly go through all potential issues and address them so you don’t find the 8 minutes you spend on reading this article wasted.

Modern languages like JavaScript and Python are using “duck typing” to determine the type of variable. If we assign this sort of formula a = 2 + 2 to a variable, the language interpreter knows it is dealing with numbers, and it will perform mathematical operations on this literal.

The duck typing can be explained by this sentence: “If it walks like a duck and it quacks like a duck, then it must be a duck”. When you look closer into the meaning of it - it makes perfect sense. If literal has letters and numbers, it must be a string, and that’s clear. But what if it has numbers?

Is it a boolean, integer, decimal, float, or date. In most cases, your language interpreter will handle types in the correct way. The problem starts when your program needs to run (even simple) equations on large numbers.

“If it walks like a duck and it quacks like a duck, then it must be a duck”, right? Actually it’s not.

Ethereum Denominations in a Nutshell

In the next paragraphs, I am referring to Ethereum common denominations - wei and gwei. Let me briefly introduce you to them so we speak the common language.

The smallest denomination is 1 wei, and 1 ether equals 1,000,000,000,000,000,000 Wei (18 zeros). I repeat - 18 zeros. It is hard to wrap our minds around such large numbers but they make a difference and matter a lot.

The next common denomination is 1 gwei. 1 ether equals 1,000,000,000 gwei (9 zeros). Gwei is more bearable for humans - in the end, everybody wants to be a millionaire, right? (wink, wink)

Let’s sum it up - 1 ether equals:

  • 1,000,000,000 gwei (9 zeros)

  • 1,000,000,000,000,000,000 wei (18 zeros)

Technical note: Ethereum has two layers - the execution layer and the consensus layer. The execution layer is using wei to represent ether values, and the consensus layer is using gwei. If you are a blockchain developer, you need to learn to interact with both.

Real-World Example: Stakefish’ tip and MEV Pool

I am a software engineer at stakefish. I’m responsible for building our DeFi products palette, and one of the most recent ones is our tip and MEV pool for Ethereum.

Starting from September 15, 2022, all validators are eligible for transaction tips and can participate in MEVs to earn additional rewards. Transaction tips and MEVs are earned when the validator proposes a new block.

We decided to build a smart contract that collects all rewards into a common vault and allows user to claim their share from it. I am not trying to advertise our product, but I need to set the context of this article.

If you are more interested in this product, you can read more here. I’m not selling you anything besides my experience.

As I mentioned, we have a smart contract that receives transaction tips and MEV rewards earned by validators. It means our smart contract has a fairly large balance. It is 963+ ether now ($1.1M), and we have 8671 validators contributing to it.

The critical part responsible for synchronization between Ethereum execution and the consensus layer is Oracle. It is a very important system that allows us to determine what validators are contributing to the pool.

The oracle is written in Python, but it could be written in JavaScript - the problem remains unchanged, and I will prove it soon.

Let’s deep dive into the code!

Why Data Types Matter

The balance of the smart contract equals 963,135,554,442,603,402,422 wei (963 ether) now. This number isn’t only hard to understand for humans but also for computers (language interpreters to be exact). Let’s check JavaScript:

const vault_balance = parseInt("963135554442603402422")
console.log(vault_balance) 
// 963135554442603400000 (lost 2422 wei in total)

I only cast the balance from string to int, and I am already 2422 wei short. We didn’t run any equation yet.

The smart contract balance is so high thanks to many validators contributing to it. Now, let’s calculate what the average validator share of the contract’s balance now:

const vault_balance = parseInt("963135554442603402422")
const validator_count = 8671

const avg_validator_contribution = vault_balance / validator_count
// 111075487768723730 (lost 7 wei per validator)

The average share is 0.111 ether. But this amount isn’t correct - we are actually 7 wei short. It is 60,697 wei in total (7 wei times 8671 validators). I will show the correct number later on.

Proceeding further down the rabbit hole of losses - let’s calculate the total amount of rewards per given validator. Have in mind that the user needed to deposit 32 ether to start a validator, so I will deduct it from the validator balance.

And I will take as an example one random validator contributing to the smart contract that has a balance of 32.779 ether.

const vault_balance = parseInt("963135554442603402422") // (lost 2422 wei)
const validator_count = 8671
const avg_validator_contribution = vault_balance / validator_count // (lost 7 wei)

const initial_deposit = parseInt("32000000000000000000")
const validator_balance = parseInt("32779333896000000000")

const total_validator_rewards = validator_balance - initial_deposit + avg_validator_contribution
// 890409383768723700 (lost 23 wei per validator)

The total rewards earned by this validator equals 0.8904 ether, but this value isn’t accurate as well. At this moment, we miscounted for 199,443 wei in total (23 wei times 8671 validator). As you can see, this way of calculating numbers isn’t sustainable.

What Goes Wrong?

There are two problems with the code above:

  • In JavaScript, the maximum safe value for integers equals 2^53 - 1 only. It means it can handle up to 9007199254740991 wei (0.009 ether)

  • Technically, we could have used BigInt but we would have problems with division. We would end up with “floating” values. The floats are the root of all evil in finance because they are approximate. It means they lose precision. We need to use decimals. (The main difference between decimal and float is that decimal stores exact value and float approximates)

If you ever had done any Ethereum-related coding in JavaScript, you must’ve heard about ethers.js. This library contains all the necessary utilities to interact with the blockchain. To fix the issue above, we will use one of the tools called BigNumber that supports extremely large numbers and handles decimals in the right way.

Let’s do it!

const vault_balance = BigNumber.from("963135554442603402422") // no loss
const validator_count = BigNumber.from(8671)
const avg_validator_contribution = vault_balance.div(validator_count) // no loss
// 111075487768723723

const initial_deposit = BigNumber.from("32000000000000000000")
const validator_balance = BigNumber.from("32779333896000000000")

const total_validator_rewards = validator_balance.sub(initial_deposit).add(avg_validator_contribution)
// 890409383768723723

As you can see, now we ended up with the accurate number. How do I know this is the right number? I will repeat the same exercise in Python to prove that I’m right.

Let’s Try It in Python

Python supports long integers, so the values will not get suddenly cut as we have seen in JavaScript. Unfortunately, it still determines all floating numbers as floats by default:

vault_balance = int("963135554442603402422") # no loss
validator_count = 8671
avg_validator_contribution = vault_balance / validator_count
# 111075487768723728 (5 wei too much)

initial_deposit = int("32000000000000000000")
validator_balance = int("32779333896000000000")

total_validator_rewards = validator_balance - initial_deposit + avg_validator_contribution
# 890409383768723712 (lost 11 wei)

Wondering where exactly it lost precision? The division casted avg_validator_contribution to float instead of decimal. The correct snippet would look like that:

vault_balance = Decimal("963135554442603402422")
validator_count = Decimal(8671)
avg_validator_contribution = vault_balance / validator_count
# 111075487768723723

initial_deposit = Decimal("32000000000000000000")
validator_balance = Decimal("32779333896000000000")

total_validator_rewards = validator_balance - initial_deposit + avg_validator_contribution
# 890409383768723723

Now, the values returned by Python and JavaScript are exact. Check on your own!

These sorts of losses are marginal, and they can be easily missed. Often, we find out about them when they compound over time and grow to significant numbers.

Situations like that always give headaches, not only to developers but also to other departments such as finance or legal. You should always test your formulas, and never use nice round numbers to do so!


I hope this article was useful to you and you learned something new today. Subscribe to my newsletter for more articles and guides on Ethereum and MEV bots. If you have any feedback, feel free to reach out to me via Twitter.

It will mean the world to me if you share this article on your social media.

Thank you!