Discussion: reducing BLS keys

Thank you for the suggestions @sophoah. Your insight into the calculation from a coding perspective is appreciated. I think we’ll hold off and see where the other discussions on this thread go before we create a separate proposal.

Good discussions everyONE!

2 Likes

Hopefully we don’t wait too long. Many are seeing the problems with the foundation of the harmony blockchain. Addressing this within a timely manner is important. I say this because I read on the forums here the ideas and the discussions but months later the same problems exist. Not trying to be rude but just to stress the importance of what we’re trying to do here.

5 Likes

hello everyone, thanks for all the input so far!

Let me just mention first that while i never thought of slashing on the lower bound (or penalize etc.) makes any sense protocol wise, i still acknowledge we need to do something about that also. That said, i think that should be a different proposal as this one is more or less limited to bls keys and how to control us big validators with some rules that we would all agree on.

As for BLS keys, the original idea that Harmony had at the very beginning, when we were still testing it prior mainnet release, to have 1 node 1 bls key was the best option we could have had. Sadly that one is now out the window due to how long we had max number of 106 bls keys. That number is way too high imo, especially with the fact that any of the big validators can simply deploy max keys within one shard and “hijack” it (take over). So this problem is two fold:

  1. absolute number of bls keys - each key us big validators deploy is one less small validator
  2. bls keys within a shard - if too many the validator(s) can take it over for their benefit

I think this is still the most simplest and best solution as opposed to limiting the delegation stake or anything else i read here. Since this is a core protocol component, we need to thread carefully so i would not advise going into too much complexity here and just try to find a simple solution. We can always upgrade it if we see its necessary, so i would rather start with that and build up if needed, than start with a complex solution to a (in my eyes), not a very complex problem to solve.

4 Likes

“The randomness and the blind election of keys assignment to all eligible validator is a bad idea since protocol has no sure way to say the node will be running smoothly to help the network.”

In what way NOW does the network know that a validator will be running smoothly other than it has x amount of keys and x amount of stake??

It has happened many times and is indeed part of the protocol that a poorly performing node will be booted from election. this happened many times with Kucoin to the tune of 40+ keys and the network handled it fine.

Indeed this was cited by Harmony as justification for moving towards decentralisation and having the community run more nodes than Harmony… Has this changed?

Right now we have a node with 29 keys about to get knocked out… This is most likely due to poor management the protocol has be proven to be able to handle this.

I do propose getting rid of key bidding and allowing automatic assignment… Using a round robin will spread keys out which will minimise the risk and make the action of ‘spinning up more nodes’ less viable because it would make no difference.

It could be that some sort of extra requirement is to be considered for random assignment such as uptime or being synced to a certain range. Information that is known and served by Harmony RPC’s

Maybe a hybrid solution of keys per shard per node and THEN randomly assign based upon that… combined with the above uptime / syncing.

Example:

4 keys per node per shard
IF uptime > 90% AND db is synced to within 100 blocks
THEN allow that node to be assigned

The focus maybe should be, as you noted, geared more towards infrastructure rather than stake.

9 Likes

I agree wholeheartedly to this proposal.

1 Like
  1. If you have more than 10 keys the system won’t let you bid past 600 or 3/4 line when Validators expand so the number is flexible.

  2. 0% fee for 50 epochs
    5-100 million mandatory 5% fee
    100-200 million mandatory 6% fee
    200 million above mandatory 7% fee

  3. Eliminate lower bound

  4. Change from 7 epochs back to 1 liquidity is king

2 Likes

I also think it would work best if we remove bidding by validators, but have the protocol decide how many keys a validator gets based on their stake / the stake of other validators and a max number of keys determined by the number of nodes they are running with a much reduced number of keys allowed per node (I support the 4 keys per node option). Similar to autobidder but optimised for fewer keys rather than a target slot?

Remove bidding completely from the validators control. Slot allocation can then be optimised and no more problems with validators taking too many slots.

Small validators can still potentially benefit from the boost below lower bound, and larger validators will be penalised for going above the upper bound if they don’t provide enough server resources to support their stake. EPoS stays intact.

Is this feasible?

3 Likes

As a delegator this is sad to hear. We the coin holders want to see the project be overall successful and reading this basically telling me that the coins success is hinging on some validators who can shake the returns of other validators. That then trickles down to the delegator. That delegator will accept lower returns or leave that validator. Neither of these should be ideal because one less validator is one less person to vote, to ensure protocol safety, to make sure the project potentially reaches a new audience. This also hinders the long term goal of harmony which is growth and adoption.

A non validator speaking.

6 Likes

This doesn’t make sense:

“ The reason we want to do this is because the median stake is high and due to it, many validators bid around it or below it…”

Limiting the amount of keys a Validator can have will INCREASE the EMS, which will raise the lower bound… and the unintended consequence is that middle and large Validators are MORE likely to try to get some of those sweet rewards below the lower bound.

While I have supported limiting the amount of keys since the beginning… some people opposed this at first, and it didn’t get any traction.

I think it is a good idea to see if the upcoming implementation of a 5% fee will bring some delegations from the large Validators to the smaller Validators… and address this then if it is still an issue.

Regardless, it’s been mostly mid sized Validators who have been diving low and adding excessive keys to keep people out recently. I took this photo just now.

2 Likes

I think should be around 50M or 100M.

1 Like

I am not sure if I understand this randomization correctly but it doesn’t sound meritocratic to me. Would this mean, that an attractive validator with stable and relatively high rewards (with good up-time, reasonable fees, etc.) has the same probability of being elected than another less attractive node?
I am probably missing something, so I apologize.

Is there an example of this randomized system?

1 Like

Based on how this discussion has gone, it really feels that the only solutions that have any hope of passing are ones proposed by the largest validators. That’s the state of our governance and decentralization which sucks, but it is what it is. It’s almost as if having the most stake somehow makes you a domain expert simply because you hold a lot of power.

Without going into a lot of details because I don’t want to waste my time typing up a detailed explanation that will just be ignored, but you can very easily define multiple distributions and combine them trivially to come up with a final probability of election. As a trivial example, you can define one distribution that rewards validators who put in a larger bid, but you can say that this is NOT the thing we feel is most important so we can give this distribution a low weight. The community can vote and say that most important is bringing on boarding new validators, so we can define a second distribution that rewards “new” validators and levels off once you have been elected “many” times (new and many are just parameters of this distribution, and can be counted in epochs)…and since this was voted on and agreed to be most important, we can give this distribution a higher weight. We can also define another distribution that would penalize validators with low uptime, levels out, and then give a small bump for validators with >99.5% uptime, and maybe give this medium weight. And defined more (or less) distributions to emphasis the properties the community feels is most important at any given time.

The main point is, it is not hard to create these distributions, and while there are different ways of combining them…its not hard to combine them we’d just need to pick a strategy. As our needs change, all we need to do is change a distribution, get rid of one, add one, or tweak the weights, and the net outcome is that election probabilities will update accordingly to move things in the direction that the community desires. These updates can very easily be voted on and implemented.

As I pointed out before, and it seems @DKValidator has noticed as well, the key is to remove the ability for validators to control their own destiny. The manipulation that this has caused is completely unfair, all of the proposals I see are clunky, unelegent and overly convoluted and will do nothing to fix that root cause, which is why we are in this perpetual mess.

I get that this may or may not make sense, but I think anyone with experience building probabilistic models would probably agree that this is very straight forward to do. For something like this, the real discussion points would be what distributions to use and what weights to assign. It just feels very much like the consensus among the validators is that being an early validator or being a mod of a subreddit is what qualifies you to be heard, and smaller voices who may have relevant expertise IRL just sort of get ignored. Oh well, I’m just going to keep watching a pray that those validators with a lot of stake don’t drive this project into the ground.

5 Likes

Thanks a lot for the reply and now I see what you mean. I agree with all you said about the power of the big validators. I have the sensation that most of the proposals are not trying to implement the best most decentralised and fair system possible but instead just improve a bit with something that pleases the big validators and won’t be shut down immediately.

That weighted-system sounds really interesting and as you say, for an expert in statistics this would be probably very easy easy to figure out. The concept of adjusting the weights when necessary sounds really good but it has to be voted as well, but at that point the system would be distributed enough and proposals would be easier to be heard and pass.
Here again, this states the issue at the moment: the protocol is hostage of the big ones and it feels like a totalitarian kingdom rather than a democracy.

As far as I have been reading there is 3 places where we could tweek;

  • Validators: limiting slots/stakes
  • Delegators: limiting amount of stakes on a single validator
  • Statistical randomization: system intervenes assigning slots

And here my thought. Couldn’t all of these be implemented all together since each of them introduces benefits and negative secondary effects when implemented alone? Or its the randomized system so good that the other two won’t be necessary anymore? On the other hand making so many changes at once makes it too complex to foresee.

I don’t wanna steel more of your time and I just throwing some thoughts that will get lost in these forest ideas anyway.

This is a real mess and I hope we manage to fix it.

1 Like

@Manumix I agree. I think the first step is coming to a collective agreement that something is broken at a fundamental level, without that I don’t see how there will be the will to implement and therefore consider larger change. I think a good starting point is the medium article (linked below) that outlines EPoS and specifically the issues the design was intended to avoid or resolve. If we look at the current state of things, having 2 years of hindsight, I do not see any other conclusion than EPoS has failed at accomplishing it’s intended goal. That doesn’t mean there is no merit to EPoS, and that there aren’t any aspects that work, but taken as a whole I don’t see any other conclusion other than it was and continues to be a failure.

The biggest piece of evidence for this is the massive imbalance of power amongst the validators. Decentralization is critical for large scale adoption. EPoS failed because it hasn’t allowed us to decentralize away from the validators that were a part of Pangea, aside from a few exceptions. EPoS does not “just work” so to speak, validators should not need to join private telegram groups to coordinate their behaviors so that things are “fair”…the model should be built to be inherently fair. Any model that relies on this type of coordination is clearly not scalable, not fair, and cannot be decentralized if it requires this coordination. A reckoning is coming if this is not fixed, its just a matter of when it will arrive.

If we are already so centralized around validators that cannot give up some power in order to achieve this goal, then it is almost certainly going to be impossible to convince large swaths of people to invest their hard earned money in Harmony. The price will go up from here, and we will get pumps, but until this issue is resolved Harmony will never be a top teir layer 1 crypto regardless of how good the tech is. With what is going on with Solana and the outlook for Cardano still up in the air, there is an opening for Harmony to sneak in. But there are so many other players that we will be quickly passed up by others, this has actually mostly already happened, but it isn’t too late to make up ground.

There is a saying that a rising tide lifts all ships. These validators can hold onto their power and remain oversized fish living in a small pond, sucking up all the oxygen and ultimately leading to the slow death of all other fish in that pond and then themselves. Or they can give up some of their power to allow the pond to come back into equilibrium, the ecosystem with be healthy, everyone will thrive and they will certainly grow with the rest of us. If not, all I see in store for Harmony is becoming the textbook case on how failed self governance destroyed what was otherwise very promising tech.

5 Likes

To respond to your points.

Validators: limiting slots/stakes. I really think the whole issue with keys and slots is a red herring, it is an indicator for a problem much more insidious, but it is not a problem itself. I think the issue with slots can be broken into 2, one related to elections and fairness and the other related to network security and stability. For the fairness issue, fairness can be resolved completely via an updated election process (i.e. randomization). The definition of what fair is can be voted on, and can change to mirror changing priorities, but whatever the community collectively decides is fair can be implemented via a randomized election process. For stability, it is possible that the distribution of keys across shards can be managed via randomization as well, but that would depend on technical details about the blockchain that I’m not familiar with. Finally, for concerns about an attack related for keys, limiting keys or anything like that ill have zero impact. If you are intending to do an attack, and you find yourself in a position to do so, then buying a server and setting up a bunch of VMs to make multiple validators is not going to stop you.

Delegators. Under EPoS, there are so many issues that everyone has their ideas on how to tweak things and fix it. Again, I do not see delegator stake as an issue at any fundamental level, it is merely a superficial band aid on top of a much worse problem. As validators, do we really want to spend all of our time putting out fire, or do we just want to go and arrest the arsonist? With randomization, any validator that gets disproportionately powerful can be dealt with in manner ways. One option, which may be unneeded depending on all parameters used for election, is to just reduce the probability of election for validators that get too big. This way, it’s hands off, it “just works”…a validator gets too big (in relation to all others), and their election probability begins to drop. If this does not limit their growth enough, it will drop more, and this will continue until either their growth stops long enough for others to catch up, or until their election probability is low enough that delegators begin to move their stake elsewhere.

Randomization: This solves all issues that I am aware of. It can be made as simple or sophisticated as desired, but realistically a pretty basic model is probably good enough (total stake, bid, uptime). A critical distinction is that in the current EPoS, as long as you are not at the 800th slot, you can add or remove keys and you have 100% certainty you will be elected; this almost complete control of ones own destination is the heart of the problem. The collective action of 100+ validators making small tweaks means that new validators who do not control their own destiny are at the mercy of those actions. With randomization, each slot will be bid independently, so if a validator wants to create 100 keys they can, but there is no way they are going to win all of those bids. The more keys they create, the lower the bid and therefore probability of election, and so they need to find the sweet spot where they balance the risk of losing a bid against the reward of a higher return for the bids they win. Any individual validator can play this game and decide where within this risk/reward space they want to park themselves, but collectively the system will behave according to the parameters voted on and used for in the election process.

This idea (or similar ones) is dead in the water if randomization cannot occur on Harmony. It has been suggested, but technical details about why have not been given. It feels very much like the limitation isn’t technical, but more of a desire to go down this path. any technical details about why randomization cannot be done would be useful to hear.

2 Likes

Thanks again for both of your answers, I am learning quite a lot and I am glad to see that there are members of the community that really see this as critical as you do. The protocol is broken as it is and it feels like a dinosaur already too big to be changed, however we are young and still small and changes like this shouldn’t be that hard to implement. The protocol is small but the power is unevenly distributed and hence here we are.

I see that you are much closer operationally to the whole process than I am. I agree, the responsibility of keeping the protocol decentralised can not fall on the delegators community and it’s not their/our job to make the math and distribute among different validators. Delegators should pursue profit and by doing so the system should stay fair and secure.

The same could be applied to validators. Validators should pursue profit and by doing so keep the protocol fair and secure. But here is the problem, by doing so the protocol shift towards unfairness and that leads to a less secure protocol (less validators less security). This is the big issue.

So if fairness and security of the protocol can’t be achieved at this point by human decisions of delegators and validators then a machine should do the math. The randomization mechanism seems to be the way to go.

Going to the “at this point” part: I guess that the protocol’s issue was the onboarding process of validators. If 1000 validators would have joined the same day and competition would have started in equal conditions, I guess we would have a pretty decentralised network. But who knows, maybe in time it would shift towards a similar situation we are having today due to the implicit design issues.

So now, how do we get out of this mess?

3 Likes

Theres not much that can be done until the largest validators understand the implications of wielding that much power, and understand that having power does not mean you have the answer to every issue. Learn that leadership means understanding your own limitations and stepping out of the way for others when you hit your personal wall, and overall stop acting so irresponsibly with the power they have unfairly obtained.

5 Likes

I have been following the messages and news from the SEC in the US these weeks and it looks like that an attack to specific protocols and stable coins is in the making. There seems to be a plan of declaring securities tons of projects based “among other criteria” on the 4 points of the Howey Test.

Not sure how we could be affected by this but it could make an expansion into the US market a lot more risky/difficult, like being listed on certain exchanges, etc.

As I got, most of the protocols fall under the definition of security based on the first 3 points of the Howey test but the 4th point leaves some room depending on how decentralised a given protocol is.

I don’t want to give the sensation that I am constantly looking for new reasons that justify the need of more decentralisation but if the centralisation aspect that we have been discussing here is starting to be a risk for the protocol itself due to increased regulation I think that this may be a very valid point to address. Two options:

a) We are already at risk
b) We are not at risk but it hinders further growth

This may be a key aspect to discuss and could be an anchor point for the whole Validator/Election discussion. Maybe I am wrong and this is being misinterpreted by me.

1 Like

I also think that solution is in Randomness algorithm on protocol side.
It should be done as next evolution step for Harmony protocol, otherwise there is no chance to have 1000 elected Validators.
Small Validators are just struggling to get delegations needed to be elected…

May be some kind of BLS KEY POOL (10%-20%-30% of all available) that participate in election to committee using Randomness algorithm with effective stake bellow median stake.

3 Likes

I agree on your point, it would be pitty to loose stability of protocol as bigger, stable long running Validators are ensuring it.

IMHO it would be great to see some kind of hybrid version of Randomness, check my other comment below:

What do you think? Would it work ?

2 Likes