CHIP 2021-05 Minimum Fee Rate Voting Via Versionbits

Consider: when 10% of the hashpower is having signficantly lower cost, they will never be able to get the end users to profit from that, and market disruption by better innovators will at best be delayed.

Hmm, they can do what they could already do on today’s network, no?

Advertise their nodes where people could submit cheaper transactions, which their pool would mine. Since these transactions would not propagate well to other nodes, but still be valid, their pool would essentially be capturing a slice of the traffic for themselves.

Granted, this might come with some unhealthy side effects, which I guess this CHIP is intended to avoid.

But the free market is still at work.

Users / applications would (in theory) gravitate to the cheaper service provider.

They would end up mining more transactions than other pools, collecting a bigger slice of the fees. This either forces the network fees down as other pools react, or the cheaper pool gets more income and is able to attract more hashpower (because they can incentivize miners better with some portion of the extra money they collect from transactions).

Either way over time the cheaper pool causes a market reaction by the other pools who want to stay competitive and not lose hashrate.

Maybe my view is a little simplistic here, but I think a point I’d like to make is this CHIP is a compromise solution which is purely voluntary (not consensus), and anyone who feels that the minimum fee set by it is too high and they can provide service cheaper, is not prevented from doing so!

Thanks for the reading suggestion, it sounds like a great topic and I’ll give it a read!

1 Like

@tom , agree with your concerns and principles but not so certain about the magnitude of the issue. It clicks on my rational brain but not my empiricist one.

We really need to address how overloaded the term “centralized” is in our conversations about consensus - especially by better clarifying magnitude and scope. It’s often just used as a “dirty” word that means bad for the most part - typically justified when real practical centralization occurs, but sometimes not so when it appears to be a theoretical notion of direction rather than destination. Would be nice if we could come up with new terms that better delineated this dimension & spectrum so the term could become more nuance and useful rather than result in the end of thought as it often does (intentional or otherwise).

If we believe that POW miners is a decentralized mechanism than how can providing one lever signal to indicate a position regarding the value/cost consideration of the service they provide be considered as being centralized or moving towards centralization on any practical basis? It seems that before this lever could actually be manifested as having any negative centralization side-effects that the actual majority control of our mining pools will have far more severe consequences beyond this. Am I missing some other aspect of it possibly leading to more centralization than that? So long as miners are independent of the transactions in which they add to the blocks then POW is the strongest known decentralized consensus system there is. I don’t see this CHIP altering that relationship.

To your example - BCH transactors could benefit from a 10% hashpower advantage by those miners A) going ahead and mining bigger blocks (assuming frequent demands so the mempool is often not cleared each block). B) Other miners would then have an incentive to reduce per tx prices in order to increase the qty of txs so there would be more for them when they win blocks. A would benefit all users regardless of whether or not B manifests. I believe the conditions for A are present today and unchanged by this CHIP whereas this CHIP enables B which is not an option today.

What becomes the alternative solution you propose that provides an additional benefit otherwise if one set of miners suddenly becomes more competitive? Is that where “days-burned/destroyed” comes into play? That’s also an interesting concept to me but one that I don’t think is necessarily conflicting with this option.

1 Like

Note that this merely provides a system to coordinate. There is nothing “enforcing” or “controlling” it, consensus or otherwise, with the possible exception that miners should have at least mild incentives to see the network succeed via not lying. Given enough support for whatever an alternative, “innovator”-facilitating system there is, nothing really stops the network from abandoning this at anytime, wholesale or piecemeal.

2 Likes

Interesting idea.

I think it’s risky if 0-conf policy is not aligned with the mining majority, so I’m inclined think the 75% threshold is too high - a simple majority should probably control the fee rate.

Also I would suggest using a predefined array of fee levels, e.g. [ …, 0.1, 0.2, 0.5, 1, 2, 5, 10, 20, 50, … ] (but you most likely want a more fine-grained array), and then step one element up or down the array after a successful vote. This way the fee rate can always remain a nice round number that is easy to communicate.

3 Likes

I am agree with the initial statement, but I’m confused about what follows. If the fees are adjusted by a mining supermajority, that would imply their are aligned with the mining majority, no? (can’t have a supermajority without having a majority)

But maybe I’m not understanding exactly what you meant by “… if 0-conf policy is not aligned with mining majority”.

Yes, this part is attractive to me since fees are a frequent topic of communication. I did consider precomputed tables, but not under that aspect (I only mention them briefly in point (1) of ‘Tradeoffs’).

I would welcome someone doing a counter-CHIP that specifies such a scheme.

They could just take this CHIP and change a few areas, as it would not be fundamentally different.

The problem occurs when a 70% majority want to increase/decrease the fee but a 30% minority successfully blocks it. In this example the resulting fee level does not have majority support. More precisely, it previously had majority support, but now lost it, yet it continues to exist.

3 Likes

A simpler suggestion than a full array is to use 25% increases and 20% decreases. These are equal in size on a logarithmic scale, i.e. one 25% increase and one 20% decrease always exactly undo each other, regardless of order.

6 Likes

since reducing minimum relay fees is not a consensus change, miners can reduce their minimum relay fee right now and accept/mine transactions with lower than 1000 sat/kb. once those transactions are mined, all nodes see them. before those transactions are mined, most nodes will not see them (because of current relay rules). as it stands this degrades zero-conf.

the voting proposal appears to be democratic (two wolves and a sheep deciding what to have for dinner). it does not appear to display the core tenets of voluntarism. that is, malicious actors can create a situation where they force others to behave in a certain way. assuming as a given that high transaction fees degrade a network, a malicious entity can socially engineer miners and other nodes to keep fees high.

in my opinion, a demonstration of voluntarism would be to release a custom drop-in client with a low-enough minimum relay fee set that activates upon a certain block number, that miners can download and use immediately. SPV wallets, nodes, exchanges, etc, can be advised that minimum relay fees may be reduced by that time, and they should update their software, before that happens, so that zero-conf is not degraded.

on another note, i think that BCH will not capture as much potential new users and new traffic if fees are not reduced within the next 120 days, and may 2022 will unfortunately be too late.

its my opinion that the only objectively correct move is to reduce transaction fees, and the only considerations should be the technical limitations of a rollout as it pertains to a decentralized system (i.e. reduction in possibility of degraded zero-conf would be a good example of that)

btw, average transaction size is about 500 bytes, not ~212: Transaction Size | Bitcoin.com Charts (Appendix A - Fee Projection table). if transactions size is being used to determine fee projections, it should probably be the case that current/real transaction sizes are used for these data projections, instead of the minimum possible transaction size

4 Likes

Some feedback on the proposal:

  • I think that voting up or down is too stateful and complicated.
  • IF we were to go with a scheme like this, I would prefer we use 3 bits at least and just have a fixed array of 7 or 8 values for fee rates such as: 1 sat/KB, 5, 10, 50, 100, 500, 1000, .
    • This covers just about anything anybody would want to do and is simpler to evaluate (all you need is the last 1008 headers).
  • And up/down mechanism requires you to evaluate every fee rate that ever existed – meaning you need to download headers since the beginning of this scheme’s deployment if you wanted to evaluate the fee rate trustlessly. (After a year of deployment that would be over 50k headers!).
  • If you just have an absolute array you just need 1008 headers. This is much easier to evaluate for an SPV wallet. Also is less bugprone for devs.
  • Having a fixed array of values makes it very clear what the actual relay fee is without rounding errors and other things accumulating.
  • It’s easier to configure as well. Just set your node to a target and fire and forget.
  • How do you configure 50 sats/kB?
    • What happens when the fee rate is 44 sats/kB? Does your node vote up?
    • If it does it will overshoot. (To 55 sats/kB).
    • If it votes “no change” it will forever be unhappy with the undershoot.
  • A fixed array doesn’t have this problem. The user has a set of choices he can make and he will know what to expect at the end based on the choice he made.

Also, I think a simple majority might be beneficial, for the reasons @BigBlockIfTrue mentioned: you can have a situation where a previous majority voted for something and now only a minority supports it but we’re stuck with it.


As for @alex_asicseer 's thoughts: I am all-for just doing it immediately or soonish without all of this, FWIW. I actually think we are overshooting the fee rate drastically now and we can just agree as a network to lower it to say 100 sats/kB or whatever right now and nobody would even get mad.

My two cents.

5 Likes

Thanks for the feedback, @cculianu .
I am almost persuaded by the “use a simple majority argument”, and it’s very likely I will modify the CHIP in that regard unless I receive strong arguments against it.

I think that voting up or down is too stateful and complicated.

We will have to disagree there, because I think the alternatives, except for a one-time reduction, still involve voting up or down and are in that regard, not significantly less stateful.

IF we were to go with a scheme like this, I would prefer we use 3 bits at least and just have a fixed array of 7 or 8 values for fee rates such as: 1 sat/KB, 5, 10, 50, 100, 500, 1000, . This covers just about anything anybody would want to do and is simpler to evaluate (all you need is the last 1008 headers).

You may need more than 1008 headers anyway since you might be into the start of a new period and you’d need not the last 1008, but (1008 + however many headers past the last vote evaluation you’re at)

Practically I don’t think it makes much difference, as I explained in the CHIP that SPV wallets could cache the last vote result and come with checkpoints (e.g. at checkpointed block X, the minfee was Y" which means they don’t need to evaluate all the fee changes since this CHIP is put into effect, but only since they’ve last been updated.

And up/down mechanism requires you to evaluate every fee rate that ever existed – meaning you need to download headers since the beginning of this scheme’s deployment if you wanted to evaluate the fee rate trustlessly. (After a year of deployment that would be over 50k headers!).

No in most cases for the reason of caching/checkpointing above.

Even for 50,000 headers: a “whopping” 4MB or less than 3 floppy disks (anyone remember the 1.44MB ones?)
This is not even a concern to me. Such software needs to obtain the headers anyway to decide on its chain to follow.
This data just falls out as a byproduct, and the calculation is dirt cheap and fast.

If you just have an absolute array you just need 1008 headers. This is much easier to evaluate for an SPV wallet. Also is less bugprone for devs.

That is an advantage of signaling for values in an absolute array, yes.

I think the complexity is not much less than simple up/down voting and therefore not really less bug prone.
But I may be convinced otherwise if I see a CHIP for it that has dramatically simpler proposed code etc.

Having a fixed array of values makes it very clear what the actual relay fee is without rounding errors and other things accumulating.

Yes, that is an advantage of a fixed fee table, you can just look up the new fee value without math.

This could really be a successful alternative to this CHIP. I invite anyone to specify it as such.

It’s easier to configure as well. Just set your node to a target and fire and forget.

Yet it also does not solve the undershoot/overshoot problem.

How do you configure 50 sats/kB?

You could configure it as a target as per @im_uname’s suggestion so that your node always votes “in the direction of 50 sat/kB”. Logically your node’s votes would oscillate around the target as long as it is not hit exactly, if the implementation does not make a choice to say “closer than X to the target value is good enough for me to vote ‘no change’”.
I view a target fee setting via parameter as proposed by @im_uname is a good interface suggestion for implementors of this CHIP.

What happens when the fee rate is 44 sats/kB? Does your node vote up?

Yes.

If it does it will overshoot. (To 55 sats/kB).

Yes.

If it votes “no change” it will forever be unhappy with the undershoot.

Maybe not. There can still be one or more fee walks that reach exactly 50 sat/kB and make your node supremely happy.
How long that may take exactly depends on the choices made by other network partipants, but it is not guaranteed to take forever.

A fixed array doesn’t have this problem.

Disagree. Taking your example array, if the user wants 200 or 250 sat/kB, that’s simply not a value they will ever be able to set or get.

The user has a set of choices he can make and he will know what to expect at the end based on the choice he made.

Half agree. Yes, they can make a choice, but I think an individual “voter” will have greater uncertainty about the outcome, since it can
be any of the values in the array, and not simply one of three choices (current fee or one of the new values computed for “up” or “down”).
As I understand your proposal with the fixed array, the outcome still depends on the choice of all others who generate blocks. And they can vote for any value in the array.

Overall it’s not a big problem as people will be able to see what votes come in over the time of the voting window, and can set their expectations accordingly.

So I’m not saying a fixed array scheme couldn’t work. Only that I think it makes its own set of tradeoffs, beginning with granularity and choice of values. And we can discuss those in details once someone makes a concrete proposal to that effect.

1 Like

No, they can not.

Your step by step is mostly accurate, except for the fact that we still have a min-relay-fee property that is essentially putting up a wall between wallets and cheaper-miner.

It is like you advertise for cheaper food, but it is not available in any supermarket.

The level of relay-fee is network-wide, and your CHIP is also network wide. Anyone wanting to under that can negotiate (2 foxes and a sheep voting on eating setup, like posted elsewhere)…

The problem I have wth it is that you (as BCHN core member) have the power to relax the network fees without hurting the network (we have 5 years of emperical data on that) using the strategy posted on this reply of mine.
Yet you choose to go and add an extra layer of control that will affect the entire network. Because any transaction lower than X-fee will simply not be relayed to miners.

I get what you’re saying @tom .

Can I summarize your position as “there is no need to have a coordination scheme for such a minimum, just let market participants set what they like” ?

Therefore you see this CHIP as an unnecessary instrument of control which does not fully meet your definition of a free market.

My position is that participants, if unhappy with the minimum, will either introduce CHIPs to lower the fee floor (perhaps right down to its absolute floor of zero) or just not implement the CHIP and carry on using the existing configuration options to set what they like.

Right now, it seems people value the fact that transactions can simply adhere to some minimum fee rate and be sure to propagate well.

Those who go below this rate have been informed that they are at risk of double spend attacks against them, and with more rollout of awareness features for double spend, that may become a safer option.

I raised this CHIP because there was pressure to reduce the fees, and I’d prefer to see it happen in a coordinated (although decentralized) manner that reduces this fragmentation risk as long as Double Spend Proofs are still in beta, implementations still do not cover as many types transactions as they can, and wallet support is minimal.
This may all change, but even then I think it will take time, and some people think a “High Fee Event” may occur on BCH sooner than many would think.

1 Like

Well, I’m not directly disagreeing, but I think your simple statement has the issue that it is dismissing how the free market works. Innovation has to happen and markets have to be able to react. The point of free markets is that the devs do not put up any boundaries to the free trade. Any imposition is limiting the power of free trade.

I believe that if you do basic things to protect the full node, you can let the market decide on the fees.

Your CHIP is not volutairy, even if you write it to be so, as any minority can not escape it.

Ok, if we go back to the beginnings of Bitcoin, well before Core started adding misfeatures to create a fee-market, this was the default. At that point the market was completely free. Transactions propagated without issues.

If you open BCHN and go to the input-selection dialog (enable it in options if you didn’t already), you see that inputs have a priority. All-high priorty inputs create a transaction that is free from rate-limiting even with zero fees. Back then you could know a transaction would propoagate well. But if you had fears, you just added a bit of fee.

If your worry is that you want people to have this peace of mind, be sure that this will not change. As long as there are miners mining low-fee transactions, people can expect low-fee transactions to get mined.

Ehm, what are you talking about? Can you point to such a past informative message?
The current state is that if you go below this level, you will get a message from your nearest full node that your transaction is not acceptable. Most wallets do not even allow you to lower the fees because of this.

Ok, economics wise this equivalent to you proposing wifes to make a proposal to lower the price of baby-food.
The way that the market works, if it is healthy and open, is that someone actually capable of producing at this lower rate puts that on the market and the wifes vote with their money by buying those products instead of those from the competitor.

Your proposal would dictate how all non-mining nodes LIMIT the flow of transactions, at the behest of a majority of miners.
Miners that are more capable of lowering fees do not have the option of doing so because all the non-mining nodes are still tethered to the vote of the majority.

Your position is thus not realistic and not making economic sense.

Generally agreed. I think that some of current crop of client software used min relay fee to protect itself to some extent, and that may just be popular because it is easy to understand and can be used to establish a common policy on the network which hasn’t diverged much from what miners consider acceptable.

I’d like to hear feedback from relevant mining pools whether they would be ok with less reliance on such a min relay fee (perhaps to the point of removing it completely) and instead more using priority as a tool to protect their nodes. However, that is considered outside the scope of this CHIP (as pointed out in the CHIP), even though still worth following up on.

I think min relay fee is a basic form of those protections, and the market can already decide it (by each node operator configuring their node as they see fit, although I think it is currently limited down to 1 sat/kB and no further - but principally this an an implementation concern, that floor could be lowered as the CHIP already acknowledges but considers out of scope).

A min relay fee > 0 is just the node operator telling the rest of the market that they’ll consider any transactions above that fee. That’s also still part of the functioning of the free market.

I have tried to describe in my previous comment that this is not the case, a minority could escape it by operating nodes which accept a lower fee, and mining those transactions. They would be opening the door to any users wishing to pay a lower fee. So I struggle a bit to see how you conclude that “a minority can not escape it”.
They do not have to adhere to the votes of those implementing this CHIP.

In BCH everyone remains free to choose which policy to run on their node(s), which transactions to mine into their blocks, … none of that is changed by this CHIP.
As @im_uname has pointed out, it is only a coordination tool.

Now, I have taken your point about such a minimum fee possibly not being needed at all if other protections against flooding are adequate. Maybe that is not the only concern or reason for their existence. I would like to hear more opinions from actual miners & pools.

Agree that Core’s fee market was (and is) disastrous, but it’s false to claim that transactions always propagated without issues before Core.

In fact, the 1MB limit was set (temporarily) precisely because as an emergency measure against a spam flood which happened because flooding the network was ultra cheap. Core did not introduce that limit, so we should not pretend that everything was fine before them.

Seriously, all of this discussion is off-topic for this CHIP as measures to better protect a node against flooding are out of scope.

Pretty sure this was removed by ABC, I see no such priority displayed for inputs nor means of selecting it (I am looking at BCHN v23.0.0). Since I don’t remember BCHN removing it, it must have already happened before February 2020.

If I’m just being dense and not seeing a field, please post a screenshot of what you see and in which BCHN version.

It would be best we continue this in a more suitable thread on re-introduction of rate-limiting through priority.

I am talking about the double spend risks that those incur who lower their fee rate below the currently established policy of 1sat/byte.

This Reddit comment by @emergent_reasons contains a brief discussion of such a situation:

emergent_reasons comments on It is already time to implement fractional satoshis per byte if we want to be money for the world.

I base my “have been informed” on this information being generally available in the public sphere, in discussion threads on this topic.

Again, this proposal does not dictate. If you read the fine print, it says “SHOULD” in the places where it mentions what nodes do with the new relay min fee value.
There isn’t a requirement that they MUST adhere to it.
They are still free to operate their own policies and bear the responsibility of doing so.

Yes, I will point out again that they DO have the option, despite your claims to the contrary.
They can break free of the majority and run their own lower fees pool.

This is not a consensus rule.

I like the idea of not imposing unnecessary constraints on the system too.
However, I would not say that this proposal makes no economic sense - I leave that up to the economic actors in the system to decide.

You should check your facts before writing them here.

You say that someone able to operate cheaper should invest in duplicating the full-node network in order to allow users to maybe be able to find such a place to buy cheaper transactions.

There definitely is something wrong with the economics… :frowning:

You are right, it was preventative instead of an emergency measure.

Why “invest in duplicating the full-node network”?

They are already running nodes. They just set their minfee to lower than others and run like that.

Advertising the existence of your service is nothing new. It does not require duplication of the entire node network. Fallacious argument there.

Yes and no. Miners have their private network of nodes (count depending on size of mining operation, typically pools have a series of them all over the world).
But those mining nodes are not reachable from the Internet. Basic security, you understand. And thus could not be used for the purpose that you state.

If you use 25% increases and 20% decreases, then it can be made equivalent to a fixed fee table. Instead of recursively calculating the fee level, it can be calculated directly without intermediate rounding:

fee rate = 1 sat/B * 1.25 ^ (total number of increases - total number of decreases)

For convenient implementation the exact values of 1.25^n can be stored in a look-up table.

2 Likes

Thanks for the bitcoin.com charts link.

While I didn’t verify the accuracy of their raw data (still something on my todo list but it will take longer) I did download their CSV data and ran it through some stats software at face value.

timeframe min max mean sstdev q1 q3 iqr median
2009-01-10 to now 135 10223 521.04438065798 402.92721552031 383 615 232 497
2017-08-01 to now 235 10223 600.65883190883 549.29857662921 388 600.5 212.5 481
Last 12 months to 2021-06-04 278 1682 681.88524590164 316.83560051598 382.75 940.75 558 627
Last 6 months to 2021-06-04 278 1163 421.77049180328 118.37921838001 342.5 478 135.5 382
Last 3 months to 2021-06-04 298 830 419.32258064516 104.92359137987 344 481 137 378

iqr is the inter-quartile range (third quartile value minus first quartile value)

The mean and median on this data seem to be higher over the entire BCH lifetime up to now than say, in the last 12, 6, or 3 months, with tendency toward smaller average transaction sizes.

Over the whole BCH lifetime the mean of the daily averages (I suppose this is what the bitcoincom data contains) is closer to 600 bytes, but more recently closer to 400.

500 bytes as you suggested seems a decent middle ground for an overall average based on this data (very roughly).

I have updated the fee projection table in v0.2 of this CHIP.

1 Like

If you use 25% increases and 20% decreases, […]

Thanks, I think this is a good simplification (hopefully), I’m planning to adopt it in v0.3.
Will inform when that is ready for review.

3 Likes