Asymmetric Moving Maxblocksize Based On Median

To clarify - this is not what I meant (edited above), I wanted to say - moving median with a 365-day window, and every day you update the floor with something like if (today_365median > floor_cap) floor_cap = today_365median;

1 Like

I see, won’t that make it effectively the same as rule 2?

Yes, except that it can’t adjust back down. Once it moves the bar up, the bar stays there. We could have a whole year of low activity where the median would drop but the bar would stay in place until activity ramps up again enough to move it up more. Rationale being: if miners could have set the bar to X at some past period of 365 days, then that is proof that X can be sustained, so why would we drop the cap below X?

1 Like

I agree with this as long as the multiple used is low enough. With a high enough multiplier you can end up with a floor based on a false claim of sustainability.

1 Like

hmm, if “the limit should basically never go down” is a desired trait, we can simply remove rule 2 and hard-specify “the limit never goes down”, or otherwise modify rule 2 to be a higher percentile (say, 90th percentile instead of median), no? Both of these have much less complexity than your proposed rule, and should have similar effects.

That is a very interesting take which has as a result to take power back from the miners into the arms of developers. And it is also a false comparison. A slightly selfish one, I might add.

The proper way of looking at this is asking which group of people can change the blocksize. Because the false idea of anyone being able to do this alone is like Imaginary wrote, well, a silly idea. Not being able to move the blocksize alone is not an indication of anything.

The proper question is; which part of our network can change the blocksize today.

  1. Is it the miners, can there be enough hashpower that needs not ask permission in order to change the blocksize?
  2. Is it the people that mostly use this forum in order to decide things for the network that need to design some new rules before the blocksize can be raised?

If the answer is 1, the miners can decide this alone, then activating the “Asymmetric Moving Maxblocksize etc” is litereally taking power away from the market and from the miners.

Taking power to set the blocksize away from the miners may be the way to go, but be bloody honest about your intention and explain why this is a net-gain.

With a 90th percentile, and if we really hit some low activity period that could span a year, then on day 328 miners would be forced to start gaming it to preserve the cap, by padding blocks with dummy data for 37 days. Allowing it to drop would require them to build it up again from a potentially much lower base, requiring them to work their way up to prove again what has already been proven.

“the limit never goes down” could be dangerous if applied to both 1. and 2., because then it would take only 45 days of consistently bigger blocks to lift it forever.
If applied to only 2., then it translates into floor=max(previously observed 365-medians), and it would require 183 days of bigger blocks to lift it forever.

Apologies to the group, the proposal is a general repeat of the 2010 blocksize limit where the limit is moved from a free market to something that first needs “permission” from developers. It can be argued that that decision was by far the most damaging thing to crypto, ever. And this proposal is repeating it.

Yeah, that upsets me. Especially when its brushed aside thoughtlessly by someone that really should know better. But, yeah, apologies as to how that appears on screen.

Its been 8 years since I got my first code into the satoshi client, and while i’m absolutely not claiming to be an expert in everything it doesn’t seem fair to get brushed away like that. I’m good at what I love to do and I’ve done it longer than most of you.

Also, no, I’m not gone. Just prefer to not hang out on chats all the time. Want to reach me? Please use email. <tom [AT] flowee [DOT] org>

It doesn’t need a permission from developers, the whole ecosystem could get together and change the parameter to 33, but it’d still be a hard-fork, and if I understand right it would be one that doesn’t require changing a single line of code, but still a hard fork because mismatch in node configuration would fork you off the network. Problem here is that of coordination, not of permission.

Consider this: if all miners have their limit set at 32 and someone mines a 33, he’d fork out. If all miners set their limits at 33, but other nodes (exchanges, block explorers, etc.) don’t, then from the point of view of such nodes the network would halt because they’d be rejecting 33MB blocks and miners would be extending a chain not recognized by anyone.

The proposal would give miners an interface where they could control the limit in a coordinated way, it would only place a sane limit to the speed and amount of PoW needed to adjust the limit so that everyone has time to adjust and allow everyone to see an increase coming from a mile away: If you start seeing consistently bigger blocks, limit WILL be moved upwards at a predictable pace and range, entirely controlled by miners. It’s really similar to DAA.

1 Like

Right, so lets understand this clearly. The power balance is key here. That is the focus point.

The market needs to decide in todays setup that they want to increase the block size. This has worked just fine many times in the past (from 250kb blocks back then) and there have not been any points brought forward why this would not work in the future. Would like to hear why people think this would suddenly start failing?

Yes, the majority of miners (hashpower, really) need to coordinate a blocksize increase, while the ecosystem needs to be on board too. This is equally true for the 8MB as the 32MB limit and any other. Even this proposal would not change that simple basic fact. This is the market needing tools to coordinate. Also hear I’d love to hear if the current tools are not sufficient for the job. Last I met Jihan he made it clear that they simply pick up the phone, I’m sure this evolves over time and stuff moved to wechat or something. Is there really a need to have our attention on that?

The super important part to remember is that for the healty of Bitcoin cash we need to ensure that some bitcoin cash implementation that the majority of the network uses can not block-a-change, limit-a-change or force a blocksize limit change.
The 1MB limit is a simple technical limit to remove but due to it not being possible to remove by the market but only by “consensus” of some core developers, the entire protocol can get hijacked. Which it did. History we should avoid repeating, no?

The market-driven part of our ecosystem already have that power, they can pick any of the various ways that they want to do this. From miners signalling EB coinbases to people picking up the phone.

This leads to the inevitable:
What actual problem is being solved?
Given the basic fact that the market today could simply start mining 50MB blocks without any of the devs being able to stop it, I think the power balance is proper. There is no problem to solve. If anyone disagrees, please make your case.

To make my point clear; any protocol change (hard fork) that moves the decision of blocksize out of the hands of the market (where it is today, this is not a debated point) and into the software moves power away from the market and into control of software people. Software people can get corrupted, as we have seen in BTC, as such any such movement of power is inherently dangerous and enables capture and as a result the death of our coin.

The miners I talked to seem to be quite OK with the EB message in the coinbase that does this job today, improvements can be made without consensus changes. The size limit is today entirely controlled by the market (which the quote calls miners).

As a closing word, not sure how many people here know their economics, but its fun to realize that the push here to have more software based control (as opposed to free market control) over the means and quatity of producing some product into the far future, that is very much akin to the interest group control over production of things like oil. With that in mind, its fun to dig in on the real reasons why the gas price is skyrocketing.

Anyway;
Austrian economics is very clear about the choice to intervene when the option exists to let the market figure it out. The market is healthier when interventions are avoided.

And how does the ecosystem come on-board? By running a software that is ready to accept blocks of size up to X so it doesn’t stay on the wrong fork by rejecting chains that have blocks >X. Simplest would be for all non-producing nodes to completely remove the blocksize limit and let producers soft-cap it to whatever they want, but then you need 2 versions of node software, or 2 configs… and people more often don’t RTFM than they do so it’s not hard to imagine how it could go sour thanks to sticky defaults. Problem with this is coordination, what if miners can churn out 1GB blocks without trouble and they decide overnight to just do it, and then block explorers, exchanges etc. are caught off-guard and their software starts breaking, or, we find that not all non-producers configured their software to remove the limit so they get stuck on a chain with no production.

With this proposal, the whole ecosystem would agree to use a formula (parameters of which are in sole control of miners) to make these adjustments happen in a predictable way. Why don’t miners phone each other to decide the next difficulty and instead we feel more comfortable leaving it to the DAA? “Oh no, the blocks are coming in too fast, quick, vote with coinbase to adjust it!!”

Note that DAA was upgraded successfully to something rather good, and it required ALL nodes to be on-board. I don’t see why a similar approach to blocksize cap couldn’t be another success.

The bitcoin cash ecosystem has been aware of how to do upgrades properly for around a decade and this is a solved problem. It is naturally true that someone may be so foolish and try to push a 1GB block to the mainnet, but luckily this is quickly caught and nothing bad happens.

But it is good to see actual thinking of problems that need solving before we come with a solution. That is a step in the right direction.

2 Likes

As far as I understand, the topic is here just a discussion topic / suggestion and not a decision. At some point during this discussion someone might want to make a decision, at which time I will respond with: “Show me the stakeholders, impact analysis, clear specification etc” - ie, show me the CHIP.

Then, such a person would need to either convince those in power to do something different, or accept that, as part of the CHIP process, they need to reach out to and get agreement from, “the market” - the people are impacted by the change. And even after doing so, and even if they end up with full absolute concensus among the stakholders they reach out to, it’s still coming down to the miners accepting or not accepting the outcome.

This discussion though, is a first step to something, that may or may not be a good idea. I say we let the market decide on that though, and encourage the participants to make a CHIP, identify the stakeholders, contact them and get their feedback, iterate on their proposal until it becomes clear that it will or won’t work out.

Talk is cheap, and good - even when controversial - as long as we have a high social expectations on agreeing on where to go.

2 Likes

@im_uname One more thing to ponder. Medians are good when you want to filter out extremes where they would break averages. But here the extremes would be bounded by 10x the median, so maybe a simple moving average would be better because they’d be smoother while the “pull” of extremes would have less impact. Medians can still produce a shock… imagine mining 1,1,1,1,10,10,10,10,10 the max stays at 10 until it suddenly jumps to 100 once 10s outnumber the 1s, and then you start mining 100s to make another 10x jump. Average would’ve lifted it to 90 and it would’ve arrived there smoothly.

Another point is that averages can be accumulated while medians require a bigger state to track and more operations to calculate. To update the average you can do something like new_average = (old_average * (CONSTANT - 1) + new_value) / CONSTANT, and it can be recorded in coinbase script so you don’t need old blocks to calculate the next average, and so on… so really I think we should look into past work on DAA to find the ideal function, and here we can start talking about what we want from it, like:

  • It should rate-limit the max. increase of the cap over the course of a year, 10x / year? This would allow node operators to see increases from a mile away and prepare for it (buy more storage, CPUs, etc…)
  • It should remember old maximums, so we don’t have a situation where we fall down from 10 to 1, and then have to work for a year just to bring it back up to 10 when we already know that the network gave the signal that it can handle 10s.
  • ??

Here’s a shot at finding the function

Algorithm Description

Every mined block will record the blocksize_limit state, which will limit the maximum block size of the next block.
First post-activation block’s blocksize_limit will have the pre-activation limit of 32MB.
Every subsequent block must update the blocksize_limit to either:

  1. previous_blocksize_limit or
  2. previous_blocksize_limit * GROWTH_FACTOR,

where the case will be decided by the actual size of the mined block.
If the blocksize is above some threshold, then the limit MUST be increased.
The threshold is defined as threshold_blocksize = previous_blocksize_limit / HEADROOM_FACTOR.

Proposed Constants

HEADROOM_FACTOR = 4
GROWTH_FACTOR = 1.00001317885378

The growth factor is chosen to rate-limit maximum growth to 2x/year and, to avoid floating-point math, rounded to max. precision fraction of two 32-bit integers (4294967295/4294910693) . The proposed GROWTH_FACTOR to the power of (365.25 * 144) is 2.00000649029968.

Pseudo-code

if (this_block.height <= ACTIVATION_HEIGHT) {
    if (this_block.size > 32MB)
        fail();
    if (this_block.height == ACTIVATION_HEIGHT)
        if (this_block.blocksize_limit != 32MB) // verifies initialization of the new coinbase field
            fail();
}
else {
    if (this_block.size > previous_block.blocksize_limit)
        fail();
    threshold_block_size = previous_block.blocksize_limit / HEADROOM_FACTOR;
    if (this_block.size > threshold_block_size)
        this_block.blocksize_limit = previous_block.blocksize_limit * GROWTH_FACTOR;
    else
        this_block.blocksize_limit = previous_block.blocksize_limit;
}

Effect

The above means that mining a block of 8.0001MB is enough to permanently increase the limit from 32 to 32.0004, next, mining a block of 8.0002MB is enough to permanently increase it to 32.0008, and so on. If a block of 7.9999MB or 0MB is mined, the limit stays the same so miners can soft-cap the blocksizes < threshold to prevent the limit from rising. On the other hand, if all blocks are mined with whatever size above the threshold and below the limit then the limit can increase at a maximum rate capped to 2x/year.

The blocksize limit will then be a function of how many blocks were mined above threshold since activation:

32MB * power(GROWTH_FACTOR, number_above_threshold).

YOY increase is then given by:

power(GROWTH_FACTOR, proportion_above_threshold * (365.25 * 144)).

Example scenario

Year Year Open Threshold Blocksize Year Open Blocksize Limit % of Blocks With Blocksize Above Threshold Year Close Threshold Blocksize Year Close Blocksize Limit YOY Increase
2024 8.00 32.00 0.00% 8.00 32.00 0.00%
2025 8.00 32.00 10.00% 8.57 34.30 7.18%
2026 8.57 34.30 20.00% 9.85 39.40 14.87%
2027 9.85 39.40 30.00% 12.13 48.50 23.11%
2028 12.13 48.50 40.00% 16.00 64.00 31.95%
2029 16.00 64.00 50.00% 22.63 90.51 41.42%
2030 22.63 90.51 60.00% 34.30 137.19 51.57%
2031 34.30 137.19 70.00% 55.72 222.86 62.45%
2032 55.72 222.86 80.00% 97.01 388.03 74.11%
2033 97.01 388.03 90.00% 181.02 724.09 86.61%
2034 181.02 724.09 100.00% 362.05 1,448.18 100.00%
2035 362.05 1,448.18 100.00% 724.09 2,896.37 100.00%
2036 724.09 2,896.37 0.00% 724.09 2,896.37 0.00%
2037 724.09 2,896.37 0.00% 724.09 2,896.37 0.00%
2038 724.09 2,896.37 100.00% 1,448.19 5,792.76 100.00%
2039 1,448.19 5,792.76 0.00% 1,448.19 5,792.76 0.00%
2040 1,448.19 5,792.76 0.00% 1,448.19 5,792.76 0.00%
2041 1,448.19 5,792.76 25.00% 1,722.20 6,888.80 18.92%
2042 1,722.20 6,888.80 0.00% 1,722.20 6,888.80 0.00%

“This proposal is basically BCH’s “we shalt never let blocks get consistently full” mission engraved into the most conservative possible automation regime, and assumes “if we move slow enough, we can make software catch up”. It further makes the assumption that “if we cannot do that, BCH has failed, so assuming software will catch up to a slow enough expansion is a reasonable assumption. Whether that is actually reasonable is also subjective.”

I think it’s perfectly reasonable and would create a minimal impact to the network, building confidence in Bitcoin Cash as a serious currency for the world stage.

This is your friendly regular reminder that the blocksize limit was removed from the consensus rules on August 2017 with the following statement in the original BCH spec;

the client shall enforce that the “fork EB” is configured to at least 8,000,000 (bytes)

Notice the word configured here, this is a user-configured size. Very different from a consensus rule. The term EB backs that up because this is the BU-invented abbreviation for “Excessive Block”. This is the user-configurable size that blocks are deemed too big (excessive).

(Notice the same “EB” you can find in any network-crawler that according to a later upgrade that 8MB was changed to a 32MB default setting.)


People sometimes still talk about us needing a protocol upgrade for block-size-limits in one form or another. Please realize that any rule we add to manage the block-size-limit makes Bitcoin Cash less free . It makes the block size limit again open to consensus discussions and such proposals can lead to chain forks based on such discussions.

Bitcoin Cash has a block-size limit that is completely open to the free market with size limits being imposed by technical capabilities and actual usage only.

I vastly prefer the open market deciding the limits, which is what we have today. I hope this is enough for others too.

I’m going to paraphrase what you said to clarify my understanding of what you wrote. Especially, since you took the time to write a detailed opinion and I’ve always valued your input.

Are you saying that block producers (from any source) should coordinate with one another for configurations and/or uncoordinated adjustments to blocksize configuration?

It’s very common knowledge that I’m a miner and this influences a lot of my opinions. One of which is that I do not want some “miner council” or some summit or whatever to come up with configurations. Generally, it is good to get mining decentralized and not have any cartels or whatever. This is sort of an external factor outside the protocol that guarantees security.

I’m not totally against what you are saying (as I understand it, feel free to correct me) , but I’m not exactly convinced. However, I do share the sentiment to not touch the protocol. It does seem like there is always a change to the protocol, this is one of those changes I could actually go for as of now.

Best Regards

There are many ways to ensure a steady block size increase. We should distinguish between two ways of coordinating that, though. One is any method that involves asking the software developers in order to make changes. Another are methods that do not require consensus from the software developers on such decisions at all.

The initial 1-MB rule was a rule that the software developers needed consensus on and a great demonstration of lock-in. I think we were lucky that this was a simple hard rule and we either hit the boundary, or we don’t. We were lucky because it showed the special interest and leverage created for developers instantly and clearly.
Rules that give the miners mostly what they want (but at a cost) are making it harder to see the special interests created for developers, but in such cases it still is there and it makes the coin less free.

I think there is a challange available to get a good blocksize-limit set by the market and keep the cost low. The original idea from BU was to just bluntly create blocks (multple in a row) and hope they would not get orphaned, and the cost of this was seen as too high by the market. So while this was a method that does not require software developers to create consensus, it was not acceptable either.

The problem should not be too hard to solve, but it is unlikely that miners would be putting a lot of time and effort into it while their hardware lifetime is probably longer than the lifetime of the current block-size-limit.

Here are some quick ideas that avoid the lock-in problem by not requiring developers to agree;

  • The idea of this topic (Asymmetric moving maxblocksize based on median) can be completely implemented in an advisery method only, where the miners adjust their max-block-size (EB) on such advice and a simple check of the blockheaders can show it being safe to increase created blocksize.

  • Miners can use the block-headers to ‘vote’ for wanting a bigger size, or limiting the blocksize with the ‘EB’ parameter. Having a relatively pure communication channel.

  • Software devs could increase the default EB in their software as it gets better and miners could just ignore this setting and realize at one point that they can safely increase the blocksize.

  • Miners could coordinate and schedule a block-size-increase by picking up the phone and announcing the changes on major channels. Much like the software devs plan the yearly upgrade.

I’m sure some way can be found that allows the blocksize to be safely increased on regular basis, possibly ideas I have not thought off.

The important part is that we have 2 variables, accepted blocksize and created blocksize. All that needs to happen is for the accepted size to be increased by all miners well ahead of a single miner increasing its created size in order for the cost of upgrade to be basically zero.

Okay, Tom, let me get back to you.