Rolling minimum fee mempool policy with decay

If we don’t create any tools, obviously miners will create them.

But the tools miners create will be

  • Hidden
  • Not open source
  • Different for each pool
  • Not standarized

Which is generally bad, it tends to create chaos in the ecosystem when every miner uses completely different set of tools to determine his way of getting profit.

For a solution, I particularly like your idea with 2 separate mempools (1: “master” mempool for relaying and 2: “slave” mempool for including in blocks) and some algorithm that decides which transactions get moved from 1 to 2. It would make sure that even if a transaction does not get mined, it gets relayed to another miner that may want to mine it.

An idea is, the rules for moving transactions between 1 and 2 could be decided by some commonly used scripting language like LUA maybe and be a separate “engine” of sorts, so

  • It gets easier for miners to fine-tune their inclusion policies
  • It makes it easier for developers to create few standard types of rules and distribute them as LUA OR
  • It makes it easier for developers to make few sets of “default” choices for miners depending on what transaction pricing schemes they find desirable
1 Like

For now, but at scale and in aggregate it would not be. At 100 MB blocks those would all add up to 1 BCH which ought to be enough for security budget especially since at that level of utilization the price could be $100k.

Ok, but why would Travala accept BCH if they couldn’t just as easily split it and use it to pay some of their smaller bills? It’s all connected, you try to pick out this or that case as more valuable you’re guaranteed to be wrong because you’re just 1 mind and market is a hivemind made of millions.

Infrastructure cost of any 1 TX is a tiny tiny cost on the whole. A rounding error. See what I did there? :smiley:

1 Like

100MB blocks filled with economic-activity transactions are sure as [censored] going to be moving more value than the fees. They’d better, otherwise the people will move elsewhere.

I feel like you’re not understanding the difference between BCH the chain and BCH the token.

The token is the one that gains value with usage. The chain is just supporting that.

See BSV, immense usage of the chain, if you’d follow your logic it would be immensely expensive.
Instead, BSV the token has no real world usage, therefore the whole is not valued high.

On BCH you can have 20MB of memo (the messaging service) transactions and 1MB p2pkh transactions (that are spending older coin).

The ONLY value of the whole comes from the latter.

Edit; lets go into detail WHY:
Lets assume that those memo messages are paying massive fees and thus miners like mining them. Dropping all economic transactions.
But those miners get paid in BCH the token. They can’t actually benefit from the massive fees if there is no utility to BCH the token, nobody will pay much for that. See BSV again.

We’re not to be confused with VISA that charges a percentage of the transferred cost, they are a middle-man that makes money. They gain value that way.
The possible confusion is two fold:

  1. BitcoinCash gives away the infrastructure that VISA charges for for next to nothing. The fees.
  2. BCH-the token, gains value from usage but VISA doesn’t have such a coin. It just supports fiat currencies.

Not really… If bored; read some Rothbard.

PS.

This just got to me guys:

If we create a standarized sets of LUA code that determines the TX fee for usage by miners, these rules could also potentially be used by wallet creators too, ensuring the ecosystem stays in sync and everybody is one the same page.

This would be great for synchronicity, achieving “fee consensus” and removing many errors and misunderstandings.

What’s more, miners too could publish their fee policies globally as LUA to make it clear to wallet creators what they expect.

In turn, wallet devs can use these publicly accessible sets of rules to determine their TX sending fee policies.

1 Like

I think y’all are overthinking this. No rube-goldberg policy can solve this problem:

And the problem is there because we can’t have infinite mempool and infinite blocksize. It can always happen that it starts overflowing and people need to adjust:

  • Wallets should get smarter about their fees, like start using the Electrum get_fee_histogram API to alert user in case mempool situation is not as what he’s been used to.
  • What @tom has been saying: wallets should take some responsibility and track the “last mile delivery” of a TX, don’t just broadcast and forget. Keep watching until it gets 1-conf, and re-broadcast if/when needed.
  • Recipients can do the same: cache inbound TXs and rebroadcast them if they see the TX is not getting confirmed when it should’ve been.

Also, isn’t min. fee supposed to be like training wheels anyway? Once blocks get big enough for orphan rates to start mattering, miners will implement their own fee floors.

I touched on this in ABLA CHIP:

When block capacity is underutilized then the opportunity cost of mining “spam” is less than when blocks are more utilized. We will here define “spam” as transactions of extremely low economic value, but how do we establish value when value is subjective? The fee paid is a good indicator of economic value: if someone is willing to pay 1 satoshi / byte in fees, then it is a proof that the transaction has economic value (to that someone, subjective value) greater than the fee paid. Consider the current state of the network: the limit is 32 MB while only a few 100 kBs are actually used. The current network relay minimum fee is 1 satoshi / byte, but some mining pool could ignore it and allow someone to fill the rest with 31.8 MB of 0 fee or heavily discounted transactions. The pool would only have increased reorg risk, while the entire network would have to bear the cost of processing these transactions.

The limit can therefore also be thought of as minimum hardware requirements, or minimum cost of participation in the network. Some selfish pool could increase the cost of participation for everyone - much before the network reaches “escape velocity” of utilization, and in doing so reduce chances of network success by increasing adoption friction.

What do we mean by “escape velocity”? It would be a situation where many businesses would be running their own infrastructure in order to securely participate in a high-value blockchain network, and it would just be a business cost easily offset by their earnings. If the network would succeed to attract, say, 20 MB worth of economic utility, then it is expected that a larger number of network participants would have enough economic capacity to bear the infrastructure costs. Also, if there was consistent demand of 1 satoshi / byte transactions, e.g. such that they would be enough to fill 20 MB blocks, then there would only be room for 12 MB worth of “spam”, and a pool choosing 0 fee over 1 satoshi / byte transactions would have an opportunity cost in addition to reorg risk.

1 Like

It is good that nobody was talking about a full mempool problem, then.

The rest of us were talking about full block problems. (you can fit a lot of blocks in one mempool)

Why are you writing weird off-topic long posts (3 today alone) that are frankly all over the place and you ignore answers to your points.

In other words, putting fingers in ears and shouting LA LA LA.

WTF man. I was starting to like you.

First you say we are overthinking this, and then you say:

Basically, none of what you said has any conflict whatsoever with what I have said.

Both can be done.

But I am unsure if what you are saying that “wallets should take some responsibility” is not much more troublesome than what I propose.

So, you’re asking all wallets in a decentralized ecosystem with no central planning to do “this” and “this”, “somehow”. You’re saying “wallet creators, be wiser”. This is too generic and ambiguous.

Well I am not sure this is going to work out better than just having a set of public LUA policies of fee inclusion. Wallets can just download them and use them “as is”.

The problem here is that every wallet will figure out on his own what works - meaning it will be chaos.

Having a set of rules standarized and publicized will create more clarity and less chaos.

1 Like

Wallets can just query the already available get_fee_histogram API and use median “as is” as the default fee.

There’s already a set of rules standardized and publicized:

  • (1) 1 sat / byte is the fee floor, nothing below will get accepted by most (if not all) nodes
  • (2) 1st seen rule
  • (3) dynamic fee floor & mempool eviction, if mempool grows bigger than few blocks worth - admittedly this one wasn’t publicized well, I learned about it the other day, around the same time as @_minisatoshi who wanted to open a topic about it, and then I found this existing one and directed him to it

Instead of coming up with even more rules and overhauling the whole system (what problem would that solve?), how about first figuring out the implications of current rules? Maybe all that’s needed is more standardization and publicization of existing rules, and a tweak here or there.

Many people don’t know about (3), so we’re doing the publicizing now.

Ideas for some tweaks to existing rules:

  • Tweak to (1): reduce min. fee to 0.5 sat / byte now, and commit to halving the min. fee with block rewards halvings, so next halving we’d reduce to 0.25 sat / byte.
  • Tweak to (3): not really a policy change but rather implementation change: when TXs are evicted from mempool, dump them to disk and store for later. When dynamic fee decays, try to load them back into mempool (if they’re still valid) and rebroadcast. This would help users whose TX got evicted during whatever rush hour.

Implication of dynamic fee floor is that it’d be possible to attempt double spend of low fee TXs by:

  1. Broadcasting some TX with min. fee
  2. Flooding the mempool to get it evicted
  3. Broadcast an alternative version of the TX, which would not be rejected by rule (2) because the 1st TX will have been evicted already

Note that the attempt would still cost the attacker at least 0.32 BCH / attempt, so min. fee 0-conf could still be OK for payments smaller than that. Strategies relying on DSPs could take fee into account when evaluating risk.

You can talk about adding more rules, taking into consideration things like coindays or number of UTXOs destroyed / created, but wouldn’t that only make it harder for everyone to evaluate risk and know what to expect in case of some burst of TXs filling the mempool?

It’s literally the topic here, recall the opening post:

OK I’m sorry for calling it rube-goldberg, no need to get all touchy about it.

1 Like

This isn’t entirely accurate. There was no minimum fee beforehand. We put that minimum fee in. Calling it a discount isn’t really accurate. And I doubt you’ll find any (at least OG) miner that would prefer few tx with high fee to many tx with low fee.

Overall – I don’t think we should be implementing any real filters on transactions. At most, obvious 0 economic activity transactions (like respending the same UTXOs over and over and over thousands of times per block) could be filtered, but again, leave that to the miners. They are more than capable of deciding if they want to filter that out themselves. We don’t need to do it for them.

My point, here, is that there is no way to predict what an economic transaction is versus not. Yes we can make very accurate guesses right now, but no way can we see the future with certainty.

As I explained in my other post about exploring the future of tx fees replacing the block reward, this can be SIGNIFICANT. Who cares about right now? Any additional fees to miners are good. Even if they are small.

If we think this is an issue, then Bitcoin has largely failed as an idea, imo. Bitcoin runs primarily off of incentives. It was devs with powerful backers that fucked up BTC. But in its natural state, Bitcoin and all parties are held accountable and will do what is in their best economic interest. Who cares if these tools are different. Miners can choose to do as they please, as always has been the point.

1 Like

This reads like we’re actually fully in agreement. But you seem to have some confusions.

Nobody is filtering. No filtering is proposed.

The entire idea is to create tools for the miners, decentralized decision making and free market choice. Nobody is filtering, and miners are the ones that decide how they want to sell their blockspace. They decide. “We’re” not doing any filtering or ranking or anything.

I hope that clears up the page I posted, you might want to take a look at it again. The entire premise (as detailed in the first part) is to move stuff over to the free market what is today really centrally planned (due to lack of tooling, I might add).