Shooting Gallery: Inject SLP into consensus, 100% compatibility with current ecosystem

There is no question that there are alternatives. The question is whether allowing apps to continue with 100% compatibility and not fucking around with their network effect is possible, and how intolerable is the pain to make that possible.

This is an infant version of the questions that would have to be asked when there is larger adoption. BCH ecosystem needs to become much better at steel-manning it’s proposals. At the risk of being rude, I roughly summarize the current approach as “it’s easy/cheap to change at the node level therefore it’s easy/cheap to change”.

1 Like

This is outside the scope of the discussion. I respectfully request that you start a different thread if you would like to discuss alternatives. This post is attempting to concretely explore the pain of a 100% SLP compatible consensus upgrade.

As far as I can tell there are three arguments against an SLP-consensus. One of them is quantitative and should be weighed with nuance against the benefits of keeping the ecosystem intact with no additional effort. One is qualitative and should be considered whether it’s a dealbreaker by itself. The last one is psychological/marketing.

  1. Quantitive: SLP is inefficient in terms of size (bandwidth, storage, scripting, fees) and implementation/maintenance/security complexity (from a full node perspective) compared to a system of any kind that is “designed for consensus”. Token-UTXO-related opreturns may need to be marked unprunable (and only prunable when related token UTXOs are all spent), for example; not the mention processing quantities require op_reversebytes. Note that there’s no qualitative scaling differences compared to other “encumber a set of UTXO” schemes, SPV clients can also get tx and merkle proof to prove validity without the whole graph as long as said tx is after SLP rules start to be enforced.

  2. Qualitative: An SLP-consensus is inflexible and may need further consensus changes every time someone wants a new token-related feature. Note that this is a trait shared with other specialized schemes like GROUP, and unlike generalized schemes e.g. PMv3 or the current nonconsensus status.

  3. Psychological: Nonconsensus SLP, while having persisted for a while with reasonable popularity, suffered from poorly-implemented and maintained software for a long time. Badger, SLPDB, EC-SLP all had their share of notorious troubles contributing to the perception that “SLP is insufferable and we must fix it by ditching it” - when in fact most of those troubles were due to software quality instead of anything fundamental in SLP’s design. The perception is real, though, and many may rightly think a “fresh start” or “v2” of some sort will fare better in the minds of people.

3 Likes

Thanks! These are definitely along the lines of thought that we need to explore more.

1. (long term inefficiency) and 2. (long term complexity) are IMO the biggest points I have seen. For better or worse, the short term cost of supporting apps to switch over to something else has almost 100% chance of being less than the long term cost to the whole network.

2. (long term complexity) is IMO generally a point against all token schemes, including SLP-consensus, that effectively turn BCH into a chain of chains system and may dilute other critical efforts such as scaling.

1 Like

Are you absolutely sure?

Aren’t Group Tokens what “In consensus” means? They are mineable by normal miners, SPV-processable and require network-wide consensus change after all.

I mean I could totally be wrong because I am not an active developer so I may be misunderstanding, but AFAIK converting SLP tokens into GROUP tokens would be exactly that: “Injecting SLP into consensus”, but done in a different way.

So, if we consider that “injecting SLP into consensus” may mean something entirely different, this could change the reach of your question and the outcome of this entire discussion.

I think you misunderstood the goal of this specific topic. Where you are looking for solutions to a technical problem, we are instead trying to find out what the wider (ecosystem wide, actually) effects are of keeping the current solution and just making it validated.

As I wrote above, the devs coming up with some new solution will seem to many as us moving on to the next shiny thing. And that is honestly something we should consider because they would be right.

As a great author wrote, its easy to write the start of a novel, you can start all these great new ideas and threads. But the more threads you start, the harder it is to end the novel in a coherent way that ties them all together.

1 Like

You are probably right here, I always try to widen the view and explore all possibilities which perhaps causes me to break out of context.

I did not mean to intrude and break your discussion, this is just how my brain is wired.

Perhaps I should create another topic.

There’s an undertone to these concerns as if we fear success. This may require a discussion of its own. Do we even want consensus tokens? I disagree that consensus tokens could put BCH at scaling risk and believe that NOT having them is an existential risk for BCH - the risk of becoming irrelevant in the grand scheme of things. Sure, we can link BCH to tokens and smart contracts through side-chains like Moeing, but if we’re doing cash then why shouldn’t we enable tokens to be brought back in to BCH chain when they’re to be used as cash. Complex smart contracts can be left out to side-chains etc. but BCH chain is perfect for cash-like uses. Bitcoin: a peer-to-peer electronic cash system.

1 Like

My comment was directed towards the non-functional aspects or infrastructure of the SLP software - not its features. I am a proponent of and have commercially adopted SLP. I support its capabilities being moved into the core protocol but not its implementation.

You just restated my very case using different words. I was pre-empting an argument that we should make SLP part of the core protocol because “we already have so much software” when, in fact, because of it’s poor situation in the non-functional realm, it would have to likely be abandoned and re-written anyway.

Sorry that’s not what I am saying. I am talking about using SLP spec directly, as-is. But I think we agree it would be some pretty gnarly technical debt to make that happen. Still, it would be better if we can find specific qualitative or quantitative descriptions of that debt and that’s what I am looking for here.

1 Like

Here’s one: enforcing the SLP OP_RETURN pattern with consensus would break the assumption that any random byte pattern is allowed after the OP_RETURN. Imagine someone else storing document hashes as OP_RETURN . What are the odds that some hash happens to match some SLP pattern and then it triggers SLP consensus checking which then fails the TX which should be perfectly valid from the document hasher’s point of view.

Maybe nobody is using it like that so this would be purely theoretical, but how to fix this problem if we would consider it to be a worthy problem?

Either all OP_RETURN users would have to add logic to avoid conflict with SLP consensus logic, like adding some non-conflicting prefix: OP_RETURN <prefix> <random data>

or SLP would have to use another opcode. An opcode that would behave the same like OP_RETURN but which would also trigger the SLP consensus logic, like OP_RSLP. This would require SLP users to change how they process TX-es so with this small change we already break out of “as-is”.

So, someone would have to change something down the line, and we could only choose who it would be – OP_RETURN users or SLP users. As you said, gnarly.

3 Likes

Not sure if this is a reason against SLP-in-consensus, just asking in the same spirit: what would be the benefit of adding SLP to consensus over switching tokens to something like Tokeda ?

TLDR; Tokeda is so completely broken and poorly specified its just fantasy land. We should not drag it in until major flaws are addressed and significant gaps are filled in.

It also breaks the principle that the spendability of a coin is determined by combining the unlocking script of the input with the locking script of the output: suddenly you also have to check not only another part of the child transaction, but even another otherwise unrelated output of the parent transaction. A separate OP_RETURN output is just not a logical place to store this data, and the use of this data has no straightforward interaction with the locking/unlocking scripts.

5 Likes

Specific and great point!

Another nice and specific point. I have thought about this one also - what exactly would be the line at which a transaction gets hard-identified as an SLP-attempt. Perhaps it is the prefix alone. It could be more sophisticated than that but I could not find a way for it not to be a heuristic.

1 Like

The part of the scope that it violates is “100% compatibility with current ecosystem” which I tried to specify in slightly more detail in the description. If my SLP app today wouldn’t work 99% as-is on the consensus version of this idea, then it is out of scope.

Thanks for the info, my reply was unwarranted then. I will start a new topic.

I completely agree here. The big point for me here is that changing the middleware without consideration of the current app-layer, as you described - adds significant cost to projects already using SLP.

This is easy to disregard if you are just focused on drawing board projects that may be possible after SLP v2 (or whatever else) reaches consensus and launches next year after May 2022, or you don’t consider the fact that businesses have to spend/hire dev time to write new code whenever the backend requirements change significantly.

Imagine having to get Tether to reimplement the whole thing again on a new system, completely changing the APIs and libraries. Imagine not supporting SLP v1 and just deprecating it without a migration plan.

What would that say to businesses and users who have already invested money and time in SLP? Disregarding this is easy if you are excited by the new shiny thing, and there is no cost to you to drop legacy support for existing users.

The key certainly is to not affect the app-layer at all, and ideally almost zero cost to current SLP projects, and not do this by breaking all the wallets that currently rely on SLP v1 middleware. A community-supported migration plan is a must.

It is also unreasonable to drop support for SLP v1, simply because something is coming in May 2022, which frankly to businesses is very far away right now. Unless you want them to incentivize them to leave in the meantime.

(There is also a small small chance that no superior token solution will reach production or get adopted, let’s say smartbch is filled with problems, in that case you still want current SLP to fall back on, as many challenges and issues it does have at the moment, it is still a solution for token issuers. Therefore, it makes zero sense to me why advocates of successor proposals constantly want SLP v1 support to be dropped/deprecated immedietely.)

I think I should also state the pragmatic reality, we all know to be true here - that BCH devs are not exactly teeming with extra time to rewrite their software from ground up.

Nobody is proposing this AFAIK. SLP v1 is free to exist as long as it wants, and people are starting to think about migration strategies. It’s important to highlight that nothing is forced. Nobody needs to rewrite anything before they’re ready. Consensus layer is just an enabler, it provides building blocks – you don’t have to use the new ones if you already have something built with old ones. However, we’re hoping people will want to use those new ones, and bring more utility to BCH with their future solutions.

Make the new building blocks so good, that even those who have already built something with old ones will be tempted to make use of new ones – because they want it and not because they’re forced. The sooner that new building blocks become a certainty, the sooner people down the line can start drawing up solutions using those blocks.

It would be a waste of time to have to wait for activation to be certain and start building. Things can be done in parallel if there’s good coordination and agreement.

There are better solutions than SLP, but on other blockchains. If they didn’t leave for those till’ now, that tells me they want to stick with BCH and so they certainly won’t leave because they’re getting some new building blocks, which will be optional to use.

3 Likes

Great to hear! Just single point:

I’ve actually heard plenty of influential people on BCH say continuing to maintain SLP v1 is a waste of time. They want to feel vindicated that they were right about SLP when xyz on-chain solution didn’t go through back in the day, they also don’t run any business that will be impacted by the lack of a token solution in the meantime.

I don’t want to name those people, but they know who they are, and if they are reading - I hope they follow my logic and not deprecate and refuse to support SLP v1

And if they are so “disgusted” by the state of current SLP, they should be reminded - nothing is perfect. Current SLP is a product of its time when Amaury was anti-tokens, it’s the best stop-gap we have right now, it works fine most of the time and we need it to tide over to the next phase of BCH.

Edit: The businesses have already suffered and many left last year over IFP. BCH loyalty is not eternal. If there is no community consensus on a path forward (new building blocks like you say), they will leave.

3 Likes