Author: Mark Friedenbach 2015-11-04 22:47:35
Published on: 2015-11-04T22:47:35+00:00
During the first Scaling Bitcoin workshop in Montreal, a presentation was given on the topic of "bad blocks" that take too long to validate. The design parameters of the system assume that validation costs scale linearly with transaction or block size when, in fact, certain kinds of transactions have quadratic scaling validation costs. Solutions like Gavin's code, which tracks bytes hashed and enforces separate limits for blocks over this aggregate value, exist but still leave gaps between average-case validation costs and worst case scenarios. Transaction selection and fee determination would become much more complicated as well. The presenter suggests using a linear function of the various costs involved to determine the "size" of the transaction instead. Others have suggested a "net-UTXO" metric to replace or supplement the validation-cost metric. This method would apply a cost in extra fee required to create unspent outputs, making dust outputs considerably more expensive than regular spend outputs. However, it widens the gap between average and worst case block validation times. The presenter will be submitting a talk proposal for Scaling Bitcoin on this topic and is open to feedback from the developer community.
Updated on: 2023-06-11T00:51:45.969370+00:00