Bitcoin Block Size Chart - BitInfoCharts

BCH blocks needs to be able to process 6.7 GB blocks in order to collect the same fee as BTC on average while guaranteeing that 0-conf would function during the biggest shopping days

  1. We assume that 0-conf is the method for fast transactions.
  2. For 0-conf to function well transactions must be included in the next transaction almost always. If it doesn’t a fee market is developed making 0-conf to expensive.
  3. In order for BCH to generate as much money to miners through fees as BTC the BCH blocks needs to be 850 times bigger than the BTC blocks, because BTC transactions are 850 times more expensive than BCH. This number was taken from coin.dance just now.
  4. BTC blocks are 1.21MB in size. This number was also just taken from coin.dance just now.
  5. VISA has an average of 1700 tps currently.: In 2011 the peak load for VISA was 11 000 tps. . This is comparing an average from 2019 with a peak in 2011. The peak is likely higher now but these are the numbers I could find. It gives us that the ratio of max/average conservatively estimated is 11000/1700 = 6.5 times higher than the average.
Now we can make a few calculations.
a. the average BCH block size needs to be 1.21MB * 850 = 1028 MB to collect the same fees that BTC is collecting today. b. In order for 0-conf to work reliably the max block size needs to be 6.5 times bigger than the average.
This means that BCH blocks needs to be able to process 1028 * 6.5 = 6.7 GB blocks in order to collect the same fee as BTC on average while guaranteeing that 0-conf would function during the biggest shopping days.
Please note, this is a reasoning about profitability and function. Not about how much transaction capacity that is needed.
submitted by N0tMyRealAcct to btc [link] [comments]

Gridcoin 5.0.0.0-Mandatory "Fern" Release

https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/5.0.0.0
Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that.
Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap.
We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout.
Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.

Highlights

Protocol

Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now.
Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date.
The transition height is also when the team requirement will be relaxed for the network.

GUI

Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.

Blockchain

The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use.
There are so many goodies here it is hard to summarize them all.
I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures.
The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!

Summary Changelog

Accrual

Changed

Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.

Removed

Beacons

Added

Changed

Removed

Unaltered

As a reminder:

Superblocks

Added

Changed

Removed

Voting

Added

Changed

Removed

Detailed Changelog

[5.0.0.0] 2020-09-03, mandatory, "Fern"

Added

Changed

Removed

Fixed

submitted by jamescowens to gridcoin [link] [comments]

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Public Proposal TL;DR:

Dragonchain has demonstrated twice Reddit’s entire total daily volume (votes, comments, and posts per Reddit 2019 Year in Review) in a 24-hour demo on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. At the time, in January 2020, the entire cost of the demo was approximately $25K on a single system (transaction fees locked at $0.0001/txn). With current fees (lowest fee $0.0000025/txn), this would cost as little as $625.
Watch Joe walk through the entire proposal and answer questions on YouTube.
This proposal is also available on the Dragonchain blog.

Hello Reddit and Ethereum community!

I’m Joe Roets, Founder & CEO of Dragonchain. When the team and I first heard about The Great Reddit Scaling Bake-Off we were intrigued. We believe we have the solutions Reddit seeks for its community points system and we have them at scale.
For your consideration, we have submitted our proposal below. The team at Dragonchain and I welcome and look forward to your technical questions, philosophical feedback, and fair criticism, to build a scaling solution for Reddit that will empower its users. Because our architecture is unlike other blockchain platforms out there today, we expect to receive many questions while people try to grasp our project. I will answer all questions here in this thread on Reddit, and I've answered some questions in the stream on YouTube.
We have seen good discussions so far in the competition. We hope that Reddit’s scaling solution will emerge from The Great Reddit Scaling Bake-Off and that Reddit will have great success with the implementation.

Executive summary

Dragonchain is a robust open source hybrid blockchain platform that has proven to withstand the passing of time since our inception in 2014. We have continued to evolve to harness the scalability of private nodes, yet take full advantage of the security of public decentralized networks, like Ethereum. We have a live, operational, and fully functional Interchain network integrating Bitcoin, Ethereum, Ethereum Classic, and ~700 independent Dragonchain nodes. Every transaction is secured to Ethereum, Bitcoin, and Ethereum Classic. Transactions are immediately usable on chain, and the first decentralization is seen within 20 seconds on Dragon Net. Security increases further to public networks ETH, BTC, and ETC within 10 minutes to 2 hours. Smart contracts can be written in any executable language, offering full freedom to existing developers. We invite any developer to watch the demo, play with our SDK’s, review open source code, and to help us move forward. Dragonchain specializes in scalable loyalty & rewards solutions and has built a decentralized social network on chain, with very affordable transaction costs. This experience can be combined with the insights Reddit and the Ethereum community have gained in the past couple of months to roll out the solution at a rapid pace.

Response and PoC

In The Great Reddit Scaling Bake-Off post, Reddit has asked for a series of demonstrations, requirements, and other considerations. In this section, we will attempt to answer all of these requests.

Live Demo

A live proof of concept showing hundreds of thousands of transactions
On Jan 7, 2020, Dragonchain hosted a 24-hour live demonstration during which a quarter of a billion (250 million+) transactions executed fully on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. This means that every single transaction is secured by, and traceable to these networks. An attack on this system would require a simultaneous attack on all of the Interchained networks.
24 hours in 4 minutes (YouTube):
24 hours in 4 minutes
The demonstration was of a single business system, and any user is able to scale this further, by running multiple systems simultaneously. Our goals for the event were to demonstrate a consistent capacity greater than that of Visa over an extended time period.
Tooling to reproduce our demo is available here:
https://github.com/dragonchain/spirit-bomb

Source Code

Source code (for on & off-chain components as well tooling used for the PoC). The source code does not have to be shared publicly, but if Reddit decides to use a particular solution it will need to be shared with Reddit at some point.

Scaling

How it works & scales

Architectural Scaling

Dragonchain’s architecture attacks the scalability issue from multiple angles. Dragonchain is a hybrid blockchain platform, wherein every transaction is protected on a business node to the requirements of that business or purpose. A business node may be held completely private or may be exposed or replicated to any level of exposure desired.
Every node has its own blockchain and is independently scalable. Dragonchain established Context Based Verification as its consensus model. Every transaction is immediately usable on a trust basis, and in time is provable to an increasing level of decentralized consensus. A transaction will have a level of decentralization to independently owned and deployed Dragonchain nodes (~700 nodes) within seconds, and full decentralization to BTC and ETH within minutes or hours. Level 5 nodes (Interchain nodes) function to secure all transactions to public or otherwise external chains such as Bitcoin and Ethereum. These nodes scale the system by aggregating multiple blocks into a single Interchain transaction on a cadence. This timing is configurable based upon average fees for each respective chain. For detailed information about Dragonchain’s architecture, and Context Based Verification, please refer to the Dragonchain Architecture Document.

Economic Scaling

An interesting feature of Dragonchain’s network consensus is its economics and scarcity model. Since Dragon Net nodes (L2-L4) are independent staking nodes, deployment to cloud platforms would allow any of these nodes to scale to take on a large percentage of the verification work. This is great for scalability, but not good for the economy, because there is no scarcity, and pricing would develop a downward spiral and result in fewer verification nodes. For this reason, Dragonchain uses TIME as scarcity.
TIME is calculated as the number of Dragons held, multiplied by the number of days held. TIME influences the user’s access to features within the Dragonchain ecosystem. It takes into account both the Dragon balance and length of time each Dragon is held. TIME is staked by users against every verification node and dictates how much of the transaction fees are awarded to each participating node for every block.
TIME also dictates the transaction fee itself for the business node. TIME is staked against a business node to set a deterministic transaction fee level (see transaction fee table below in Cost section). This is very interesting in a discussion about scaling because it guarantees independence for business implementation. No matter how much traffic appears on the entire network, a business is guaranteed to not see an increased transaction fee rate.

Scaled Deployment

Dragonchain uses Docker and Kubernetes to allow the use of best practices traditional system scaling. Dragonchain offers managed nodes with an easy to use web based console interface. The user may also deploy a Dragonchain node within their own datacenter or favorite cloud platform. Users have deployed Dragonchain nodes on-prem on Amazon AWS, Google Cloud, MS Azure, and other hosting platforms around the world. Any executable code, anything you can write, can be written into a smart contract. This flexibility is what allows us to say that developers with no blockchain experience can use any code language to access the benefits of blockchain. Customers have used NodeJS, Python, Java, and even BASH shell script to write smart contracts on Dragonchain.
With Docker containers, we achieve better separation of concerns, faster deployment, higher reliability, and lower response times.
We chose Kubernetes for its self-healing features, ability to run multiple services on one server, and its large and thriving development community. It is resilient, scalable, and automated. OpenFaaS allows us to package smart contracts as Docker images for easy deployment.
Contract deployment time is now bounded only by the size of the Docker image being deployed but remains fast even for reasonably large images. We also take advantage of Docker’s flexibility and its ability to support any language that can run on x86 architecture. Any image, public or private, can be run as a smart contract using Dragonchain.

Flexibility in Scaling

Dragonchain’s architecture considers interoperability and integration as key features. From inception, we had a goal to increase adoption via integration with real business use cases and traditional systems.
We envision the ability for Reddit, in the future, to be able to integrate alternate content storage platforms or other financial services along with the token.
  • LBRY - To allow users to deploy content natively to LBRY
  • MakerDAO to allow users to lend small amounts backed by their Reddit community points.
  • STORJ/SIA to allow decentralized on chain storage of portions of content. These integrations or any other are relatively easy to integrate on Dragonchain with an Interchain implementation.

Cost

Cost estimates (on-chain and off-chain) For the purpose of this proposal, we assume that all transactions are on chain (posts, replies, and votes).
On the Dragonchain network, transaction costs are deterministic/predictable. By staking TIME on the business node (as described above) Reddit can reduce transaction costs to as low as $0.0000025 per transaction.
Dragonchain Fees Table

Getting Started

How to run it
Building on Dragonchain is simple and requires no blockchain experience. Spin up a business node (L1) in our managed environment (AWS), run it in your own cloud environment, or on-prem in your own datacenter. Clear documentation will walk you through the steps of spinning up your first Dragonchain Level 1 Business node.
Getting started is easy...
  1. Download Dragonchain’s dctl
  2. Input three commands into a terminal
  3. Build an image
  4. Run it
More information can be found in our Get started documents.

Architecture
Dragonchain is an open source hybrid platform. Through Dragon Net, each chain combines the power of a public blockchain (like Ethereum) with the privacy of a private blockchain.
Dragonchain organizes its network into five separate levels. A Level 1, or business node, is a totally private blockchain only accessible through the use of public/private keypairs. All business logic, including smart contracts, can be executed on this node directly and added to the chain.
After creating a block, the Level 1 business node broadcasts a version stripped of sensitive private data to Dragon Net. Three Level 2 Validating nodes validate the transaction based on guidelines determined from the business. A Level 3 Diversity node checks that the level 2 nodes are from a diverse array of locations. A Level 4 Notary node, hosted by a KYC partner, then signs the validation record received from the Level 3 node. The transaction hash is ledgered to the Level 5 public chain to take advantage of the hash power of massive public networks.
Dragon Net can be thought of as a “blockchain of blockchains”, where every level is a complete private blockchain. Because an L1 can send to multiple nodes on a single level, proof of existence is distributed among many places in the network. Eventually, proof of existence reaches level 5 and is published on a public network.

API Documentation

APIs (on chain & off)

SDK Source

Nobody’s Perfect

Known issues or tradeoffs
  • Dragonchain is open source and even though the platform is easy enough for developers to code in any language they are comfortable with, we do not have so large a developer community as Ethereum. We would like to see the Ethereum developer community (and any other communities) become familiar with our SDK’s, our solutions, and our platform, to unlock the full potential of our Ethereum Interchain. Long ago we decided to prioritize both Bitcoin and Ethereum Interchains. We envision an ecosystem that encompasses different projects to give developers the ability to take full advantage of all the opportunities blockchain offers to create decentralized solutions not only for Reddit but for all of our current platforms and systems. We believe that together we will take the adoption of blockchain further. We currently have additional Interchain with Ethereum Classic. We look forward to Interchain with other blockchains in the future. We invite all blockchains projects who believe in decentralization and security to Interchain with Dragonchain.
  • While we only have 700 nodes compared to 8,000 Ethereum and 10,000 Bitcoin nodes. We harness those 18,000 nodes to scale to extremely high levels of security. See Dragonchain metrics.
  • Some may consider the centralization of Dragonchain’s business nodes as an issue at first glance, however, the model is by design to protect business data. We do not consider this a drawback as these nodes can make any, none, or all data public. Depending upon the implementation, every subreddit could have control of its own business node, for potential business and enterprise offerings, bringing new alternative revenue streams to Reddit.

Costs and resources

Summary of cost & resource information for both on-chain & off-chain components used in the PoC, as well as cost & resource estimates for further scaling. If your PoC is not on mainnet, make note of any mainnet caveats (such as congestion issues).
Every transaction on the PoC system had a transaction fee of $0.0001 (one-hundredth of a cent USD). At 256MM transactions, the demo cost $25,600. With current operational fees, the same demonstration would cost $640 USD.
For the demonstration, to achieve throughput to mimic a worldwide payments network, we modeled several clients in AWS and 4-5 business nodes to handle the traffic. The business nodes were tuned to handle higher throughput by adjusting memory and machine footprint on AWS. This flexibility is valuable to implementing a system such as envisioned by Reddit. Given that Reddit’s daily traffic (posts, replies, and votes) is less than half that of our demo, we would expect that the entire Reddit system could be handled on 2-5 business nodes using right-sized containers on AWS or similar environments.
Verification was accomplished on the operational Dragon Net network with over 700 independently owned verification nodes running around the world at no cost to the business other than paid transaction fees.

Requirements

Scaling

This PoC should scale to the numbers below with minimal costs (both on & off-chain). There should also be a clear path to supporting hundreds of millions of users.
Over a 5 day period, your scaling PoC should be able to handle:
*100,000 point claims (minting & distributing points) *25,000 subscriptions *75,000 one-off points burning *100,000 transfers
During Dragonchain’s 24 hour demo, the above required numbers were reached within the first few minutes.
Reddit’s total activity is 9000% more than Ethereum’s total transaction level. Even if you do not include votes, it is still 700% more than Ethereum’s current volume. Dragonchain has demonstrated that it can handle 250 million transactions a day, and it’s architecture allows for multiple systems to work at that level simultaneously. In our PoC, we demonstrate double the full capacity of Reddit, and every transaction was proven all the way to Bitcoin and Ethereum.
Reddit Scaling on Ethereum

Decentralization

Solutions should not depend on any single third-party provider. We prefer solutions that do not depend on specific entities such as Reddit or another provider, and solutions with no single point of control or failure in off-chain components but recognize there are numerous trade-offs to consider
Dragonchain’s architecture calls for a hybrid approach. Private business nodes hold the sensitive data while the validation and verification of transactions for the business are decentralized within seconds and secured to public blockchains within 10 minutes to 2 hours. Nodes could potentially be controlled by owners of individual subreddits for more organic decentralization.
  • Billing is currently centralized - there is a path to federation and decentralization of a scaled billing solution.
  • Operational multi-cloud
  • Operational on-premises capabilities
  • Operational deployment to any datacenter
  • Over 700 independent Community Verification Nodes with proof of ownership
  • Operational Interchain (Interoperable to Bitcoin, Ethereum, and Ethereum Classic, open to more)

Usability Scaling solutions should have a simple end user experience.

Users shouldn't have to maintain any extra state/proofs, regularly monitor activity, keep track of extra keys, or sign anything other than their normal transactions
Dragonchain and its customers have demonstrated extraordinary usability as a feature in many applications, where users do not need to know that the system is backed by a live blockchain. Lyceum is one of these examples, where the progress of academy courses is being tracked, and successful completion of courses is rewarded with certificates on chain. Our @Save_The_Tweet bot is popular on Twitter. When used with one of the following hashtags - #please, #blockchain, #ThankYou, or #eternalize the tweet is saved through Eternal to multiple blockchains. A proof report is available for future reference. Other examples in use are DEN, our decentralized social media platform, and our console, where users can track their node rewards, view their TIME, and operate a business node.
Examples:

Transactions complete in a reasonable amount of time (seconds or minutes, not hours or days)
All transactions are immediately usable on chain by the system. A transaction begins the path to decentralization at the conclusion of a 5-second block when it gets distributed across 5 separate community run nodes. Full decentralization occurs within 10 minutes to 2 hours depending on which interchain (Bitcoin, Ethereum, or Ethereum Classic) the transaction hits first. Within approximately 2 hours, the combined hash power of all interchained blockchains secures the transaction.

Free to use for end users (no gas fees, or fixed/minimal fees that Reddit can pay on their behalf)
With transaction pricing as low as $0.0000025 per transaction, it may be considered reasonable for Reddit to cover transaction fees for users.
All of Reddit's Transactions on Blockchain (month)
Community points can be earned by users and distributed directly to their Reddit account in batch (as per Reddit minting plan), and allow users to withdraw rewards to their Ethereum wallet whenever they wish. Withdrawal fees can be paid by either user or Reddit. This model has been operating inside the Dragonchain system since 2018, and many security and financial compliance features can be optionally added. We feel that this capability greatly enhances user experience because it is seamless to a regular user without cryptocurrency experience, yet flexible to a tech savvy user. With regard to currency or token transactions, these would occur on the Reddit network, verified to BTC and ETH. These transactions would incur the $0.0000025 transaction fee. To estimate this fee we use the monthly active Reddit users statista with a 60% adoption rate and an estimated 10 transactions per month average resulting in an approximate $720 cost across the system. Reddit could feasibly incur all associated internal network charges (mining/minting, transfer, burn) as these are very low and controllable fees.
Reddit Internal Token Transaction Fees

Reddit Ethereum Token Transaction Fees
When we consider further the Ethereum fees that might be incurred, we have a few choices for a solution.
  1. Offload all Ethereum transaction fees (user withdrawals) to interested users as they wish to withdraw tokens for external use or sale.
  2. Cover Ethereum transaction fees by aggregating them on a timed schedule. Users would request withdrawal (from Reddit or individual subreddits), and they would be transacted on the Ethereum network every hour (or some other schedule).
  3. In a combination of the above, customers could cover aggregated fees.
  4. Integrate with alternate Ethereum roll up solutions or other proposals to aggregate minting and distribution transactions onto Ethereum.

Bonus Points

Users should be able to view their balances & transactions via a blockchain explorer-style interface
From interfaces for users who have no knowledge of blockchain technology to users who are well versed in blockchain terms such as those present in a typical block explorer, a system powered by Dragonchain has flexibility on how to provide balances and transaction data to users. Transactions can be made viewable in an Eternal Proof Report, which displays raw data along with TIME staking information and traceability all the way to Bitcoin, Ethereum, and every other Interchained network. The report shows fields such as transaction ID, timestamp, block ID, multiple verifications, and Interchain proof. See example here.
Node payouts within the Dragonchain console are listed in chronological order and can be further seen in either Dragons or USD. See example here.
In our social media platform, Dragon Den, users can see, in real-time, their NRG and MTR balances. See example here.
A new influencer app powered by Dragonchain, Raiinmaker, breaks down data into a user friendly interface that shows coin portfolio, redeemed rewards, and social scores per campaign. See example here.

Exiting is fast & simple
Withdrawing funds on Dragonchain’s console requires three clicks, however, withdrawal scenarios with more enhanced security features per Reddit’s discretion are obtainable.

Interoperability Compatibility with third party apps (wallets/contracts/etc) is necessary.
Proven interoperability at scale that surpasses the required specifications. Our entire platform consists of interoperable blockchains connected to each other and traditional systems. APIs are well documented. Third party permissions are possible with a simple smart contract without the end user being aware. No need to learn any specialized proprietary language. Any code base (not subsets) is usable within a Docker container. Interoperable with any blockchain or traditional APIs. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js. Please see our source code and API documentation.

Scaling solutions should be extensible and allow third parties to build on top of it Open source and extensible
APIs should be well documented and stable

Documentation should be clear and complete
For full documentation, explore our docs, SDK’s, Github repo’s, architecture documents, original Disney documentation, and other links or resources provided in this proposal.

Third-party permissionless integrations should be possible & straightforward Smart contracts are Docker based, can be written in any language, use full language (not subsets), and can therefore be integrated with any system including traditional system APIs. Simple is better. Learning an uncommon or proprietary language should not be necessary.
Advanced knowledge of mathematics, cryptography, or L2 scaling should not be required. Compatibility with common utilities & toolchains is expected.
Dragonchain business nodes and smart contracts leverage Docker to allow the use of literally any language or executable code. No proprietary language is necessary. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js.

Bonus

Bonus Points: Show us how it works. Do you have an idea for a cool new use case for Community Points? Build it!

TIME

Community points could be awarded to Reddit users based upon TIME too, whereas the longer someone is part of a subreddit, the more community points someone naturally gained, even if not actively commenting or sharing new posts. A daily login could be required for these community points to be credited. This grants awards to readers too and incentivizes readers to create an account on Reddit if they browse the website often. This concept could also be leveraged to provide some level of reputation based upon duration and consistency of contribution to a community subreddit.

Dragon Den

Dragonchain has already built a social media platform that harnesses community involvement. Dragon Den is a decentralized community built on the Dragonchain blockchain platform. Dragon Den is Dragonchain’s answer to fake news, trolling, and censorship. It incentivizes the creation and evaluation of quality content within communities. It could be described as being a shareholder of a subreddit or Reddit in its entirety. The more your subreddit is thriving, the more rewarding it will be. Den is currently in a public beta and in active development, though the real token economy is not live yet. There are different tokens for various purposes. Two tokens are Lair Ownership Rights (LOR) and Lair Ownership Tokens (LOT). LOT is a non-fungible token for ownership of a specific Lair. LOT will only be created and converted from LOR.
Energy (NRG) and Matter (MTR) work jointly. Your MTR determines how much NRG you receive in a 24-hour period. Providing quality content, or evaluating content will earn MTR.

Security. Users have full ownership & control of their points.
All community points awarded based upon any type of activity or gift, are secured and provable to all Interchain networks (currently BTC, ETH, ETC). Users are free to spend and withdraw their points as they please, depending on the features Reddit wants to bring into production.

Balances and transactions cannot be forged, manipulated, or blocked by Reddit or anyone else
Users can withdraw their balance to their ERC20 wallet, directly through Reddit. Reddit can cover the fees on their behalf, or the user covers this with a portion of their balance.

Users should own their points and be able to get on-chain ERC20 tokens without permission from anyone else
Through our console users can withdraw their ERC20 rewards. This can be achieved on Reddit too. Here is a walkthrough of our console, though this does not show the quick withdrawal functionality, a user can withdraw at any time. https://www.youtube.com/watch?v=aNlTMxnfVHw

Points should be recoverable to on-chain ERC20 tokens even if all third-parties involved go offline
If necessary, signed transactions from the Reddit system (e.g. Reddit + Subreddit) can be sent to the Ethereum smart contract for minting.

A public, third-party review attesting to the soundness of the design should be available
To our knowledge, at least two large corporations, including a top 3 accounting firm, have conducted positive reviews. These reviews have never been made public, as Dragonchain did not pay or contract for these studies to be released.

Bonus points
Public, third-party implementation review available or in progress
See above

Compatibility with HSMs & hardware wallets
For the purpose of this proposal, all tokenization would be on the Ethereum network using standard token contracts and as such, would be able to leverage all hardware wallet and Ethereum ecosystem services.

Other Considerations

Minting/distributing tokens is not performed by Reddit directly
This operation can be automated by smart contract on Ethereum. Subreddits can if desired have a role to play.

One off point burning, as well as recurring, non-interactive point burning (for subreddit memberships) should be possible and scalable
This is possible and scalable with interaction between Dragonchain Reddit system and Ethereum token contract(s).

Fully open-source solutions are strongly preferred
Dragonchain is fully open source (see section on Disney release after conclusion).

Conclusion

Whether it is today, or in the future, we would like to work together to bring secure flexibility to the highest standards. It is our hope to be considered by Ethereum, Reddit, and other integrative solutions so we may further discuss the possibilities of implementation. In our public demonstration, 256 million transactions were handled in our operational network on chain in 24 hours, for the low cost of $25K, which if run today would cost $625. Dragonchain’s interoperable foundation provides the atmosphere necessary to implement a frictionless community points system. Thank you for your consideration of our proposal. We look forward to working with the community to make something great!

Disney Releases Blockchain Platform as Open Source

The team at Disney created the Disney Private Blockchain Platform. The system was a hybrid interoperable blockchain platform for ledgering and smart contract development geared toward solving problems with blockchain adoption and usability. All objective evaluation would consider the team’s output a success. We released a list of use cases that we explored in some capacity at Disney, and our input on blockchain standardization as part of our participation in the W3C Blockchain Community Group.
https://lists.w3.org/Archives/Public/public-blockchain/2016May/0052.html

Open Source

In 2016, Roets proposed to release the platform as open source to spread the technology outside of Disney, as others within the W3C group were interested in the solutions that had been created inside of Disney.
Following a long process, step by step, the team met requirements for release. Among the requirements, the team had to:
  • Obtain VP support and approval for the release
  • Verify ownership of the software to be released
  • Verify that no proprietary content would be released
  • Convince the organization that there was a value to the open source community
  • Convince the organization that there was a value to Disney
  • Offer the plan for ongoing maintenance of the project outside of Disney
  • Itemize competing projects
  • Verify no conflict of interest
  • Preferred license
  • Change the project name to not use the name Disney, any Disney character, or any other associated IP - proposed Dragonchain - approved
  • Obtain legal approval
  • Approval from corporate, parks, and other business units
  • Approval from multiple Disney patent groups Copyright holder defined by Disney (Disney Connected and Advanced Technologies)
  • Trademark searches conducted for the selected name Dragonchain
  • Obtain IT security approval
  • Manual review of OSS components conducted
  • OWASP Dependency and Vulnerability Check Conducted
  • Obtain technical (software) approval
  • Offer management, process, and financial plans for the maintenance of the project.
  • Meet list of items to be addressed before release
  • Remove all Disney project references and scripts
  • Create a public distribution list for email communications
  • Remove Roets’ direct and internal contact information
  • Create public Slack channel and move from Disney slack channels
  • Create proper labels for issue tracking
  • Rename internal private Github repository
  • Add informative description to Github page
  • Expand README.md with more specific information
  • Add information beyond current “Blockchains are Magic”
  • Add getting started sections and info on cloning/forking the project
  • Add installation details
  • Add uninstall process
  • Add unit, functional, and integration test information
  • Detail how to contribute and get involved
  • Describe the git workflow that the project will use
  • Move to public, non-Disney git repository (Github or Bitbucket)
  • Obtain Disney Open Source Committee approval for release
On top of meeting the above criteria, as part of the process, the maintainer of the project had to receive the codebase on their own personal email and create accounts for maintenance (e.g. Github) with non-Disney accounts. Given the fact that the project spanned multiple business units, Roets was individually responsible for its ongoing maintenance. Because of this, he proposed in the open source application to create a non-profit organization to hold the IP and maintain the project. This was approved by Disney.
The Disney Open Source Committee approved the application known as OSSRELEASE-10, and the code was released on October 2, 2016. Disney decided to not issue a press release.
Original OSSRELASE-10 document

Dragonchain Foundation

The Dragonchain Foundation was created on January 17, 2017. https://den.social/l/Dragonchain/24130078352e485d96d2125082151cf0/dragonchain-and-disney/
submitted by j0j0r0 to ethereum [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Two reasons why stable 10 minute block times averages are more important than a rather small long term schedule drift.

Before I open my two reasons, two preliminary facts:
  1. Current schedule drift is less than 1 yr per lifetime of Bitcoin Cash up to now.
  2. If we correct the oscillation and implement a stable new difficulty algorithm that is absolutely scheduled, we stop the future drift. We end up with the drift of the past meaning from wherever we base our absolute scheduling. That may mean we do not correct for past drift at all. To consider the question of whether we should, I want to put up two reasons why I think a stable 10 minute block time average is more important than a small long term schedule.
1. The average time will be a factor in dimensioning computer systems that build on Bitcoin Cash.
Right now, one might thinks the 10 minute block average is very big, because the network is not being heavily used and there is a quasi-consensus rule which implies most nodes will not accept blocks > 32mb.
But 32mb is far from the end goal of Bitcoin Cash on chain scaling. For Bitcoin Cash to succeed, we hope blocks will become MUCH bigger. This means the performance of relaying and processing blocks will become an important factor for node implementations.
Software needs hardware and hardware systems need to be dimensioned for the expected workload.
A stable 10-minute average allows easier dimensioning of the server hardware needed to deal with large-sized blocks.
As Gavin Andresen said: "design for success".
Therefore we should think about the case where blocks are large.
Now, what happens if we implement an algorithm where the blocks can take longer in, let's say, the next 5 years, but then suddenly the difficulty is dropped a bit so that they now effectively arrive faster?
Then your system which you dimensioned for e.g. 11.25 minutes / block would suddenly need to process more per time interval and might be under-dimensioned.
We could say: no problem - technology is going to improve anyway. The consensus around which block sizes are allowed needs to move toward higher block sizes anyway which will make current systems under-dimensioned - perhaps much faster.
But is it really necessary to add another complicating factor to this already complex calculation by implementing a diminishing block time average?
2. Enabling changes to monetary policy such as influencing the emission schedule opens the Overton window to including less well considered monetary policy changes.
In the less harmful scenario, something like drift correction would be well motivated, its benefits and risks laid out clearly, discussed and eventually accepted by Bitcoin Cash (stake)holders.
In the more harmful scenario, such changes would be pushed through without much discussion by a small group which demonstrates that the protocol is easy to change in ways that don't need to be well motivated and largely risk-free and may even disproportionately benefit a certain subset of stakeholders. This would not be a good selling point for Bitcoin Cash.
Those are the two reasons I want to bring up for considering the stability of the "10 minute block time average" as a more important point than correcting for a drift which compared to long term emission (> 100 years) is less than 1%.
submitted by Pablo_Picasho to btc [link] [comments]

How Bitcoin Mining Works

When you hear about bitcoin “mining,” you envisage coins being dug out of the ground. But bitcoin isn’t physical, so why do we call it mining?
Similar to gold mining, bitcoins exist in the protocol’s design just as the gold exists underground, but they haven’t been brought out into the light yet, just as the gold hasn’t yet been dug up.
The bitcoin protocol stipulates that a maximum of 21 million bitcoins will exist at some point. What miners do is bring them out into the light, a few at a time. Once miners finish mining all these coins, there won’t be more coins rolling out unless the bitcoin protocol changes to allow for a larger supply. Miners get paid in transaction fees for creating blocks of validated transactions and including them in the blockchain.
To understand how bitcoin mining works, let’s backtrack a little bit and talk about nodes. A node is a powerful computer that runs the bitcoin software and fully validates transactions and blocks. Since the bitcoin network is decentralized these nodes are collectively responsible for confirming pending transactions.
Anyone can run a node—you just download the free bitcoin software. The drawback is that it consumes energy and storage space – the network at time of writing takes hundreds of gigabytes of data. Nodes spread bitcoin transactions around the network. One node will send information to a few nodes that it knows, who will relay the information to nodes that they know, etc. That way, the pending transaction ends up getting around the whole network pretty quickly.
Some nodes are mining nodes,usually referred to as miners. These chunk outstanding transactions into blocks and add them to the blockchain. How do they do this? By solving a complex mathematical puzzle that is part of the bitcoin program, and including the answer in the block.
The puzzle that needs solving is to find a number that, when combined with the data in the block and passed through a hash function (which converts input data of any size into output data of a fixed length, produces a result that is within a certain range.
For trivia lovers, this number is called a “nonce”, which is an abbreviation of “number used once.” In the blockchain, the nonce is an integer between 0 and 4,294,967,296.
How do they find this number? By guessing at random. The hash function makes it impossible to predict what the output will be. So, miners guess the mystery number and apply the hash function to the combination of that guessed number and the data in the block. The resulting hash starts with a certain number of zeroes. There’s no way of knowing which number will work, because two consecutive integers will give wildly varying results. What’s more, there may be several nonces that produce the desired result, or there may be none. In that case, the miners keep trying but with a different block configuration.
The difficulty of the calculation (the required number of zeros at the beginning of the hash string) is adjusted frequently, so that it takes on average about 10 minutes to process a block.
Why 10 minutes? That is the amount of time that the bitcoin developers think is necessary for a steady and diminishing flow of new coins until the maximum number of 21 million is reached (expected some time in 2140).
The first miner to get a resulting hash within the desired range announces its victory to the rest of the network. All the other miners immediately stop work on that block and start trying to figure out the mystery number for the next one. As a reward for its work, the victorious miner gets some new bitcoin.
At the time of writing, the reward is 6.25 bitcoins per block, which is worth around $56,000 in June 2020.
However, it’s not nearly as cushy a deal as it sounds. There are a lot of mining nodes competing for that reward, and the more computing power you have and the more guessing calculations you can perform, the luckier you are.
Also, the costs of being a mining node are considerable, not only because of the powerful hardware needed, but also because of the large amounts of electricity consumed by these processors.
And, the number of bitcoins awarded as a reward for solving the puzzle will decrease. It’s 6.25 now, but it halves every four years or so (the next one is expected in 2024). The value of bitcoin relative to cost of electricity and hardware could go up over the next few years to partially compensate for this reduction, but it’s not certain.
If you’ve made it this far, then congratulations! There is still so much more to explain about the system, but at least now you have an idea of the broad outline of the genius of the programming and the concept. For the first time we have a system that allows for convenient digital transfers in a decentralized, trust-free and tamper-proof way.
submitted by hackatoshi to u/hackatoshi [link] [comments]

Themis (MIS) Launches Pledge Mining Platform, New Opportunity Occurs to Grow Wealth

Themis (MIS) Launches Pledge Mining Platform, New Opportunity Occurs to Grow Wealth
With the development of blockchain technology, obtaining data on the chain only is no longer satisfying and how to bridge the real world and the blockchain world has always been the direction of the technological breakthrough. Under this background, Oracle Machine came to our attention. In particular, with the popularity of the DeFi concept, the industry starts to witness a boom of the application of Oracle Machine in financial derivatives, trading platforms, gambling games, and prediction markets.
At present, Oracle Machine represented by Themis is developing fast with a good momentum, leading the trend of the development of Oracle Machine and continuing to consolidate the basic technical support for the DeFi revolution. Themis’ mining system has been launched in the market, which is refreshing and appealing (see https://themisoracle.com/#/credit for details on the Themis mining).
90% of MIS, the native token of Themis, will be used for mining output. The entire mining mechanism runs through a distributed oracle protocol, which sets up three roles: data provider, data validator, and arbitration node. Reward and punishment mechanisms are applied to ensure the smooth ecological operation.
How does Themis mining work? Is it a new way to become wealthy? What are the characteristics? To answer these questions, we need to analyse the distribution mechanism, mining mechanism, and token value of Themis.
With a fairer mining mechanism, small and medium-sized miners can enjoy better benefits
One of the core values of blockchain is fairness and justice, and allowing everyone in the network to play a role in the system without permission. However, Bitcoin mining is now monopolized by several mining machine vendors such as Bitmain, leaving little space for other miners to participate. If those old PoW public chains, such as Bitcoin, has formed the head effect in mining, what about those new projects? Let's take Cosmos as an example. Since Binance joined its validator node, it has instantly ranked top with the strong financial strength and user base of the top exchange, making the small and medium nodes hard to participate.
After comparison, we can find that the mining mechanism of MIS is very friendly to ordinary users. Assuming that there are 12 mining transactions in a block, the ranking according to the MIS pledged by each transaction would be as follow:
https://preview.redd.it/1kfccgps2pg51.png?width=832&format=png&auto=webp&s=bf6c7f614c600826006bc2bf8a6026292c3b328c
The pledge ranking is based on the jump ranking weighting algorithm rather than the weighted average of the user pledge amount, which can prevent MIS from being controlled by a small number of people, avoid monopoly, creating a win-win situation in the Themis community.

https://preview.redd.it/pme9tcd62pg51.png?width=832&format=png&auto=webp&s=049f899d2a5ee3ce64007d5cc0ae3ed6167c2b3a
Compared with other mining projects, Themis has introduced a unique pledge ranking method in the mining design. Users in the best ranking area will get the most benefits, which is a good mechanism guarantee for attracting more users to participate in mining. At the same time, it can lead to the decentralization of data providers, ensuring the decentralization of the oracle system and the positive development of the community.
How can miners join in Themis mining? The answer is to become a part of the ecology by playing the role of either data provider, data validator, or arbitration node.
The data provider is mainly responsible for providing various types of data, and the data validator verifies and challenges the data offered by the data provider and provides new data. The arbitration node arbitrates the query raised by the data validator and come up with the final result.
Both the data providers and validators of Themis need to pledge MIS to obtain the qualifications, and the caller of external data also needs to pay MIS assets when accessing the data of Themis oracles. If the data has been verified as correct, data providers and validators will receive mining rewards, and the more they pledge, the more rewards they will receive.
In the mining design of Themis, miners can acquire MIS by providing verifiable random number or offering the price of in-chain assets. Whenever miners call mining contracts, the system will charge no service fee (excluding the service fee of ETH). In addition, if no mining transaction occurs within a certain period of time, the first newly-emerging block containing mining transactions will acquire all the MIS rewards. In this way, miners can be encouraged to continue mining and maintain the ecological stability of Themis.
The number of MIS mining for each mining transaction of miners is calculated as follows:
First, calculate the number of MIS mining rewards N contained in the block of the packaged mining transaction. If the height difference between the block and the previous block containing the mining transaction is y, then N = y * 20.
The MIS mining quantity of this mining transaction is M, then M=Xi/(📷)×N. Among them, X is the ranking of the MIS pledge amount in the block, and those who pledge the same amount of MIS have the same ranking.
Few official pre-mining, while 90% belongs to the community
Based on the official announcement, the distribution of MIS is:
The total amount of MIS is 1 billion, 10% is reserved for early project promotion, the remaining 90% are produced by mining, in which 75% are directly awarded to data providers, 10% to developers, and 5% as reward for arbitration nodes and ecological incentive. The production of mining will be progressively decreased and released with ETH. For some current popular VC-invested projects, institutional holdings hold more than half of blocks and unlock the block every month, which is a huge stress for ordinary pledge users. Many projects also went wrong because institutional investors do not abide by the rules. For MIS, because there is fewer official pre-mining, the selling pressure will be smaller, which is more in line with the value of the blockchain.
The release plan of developer and arbitration node and ecological incentive is as follows:

https://preview.redd.it/nld8k8gb2pg51.jpg?width=926&format=pjpg&auto=webp&s=8c2435c993cf86b2bf6b0c4d2a1935708734de97
The release plan of data provider incentive is as follows:

https://preview.redd.it/kgia5n6d2pg51.jpg?width=982&format=pjpg&auto=webp&s=ad1bfe7796fdaec58f3caede2f2a2083c0a07724
The MIS awarded per block reduces by 10% in every 4 million blocks, and the reward per block at present is 20 MIS.
We can see that the allocation of MIS follows the following principles.
First of all, as MIS is the platform certificate of Themis, it is very reasonable to reserve 10% of MIS for early project promotion.
Secondly, 90% of MIS is produced through sustainable mining. This proportion can motivate contract users and miners to conduct contract mining, truly implementing the spirit of win-win community and token economy.
Finally, among the 90% of MIS, better incentive mechanisms have been adapted, mining reward ratios are subdivided, which can attract more investors to participate in mining.
Reasonable mining mechanism highlights the project value of Themis
Themis, as a public chain that provides a mechanism to solve the problems in Oracle Machine, has a unique charm in the value of MIS.
From the perspective of the number of tokens, the total amount of MIS is 1 billion, and the total mining pool is 900 million. 90% of the tokens are generated by mining, and the mining output gradually decreases its release with the Ethereum block, showing a great potential in its future added value. The earlier you participate in mining, the more profit you can gain.
From the perspective of Themis’s ecological design, Themis is committed to the original intention of building a price oracle. The data provider pays on-chain fees and pledges a certain amount of MIS, and determines the income obtained according to the scale of the pledge; the validator can make profit from challenging the data. Also, any smart contract developer or user need to pay the corresponding fee when calling Themis, and this part of the profit will be distributed to the data provider in proportion. Through this design, a logical closed loop is completed to ensure the healthy operation of the entire ecology and achieve the goal of mutual benefit. In Themis, all parties in the ecology can work together to grow more wealth.
In all, MIS has a huge potential for future development and arbitrage, and of course, a great profit potential as well.
Today, public chains like Themis are not just a technology platform, but also a symbol of future economic operation mode which connect between the blockchain and the real world. Themis, with a fair, justice and open network through mining, is building a strong token ecology, connecting external chain data and the systems, realising data interaction between blockchain and the real world, and more importantly, creating a new mode of token economy.
submitted by ThemisOracle to u/ThemisOracle [link] [comments]

Doing the Math on the S4E10 eCoin Transaction...

In this week's Mr Robot episode, Darlene sits on a park bench with Dom, and distributes the money she stole from the Deus Group to everybody, evenly. I timed the transaction as it happened in the show. It was 24 seconds, between her hitting return and seeing the following message on her screen: "*Transfers Complete. All Wallets Updated*" This processing time includes a message that says, "cleaning coins through crypto tumbler". It took 1 minute and 16 seconds for the transaction to tumble, process, and for the recipients to begin to get notices that they received money in their accounts.
If you have worked with bitcoin, you know that cryptocurrency does not work like this. Transferring money is a slow and sometimes expensive process, as transaction fees eat into every transaction. I know that eCoin isn't necissarily bitcoin, because it's controlled by eCorp, but it's fun to think about what happens if eCoin works like bitcoin does today...
How much money was transferred?
According to Forbes, the most wealthy people in the world are worth a combined $8.7 trillion, or $2.7 trillion. It depends on which Forbes list you are looking at. On the actual Forbes web site, they say the richest people in the world are worth $8.7 trillion, but they do not state how many of the richest people in the world are worth that much. If you look at sites like Victor Media, they publish a table of the 100 most wealthy people, and say they got the list from Forbes. They probably did purchase the list from Forbes. If I put the Victor Media list into excel, and add all the values in the net worth column, that number comes out to $2.7 trillion. So Forbes might be talking about a list that is more than the top 100 people, and sell the top 100 people list to sites like Victor Media? I don't know.
Either way, we are talking about somewhere between $2.7 and $8.7 trillion.
How many people did the money go to?
That's complicated. There was no global montage showing people celebrating all over the world (which I found a little surprising, even though I still love how this episode was shot). The only indication of a truly global transfer, to every individual in the world, is a TV screen in the airport saying that, "Global eCoin Payout... Deus group collapses as wealth spreads around the world." So Darlene could have sent the money to every individual with an eCoin wallet in the world, or she could be sending them to every American, or to everybody in the developed world. I doubt the average rice farmer in Indonesia is really using eCoin, but it's possible. If she only sent it to every American, our wealth tends to spread around the globe pretty fast, so that's possible, too.
Lets work with World Bank population numbers for all three of these possibilities...
World Population: 7.6 billion people
Global North (AKA the developed world): 1.24 billion people
United States: 327 million people
So we have 6 possibilities for how much money was sent to each person...
People Total Money Money Per Capita Satoshis
7.6 billion $2.7 trillion $355.53 4,739,471
1.21 billion $2.7 trillion $2230.82 29,741,808
327 million $2.7 trillion $8252.65 110,079,512
7.6 billion $8.7 trillion $1145.60 15,279,332
1.21 billion $8.7 trillion $7188.22 95,883,716
327 million $8.7 trillion $26591.89 355,275,242
How much would this transaction cost with bitcoin?
Aside from the fact that eCoin probably functions differently than bitcoin, this is a very complex question. I'm definitely not as sure about these numbers as the other numbers I have, but I'll do my best to come up with useful, realistic numbers. If you are more familiar with the block chain than me, please correct me.
The coins were taken from 100 different Deus Group accounts. Lets say each transaction launders through a bitcoin tumbler 1,000 times. I'm going to ignore transaction fees for the tumbling process, because I don't fully understand the details of tumbling, but 1,000 times seems reasonable to me.
That means that there are 100 x 1,000 = 10,000 inputs in any transaction that spends all the money from the Deus group.
For outputs... for simplicity's sake, I will make the conservative assumption that everybody has one eCoin wallet. That means somewhere between 327 million and 7.6 billion outputs. Accounting for everybody having multiple wallets would make the transaction even bigger, but this is a good starting point to get a feel for what this transaction would look like, in the real world.
How long will this transaction take to process?
There is a bidding process and a bit of politics involved in processing a cryptocurrency transaction. For simplicity, I'll assume we bid enough that this transaction gets priority treatment from the bitcoin miners.
According to blockchain.com, transactions happen on the block chain at a rate of roughly 3.5 transactions per second. At that rate, the tumbling would take roughly 48 minutes, rather than the few seconds it took for Darlene to tumble this money.
According to buybitcoinworldwide.com's fee calculator, here are the transaction sizes, the transaction fees involved (in US Dollars), and the time it would take at 3.5 transactions per second...
Inputs Outputs Size Cost Time
10,000 7.6 billion 240.4737 Gb $38,884,280.55 68.85 years
10,000 1.21 billion 38.32587 Gb $6,192,571.09 10.96 years
10,000 327 million 10.35582 Gb $1,673,260.46 2.96 years
So this transaction would take years to go through, and it pays Evil Corp somewhere between $1.6 and $38 million. In the real world, most of that money would go to Chinese bitcoin miners.
What would the impact be?
A one time windfall of $327 per capita would probably not trigger hyperinflation in America. The largest payout we calculated was $26.5k, and I doubt that would cause hyperinflation, either. Regular inflation? Yes. Hyperinflation? Probably not.
It might lead to hyperinflation in other countries, though, because of differences in purchasing power.
Purchasing power parity is a number that describes the differences in the cost of goods and services around the world. $5 in America will buy you a big mac, but if you go to, say, Indonesia, you can buy a lot more with that $5, because Indonesia is full of people who make something like 25 cents a week.
OECD.org publishes PPP (purchasing power parity) numbers for countries all around the world. If you want to know how far your dollar will stretch, on average, in a foreign country, consult this list. If you have $100 in America, you can expect it to be worth $100 worth of American goods and services, so on the OECD table, it has a PPP of 1.0. If you take that $100 to, say, the UK, where the PPP is 0.7, you can expect that $100 to be worth $70 worth of goods and services. If you take that $100 to Australia, where the PPP is 1.48, you can expect that $100 to buy roughly $148 worth of goods and services.
If Elliot and Darlene were genius economists, I might expect them to account for PPP in their payout. They would have to be geniuses, to predict what PPP is doing after events like the 5/9 hack, because their best data would be out of date, so they would have to use all kinds of fancy regressions and tricks to figure out how that would work in such a volatile world economy. They definitely aren't economists, though, so I'll assume they sent the same nominal amount to everybody.
So what's the range on how much purchasing power this transaction gives people around the world? In 2018, the highest PPP number on the OECD list is Indonesia, with a PPP of 4,245.613140. The lowest PPP on the list is Lithuania, with a PPP of 0.457582. Lets see how this shakes out in each of these countries...
$ Per Capita Lithuania (0.46) Indonesia (4,245.61)
$355.53 $162.68 $1,509,442.84
$2,230.82 $1,020.78 $9,471,198.70
$8,252.65 $3,776.26 $35,037,559.28
$1,145.60 $524.20 $4,863,774.41
$7,188.22 $3,289.20 $30,518,401.29
$26,591.89 $12,167.97 $112,898,877.60
What would this cause? People might predict a lot of different things. The Yang gang people probably strong opinions on this. I have a bachelor's degree in economics, so I believe I can predict that most mainstream economists would predict the following...
In Lithuania, when they get a few hundred to a few thousand dollars, they probably raise a pint to F Society, then put the rest towards a house or car payment, or buy themselves something nice. Minor inflation would happen, probably starting at the pubs, and that would worry financial types, but it would not cause any kind of major economic catastrophe.
In Indonesia, where everybody becomes an asset millionaire overnight, they will probably have hyperinflation, mass social upheaval, and violence.
In conclusion...
TL;DR: What Darlene did last night with eCoin isn't actually possible with bitcoin, and the impact in America might not be as great as you think, but the impact would be much bigger in poorer parts of the world.
submitted by bubblesort to MrRobot [link] [comments]

Addressing Common Arguments For Limiting BTC's Block Size

For a while, I've seen many BTC maximalists bring up arguments about why the block size for Bitcoin should be limited to 1 MB. I have made this post to address most of these arguments. If you disagree, feel free to make your point in the comments!

Limiting block size is what helps keep nodes cheap, and helps decentralize Bitcoin.
Let's do some math here...
With the block size of BTC being 1.00 MB, and having ~144 blocks a day, 365 days a year, there are roughly 52,560 blocks in a year. Using this data, 52.5 GB of storage will be used up in an entire year (we'll make the assumption that someone running a node buys 1 hard drive a year to store all this data).
Looking at Amazon, the average cost for 64.0 GB of storage capacity for a flash drive is roughly $10.00. This means on average, someone running a node is paying roughly 80 cents per month for storage. Okay, now let's look at the internet aspect of things. The average internet speed globally is around ~75 Mbps (which is more than enough for both BTC and BCH) and will likely run for around ~$40 a month (this is a rough figure, and slightly pessimistic, but let's take it). Therefore, doing some math:
($40.00/month + $0.80/month) x 12 months = ~$490.00/year
Okay, so it roughly costs $490.00 a year which is just a little over $1/day for running a node. Let's see how much more expensive BCH is when running the same type of node:
For BCH, everything stays the same, except for storage costs. Since the block size is 32 times bigger than BTC, doing the math, BCH will take up roughly 1.7 TB of data. For a 2 TB hard drive, the cost is roughly $60. For an entire year, that will cost about $5 per month for storage.
Taking this into consideration, we can calculate how much it will cost to run a BCH node for storage and internet:
($40.00/month + $5.00/month) x 12 months = ~$540.00/year
So in conclusion:
BTC Node BCH Node
Price (yearly) $490.00 $540.00
Price (monthly) ~$40.80 ~$45.00
As we can see, it really isn't that much more expensive, and this isn't even factoring in how much cheaper digital storage will become over time. As digital storage becomes bigger, we can also expand block size, and not have to worry about centralization.

The market has decided that BTC is better, therefore BCH is not Bitcoin.
While yes, based on hashing power, this is true, Bitcoin being Bitcoin is not about hashing power. It is about what Bitcoin was intended to do. Bitcoin was created by Satoshi as a form of peer-to-peer electronic cash system. Even in the whitepaper of Bitcoin, Bitcoin is not working the way it was intended to. From the whitepaper:

The cost of mediation increases transaction costs, limiting the minimum practical transaction size and cutting off the possibility for small casual transactions.
It says it right here, one of the issues with current forms electronic payments is high transaction fees, and how they make small, everyday purchases expensive, making it bad for regular, everyday purchases.
Currently, looking at the fees, BTC costs roughly $0.50 for every transaction (fees vary every single block, but this is the current average), regardless of the transaction amount. That means if I'm making a purchase at a coffee shop for $2.00, it is going to cost me $2.50 effectively for the coffee. That means that I am paying 25% of my transaction value just to transfer my own money. What incentive would I have to make that purchase, especially when I could just use normal cash, and not pay ridiculously high fees for a normal transaction?
Let's compare this to BCH. Right now, the average fee for BCH is about $0.0025 for every transaction. When comparing that even to a $2 purchase, the fee is negligible and makes effectively no difference to the transaction amount. As we can see, BCH is far cheaper for everyday normal transactions, a.k.a. electronic cash.

Bitcoin only has high transaction fees because of the higher transaction volume, and Bitmain has spammed transactions to make BTC look bad.
As far as I know, I don't recall Bitmain spamming transactions on the network (I could be wrong on this). If someone has evidence of this, I will gladly retract this. As for transaction volume (number of transactions), we can use comparable numbers from when BCH and BTC were both having extremely high transaction volumes:
Date (DD/MM/YYYY) No. of transactions (BTC) No. of transactions (BCH) Average Transaction Fee (BTC) Average Transaction Fee (BCH)
19/08/2018 167k 129k $0.65 $0.0033
15/11/2018 241k 688k $0.74 $0.0018
19/11/2018 268k 283k $0.79 $0.0007
20/11/2018 288k 329k $1.11 $0.0006
01/09/2019 285k 575k $0.68 $0.0007
Note: The peak fees for both blockchains were $52.00 for BTC and $0.90 (which is still bad for BCH. The difference is that BCH has taken steps to ensure that kind of transaction fee would never happen again, even faced with the same amount of traffic on the network.)

The Lightning Network (an off-chain solution) is a better solution to Bitcoin's current problem than increasing the block size (an on-chain solution), and has a much higher transactions per second capability than BCH.
Yes, the Lightning Network may have a higher transaction per second capability when compared to BCH, but it comes at a cost: centralization.
The aim of Bitcoin was to make a peer-to-peer electronic cash system with a high transaction per second capability, but it also is supposed to have 3 distinct properties to it. Bitcoin should also be:
  1. Cheap (fees should be negligible, no matter how low the transaction amount)
  2. Decentralized
  3. Secure
When you take away any one of these characteristics, it becomes A LOT easier to make a currency with a higher transaction input capability, but it ignores the goal of what Bitcoin is supposed to be. For example, if you have a system of cash that is:

Cheap and secure, but not decentralized:
XRP (Ripple)
Credit Cards
Paypal
Lightning Network

Cheap and decentralized, but not secure:
LTC (Litecoin)
(DOGE) Dogecoin
Plenty of other low-use altcoins

Secure and Decentralized, but not cheap:
BTC (Bitcoin)
XMR (Monero)

BCH manages to have all 3 characteristics, all while having a transaction capability of more than 200 transactions per second. Not to mention that setting up a node on the Lightning Network is a complicated, tedious, and painful process to go through, just to put your fund somewhere where they aren't safe (you risk losing your funds pretty easily, especially if you're an everyday person who doesn't have much knowledge when it comes to technology). Not only is this the case, but eventually the funds from the Lightning Network will have to be settled on the blockchain, and when adoption increases, the fees will increase as well, meaning that you will be charged a ridiculously high amount for withdrawing your own money.
To add to this, nodes that are run by people with more resources will eventually become Lightning Hubs, meaning that they are the only few who you can go through to send a transaction to whoever you want. This makes Lightning Hubs the new intermediaries for financial transactions. Does this all sound familiar? It is literally banking right now, but with the name 'Bitcoin' slapped on top of it.

Anyway, these are all the arguments I have heard from BTC maximalists. If you have any more arguments, feel free to comment them below, and I'm willing to change my mind if you make a good point.
submitted by 1MightBeAPenguin to btc [link] [comments]

A practical proposition to naturally make bitcoin better.

So I thought of an idea, if bitcoin has introduced the key concept of mining and exchanging gold into the internet, how about making also the concept of minting into coin with weight and purity also to it too? Like gold or silver, when we mine them we weight mined raw metal in grams or ounce and each little rocks have different purity. Those raw metal do have value but still is hard to transact with them (I believe we are like in this situation for bitcoin; we mine satoshis and the fees and price are constantly fluctuating.). To make economic exchange of those raw metal easy, we mint the raw metals into coins in a standard size, weight and purity.
Example the 1 oz 22 karats American Eagle gold coin. 22 Karat mean the coin has 91.67% pure gold and 8.33% is impurity. How about we present the equivalent concept for Bitcoin, we imagine a virtual coin with defined purity and weight? If we agree a virtual coin unit, let's call it vBTC, have always 0.2% average fee, and from that calculate the weight of 1 vBTC in satoshi?
The time writing this the price of bitcoin is 8,915.00 USD and the average fee is around 2.41 USD/TX or 27,500 satoshi/TX that 1 vBTC will have the weight of 27,500/0.002 = 13,750,000 satoshis value $1,223.09. And each day or bloc that weight go down or go up according to the network load. (here I chose the 0.2% fee because we can use sub unit, like 0.01 vBTC which have 20% fee if we desire it in the first bloc. or let it 0.2% if to be on any bloc) And when I send you a 1 vBTC, I don't worry about fees if is high or low, or if my coin will be in the block or not. 1 vBTC will always be in the next block. If a seller set a price in vBTC, like a software for 2.5 vBTC, if the network is loaded or not, if the price is up or down it doesn't matter. When I receive 2.5 vBTC, I know I own 99.8% of that and i will receive it at the next 10min.
We can do this without changing anything in the bitcoin protocol. It's just a virtual layer between us bitcoin users. This way we can separate the link between bitcoin and USD and all those prices speculators in those centralized exchange platforms. And use bitcoin as a way of exchange, not an asset that we just hold and wait the day it hit the moon.
Yet there is still another stunning thing we can do with that vBTC. It can solve the price fluctuation nightmare of bitcoin.
The time writing this 1 vBTC is 13,750,000 satoshis which is $1,223.09 and 1oz 22K American Eagle gold coin is around $1,766.50, which are approximately the similar level of value. How about if we say that 1 vBTC possess the same intrinsic value as a 1 oz of 22K gold coin? Like anywhere and anytime 1 vBTC can always be exchanged to 1oz of 22K gold coin? And we always transact 1 vBTC as was real 1oz 22K gold on internet?
It's a very hard to understand at first glance, but allow me to explain. Right now, the price of normal BTC is set by the law of supply and demand in those multiple centralized exchange platforms. If there is more buyers than sellers, the price go up, and if there is more sellers than buyers, the price go down. But that supply and demand system is already present on the vBTC. If there is more demand (load) on the network the average fees go up and the weight of 1 vBTC in satoshis go up too. If there is less, the fees go down and the weight of 1 vBTC go down too. So if we simply agree that 1 vBTC = 1 oz 22K Gold, the bitcoin network by itself will do all the work and converge to the perfect weight to accommodate that value equality at that exact time. And that without letting speculator decide the price of it in a fiat currency on those centralized exchange platforms.
This will be revolutionary. There will be no speculation what so ever, and no pricing in a Fiat currency and no anxious holding until hit the moon. We can see bitcoin as simple coins and we can price our goods and services as old time when gold minted coins was introduced. And this doesn't require any change in bitcoin protocol. All it needs is only we put the vBTC denomination into wallets and we all bitcoin users have the faith that 1 vBTC = 1 oz 22K Gold or arround that, and the bitcoin network will do the magic by itself.
Furthermore, this is a good way to make minting of real physical 1 vBTC in real 1oz 22K gold possible, and exchange as it was real bitcoins we touch with hands.
I will seek first to perform some excel spreadsheet calculation of the above. As the weight variation of vBTC (0.2% fees) since 2011 and price of each 1 vBTC in 1 oz Gold since 2011 then share it here.
submitted by babyass to Bitcoin [link] [comments]

Proof Of Work Explained

Proof Of Work Explained
https://preview.redd.it/hl80wdx61j451.png?width=1200&format=png&auto=webp&s=c80b21c53ae45c6f7d618f097bc705a1d8aaa88f
A proof-of-work (PoW) system (or protocol, or function) is a consensus mechanism that was first invented by Cynthia Dwork and Moni Naor as presented in a 1993 journal article. In 1999, it was officially adopted in a paper by Markus Jakobsson and Ari Juels and they named it as "proof of work".
It was developed as a way to prevent denial of service attacks and other service abuse (such as spam on a network). This is the most widely used consensus algorithm being used by many cryptocurrencies such as Bitcoin and Ethereum.
How does it work?
In this method, a group of users competes against each other to find the solution to a complex mathematical puzzle. Any user who successfully finds the solution would then broadcast the block to the network for verifications. Once the users verified the solution, the block then moves to confirm the state.
The blockchain network consists of numerous sets of decentralized nodes. These nodes act as admin or miners which are responsible for adding new blocks into the blockchain. The miner instantly and randomly selects a number which is combined with the data present in the block. To find a correct solution, the miners need to select a valid random number so that the newly generated block can be added to the main chain. It pays a reward to the miner node for finding the solution.
The block then passed through a hash function to generate output which matches all input/output criteria. Once the result is found, other nodes in the network verify and validate the outcome. Every new block holds the hash of the preceding block. This forms a chain of blocks. Together, they store information within the network. Changing a block requires a new block containing the same predecessor. It is almost impossible to regenerate all successors and change their data. This protects the blockchain from tampering.
What is Hash Function?
A hash function is a function that is used to map data of any length to some fixed-size values. The result or outcome of a hash function is known as hash values, hash codes, digests, or simply hashes.
https://preview.redd.it/011tfl8c1j451.png?width=851&format=png&auto=webp&s=ca9c2adecbc0b14129a9b2eea3c2f0fd596edd29
The hash method is quite secure, any slight change in input will result in a different output, which further results in discarded by network participants. The hash function generates the same length of output data to that of input data. It is a one-way function i.e the function cannot be reversed to get the original data back. One can only perform checks to validate the output data with the original data.
Implementations
Nowadays, Proof-of-Work is been used in a lot of cryptocurrencies. But it was first implemented in Bitcoin after which it becomes so popular that it was adopted by several other cryptocurrencies. Bitcoin uses the puzzle Hashcash, the complexity of a puzzle is based upon the total power of the network. On average, it took approximately 10 min to block formation. Litecoin, a Bitcoin-based cryptocurrency is having a similar system. Ethereum also implemented this same protocol.
Types of PoW
Proof-of-work protocols can be categorized into two parts:-
· Challenge-response
This protocol creates a direct link between the requester (client) and the provider (server).
In this method, the requester needs to find the solution to a challenge that the server has given. The solution is then validated by the provider for authentication.
The provider chooses the challenge on the spot. Hence, its difficulty can be adapted to its current load. If the challenge-response protocol has a known solution or is known to exist within a bounded search space, then the work on the requester side may be bounded.
https://preview.redd.it/ij967dof1j451.png?width=737&format=png&auto=webp&s=12670c2124fc27b0f988bb4a1daa66baf99b4e27
Source-wiki
· Solution–verification
These protocols do not have any such prior link between the sender and the receiver. The client, self-imposed a problem and solve it. It then sends the solution to the server to check both the problem choice and the outcome. Like Hashcash these schemes are also based on unbounded probabilistic iterative procedures.
https://preview.redd.it/gfobj9xg1j451.png?width=740&format=png&auto=webp&s=2291fd6b87e84395f8a4364267f16f577b5f1832
Source-wiki
These two methods generally based on the following three techniques:-
CPU-bound
This technique depends upon the speed of the processor. The higher the processor power greater will be the computation.
Memory-bound
This technique utilizes the main memory accesses (either latency or bandwidth) in computation speed.
Network-bound
In this technique, the client must perform a few computations and wait to receive some tokens from remote servers.
List of proof-of-work functions
Here is a list of known proof-of-work functions:-
o Integer square root modulo a large prime
o Weaken Fiat–Shamir signatures`2
o Ong–Schnorr–Shamir signature is broken by Pollard
o Partial hash inversion
o Hash sequences
o Puzzles
o Diffie–Hellman–based puzzle
o Moderate
o Mbound
o Hokkaido
o Cuckoo Cycle
o Merkle tree-based
o Guided tour puzzle protocol
A successful attack on a blockchain network requires a lot of computational power and a lot of time to do the calculations. Proof of Work makes hacks inefficient since the cost incurred would be greater than the potential rewards for attacking the network. Miners are also incentivized not to cheat.
It is still considered as one of the most popular methods of reaching consensus in blockchains. Though it may not be the most efficient solution due to high energy extensive usage. But this is why it guarantees the security of the network.
Due to Proof of work, it is quite impossible to alter any aspect of the blockchain, since any such changes would require re-mining all those subsequent blocks. It is also difficult for a user to take control over the network computing power since the process requires high energy thus making these hash functions expensive.
submitted by RumaDas to u/RumaDas [link] [comments]

Bitcoin as a Solar Battery Online CryptoCurrency Calculator with multi-Cryptocurrencies Simple Bitcoin Converter Best Moving Average Trading Strategy (MUST KNOW) - YouTube Bitcoin mining is a full time job How to start Bitcoin mining for beginners (SUPER EASY ...

Accurate Bitcoin mining calculator trusted by millions of cryptocurrency miners since May 2013 - developed by an OG Bitcoin miner looking to maximize on mining profits and calculate ROI for new ASIC miners. Updated in 2020, the newest version of the Bitcoin mining calculator makes it simple and easy to quickly calculate mining profitability for your Bitcoin mining hardware. Average Block Size (MB) Average Transactions Per Block. Total Number of Transactions. Median Confirmation Time. Average Confirmation Time. Mining Information. Network Activity . Wallet Activity. Market Signals. Sponsored Content. Median Confirmation Time The median time for a transaction with miner fees to be included in a mined block and added to the public ledger. 30 Days 60 Days 180 Days 1 ... Bitcoin Transaction Fees Explained in Detail. Bitcoin fees are a fascinating component of the network’s game theory and an indispensable element without which the whole project’s economic sustainability becomes questionable.. Whenever a transaction is sent, miners demand for an arbitrary amount of bitcoin fractions (denominated in satoshis, the hundred millionth part of 1 BTC) so that they ... Average Block Size (MB) Average Transactions Per Block. Total Number of Transactions. Median Confirmation Time. Average Confirmation Time. Mining Information. Network Activity . Wallet Activity. Market Signals. Sponsored Content. Average Confirmation Time The average time for a transaction with miner fees to be included in a mined block and added to the public ledger. 30 Days 60 Days 180 Days ... Block Size Calculator. What is the effect of changing the bitcoin MAX_BLOCK_SIZE? Historical Context. Load Data (large file) Loading... may take quite a while Between and Large charts Logarithmic Scale 1053 blocks with an average block size of 660 kB (353 txs) and an average mining rate of 1 block every 600 seconds. New Block Size Simulate a Max Block Size of MB A megabyte is 1024 × 1024 ...

[index] [3505] [42903] [48257] [417] [46540] [43307] [19564] [23698] [41186] [4217]

Bitcoin as a Solar Battery

In this two part video tutorial, Trading 212 shows you how to trade moving averages. In the first video you will learn what moving averages are and how they ... Crypto exchange rate calculator helps you convert prices online between two currencies in real-time. Online CryptoCurrency Calculator with multi-Cryptocurrencies. Cryptocurrency converter, calculator. Top Bitcoin Core Dev Greg ... block size, and more - Duration: 55:04. The Bitcoin Foundation Recommended for you. 55:04. Off Grid Solar Power - How to Calculate Your Needs - Duration: 8:48 ... People say that how much muscle you can build is entirely determined by your genes. Could that be true? While the shape of your muscles, and overall skele... In this video we show you the BEST moving average trading strategy that will take your trading to next level. The specific moving average we use is the 50 EM...

#