07-23 13:32 - 'Can somebody help me reframe this research question? "To what extent public-key cryptography is viable for bitcoin transactions? "' (self.Bitcoin) by /u/Parthgada1 removed from /r/Bitcoin within 417-427min
ENCRYPTOPEDIA - Scientific Research and White Papers on Bitcoin, Blockchain Technology, Cryptography, Algorithms and Cryptocurrencies
There is a wide choice of research and academic archives on the web, and some boast a sizable amount of cryptography and/or cryptocurrency material. There are also a number of known lists of bitcoin and cryptocurrency-related research papers, but those mostly appear in blog posts, media articles and shared documents. We felt that an easy to navigate, searchable archive of all things crypto was missing: http://encryptopedia.com Currently exposing over three hundred papers, while constantly being updated with new research and info as soon as it becomes publicly available, Encryptopedia aims to be the central hub for cryptocurrency-related research. Encryptopedia covers a wide range of context, from ground-breaking baseline research on economy, networks and cryptography to recent altcoin and crypto 2.0 technologies. Publications from major corporations, foundations, banks and/or government bodies are also included if they contain references to cryptocurrencies and/or related technologies. We are always crawling the web for important papers that we might have missed, while combing references from papers already in the archive. Please help expand the size and scope of Encryptopedia by suggesting a relevant paper, through the available contact form on the website or via email at [email protected]. We welcome your non-content suggestions at [email protected] Navigation options: searching by keywords is encouraged, but one can search by referencing tags or dates instead. There is also an alphabetical list of all available papers on the dashboard. Ratings/comments: discussion on the papers is beyond the scope of this website - the bitcoin forum, reddit, stackexchange, and many others are great places towards that goal.. Therefore in a conscious decision to keep this as clean and as simple as possible, ranking and comments are disabled. Post dates: the date of publication of each paper is used. Whenever the actual day of publication is not known, the first day of the month is used. Copyrights: Encryptopedia simply aggregates links, there is no content held of our servers. All copyrights go to the respective authors of each paper. Please read our formal disclaimer for more information. http://encryptopedia.com
Ultimate glossary of crypto currency terms, acronyms and abbreviations
Security/privacy researcher here. Time for a nitpicking post! I just watched last episode and, while I did not watch the show for its technical accuracy (after all, the whole show is based on the idea that Richard breaks a fundamental limit on compression), I felt particularly aghast as to how their discovery is treated. First of all, I'm going to reassure you: an IA suddenly learning by itself to handle new tasks is extremely far from the state of art. Current "IAs" are task-specific and will always run within the limits of the (simple) system they've been built for. But that's not the problematic part. Apparently, their IA learns to solve NP-hard problems in polynomial time. As depicted by the show, that is a HUGE deal. But the rest of the depiction is wrong. Let's sum up. They basically learn that:
2. they have a proof of it
3. this proof is a working implementation
And their conclusion is that this is a bad thing and that they should hide it because it would "be the end of privacy". It's a bit more complicated than that. They just reached a huge milestone on theoretical CS which have deep implications. The fact that it breaks all current implementations of cryptography is not a problem in itself. In fact, if they manage to proof that modern cryptography is based on a false premise (and I wouldn't like to scare you, but in reality, we actually don't know whether this premise is true or not), the problem is not the proof. No, they did not "build a monster". They proved that cryptography is built on quicksands. Hiding the truth won't change that, because it will eventually come out anyway. Instead, an ethical way to handle this situation would be to working on fixing this problem. Their release being shipped does not mean that anyone will notice that their system breaks ciphers on-the-fly. Richard had to dig into the system's logs, which nobody outside the company has access to, to notice and understand what was happening. In fact, it could take years before anyone notices. Cryptosystems handling the problem of a sudden computational power breaking ciphers in polynomial time do exist, e.g. quantum cryptography. They may still be experimental, but hey, Pied Piper now has a huge load of money, and they have time to work on the issue since they're the only ones to know about it, so spending the company's money to move to a new form of cryptography would be the most ethical way to handle this situation. But instead of that, they just bury their heads in the sand. Ironically, it is hinted that 3 of them work for the NSA, which is pretty much the least ethical thing to do with this knowledge, especially if your concern is privacy. Moreover, having a working implementation of solving a NP-hard problem in polynomial time would be a huge deal outside of cryptography! I can't even start to imagine the implications on different fields of science (climate modeling, chemistry, biology...). Hell, they could even take down the Bitcoin blockchain, ending the ecological disaster that it is. But maybe helping to model climate change or help develop new medicines does not satisfy their moral stances? (edited for formatting) (edit 2: I got the title wrong. What does not make sense is the characters' reaction more than the technical accuracy)
d down, k up, everybody's a game theorist, titcoin, build wiki on Cardano, (e-)voting, competitive marketing analysis, Goguen product update, Alexa likes Charles, David hates all, Adam in and bros in arms with the scientific counterparts of the major cryptocurrency groups, the latest AMA for all!
Decreasing d parameter Just signed the latest change management document, I was the last in the chain so I signed it today for changing the d parameter from 0.52 to 0.5. That means we are just about to cross the threshold here in a little bit for d to fall below 0.5 which means more than half of all the blocks will be made by the community and not the OBFT nodes. That's a major milestone and at this current rate of velocity it looks like d will decrement to zero around March so lots to do, lots to talk about. Product update, two days from now, we'll go ahead and talk about that but it crossed my desk today and I was really happy and excited about that and it seemed like yesterday that d was equal to one and people were complaining that we delayed it by an epoch and now we're almost at 50 percent. For those of you who want parameter-level changes, k-level changes, they are coming and there's an enormous internal conversation about it and we've written up a powerpoint presentation and a philosophy document about why things were designed the way that they're designed. Increasing k parameter and upcoming security video and everybody's a game theorist My chief scientist has put an enormous amount of time into this. Aggelos is very passionate about this particular topic and what I'm going to do is similar to the security video that I did where I did an hour and a half discussion about a best practice for security. I'm going to actually do a screencasted video where I talk about this philosophy document and I'm going to read the entire document with annotations with you guys and kind of talk through it. It might end up being quite a long video. It could be several hours long but I think it's really important to talk around the design philosophy of this. It's kind of funny, everybody, when they see a cryptographic paper or math paper, they tend to just say okay you guys figure that out. No one's an expert in cryptography or math and you don't really get strong opinions about it but game theory despite the fact that the topics as complex and in some cases more complex you tend to get a lot of opinions and everybody's a game theorist. So, there was enormous amount of thought that went into the design of the system, the parameters of system, everything from the reward functions to other things and it's very important that we explain that thought process in as detailed of a way as possible. At least the philosophy behind it then I feel that the community is in a really good position to start working on the change management. It is my position that I'd love to see k largely increased. I do think that the software needs some improvements to get there especially partial delegation delegation portfolios and some enhancements into the operation of staking especially. E-voting I'd love to see the existence of hybrid wallets where you have a cold part a hot part and we've had a lot of conversations about that and we will present some of the progress in that matter at the product updates. If not this October certainly in November. A lot of commercialization going along, a lot of things going on and flowing around and you know, commercial teams working hard. As I mentioned we have a lot of deals in the pipeline. The Wyoming event was half political, half sales. We were really looking into e-voting and we had very productive conversations along those lines. It is my goal that Cardano e-voting software is used in political primaries and my hope is for eventually to be used in municipal and state and eventually federal elections and then in national elections for countries like Ethiopia, Mongolia and other places. Now there is a long road, long, long road to get there and many little victories that have to begin but this event. Wyoming was kind of the opener into that conversation there were seven independent parties at the independent national convention and we had a chance to talk to the leadership of many of them. We will also engage in conversation with the libertarian party leadership as well and at the very least we could talk about e-voting and also blockchain-based voting for primaries that would be great start and we'll also look into the state of Wyoming for that as well. We'll you know, tell you guys about that in time. We've already gotten a lot of inquiries about e-voting software. We tend to get them along with the (Atala) Prism inquiries. It's actually quite easy to start conversations but there are a lot of security properties that are very important like end-to-end verifiability hybrid ballots where you have both a digital and a paper ballot delegation mechanics as well as privacy mechanics that are interesting on a case-by-case basis. Goguen, voting, future fund3, competitive marketing analysis of Ouroboros vs. EOS, Tezos, Algorand, ETH2 and Polkadot, new creative director We'll keep chipping away at that, a lot of Goguen stuff to talk about but I'm going to reserve all of that for two days from now for the product update. We're right in the middle, Goguen metadata was the very first part of it. We already have some commercialization platform as a result of metadata, more to come and then obviously lots of smart contract stuff to come. This update and the November update are going to be very Goguen focused and also a lot of alternatives as well. We're still on schedule for an HFC event in I think November or December. I can't remember but that's going to be carrying a lot of things related multisig token locking. There's some ledger rule changes so it has to be an HFC event and that opens up a lot of the windows for Goguen foundations as well as voting on chain so fund3 will benefit very heavily from that. We're right in the guts of Daedalus right now building the voting center, the identity center, QR-code work. All this stuff, it's a lot of stuff, you know, the cell phone app was released last week. Kind of an early beta, it'll go through a lot of rapid iterations every few weeks. We'll update it, google play is a great foundation to launch things on because it's so easy to push updates to people automatically so you can rapidly iterate and be very agile in that framework and you know we've already had 3500 people involved heavily in the innovation management platform ideascale and we've got numerous bids from everything. From John Buck and the sociocracy movement to others. A lot of people want to help us improve that and we're going to see steady and systematic growth there. We're still chipping away at product marketing. Liza (Horowitz) is doing a good job, meet with her two three-times a week and right now it's Ouroboros, Ouroboros, Ouroboros... We're doing competitive analysis of Ouroboros versus EOS, Tezos, Algorand, ETH2 and Polkadot. We think that's a good set. We think we have a really good way of explaining it. David (David Likes Crypto now at IOHK) has already made some great content. We're going to release that soon alongside some other content and we'll keep chipping away at that. We also just hired a creative director for IO Global. His name's Adam, incredibly experienced creative director, he's worked for Mercedes-Benz and dozens of other companies. He does very good work and he's been doing this for well over 20 years and so the very first set of things he's going to do is work with commercial and marketing on product marketing. In addition to building great content where hope is make that content as pretty as possible and we have Rod heavily involved in that as well to talk about distribution channels and see if we can amplify the distribution message and really get a lot of stuff done. Last thing to mention, oh yeah, iOS for catalyst. We're working on that, we submitted it to the apple store, the iOS store, but it takes a little longer to get approval for that than it does with google play but that's been submitted and it's whenever apple approves it or not. Takes a little longer for cryptocurrency stuff. Wiki shizzle and battle for crypto, make crypto articles on wiki great again, Alexa knows Charles, Everpedia meets Charles podcast, holy-grail land of Cardano, wiki on Cardano, titcoin Wikipedia... kind of rattled the cage a little bit. Through an intermediary we got contact with Jimmy Wales. Larry Sanger, the other co-founder also reached out to me and the everpedia guys reached out to me. Here's where we stand, we have an article, it has solidified, it's currently labeled as unreliable and you should not believe the things that are said in it which is David Gerard's work if you look at the edits. We will work with the community and try to get that article to a fair and balanced representation of Cardano and especially after the product marketing comes through. We clearly explain the product I think the Cardano article can be massively strengthened. I've told Rod to work with some specialized people to try to get that done but we are going to work very hard at a systematic approval campaign for all of the scientific articles related to blockchain technology in the cryptocurrency space. They're just terrible, if you go to the proof of work article, the proof of stake or all these things, they're just terrible. They're not well written, they're out of date and they don't reflect an adequate sampling of the science. I did talk to my chief scientist Aggelos and what we're gonna do is reach out to the scientific counterparts that most of the major cryptocurrency groups that are doing research and see if they want to work with us at an industry-wide effort to systematically improve the scientific articles in our industry so that there are a fair and balanced representation of what the current state of the art are, the criticisms, the trade-offs as well as the reference space and of course obviously we'll do quite well in that respect because we've done the science. We're the inheritor of it but it's a shame because when people search proof of stake on google usually wikipedia results are highly biased. We care about wikipedia because google cares about wikipedia, amazon cares about wikipedia. If you ask Alexa who is Charles Hoskinson, the reason why Alexa knows is because it's reading directly from the wikipedia page. If I didn't have a wikipedia page Alexa would know that so if somebody says Alexa what is Cardano it's going to read directly from the wikipedia page and you know and we can either just pretend that reality doesn't exist or we can accept it and we as a community working with partners in the broader cryptocurrency community can universally improve the quality of cryptocurrency pages. There's been a pattern of commercial censorship on wikipedia for cryptocurrencies in general since bitcoin itself. In fact I think the bitcoin article is actually taken down once back in, might have been, 2010 or 2009 but basically wikipedia has not been a friend of cryptocurrencies. That's why everpedia exists and actually their founders reached out to me and I talked to them over twitter through PMs and we agreed to actually do a podcast. I'm going to do a streamyard, stream with these guys and they'll come on talk all about everpedia and what they do and how they are and we'll kind of go through the challenges that they've encountered. How their platform works and so forth and obviously if they want to ever leave that terrible ecosystem EOS and come to the holy-grail land of Cardano we'd be there to help them out. At least they can tell the world how amazing their product is and also the challenges they're having to overcome. We've also been in great contact with Larry Sanger. He's going to do an internal seminar at some point with with us and talk about some protocols he's been developing since he left wikipedia specifically to decentralize knowledge management and have a truly decentralized encyclopedia. I'm really looking forward to that and I hope that presentation gives us some inspiration as an ecosystem of things we can do. That's a great piece of infrastructure regardless and after we learn a lot more about it and we talk to a lot of people in ecosystem. If we can't get people to move on over, it would be really good to see through ideascale in the innovation management platform for people to utilize the dc fund to build their own variant of wikipedia on Cardano. In the coming months there will certainly be funding available. If you guys are so passionate about this particular problem that you want to go solve it then I'd be happy to play Elon Musk with the hyperloop and write a white paper on a protocol design and really give a good first start and then you guys can go and try to commercialize that technology as Cardano native assets and Plutus smart contracts in addition to other pieces of technology that have to be brought in to make it practical. Right now we're just, let's talk to everybody phase, and we'll talk to the everpedia guys, we're going to talk to Larry and we're going to see whoever else is in this game and of course we have to accept the incumbency as it is. So, we're working with obviously the wikipedia side to improve the quality of not only our article but all of the articles and the scientific side of things so that there's a fair and accurate representation of information. One of the reasons why I'm so concerned about this is that I am very worried that Cardano projects will get commercially censored like we were commercially censored. So, yes we do have a page but it took five years to get there and we're a multi-billion dollar project with hundreds of thousands of people. If you guys are doing cutting-edge novel interesting stuff I don't want your experience to be the same as ours where you have to wait five years for your project to get a page even after government's adopted. That's absurd, no one should be censored ever. This is very well a fight for the entire ecosystem, the entire community, not just Cardano but all cryptocurrencies: bitcoin, ethereum and Cardano have all faced commercial censorship and article deletions during their tenure so I don't want you guys to go through that. I'm hoping we can prove that situation but you know you don't put all your eggs in one basket and frankly the time has come for wikipedia to be fully decentralized and liberated from a centralized organization and massively variable quality in the editor base. If legends of valor has a page but Cardano didn't have one until recently titcoin, a pornography coin from 2015, that's deprecated, no one uses it, has a page but Cardano couldn't get one there's something seriously wrong with the quality control mechanism and we need to improve that so it'll get done.
Hi Monero community! Two months ago I posted a CCS for continuing my research on Monero Atomic Swaps. That research is now complete and I'm happy to present my results. This post will be a summary of my research, but you can also find the whitepaper that describes the full protocol and all the details here.
Shiny BTC/XMR Atomic Swap Protocol!
We found it! With the help of the MRL, my colleagues, and the community, we created the first (to our knowledge) protocol to atomically swap bitcoin and monero. And this resulting protocol is implementable today - no more obscure crypto!
Why now? What changed?
When I started studying Monero for a Bitcoin/Monero atomic swap three and a half years ago, most of the swap protocols where based on 'Hash Time Locked Contract' (HTLC), something that we all know as non-existent on Monero. So the goal at the beginning of the project was to create an atomic swap where all the logic (timeouts, possible sequences of operation, secret disclosures, etc) is managed on the other chain: the Bitcoin chain. The second difficulty with Monero and Bitcoin is their respective underlying cryptographic parameters: they don't share the same elliptic curve, they don't share the same signing algorithm; they have nothing in common! This makes the pair a bad candidate for other types of atomic swap that don't (solely) rely on HTLC. In November 2018 we came up with a draft protocol that respects the above constraints. Thus, the protocol requires a specific type of zero-knowledge proof to be trustless: a hash pre-image zero-knowledge proof. This type of zkp is not wildly used in practice, if at all. Thus the protocol works in theory, but with some obscure crypto, making the protocol a bad candidate for an implementation. In early 2020, after presenting the draft protocol at 36C3 in December 2019, I discovered, by reference from Sarang Noether (MRL), Andrew Poelstra's idea of doing a discrete logarithm equality across group zero-knowledge proof of knowledge (MRL-0010), meaning that we can prove some relations between elements in two different groups (two curves to simplify) and the paper by LLoyd Fournier on One-Time Verifiably Encrypted Signatures allowing secret disclosure with ECDSA. With these two new (to me) cryptographic primitives, we were able to replace the previous zero-knowledge proof with a combination of the latter, making the protocol complete and practically feasible.
How it works
As a broad overview (and simplified) the protocol work as follow:
The monero are locked in an address generated by both participants
At the beginning, neither of the participants have the full control over the address; they both have half of the private key only
With the cross group discrete logarithm equality zkp, both participants prove to each other that the address on the Bitcoin chain is related to the address on the Monero chain
By means of Bitcoin scripts and ECDSA one-time verifiably encrypted signatures, one participant reveals to the other her partial private key by taking the bitcoin, allowing the other to take control over the monero
If the swap succeeds, A reveals to B, and if the swap is cancelled, B reveals to A. (We have a third scenario explained in the paper to force reaction and avoid deadlock.)
The obvious next step would be to have a working implementation on mainnet, but a ready-to-use implementation that is also robust and safe-to-use requires a lot of engineering work. Furthermore, even though the cryptography is not too obscure, most of it still also lacks an implementation. I'll post soon, if the community wants it, a CCS proposal to get my team and I to work on implementing this protocol, step by step, with the end goal of creating a working client/daemon for swapping Bitcoin and Monero. It would be very exciting to build that!
Thanks to the MRL and its researchers for their help, the CCS team, and the community for its support! I hope I fulfilled the community's expectations for my my first CCS - all feedback is appreciated.
https://preview.redd.it/al1gy9t9v9q51.png?width=424&format=png&auto=webp&s=b29a60402d30576a4fd95f592b392fae202026ca Hopefully any questions you have will be answered by the resources below, but if you have additional questions feel free to ask them in the comments. If you're quite technically-minded, the Zano whitepaper gives a thorough overview of Zano's design and its main features. So, what is Zano? In brief, Zano is a project started by the original developers of CryptoNote. Coins with market caps totalling well over a billion dollars (Monero, Haven, Loki and countless others) run upon the codebase they created. Zano is a continuation of their efforts to create the "perfect money", and brings a wealth of enhancements to their original CryptoNote code. Development happens at a lightning pace, as the Github activity shows, but Zano is still very much a work-in-progress. Let's cut right to it: Here's why you should pay attention to Zano over the next 12-18 months. Quoting from a recent update:
Anton Sokolov has recently joined the Zano team. ... For the last months Anton has been working on theoretical work dedicated to log-size ring signatures. These signatures theoretically allows for a logarithmic relationship between the number of decoys and the size/performance of transactions. This means that we can set mixins at a level from up to 1000, keeping the reasonable size and processing speed of transactions. This will take Zano’s privacy to a whole new level, and we believe this technology will turn out to be groundbreaking!
If successful, this scheme will make Zano the most private, powerful and performant CryptoNote implementation on the planet. Bar none. A quantum leap in privacy with a minimal increase in resource usage. And if there's one team capable of pulling it off, it's this one.
What else makes Zano special?
You mean aside from having "the Godfather of CryptoNote" as the project lead? ;) Actually, the calibre of the developers/researchers at Zano probably is the project's single greatest strength. Drawing on years of experience, they've made careful design choices, optimizing performance with an asynchronous core architecture, and flexibility and extensibility with a modular code structure. This means that the developers are able to build and iterate fast, refining features and adding new ones at a rate that makes bigger and better-funded teams look sluggish at best. Zano also has some unique features that set it apart from similar projects: Privacy Firstly, if you're familiar with CryptoNote you won't be surprised that Zano transactions are private. The perfect money is fungible, and therefore must be untraceable. Bitcoin, for the most part, does little to hide your transaction data from unscrupulous observers. With Zano, privacy is the default. The untraceability and unlinkability of Zano transactions come from its use of ring signatures and stealth addresses. What this means is that no outside observer is able to tell if two transactions were sent to the same address, and for each transaction there is a set of possible senders that make it impossible to determine who the real sender is. Hybrid PoW-PoS consensus mechanism Zano achieves an optimal level of security by utilizing both Proof of Work and Proof of Stake for consensus. By combining the two systems, it mitigates their individual vulnerabilities (see 51% attack and "nothing at stake" problem). For an attack on Zano to have even a remote chance of success the attacker would have to obtain not only a majority of hashing power, but also a majority of the coins involved in staking. The system and its design considerations are discussed at length in the whitepaper. Aliases Here's a stealth address: ZxDdULdxC7NRFYhCGdxkcTZoEGQoqvbZqcDHj5a7Gad8Y8wZKAGZZmVCUf9AvSPNMK68L8r8JfAfxP4z1GcFQVCS2Jb9wVzoe. I have a hard enough time remembering my phone number. Fortunately, Zano has an alias system that lets you register an address to a human-readable name. (@orsonj if you want to anonymously buy me a coffee) Multisig Multisignature (multisig) refers to requiring multiple keys to authorize a Zano transaction. It has a number of applications, such as dividing up responsibility for a single Zano wallet among multiple parties, or creating backups where loss of a single seed doesn't lead to loss of the wallet. Multisig and escrow are key components of the planned Decentralized Marketplace (see below), so consideration was given to each of them from the design stages. Thus Zano's multisig, rather than being tagged on at the wallet-level as an afterthought, is part of its its core architecture being incorporated at the protocol level. This base-layer integration means months won't be spent in the future on complicated refactoring efforts in order to integrate multisig into a codebase that wasn't designed for it. Plus, it makes it far easier for third-party developers to include multisig (implemented correctly) in any Zano wallets and applications they create in the future. (Double Deposit MAD) Escrow With Zano's escrow service you can create fully customizable p2p contracts that are designed to, once signed by participants, enforce adherence to their conditions in such a way that no trusted third-party escrow agent is required. https://preview.redd.it/jp4oghyhv9q51.png?width=1762&format=png&auto=webp&s=12a1e76f76f902ed328886283050e416db3838a5 The Particl project, aside from a couple of minor differences, uses an escrow scheme that works the same way, so I've borrowed the term they coined ("Double Deposit MAD Escrow") as I think it describes the scheme perfectly. The system requires participants to make additional deposits, which they will forfeit if there is any attempt to act in a way that breaches the terms of the contract. Full details can be found in the Escrow section of the whitepaper. The usefulness of multisig and the escrow system may not seem obvious at first, but as mentioned before they'll form the backbone of Zano's Decentralized Marketplace service (described in the next section).
What does the future hold for Zano?
The planned upgrade to Zano's privacy, mentioned at the start, is obviously one of the most exciting things the team is working on, but it's not the only thing. Zano Roadmap Decentralized Marketplace From the beginning, the Zano team's goal has been to create the perfect money. And money can't just be some vehicle for speculative investment, money must be used. To that end, the team have created a set of tools to make it as simple as possible for Zano to be integrated into eCommerce platforms. Zano's API’s and plugins are easy to use, allowing even those with very little coding experience to use them in their E-commerce-related ventures. The culmination of this effort will be a full Decentralized Anonymous Marketplace built on top of the Zano blockchain. Rather than being accessed via the wallet, it will act more as a service - Marketplace as a Service (MAAS) - for anyone who wishes to use it. The inclusion of a simple "snippet" of code into a website is all that's needed to become part a global decentralized, trustless and private E-commerce network. Atomic Swaps Just as Zano's marketplace will allow you to transact without needing to trust your counterparty, atomic swaps will let you to easily convert between Zano and other cyryptocurrencies without having to trust a third-party service such as a centralized exchange. On top of that, it will also lead to the way to Zano's inclusion in the many decentralized exchange (DEX) services that have emerged in recent years.
Where can I buy Zano?
Zano's currently listed on the following exchanges: https://coinmarketcap.com/currencies/zano/markets/ It goes without saying, neither I nor the Zano team work for any of the exchanges or can vouch for their reliability. Use at your own risk and never leave coins on a centralized exchange for longer than necessary. Your keys, your coins! If you have any old graphics cards lying around(both AMD & NVIDIA), then Zano is also mineable through its unique ProgPowZ algorithm. Here's a guide on how to get started. Once you have some Zano, you can safely store it in one of the desktop or mobile wallets (available for all major platforms).
How can I support Zano?
Zano has no marketing department, which is why this post has been written by some guy and not the "Chief Growth Engineer @ Zano Enterprises". The hard part is already done: there's a team of world class developers and researchers gathered here. But, at least at the current prices, the team's funds are enough to cover the cost of development and little more. So the job of publicizing the project falls to the community. If you have any experience in community building/growth hacking at another cryptocurrency or open source project, or if you're a Zano holder who would like to ensure the project's long-term success by helping to spread the word, then send me a pm. We need to get organized. Researchers and developers are also very welcome. Working at the cutting edge of mathematics and cryptography means Zano provides challenging and rewarding work for anyone in those fields. Please contact the project's Community Manager u/Jed_T if you're interested in joining the team. Social Links: Twitter Discord Server Telegram Group Medium blog I'll do my best to keep this post accurate and up to date. Message me please with any suggested improvements and leave any questions you have below. Welcome to the Zano community and the new decentralizedprivateeconomy!
Before jumping to conclusions about this post, know that I am not looking to spread any FUD but rather am trying to understand a forthcoming risk and potential solutions from an unbiased standpoint. My research has not yielded any definitive answer so I am turning here to seek direction from those more knowledgable than me. -- When it comes to predicting quantum computing's ability to break Bitcoin cryptographically, I've seen estimates as small as two years and as large as 25 years. Either way, it is easily conceivable that quantum processors will improve to the point of threatening Bitcoin as a reliable form of currency and store of value. One way to prevent vulnerability to quantum threats is by storing Bitcoin in an address that has only ever received Bitcoin and never sent it. Although, this is an unrealistic mitigant for an asset/currency that is intended to be bought and sold, for all trust will be lost in the network once quantum computing becomes powerful enough to hack Bitcoin. Nobody will place any value in a currency that can be hacked by sending a transaction. Another argument I've seen is that once quantum computing is strong enough to hack Bitcoin's cryptography, Bitcoin will be a non-factor compared to the other digital security breakdowns that will have transpired. For example, nuclear codes, bank accounts, digital privacy, etc. However, those centralized networks will have the ability to preemptively update their internal security to the standard required in a quantum computing world. In a similar manner, cryptocurrency and blockchain as a whole will survive such transition via improved cryptography. But when it comes to Bitcoin specifically, will it be possible to generate consensus among the miners to switch to a quantum resistant protocol? My research has found conflicting perspectives - one side being that in order to upgrade Bitcoin's security, it would require manual movement of coins to a new address by all users, and a burning of the coins that did not move after a "sufficient" amount of time. Burning one's assets would undoubtedly not hold in a court of law. Even if we are still several years away, an unsolvable existential threat on the horizon would be priced into the value of Bitcoin and drive it down to zero. With that being said, are there any feasible solutions to bring Bitcoin to quantum resistance? How can Bitcoin survive this threat in the long run? What is being done currently to resolve such problem?
Decentr ($DEC) - foundational cross-chain and cross-platform DeFi protocol
Decentr is a protocol designed to make blockchain/DLT mainstream by allowing DeFi applications built on various blockchains to “talk to each other”. Decentr is a 100% secure and decentralised Web 3.0 protocol where users can apply PDV (personal data value) to increase APR on $DEC that users loan out as part of of our DeFi dLoan features, as well as it being applied at PoS when paying for stuff online. Decentr is also building a BAT competitor browser and Chrome/Firefox extension that acts as a gateway to 100% decentralised Web 3.0
Allows DeFi Dapps to access all Decentr’s dFintech features, including dLoan, dPay. Key innovation is that the protocols is based on a user’s ability to leverage the value of their data as exchangeable “currency”.
Decentr is building foundational chain-agnostic protocols that will support “true” 100% DeFi Dapps, a 100% secure and decentralised, user-centric alt economy. DeFi dApps inter-connected by Decentr can talk to each other and share PDV (personal data value) of their users. PDV is best described as a personalized “exchange rate” (in a sense social reputation where more effort leads to more rewards and NOT more capital to more rewards. ) between currencies that users apply at point-of-sale to make the cost of goods and services cheaper online. PDV is applied to the APR users earn on $DEC (native token) that they hold that they loan out as part of the investing pool. PDV will also allow uncollateralized loans on their dLoan platform, and also on platforms like Aave and Compound.
Decentr will implement ZKsync to get super cheap and super fast transactions across the ETH network. It is also working with HoloChain and Tomochain to allow connect their DeFi ecosystem to the Ethereum DeFi ecosystem. Decentr has DEEP TIES and a PARTNERSHIP with Holochain: https://medium.com/@DecentrNet/decentr-holochain-ama-29d662caed03
Decentr is also building a browser and Chrome/Firefox extension - a gateway that “transitions” Web 2.0 into a 100% decentralised Web 3.0 via their suite of decentralised dFintech and dCommunications features. The browser adds a 100% decentralised “user layer” to current blockchain protocols so that applications built on blockchain can actually “talk to each other”. The browser uses encryption all the time and the power of blockchain to keep private keys safe. Browser will offer a more robust and innovative type of blockchain storage and caching that is much faster than VPN or TOR. It will allow surfing .onion addresses as well as the regular ones. >>BAT browser 400m marketcap, DEC marketcap 4m<<
Decentr is researching a hardware application, powered by Decentr software, that would greatly enhance current IoT networks. It’s called a “Smart Chip Node” (SCN) and will adhere to 4G LTE standards (with in-built 5G capability), which means connectivity between devices will match or exceed current speed and connectivity, dramatically improving stability and coverage of standalone devices, such as a laptop or tablet, as well as IoT devices, such as home routers and modems.
Decentr uses Coinbase API to optimise integrated implementation of the user layer and Blockchain as a Service (BaaS) to allow users to leverage cloud-based solutions to build, host and use their own blockchain apps. Tierion’s technological infrastructure, the Chainpoint Proof protocol, will come into play whenever a user adds something in Tierion’s data store. Hyperledger Fabric and R3 Corda private blockchains are used as an immutable transaction database for data transfers, including the following tech: R3 Corda, Hyperledger Fabric, Ansible, Bitbucket Pipelines, AWS, Node.JS, GoLang, Kotlin and CouchDB.
Implements a system of layered security protocols based on a radically-new software architecture that combines Elliptic Curve Cryptography (ECC)4 and Sobol sequencing with a n-dimensional chain as part of AI-enhanced, platform-wide community consensus mechanism — a mechanism that assigns mutually agreed value to data and user security protocol upgrades (further encouraging enhanced data integrity) by deploying a Delegated Proof of Stake (DPoS) protocol.
Bank of England has reached out to Decenr to discuss the potential of a UK CBDC upon hearing about the potential of their tech. Decentr is consistent with their own R&D into a "dGBP" and they requested a top-level document for review >> Decentr created this proposal: https://decentr.net/files/Decentr_Consultancy_Doc_UK_CBDC.pdf
A fee is charged for every transaction using dPay whereby an exchange takes place between money (fiat and digital) and data, and vice versa, either as part of DeFi features or via a dApp built on Decentr. They are launching pilot programmes in the following industries:
Banking/PSP Industry: On Product launch, due to Decentr’s powerful PSP connections (including the worlds #2 PSP by volume), a medium-scale pilot program will be launched, which will seed the network with 150,000 PSP customers in primarily the Spanish/LAC markets, generating revenue from day one.
“Bricks and Mortar” Supermarket/Grocery Industry: Decentr aims to ensure the long-term competitiveness of “bricks and mortar” supermarkets against online-only grocery retailers, such as Amazon, by a) building secure tech that allows supermarkets to digitise every aspect of their supply chains and operational functions, while b) allowing supermarkets to leverage this incredibly valuable data as a liquid asset class. Expected revenue by Year 5: $114Mn per year.
Online Advertising Industry: Decentr’s 100% decentralised platform credits users secure data with payable value, in the form of PDV, for engaging with ads. The Brave browser was launched in 2012 and in 8 years has reached over 12 million monthly active users, accented by as many as 4.3 million daily active users.
TOKEN $DEC AND SALE
Decentr recently complete their token sale on a purchase portal powered by Dolomite where they raised $974,000 in 10 minutes for a total sale hardcap of 1.25M. The $DEC token is actively trading on multiple exchanges including Uniswap and IDEX. Listed for free on IDEX, Hotbit, Hoo, Coinw, Tidex, BKex. Listed on CoinGecko and Coinmarketcap. Listed on Delta and Blockfolio apps. ➡️ Circulating supply: 61m $DEC. ➡️ Release schedule and token distribution LINK -> NO RELEASE UNTIL 2021.
A tradeable unit of value that is both internal and external to the Decentr platform.A unit of conversion between fiat entering and exiting the Decentr ecosystem.A way to capture the value of user data and combines the activity of every participant of the platform performing payment (dPay), or lending and borrowing (dLend), i.e a way to peg PDV to tangible/actionable value.Method of payment in the Decentr ecosystem.A method to internally underwrite the “Deconomy.
Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31 Submitted for consideration toThe Great Reddit Scaling Bake-Off Baked by the pastry chefs atOffchain Labs Please send questions or comments to [[email protected] ](mailto:[email protected]) 1. Overview We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too! Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know). To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts. 1.1 Why Ethereum Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract. The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both. Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice. 1.2 Why Arbitrum While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools. Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users. We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth. 2. Arbitrum at a glance Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes. Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability. Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow. Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below. Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn. Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask. Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract. Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details). Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users. Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum. Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain. Limitations Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals. As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit). Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here). So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality. 3. The recipe: How Arbitrum Rollup works For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents: Arbitrum Rollup Whitepaper Arbitrum academic paper (describes a previous version of Arbitrum) 4. Developer docs and APIs For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/. Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release. 5. Who are the validators? As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators? Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers. Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators. 6. Reddit Contract Support Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain. Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain. For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2. Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios. In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process. To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum. When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) . The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform. 8. Benchmarks and costs In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking. Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here. Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum. On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support: Over a 5 day period, your scaling PoC should be able to handle:
100,000 point claims (minting & distributing points)
75,000 one-off points burning
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.) We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks). Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c. We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic. Our model. Our cost model includes several sources of cost:
L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system. Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.) 9. Status of Arbitrum Rollup Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade. Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet. 10. Reddit Universe Arbitrum Rollup Chain The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo. If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access. 11. Even more scaling: Arbitrum Sidechains Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide. While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic. The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout. Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator. Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol. Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest. We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help. While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined). 12. How Arbitrum compares We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects. Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components. But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration. About Offchain Labs Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others. Leadership Team Ed Felten Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan. Steven Goldfeder Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons. Harry Kalodner Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
Strategic Developments in Eurasia After 11 September, 1st Edition: Shireen Hunter
Contemporary Issues in Healthcare Law and Ethics, 4th Edition: Dean Harris
Transitioning from RN to MSN: Principles of Professional Role Development: Brenda Scott & Mindy Thompson
Principles and Practice of Public Health Surveillance, 3rd Edition: Lisa M. Lee & Steven M. Teutsch & Stephen B. Thacker & Michael E. St. Louis
Elementary Statistics: Picturing the World, 6th Edition: Ron Larson & Betsy Farber
Human Sexuality in a World of Diversity, 6th Canadian Edition: Spencer A. Rathus & Jeffrey S. Nevid & Lois Fichner-Rathus & Alex McKay & Robin Milhausen
Becoming Your Own Banker, 6th Edition: R. Nelson Nash
Murach's MySQL, 3rd Edition: Joel Murach
Intermediate Algebra, 13th Edition: Marvin L. Bittinger & Judith A. Beecher & Barbara L. Johnson
Planning Health Promotion Programs: An Intervention Mapping Approach, 4th Edition: L. Kay Bartholomew Eldredge & Christine M. Markham & Robert A. C. Ruiter & Maria E. Fernández & Gerjo Kok & Guy S. Parcel
Human Factors in Simple and Complex Systems, 3rd Edition: Robert W. Proctor & Trisha Van Zandt
The Irony of Democracy: An Uncommon Introduction to American Politics, 17th Edition: Louis Schubert & Thomas R. Dye & Harmon Zeigler
Understanding Earth, 7th Edition: John Grotzinger
Nursing Research in Canada: Methods, Critical Appraisal, and Utilization, 4th Edition: Geri LoBiondo-Wood & Judith Haber & Cherylyn Cameron & Mina Singh
The Philosophy of Film, 1st Edition: Thomas E. Wartenberg & Angela Curran
Disaster Nursing and Emergency Preparedness, 4th Edition: Tener Goodwin Veenema
Language in Mind: An Introduction to Psycholinguistics, 2nd Edition: Julie Sedivy
Medical Anthropology: A Biocultural Approach, 3rd Edition: Andrea S. Wiley & John S. Allen
Exploring Biology in the Laboratory, 3rd Edition: Murray P. Pendarvis & John L. Crawley
Guide to Networking Essentials, 8th Edition: Greg Tomsho
Social Psychology: A Storytelling Approach, 2nd Edition: Leonard Newman & Ralph Erber
Managing Conflict: An Introspective Journey to Negotiating Skills, 1st Edition: Dorothy Balancio
Environmental Change and Challenge: A Canadian Perspective, 5th Edition: Philip Dearden & Bruce Mitchell
Brain and Behavior: A Cognitive Neuroscience Perspective, 1st Edition: David Eagleman & Jonathan Downar
Cardiac/Vascular Nurse Exam Secrets Study Guide: Cardiac/Vascular Nurse Test Review for the Cardiac/Vascular Nurse Exam: Mometrix Media & Cardiac Vascular Nurse Exam Secrets
Keeping the Republic: Power and Citizenship in American Politics, The Essentials, 9th Edition: Christine Barbour & Gerald Wright
Principles of Environmental Science, 9th Edition: William Cunningham & Mary Cunningham
Thomas' Calculus, 14th Edition: Joel R. Hass & Christopher E. Heil & Maurice D. Weir
Pharmacology for Canadian Pharmacy Technicians, 1st Edition: Leland Norman Holland & Michael P. Adams & Jeanine Lynn Brice & Heather V. LeBlanc
Cellular and Molecular Immunology, 9th Edition: Abul K. Abbas & Andrew H. Lichtman & Shiv Pillai
Operations Management: Processes and Supply Chains, 11th Edition: Lee J. Krajewski & Manoj K. Malhotra & Larry P. Ritzman
Jews, Christians, Muslims: A Comparative Introduction to Monotheistic Religions, 2nd Edition: John Corrigan & Frederick Denny & Martin S Jaffee & Carlos Eire
Professional Nursing: Concepts & Challenges, 9th Edition: Beth Black
Practical Homicide Investigation: Tactics, Procedures, and Forensic Techniques, 4th Edition: Vernon J. Geberth
Fundamentals of Modern Manufacturing: Materials, Processes and Systems, 7th Edition: Mikell P. Groover
Genetics: A Conceptual Approach, 7th Edition: Benjamin A. Pierce
Computer Science Illuminated, 7th Edition: Nell Dale & John Lewis
The Globalization of World Politics: An Introduction to International Relations, 8th Edition: John Baylis & Steve Smith & Patricia Owens
Behavioral Neuroscience, 9th Edition: S. Marc Breedlove & Neil V. Watson
Canadian Human Resource Management: A Strategic Approach, 12th Edition: Hermann Schwind & Krista Uggerslev & Terry Wagar & Neil Fassina
Brief Principles of Macroeconomics, 9th Edition: N. Gregory Mankiw
Living in the Environment, 4th Canadian Edition: G. Miller & Dave Hackett & Carl Wolfe
Principles of Economics, 9th Edition: N. Gregory Mankiw
Principles of Microeconomics, 9th Edition: N. Gregory Mankiw
Child Development, 9th Edition: Laura E. Berk
Home, School, and Community Collaboration: Culturally Responsive Family Engagement, 4th Edition: Kathy Beth Grant & Julie A. Ray
Set Lighting Technician's Handbook, 4th Edition: Harry Box
Clinical Nurse Leader Certification Review, 2nd Edition: Cynthia R. King
Basic Chemistry, 4th Edition: Karen C. Timberlake & William Timberlake
Family Theories: Foundations and Applications, 1st Edition: Katherine R. Allen & Angela C. Henderson
The Earth and Its Peoples: A Global History, 7th Edition: Richard Bulliet & Pamela Crossley & Daniel Headrick & Steven Hirsch & Lyman Johnson
Sociology in Action: A Canadian Perspective, 3rd Edition: Tami Bereska & Diane Symbaluk
Operations Management: Processes and Supply Chains, 12th Edition: Lee J. Krajewski & Manoj K. Malhotra & Larry P. Ritzman
Introduction to Food Science and Food Systems, 2nd Edition: Rick Parker & Miriah Pace
Liaisons, Student Edition: An Introduction to French, 3rd Edition: Wynne Wong & Stacey Weber-Fève & Bill VanPatten
Zuckerman Parker Handbook of Developmental and Behavioral Pediatrics for Primary Care, 4th Edition: Marilyn Augustyn & Barry Zuckerman
Teaching in Today's Inclusive Classrooms: A Universal Design for Learning Approach, 3rd Edition: Richard M. Gargiulo & Debbie Metcalf
The Biological Basis of Mental Health, 3rd Edition: William T. Blows
Developing and Managing Electronic Collections: The Essentials: Peggy Johnson
Western Civilization: Volume II: Since 1500, 10th Edition: Jackson J. Spielvogel
Talking to Strangers: What We Should Know about the People We Don't Know, 1st Edition: Malcolm Gladwell
Understanding Pathophysiology, 7th Edition: Sue E. Huether & Kathryn L. McCance
Our Environment: A Canadian Perspective, 5th edition: Dianne Draper & Ann Zimmerman
Criminal Law: Cases and Materials, 8th Edition: John Kaplan & Robert Weisberg & Guyora Binder
A Photographic Atlas of Histology, 2nd Edition: Michael J Leboffe
Dragons and Tigers: A Geography of South, East, and Southeast Asia, 3rd Edition: Barbara A. Weightman
Climate Change Biology, 1st Edition: Jonathan A. Newman & Madhur Anand & Hugh A. L. Henry & Shelley L. Hunt & Ze'ev Gedalof
The Power of Critical Thinking: 5th Canadian Edition: Chris MacDonald and Lewis Vaughn
Principles of Fire Behavior and Combustion, 4th Edition: Richard Gann & Raymond Friedman
Informatics Nurse Exam Secrets Study Guide: Informatics Test Review for the Informatics Nurse Certification Exam: Informatics Exam Secrets Test Prep Team
General Chemistry, 10th Edition: Darrell Ebbing & Steven D. Gammon
A Practical Guide to Computer Forensics Investigations, 1st Edition: Darren R. Hayes
Basic Biomechanics, 8th Edition: Susan Hall
Essay Writing for Canadian Students, 8th Edition: Roger Davis & Laura K. Davis
Biology, 11th Edition: Peter Raven & George Johnson & Kenneth Mason & Jonathan Losos & Susan Singer
Molecular Imaging, 1st Edition: Ralph Weissleder& Brian D. Ross & Alnawaz Rehemtulla & Sanjiv Sam Gambhir
Criminology, 4th Edition: Frank Schmalleger
A Theory of Truthmaking: Metaphysics, Ontology, and Reality: Jamin Asay
The Routledge Handbook of Metaphysical Grounding, 1st Edition: Michael J. Raven
Linear Algebra and Its Applications, 5th Edition: David C. Lay & Steven R. Lay & Judi J. McDonald
Essentials of Human Communication, 9th Edition: Joseph A. DeVito
Economics: Principles, Applications, and Tools, 9th Edition, Global Edition: Arthur O'Sullivan & Steven Sheffrin & Stephen Perez
Global Health 101, 3rd Edition: Richard Skolnik
Mathematical Proofs: A Transition to Advanced Mathematics, 4th Edition: Gary Chartrand & Albert D. Polimeni & Ping Zhang
Concepts in Strategic Management and Business Policy: Globalization, Innovation and Sustainability, 15th Edition, Global Edition: Thomas L. Wheelen & J. David Hunger & Alan N. Hoffman & Charles E. Bamford
Chemistry: The Central Science, 14th Edition, Global Edition: Theodore E. Brown & H. Eugene LeMay & Bruce E. Bursten & Catherine Murphy & Patrick Woodward & Matthew E. Stoltzfus
Biopsychology, 10th Edition, Global Edition: John P. J. Pinel & Steven Barnes
Electric Circuits, 11th Edition: James W. Nilsson & Susan Riedel
Keeping the Republic; Power and Citizenship in American Politics, the Essentials, 8th Edition: Christine Barbour & Gerald C Wright
Applied Behavior Analysis: Pearson New International Edition, 2nd Edition: John O. Cooper & Timothy E. Heron & William L. Heward
Cryptography and Network Security: Principles and Practice, 7th Edition, Global Edition: William Stallings
Operating Systems: Internals and Design Principles, 9th Edition, Global Edition: William Stallings
Options, Futures, and Other Derivatives, 9th Edition, Global Edition: John C. Hull
Invitation to the Psychology of Religion, 3rd Edition: Raymond F. Paloutzian
Valuation: The Art and Science of Corporate Investment Decisions, 3rd Edition: Sheridan Titman
Comprehensive Clinical Nephrology, 5th Edition: Richard J. Johnson & John Feehally & Jurgen Floege
Miller & Freund's Probability and Statistics for Engineers, 9th Edition, Global Edition: Richard Johnson & Irwin Miller & John Freund
Exploring Strategy: Text and Cases, 11th Edition: Gerry Johnson & Richard Whittington & Patrick Regnér & Kevan Scholes & Duncan Angwin
Economics for Business, 7th Edition: John Sloman
Essentials of Economics, 7th Edition: John Sloman & Dean Garratt
Economics, 9th Edition: John Sloman & Dean Garratt & Alison Wride
Essential Economics for Business, 5th Edition: Johnsloman & Jones Elizabeth
Finite Mathematics, 7th Edition: Stefan Waner & Steven Costenoble
The SAGE Encyclopedia of Surveillance, Security, and Privacy, 1st Edition: Bruce A. Arrigo
Evolution, 4th Edition: Douglas J. Futuyma & Mark Kirkpatrick
Adult Development and Aging, 7th Edition: John C. Cavanaugh & Fredda Blanchard-Fields
Foundations of Finance, 9th Edition, Global Edition: Arthur J. Keown & John D Martin & J. William Petty
Head First Learn to Code: A Learner's Guide to Coding and Computational Thinking, 1st Edition: Eric Freeman
Learning Swift: Building Apps for macOS, iOS, and Beyond, 3rd Edition: Jonathon Manning & Paris Buttfield-Addison & Tim Nugent
Database Systems: Design, Implementation, & Management, 12th Edition: Carlos Coronel & Steven Morris
Introduction to Solid Modeling Using SolidWorks, 13th Edition: William Howard & Joseph Musto
Communications Receivers: Principles and Design, 4th Edition: Ulrich Rohde & Jerry Whitaker & Hans Zahnd
Connect Core Concepts in Health, 15th Edition: Paul Insel & Walton Roth
On Course: Strategies for Creating Success in College and in Life, 8th Edition: Skip Downing
Vander's Human Physiology, 15th Edition: Eric Widmaier & Hershel Raff & Kevin Strang
Biology, 4th Edition: Robert Brooker & Eric Widmaier & Linda Graham & Peter Stiling
The Essentials of Statistics: A Tool for Social Research, 4th Edition: Joseph F. Healey
Oracle 12c: SQL, 3rd Edition: Joan Casteel
Global Business Today, 10th Edition: Charles Hill & G. Tomas M. Hult
Project Management: The Managerial Process, 7th Edition: Erik Larson & Clifford Gray
fle: Practical Decentralized Coin Mixing for Bitcoin”. In: European Symposium on Research in Computer Security (ESORICS) 2014. doi: 10.1007/978-3-319-11212-1_20. [Bac+15] Michael Backes, Aniket Kate, Sebastian Meiser, and Tim Ruffing. “Se-crecy Without Perfect Randomness: Cryptography with (Bounded) Weak Sources”. In: Applied Cryptography ... Cryptography is the process of communicating securely in an insecure environment – i.e. where other people can listen in and control the communication channel. The message you wish to send is converted to a cipher text that appears to be gibberish unless you know the secret to unlocking it. There are two main types of cryptography – symmetric and asymmetric. [ September 1, 2020 ] NTT Research Collaboration Advances Cryptography and Blockchain Research Blockchain [ September 1, 2020 ] VC: This cycle’s altcoin rally is likely to eclipse the crypto mania of 2017 Altcoin [ September 1, 2020 ] Protection Over Profit: What Early Mining Patterns Suggest About Bitcoin’s Inventor Bitcoin So I did my research and cracked open the Bitcoin client source code. The short story is that the thrust of my argument remains the same, but the details of a hypothetical attack against the cryptographic function are a bit more complicated—a simple chosen-prefix collision attack will not be sufficient. The long story? Bitcoin makes some interesting choices of the cryptography it chooses ... Bitcoin mining uses SHA-256, while ECDSA (Elliptical Curve Digital Signature Algorithm) is used in the cryptography to create private and public key pairs. In the event of quantum computers ...
Introduction to Cryptography in Blockchain Technology
Elliptic curve cryptography is the backbone behind bitcoin technology and other crypto currencies, especially when it comes to to protecting your digital ass... Bitcoin Price Analysis & Crypto News! 👍 THUMBS UP & SUBSCRIBE NOW + 🔔! *** VIP PRIVATE TRADE ALERTS - https://t.me/VIPELITE *** ***** VIP ELITE TRADE ALERT... In this episode, I talk with Andrew Poelstrsa, Director of Research at Blockstream. We talk about math, signature technology, Bitcoin fungibility and his role researching Bitcoin. This video is part of an online course, Applied Cryptography. Check out the course here: https://www.udacity.com/course/cs387. Cryptography Course Dr. Julian Hosp - Bitcoin, Stocks, Gold, etc; 17 videos; 1,335 views ; Last updated on Aug 14, 2019; Play all Share. Loading... Save. Sign in to YouTube. Sign in. Cryptography ...