Knowledge and specialization

Now that Thanksgiving is over, I’m trying to get back in the swing of things and stick to my habits. I’ve been terribly bad lately, staying up too late, not getting enough exercise, and spending too much money on beer. I’m sure a lot of it was exuberance with the bitcoin price action; I was way too excited about my prospects. This Thanksgiving pullback seems to have deflated my mania. I also deleted TradingView and my cointracking apps off of my phone.

I spent a lot of time playing Elite: Dangerous the past week after it was offered for free on Epic Games. It’s a massive timesuck. I played six hours after dinner on Thursday, trying to complete a mission that may have been bugged. It’s a great game, but I feel like I need to delete it from my PC. Otherwise I may wind up pulling out the Saitek flightstick and Oculus out of the closet. May as well buy some adult diapers while I’m at it.

Last night I spent the evening working on a bug fix pull request for a Hardhat issue that no one’s been working on. I ran into it while working on my Ether Auction unit tests, two other people seem to have discovered it as well, so I decided to take a look at it. It was actually pretty easy to fix. The Hardhat library explicitly imports many of the matchers in the Waffle library, these balance change functions were just left out. I tested one of them within my actual repo and it worked fine, but I’m not sure the most efficient way to design a unit test around every one of these functions. I may take another crack at it later.

I still get a kick contributing to projects, knowing that my contributions are forever going to be associated with the project history. Even if I’m just doing minor stuff like fixing a typo or adding a couple of imports to a file. It’s like every commit or PR gives me dev points, and each one still gives me a little dopamine hit.

There was a quote that’s had my attention recently: “the more knowledgeable you become, the more you tend to specialize.” I can’t recall the exact wording, but the gist is that as one gains knowledge in a field, the more one tends toward specialization in a particular niche. One the one hand, my professional knowledge has tended to spike toward a topic for a period of time before moving on in a different direction, Jack of all trades, Master of none, if you will. I might be overly harsh on myself. On the other hand there is a bit of immense freedom in this, that I’m not trapped into a particular niche. Still, it may be too much freedom, and has prevented me from truly mastering particular domains. I usually focus on what I need to get things done, then it’s on to the next thing.

When I was younger, I used to ready entire documentation manuals from front to back. I remember the large Windows 2000 Server Administrator guides, large 1700 page monstrosities that I would flip through with great curiosity. It got me pretty far in my career, and still serves me well as far as Windows server administration goes. I’m just trying hard to get away from that career.

Now that I’m in my forties, there’s a part of me that wonders whether I still have the capability to absorb information or pick up new skills like I used to. My brain is a sponge, but is it’s absorption rate still what it used to be? I don’t know whether it’s just doubt, impostor syndrome, or just too much time to think, but part of me is wondering whether I have what it takes to succeed in these endeavors that I’m undertaking.

As far as my day job goes, it seems that I’ve already made a semi-conscious effort to stop keeping up with the changes. There’s so much consolidation in the managed services space, with remote management and process service vendors merging left and right. New offerings are popping up all the time. Every few days Bossmang will send me an email about some online conference for this or that, Sonicwall’s new OS offering, say, and ask me if I’ve registered for it.

“Nope”. Aint gonna.

I’m simply focused on maintaining our existing customer networks. We haven’t had a new client in eighteen months. Right now I’m just focused on completing the Ether Auction, then I’ll have my first Ethereum project under my belt, and then I’ll feel comfortable calling myself an Ethereum developer, which will open a lot of doors. I know I can get the version done before a COVID vaccine is widely available, and then I should be able to go anywhere.

I’ve always loved casting a wide net, and going wherever my fancy took me. Unfortunately, this results in a bit of decision paralysis. I must get this from my mom. My dad called it “flightiness”, as she always had a new hobby, whether it be woodcrafts, painting, stained glass, or hiking. I tend to take after her in that respect, in what Missus calls my six-month projects.

I’m more focused on seeing things through to the end these days, but I’ve always lived by the maxim that works of art are never completed, merely abandoned. Maybe it’s a matter of specification, the difference between building to requirements versus a creative project based more on exploration.

Focusing on the process, rather than the outcome seem to help. Sit down, code, make progress toward the next thing. Resist the urge to start something new. Say “no”. Have constraints. Reduce the possible futures to the set of those where success (or at least completion) is inevitable. Collapse the superposition to a finite state, hit that commit button, deploy that contract, so you can say it.

“I’m done.”

Changing strategies

Another day, another 2020 ATH. $BTC broke $18K before I went to bed last night, and despite a $1000 drop that was quickly eaten up, we’re back at $18,200, ready to continue our journey.

I’ve been seeing something interesting play out with the Bitcoin related equities like $RIOT, $MARA, and $BTBT; they’ve been running faster than BTC and $GBTC. I’m not sure why this is, I guess investors think that they’re undervalued. The main question right now is whether I’m over-allocated in GBTC right now.

It’s 43%, so yea. I figured it was the best play on a BTC run, but with the others outperforming, I may need to reallocate. But I’m almost out of cash, so I’ll need to sell some to reallocate. My last GBTC purchase was on July 15, at 13.60. Current price is over $20, so I should probably sell enough to cover a week’s worth of my value averaging protocols for the big gainers. My two percent targets have increased, so I’ll have to update my value averaging script and make sure I have enough capital allocated to allow it to run.

Done. Now to see how it plays out.

Progress continues on the Ether Auction. I figured out some problems with the Hardhat/Waffle/Chai tests that were failing, and added some functionality to allow claiming the proceeds. Writing the tests out really forces one to clarify all the possibilities. I also found a big bug with the time lock parts of the contracts that were making them end seven seconds after they started, not seven days.

I’ve got some additional tests cases that I need to write to finish the auction contract itself, then I need to figure out how to deploy it to a test network and starting working on the UI. I’ve started going through the React course on FreeCodeCamp, and will deploy a create-eth-app once I have the contract deployed.

I want the website up and running before I deploy the first contract. I’m debating whether I need the autodeployer setup, or whether I just want to manage the first one manually as a test.

There’s also some interesting things with the Open Zeppelin upgradable contracts that I may want to add in before all is said and done. I’ve implemented a self destruct function that can be called three days after the auction ends, but using one of the proxy contracts to hold state might allow me to reset the auctions for the subsequent rounds.

We’ll see how things develop.

More ethereum work

I haven’t been keeping up with habits lately, and haven’t written in three days. I also haven’t been working out much, instead drinking and staying up too late. I could make excuses about how work, home and the kids have been stressing me out, but that’s not it. I’ve just got to find other things to do.

I have been getting a lot done on the Ether Auction smart contracts. I’ve been using Hardhat, which is probably a bit too new and buggy for me, given my experience level with Solidity development, but I’m slowly moving forward with TDD. There are a couple bugs that are preventing me from putting the tests together like I would want. Chai methods for ensuring wallet balances change between bid and withdrawal calls seems to be broken, and I can’t seem to retrieve the getters from a public mapping in the contract.

I’ve been reading up on Solidity best practices, fussing over the withdraw function on the app to make sure I don’t have any hacks on the withdrawals. I want a really solid testing suite before I worry about deploying this thing. I’ve got a lot to figure out in the meantime. Just getting the testing suite has been difficult enough, but I’ve still got to finish building the auction instance contracts, the deployer contract, then figure out how to deploy, and get the web3 UI up and running. I have no idea what I’m doing, and will have to learn it all on the fly.

Thankfully the Hardhat community Discord has several helpful people, although the team seems like they have quite the backlog on the Github issues page.

I’m trying to just take it slow break when I get frustrated. I’ve been working through the Javascript ES tutorials on FreeCodeCamp to try and fill the gaps in knowledge, since I’m going to need it more and more.

I’m not happy with the pace of progress that I’m making, but as long as I make some progress every day I should be happy.

Continued optimism

Nose to the grindstone.

So I actually got a quite a bit done yesterday since I wasn’t obsessing over $BTC price action yesterday. I spent most of my time working in Hardhat, trying to figure out how to make tests work using the Waffle/Chai suite. I’m having a hard time wrapping my head around all the different dependencies so that I can do things. It’s a lot to take in, even for me, so I just had to turn in early last night and give my brain a rest.

I’ve been reading Kurt Vonnegut’s Player Piano for the past week. I finished Slaughterhouse Five earlier last month — it’s a short read — Player Piano is much more like a regular novel. I only gotten through the first fifth of it, but it’s quite amazing from a futurist standpoint. The novel deals with the economic and class consequences of automation and computerization, and even touches on things like standardized test scores determining one’s algorithmic destiny. It’s really making me think about the kids’ education.

Elder is really spending a lot of her day working on schoolwork. I know it’s really not a lot compared to how much time she would be spending in class if they were in person, but it just seems like a lot of work for a third-grader. I find she’s often not paying attention to what the teacher is doing, and is doodling or reading something else she’s not supposed to, and I feel like a hardass constantly telling her to pay attention. She gets frustrated by the homework, having to type everything up; I’ve been trying to reinforce her touch typing, but she often falls back to two-fingers when she’s working.

And I’m pushing Younger with her reading. We’ve been doing IXL every day for the most part, and I’m working with her on language arts as much as I can. It’s stressful, cause she gets frustrated easy, so we have to take it in short increments, a few questions, a TV show, a few questions, another show.

And trying to fit all this in while “working”…

Zombie, LLC’s home franchise was having their virtual convention yesterday, and I spent half of my workday yesterday trying unsuccessfully to get sound working in the Windows 10 VM that I use for work. I don’t know if it’s a problem with QEMU, or the Pulse Audio subsystem, but I tried to convert my QEMU image over to a VirtualBox image and ran out of space. I tried watching the Zoom meeting on my host, but I’m stuck on wifi (another problem with the ethernet card), and the meeting was pretty much unwatchable. I also tried using the Azure VM that I use for the meeting, but the throughput on that was pretty horrible.

I really don’t know what to do about the networking issue other than just put my head to the grindstone and figure out what the hell is going on. I’m not sure if it’s a driver issue with the card itself or some sort of Network Manager / NetPlan issue that I messed up. I’m just not getting an IP address unless I run dhclient directly, and that only works for a few minutes. I really wasn’t looking forward to debugging the entire Ubuntu network stack.

I did have some small wins over the past few days. Lambo1, my six-GPU mining rig, had been acting up, so I wound up disconnecting the rig and pulling out every card one-by-one and spraying then off with air. It looks like one of the power cables stopped working, but it took an hour of swapping and restarting to figure it out. The riser support was slipping down as well, which may have contributed. I also managed to finally figure out how ssh-agent and ssh-add work together with ssh to allow automatic login. It had always been one of those things that I managed to clobber together once in a blue moon, but I had to redo my Gitlab and Github keys on both my development workstations, and now I’ve got it figured out. It’s so nice to be able to clone my repos and push without having to lookup passwords.

I think my BTC bullishness may have caught on with the Missus. She’s sitting on a lot of cash right now and just opened a Vangard account, per her FIRE peeps. I bought a small amount for her during the 2017 run up, and it’s now worth three times what she paid for it. We were comparing notes on portfolio performance she said, “OK, I’ll buy some more”. I’ve been trying to get her to setup a BlockFi account, but she’s had other things on her mind. I’ll probably just have her set the account up with some cash, and we’ll feed the interest into BTC. Maybe I’ll add some dollar cost averaging into the mix if she want to fund it further.

Ether auction development

So I actually started programming the (Evil) Ether Auction that I’ve been thinking about for several weeks. I put the repo up on GitLab while I work through it, so that I can get some feedback on it before deploy it.

I’m still working on the actual auction portion of it. There are several auction Solidity auctions tutorials out there, so used those as a start point while I refine the requirements for the app. The auction contract is deployed with a bid time parameter, and the the pot is seeded via a separate transaction. The first bid will set the endtime of the auction.

We keep track of the winner and first loser, once the auction is complete all other bidders will be able to withdrawal their funds from the contract. When the winner claims the pot, both the winner and first loser’s balance become property of the owner.

That’s the gist, anyways. I’m working out the details on a deployer contract that will keep the game running indefinitely, or until a set limit. I actually want the next round to be triggered by the winner claiming funds, some sort of callback to the deployer that takes the winnings from the previous auction and uses it to create a new one. I’ll probably add some sort of dev fee, and checks to make sure that the proceeds from the previous auction are more than the starting pot. I don’t think there’s any reason that this can’t be done, I’ll have to do some gas tests to make sure claiming the pot doesn’t cost too much for a first round.

I’m planning on seeding the first round with one Eth, and letting the contract run until the last round is greater than 32ETH. It’s actually pretty small change for some Ethereum whales, and there’s no reason that I couldn’t make this work for specific ERC20 tokens.

I’m using the Hardhat library to code this up right now, instead of Truffle and Ganache, and I’m not sure if I’m going to stick with Solidity for this contract or change over to Vyper. I’ve got the framework up and running and have started writing tests, but I’m unfamiliar with Chai and having problems wrapping my head around how to structure the tests.

The last piece I want in place is some sort of web interface setup, something simple that will list the auction details and allow users to place bids or reclaim bids from previous auctions.

That’s what I’m envisioning, so we’ll see how things go as development continues.

Numbers in Etherum and Javascript

So I feel like I made some significant progress today after going through this piece on numbers in Ethereum and Javascript. It’s quite a bit of trouble, especially because of the way that numbers and storage is returned from web3 call() and getStorageAt() functions.

Case in point, I’m trying to compute the sum of a Ethereum amount multiplied by a percentage. Both values are stored in wei, which has eighteen decimal points. If one simply multiplies the two together, you get a rate that is actually off by another 10^18, so the result of the division needs to be divided by this factor before it is returned.

The web3 library in Javascript relies on BN.js, which stands for big number. It doesn’t work on decimals, they have to be passed as strings. So I can’t just pass 10**18 to make a big number, I have to return it as a string.

let BN = web3.utils.BN;
let decimal = new BN((10**18).toString())
let balance = new BN("196144358288748402370");
let rate = new BN("2500000000000000");
let result = balance.mul(rate).div(decimal);
console.log("Result: " + web3.utils.fromWei(result));
> Result: 0.490360895721871005

Performing this percentage calculation in Vyper is simple arithmetic, where all the numbers are uint256.

res: uint256 = (balance * pct) / 10 ** 18

Calling Ethereum contract methods in Javascript

I am still trying to wrap my head around Javascript’s Promises and how to use them properly. I did make some progress today and managed to get some data off of the PRIA contract using web3.

A lot of my confusion stems from the behavior of the JS arrow function. I now understand it in relation to a Python lambda function. And now that I’ve figured out how to create an anonymous async function in Javascript, I think I’ve opened another door in my mind that is going to make this a bit easier to work with. I feel like I’ve leveled up.

I’ve create a small class that I can use to pull the data from the chain. I’m using Alchemy’s API to do this, my key is stored in an .env file.

// pria.js


const { createAlchemyWeb3 } = require("@alch/alchemy-web3");
const web3 = createAlchemyWeb3(process.env.ALCHEMY_URL);
const priaContract = "0xb9871cb10738eada636432e86fc0cb920dc3de24";
const PRIA_ABI = [{"OMITTED FOR BREVITY"}] // copied from Remix or Etherscan

class Pria {
    constructor() {
        this.contract = web3.eth.Contract(PRIA_ABI, priaContract);
        this.airdrop_threshold = "2500000000000000";

        (async () => {
            await this.init();

    async init() {
        this.airdropAddressCount = await this.contract.methods.airdropAddressCount().call();
        this.burnRate = await this.contract.methods.burnRate().call();
        this.minimumForAirdrop = await this.contract.methods.minimum_for_airdrop().call();
        this.totalSupply = await this.contract.methods.totalSupply().call();
        this.tx_n = await this.contract.methods.tx_n().call();
        return true;

There are various ways to call async functions from a class constructor. I’m probably not doing it very elegantly, but I think I understand the how to use immediately invoked function expressions. These self-executing anonymous functions remind me a lot of Lisp and lambda calculus.

I’ll be using the Pria class within additional code to calculate the cost of spamming the PRIA airdrop list as I described earlier. There are additional state variables that I need to retrieve from the contract. These aren’t public, so I’ll have to do some more hacking to pull them directly from storage, after I figure out where to look.

Once that’s done I’ll have my application logic code compute the cost of two hundred transactions and compute the price to dump all the gained tokens on Uniswap. Then the exciting part will be waiting for it to become profitable.

Spamming the PRIA airdrop list for fun and profit

I spent most of the day working on $PRIA related things today, mostly trying to figure out how to read data from the smart contract using Alchemy. I learned a lot.

I bought a few more tokens this morning to try and get back on the airdrop list, but I miscalculated the threshold and messed up. So I’ve decided to codify the calculations to figure out whether initiating a transfer will work as a way to cheaply accumulate the tokens. A transfer costs more gas than a standard Ethereum transfer call because of the airdrop code called by the function.

By checking the balance of the airdrop address each time it changes, one can estimate the airdrop amount. It’s roughly 1/200th of the total amount, and changes depending on the ratio of the airdrop wallet balance and the total market cap.

The next step involves estimating the gas needed for a transfer. I haven’t gotten this far, but it’s part of the web3 framework. I could also run a test transaction myself. Most of the transactions happening right now are interacting with the uniswap router, so it’s not accurate. Right now the floor seems to be about 400,000 wei, or about $3.51 in Eth.

In fact, as I write this, with the cost of PRIA just over two dollars, I could transfer the amount to myself, and potentially get back $0.50 worth of PRIA as a reward.

Of course, the tokenomics come into play here. I’d lose some PRIA on each transfer, increasing the burn amount with each transaction. Then I’d have to wait for another two hundred transactions to come through before I get my reward, before I can sell it. Then there’s also the question of selling.

I was able to put something together in a spreadsheet to figure things out. With the current burn rate, one can self-transfer the minimum amount of PRIA needed to qualify for the airdrop and spam one’s address to the payout list, taking up all 200 spots. A the current cycle, with the burn rate at 2.6% and about 600 PRIA in the airdrop pool, it will cost you less than six PRIA and drain the airdrop balance by roughly half. At this point one sits and waits for the next two hundred transactions, which which will payback at least 155 PRIA. This number is calculated on an additional 200 minimum qualifying transactions, which will continue to drain the airdrop balance.

The big problem though, is the gas fees. Spamming two hundred transactions will cost a lot. I calculated it as about two times as much Ether than the expected payout.

Of course, this is a dynamic system. Prices change, and everytime the system cycles through the airdrop list the burn rate changes. So, I’m in the process of building a script that can pull this data in real time and do the computations. Here’s the basic outline:

Monitor airdrop address for balance changes. This can be done with Alchemy's notify webhooks. 
Get the burn rate from contract
Get current current gas cost
Get PRIA price, either from Uniswap directly or via CoinGecko API
Calculate the wash trade costs (in PRIA) and expected payout
Estimate gas usage for transfer function
Compare gas fees to expected minimum payout. 
If profitable, execute 200 self-transfers
Wait for payout, then execute Uniswap exchange

There are several risks here.

First off, the calculation for the self-transfer needs to be perfect. If it’s too low, the payment won’t qualify for the airdrop list. If it’s too high, you’ll lose more than needed to the burn function.

Next, the gas calculations are tricky. PRIA transfers require much more gas than standard ERC20 token transfers, due to the airdrop system itself. Additionally there are rate adjust functions that are triggered on turn one of the airdrop cycle. And there’s additional functions that are called at the end of each turn when PRIA hits a floor or ceiling and swaps from burn to mint. I might be able to more accurately predict the fees. I could pull the gas costs from previous airdrop payout transactions, but I’m not sure if I could, filter out all the Uniswap interactions from those. More likely, I’ll need to deploy PRIA on an internal testnet, and spam the list for a couple cycles to take an average.

Then, there’s the risk that the price dumps before the next cycle completes. One could mitigate this risk by cycling the airdrop list for two whole cycles. It would increase the gas prices by double, plus an additional percentage that I haven’t calculated yet. Theoretically though, there is a point where the price of PRIA can get high enough, and gas prices low enough, that one could pay for such a double airdrop spam cycle, cover the cost by selling, then sit back and wait for the next two hundred transactions to trigger airdrop rewards at a profit.

That said, I have no idea how to do any of this. I’m trying to learn Javascript and web3 at the same time right now. I could use the Python library, but having a basic knowledge of how JS promises and async functions work is something I need to know. I’ve been able to pull data from the blockchain and have done some quick models in a spreadsheet, but there’s so much I have to figure out from there. I have a lot more questions that I have to figure out from a design standpoint, not to mention all the testing and data modeling that I can do from here.

And who’s to say even if I build it that we’ll even reach the point where this will work. PRIA’s less than a week old at this point, but I’m not sure if it’s ever going to reach the point where it’ll work, or whether it’ll peter off and die. All the work that I’m doing will be useful though, as the skills I’m using will make me a better engineer.

What if I could build something that could watch the blockchain, and when the moment’s right, fire off four hundred self transfers, take the resulting income and Uniswap it in one block. Wouldn’t that be glorious?

It might just work. Unless it gets frontrun.

Web3 development

Yesterday we had a party for Elder, since we were out of town for her birthday. Our quranteam came over, so all the kids were running around the backyard while we ate pizza and wings, and drank the latest batch of my homebrew. Missus even got me to break out my guitar and I spent a good hour playing and singing at the top of my lungs. It was good times.

I keep partying after the kids went to bed, and played video games until well after midnight. I paid for it this morning. There wasn’t much cleanup left to do from the party, but I wasn’t productive in the AM. I didn’t get much work done for Zombie, I pretty much just checked in with Boss and spent the rest of the morning looking at markets and reading.

I wound up buying an IXL subscription for Younger, and we spent some time working through some of that.

I decided to take a look at Flutter, and worked through the entire tutorial. It’s a very interesting project, like React, that allows you to create one project that will render on IOS, Android and the web. It’s pretty neat. I’m not sure how I feel about Dart yet, but I’ll probably dive into more later. The whole ecosystem is pretty interesting; Material probably deserves a closer look at some point also.

I’ve got to be careful though, cause I feel like my backlog is gonna quickly get swamped up at the rate I’m going. I also ran across Alchemy, which is a development platform for Ethereum, and I feel like I’m going to be spending a lot of time with that very soon.

It seems like there’s a whole lot of things to be built in the DeFi ecosystem, so having a good understanding of how to interact with smart contracts is going to be instrumental, as is being able to build dashboards like what Zapper and others are doing. I’m really going to have to have my work cut out for me. I love Python, but I’m going to have to work with some of these other languages if I want to be a world-class developer.

Tonight I think I’m going to spend some time working on Ethernauts. I need to make some adjustments to my Solidity workflow to make things work a little better, and it should give a better understanding how to interact with Ethereum programmatically.

Evening notes

Trade plan programming

I’ve been working on my trade planning Python module the last couple days, and already the project is becoming rather complex. I say it’s a trade plan module, but really it’s a capital preservation ‘brake’, if you will.

The basic idea behind the module is like this:

  • Get balance list and filter empty ones.
  • Get last symbol/BTC market price.
  • Calculate total BTC value of all holdings.
  • Get open orders for each market. For each, look for limit orders, and calculate the covered/uncovered amount in BTC.
  • Make sure that no uncovered position accounts for more than two percent of total portfolio value, and that no more than six percent of the portfolio value is uncovered. If they are, do not allow any additional buys.

The last couple days I’ve been slowly working through everything, following a strict TDD methodology to make sure the code is covered, monkeypatching and mocking calls and creating fixtures for the exchange data. Now I’m getting to the point where I don’t know how to proceed, and I’m getting frustrated.

I don’t know where the problem arises in times like these, but I have a feeling it comes from lack of proper planning. I start out with a few procedural calls, then I get to a certain point of complexity where I have to start refactoring classes. Or I don’t know what to do next, and so I cobble come code together without writing a unit test first, and start breaking my flow.

All I can do at times like this is take a break.

Binance token mooning

Binance token has been on a bit of a tear the last few days. Apparently they’ve launched their own EVM compatible Binance Smart Chain, and are hoping to go after the DeFi space. Good luck to them.

I took a look at the validator instructions earlier to price out the cost of being one. It costs 10,000 BNB tokens, or about 300 BTC ($3 million), and about $244/month in AWS costs. That’s still a magnitude cheaper than running a $30 million Serum DEX node, but shows the type of centralization that we’re going to be seeing with these projects. I’ll keep running my puny IDEX node, and work toward my 32 ETH so I can run a Ethereum 2.0 node.

I’ve actually been holding my BNB tokens for two years, and they just actually touched my cost basis after spending so much time underwater. Since I’m actually trying to follow my capital preservation rules, I’ve had to put a tight stop on this latest run. I’ll have to figure out how to account for entry cost in my trade plan program, as now I’m just looking at the percentage of total. This may not work well when things start mooning and I have to recalculate on the run-up.

Jumping into the DeFi deep-end

I’ve decided that the opportunity cost for keeping my funds in BlockFi is just too great, and I’ve initiated some withdrawals. I’ll be putting the entirety of the funds set aside for my kids into the sBTC vault later this week, for a modest 40% APY. I must have stared at the withdrawal screen for five minutes before I could push the submit button. I must have read the wallet address over and over three or four times to make sure they were right.

It’s stressful, being your own bank.

Anyways, I’ve still made no decision on my cold storage funds. I’m risking way more than two percent on this vault, and any more would be irresponsible.

Famous last words.