Evening notes

Trade plan programming

I’ve been working on my trade planning Python module the last couple days, and already the project is becoming rather complex. I say it’s a trade plan module, but really it’s a capital preservation ‘brake’, if you will.

The basic idea behind the module is like this:

  • Get balance list and filter empty ones.
  • Get last symbol/BTC market price.
  • Calculate total BTC value of all holdings.
  • Get open orders for each market. For each, look for limit orders, and calculate the covered/uncovered amount in BTC.
  • Make sure that no uncovered position accounts for more than two percent of total portfolio value, and that no more than six percent of the portfolio value is uncovered. If they are, do not allow any additional buys.

The last couple days I’ve been slowly working through everything, following a strict TDD methodology to make sure the code is covered, monkeypatching and mocking calls and creating fixtures for the exchange data. Now I’m getting to the point where I don’t know how to proceed, and I’m getting frustrated.

I don’t know where the problem arises in times like these, but I have a feeling it comes from lack of proper planning. I start out with a few procedural calls, then I get to a certain point of complexity where I have to start refactoring classes. Or I don’t know what to do next, and so I cobble come code together without writing a unit test first, and start breaking my flow.

All I can do at times like this is take a break.

Binance token mooning

Binance token has been on a bit of a tear the last few days. Apparently they’ve launched their own EVM compatible Binance Smart Chain, and are hoping to go after the DeFi space. Good luck to them.

I took a look at the validator instructions earlier to price out the cost of being one. It costs 10,000 BNB tokens, or about 300 BTC ($3 million), and about $244/month in AWS costs. That’s still a magnitude cheaper than running a $30 million Serum DEX node, but shows the type of centralization that we’re going to be seeing with these projects. I’ll keep running my puny IDEX node, and work toward my 32 ETH so I can run a Ethereum 2.0 node.

I’ve actually been holding my BNB tokens for two years, and they just actually touched my cost basis after spending so much time underwater. Since I’m actually trying to follow my capital preservation rules, I’ve had to put a tight stop on this latest run. I’ll have to figure out how to account for entry cost in my trade plan program, as now I’m just looking at the percentage of total. This may not work well when things start mooning and I have to recalculate on the run-up.

Jumping into the DeFi deep-end

I’ve decided that the opportunity cost for keeping my funds in BlockFi is just too great, and I’ve initiated some withdrawals. I’ll be putting the entirety of the funds set aside for my kids into the sBTC vault later this week, for a modest 40% APY. I must have stared at the withdrawal screen for five minutes before I could push the submit button. I must have read the wallet address over and over three or four times to make sure they were right.

It’s stressful, being your own bank.

Anyways, I’ve still made no decision on my cold storage funds. I’m risking way more than two percent on this vault, and any more would be irresponsible.

Famous last words.

Blood in the streets

Markets continued to tank today. So, we learn.

Seems like everything was in the red today. I didn’t take a lot of time looking at the markets today, but it seems that everything was down significantly. $BTC broke below $10K, and took everything with it, it seems.

The losses that got stopped out in the last week or two, like $BAT and $CVC, were actually good in that they preserved some capital. And since I only deployed two percent on the other orders I placed, I’ve managed to avoid taking losses on $ZEN and $SOL. I’m still waiting for my COSMOS/$ATOM order to hit, and if Aave’s $LEND token has another flat day I may pick up some of that as well when it hits a nine on the TD Sequential.

I’m also keeping an eye on the IDEX-ETH price. If it spikes I may trade some; I’m still open to the possibility of providing liquidity on Uniswap. We’ll see.

I read a lot of Mastering Ethereum. The cryptography chapter managed to put me back to sleep this morning, but I’m actually getting to the smart contract part and am looking forward to trying my hand at Ethernaut right after I get done with this. I’m really looking forward to writing some programs to watch what happens with these liquidity pools.

Should be fun.

Slow day

Markets were down all day, so today was a chance to catch a breather.

So the Yearn.Finance ETH vault went live earlier today, and I managed to stake a small position plus another twenty dollars in gas fees. Apparently they’re depositing the funds in Maker, using it as collateral for a DAI loan, then depositing that on Curve, earning CRV tokens, selling those which goes back to the ETH pool.

I’m comfortable with these little 2% experiments and consider it as a sort of tution. Unchained covered why DEXs are taking off in this latest episode, and they really break things down quite well. It’s worth a listen.

I’ll admit I didn’t get a lot done today, I stayed up too late last night and got woken up too early by the kids. They were quite a handful today and it was tough trying to get some work done while nursing the lack of sleep.

About the only thing I did do was re-opened a position in $ZRX.

This isn’t ideal, considering that I just stopped out this position less than a week ago, but I want to explore picking up positions on the TD #9, before it closes. I’m still trying to flesh out the “rules” about how I’m going to set stops on these positions. Damn Binance and the lack of decent trailing mechanisms. I’ve got a lot of work to do to code these things up.

Using my Google Sheet trade calculators has become very cumbersome given that the CryptoFinance module that I had been using for price feeds no longer works. I had built custom API lookup scripts for some of the smaller markets in my mining portfolio, but it’s just too cumbersome keeping those together between various sheets. So I think the time is nigh to convert those over to some sort of Python program, maybe with a web front end.

I’m not sure how much work that’s going to be, of course.

All of this DeFi madness did get me to pick up Mastering Ethereum, which I started reading through today. Things seem to have come quite a ways since I tried experimenting with things a while back, now they have a online IDE that lets you compile, deploy and test smart contracts on a local JS node. It’s pretty handy, and the whole thing is pretty damn handy.

Cryptocurrency mining auto-exchanger

How to convert mining pool proceeds to bitcoin

I haven’t done much programming lately, at least rather I haven’t written much code lately. I still rely on some things that I’ve written; my value average programs for my self-directed IRA is something I use every day. But I’ve been so tied up with other projects lately that yesterday was the first time I’d really sat down and started writing a program from scratch in a very long time.

It’s amazing how much I’d forgotten.

I’m writing a new Python program, this one to do some automatic selling of cryptocurrency mining proceeds. My little six-GPU mining rig has been chugging along for some two years now, and has yet to turn a profit, so I’ve decided that if I’m going to keep doing it I need to at least cover my electricity costs. I’ll be exchanging the proceeds for bitcoin, not fiat, of course. Plus it seems that market pressure is building toward another parabolic run, so I think it prudent to start converting some percentage of my proceeds that I’ve mined over the years so that I don’t wind up holding another dead bag. I guess I’ve moved out of the spec mining phase for the time being, and need to start making some real money.

So I sat down last night to start designing a system. I had already set up my environment during a previous session, and got started with my usual project flow, setting up dotenv files, a new git repo on my home lab, TDD with pytest. It was slow going, and felt like I had to do Google searches for every line of code I wrote. I caught myself dealing with premature optimization several times and had to stop myself from over complicating things. I was just building an API call using requests.

All I managed to do last night was get the rewards statistics from the mining pool that I’m using. It’s a list of dicts, and I’m not sure how I want to do my calculations, a percentage of rewards, or something based off of power consumption. I need to at least cover my rig’s 0.9Kw/h power consumption. I can’t get that from Simplemining since they don’t seem to have an API, so for now I’ll just have to go off that estimate, or plug in my Kill-o-watt. I killed the last one with a power surge over eleven hundred watts, so I can either run a test or just use SMOS’s estimate.

Either way, I can use my mining rewards for our preferred mining interval, get price data off Binance using the CCXT library, and then send a transaction to the exchange using the wallet API. Then I’ll have another call to the exchange, after the deposit has cleared, to initiate a market order. And then a final call to move the proceeds to my hard wallet.

Several of these components are going to be rather straightforward, but putting them all together is going to be a bit of a challenge. I could run this as a simple script as a cron job, but I could also leave this running as a service, with various components running in different threads. I’ve obviously got some design decisions to make.

I have decided that I want to move my blockchain nodes out of my house. I’ve got several of them that I need to have online, including ARW, XHV, RVN, and BEAM. Running all of them at the same time is a bit too taxing for the machines that I have scattered around the house, so I’m going to try and put my AWS training to work and setup various nodes in the cloud.

My XDNA node that is currently running in an EC2 instance is using less than 5% CPU as all of the work for Ethereum has been handed off to Infura. When I tried to put the BEAM node on this same instance it couldn’t handle it, so I’ve got to figure out what the actual requirements are for it. I’m hoping that it’s not going to be cost prohibitive.

I’m envisioning several instances with various blockchains stored on S3 volumes, nodes running on their own small instances as needed. My wallet files can either remain at my home lab or consolidated in another, secured instance. Cost will be a factor of course, but maintaining copies of several different blockchains locally is proving difficult to manage. And several of my wallets have been untouched for over a year or more.

I’m hoping I can build everything out in config files, Docker builds for the nodes, and the AWS configuration itself setup using Elastic files. I envision that some of the programs, like the one that I’ve described here, could even be run as Lambda functions.

Of course, a lot of this sounds like premature optimization. We shall see.

Morning pages

Yesterday I spent most of my time trying to migrate a production WordPress site to my development environment. Normally, I’ve used Infinite WordPress’s site migration tools to move them, which does the trick of moving all the files and updating the database references to the site URL, but I don’t think it works when the site’s not public. I’m doing a lot of hacks with my Docker setup, importing databases, messing with file permissions, and duplicating a lot of my work since I like to sit downstairs at my desk during the day, and upstairs at night. So part of my challenge is trying to find a setup that works well for me.

I might have to make some sacrifices. JetBrains IDEs don’t like to work over the network, so I’ve got to add a directory sync if I want to keep the files on my network server and work from both workstations. At least I can run Docker from the remote machine, but it’s not supported by the IDE, so I’ll have to figure out how to fit that into my workflow.

The girls were good. Elder did everything I asked her, and got her extra screen time. She did Typing, piano, and two sessions of math in Khan’s, and we managed to keep the house tidy. So that’s a big parenting win. She’s already up and on her laptop right now, ostensibly doing typing, but I don’t hear too much of it going on over there. Maybe she’s doing math.

We had a bit of excitement yesterday when bitcoin went on a little bit of a tear. I noticed it shot up to touch ten thousand and got excited. Elder came over and said “it went UP,” excitedly. We watched the fight for a few minutes, before it dumped, and then went on a bike ride.

I moved my entire Ethereum stash over to BlockFi. There’s always a moment of horror after publishing a large transaction to the blockchain when the doubt sets in. Wondering whether the address was copied correctly or if my opsec failed and some hacker changed the receiving address while it was in my clipboard. Did I check the address. I usually do a small test transaction before sending over the big one, but it still makes me nervous. Especially after reading about the mining firm that said someone sent a $144 Eth transaction with $131 million in gas fees.

So now I have a fair chunk of my assets up on BlockFi. I haven’t touched but a fraction of my BTC; it’s just too much risk for me to do that. I’ve got a roughly even spit on there between BTC, ETH, and USD stablecoins, and I’m considering whether to put more USD there. I’ve still got the girl’s BTC accounts, but I don’t want to mix them with mine, and I’m not yet sure if I can open an account in their name or if I’ll have to do like I did for lending club and make multiple ones in my name.

Due to the coronavirus, the IRS is allowing 2019 IRA contributions up until July 1. I’m considering whether I want to do this, or throw some more cash into BlockFi. My IRA is on fire right now, I calculated 70% realized gains off of this market rally, and my unrealized gains for the year are much higher than when I calculated them a couple of weeks ago. I’ve still got active value average positions that are in play, and I’m probably going to be short on cash before they complete, so I need some powder. I just doing know whether I should sell some of my other positions, or put more cash into play. All of my current plays are under risk-adjusted position sizes, but my long term holdings are just sitting without any stops on them. With everyone going crazy on RobinHood these days, I should probably put some protections in place in case there’s another lockdown related pullback.

Yesterday, a client’s laptop failed, and I’m waiting on a vendor to go out there and swap a motherboard or something. The drive is encrypted, and while I’m certain I have the keys, I felt a shot of adrenaline course through my body when I remembered that I neglected to reinstall a backup program on her machine after replacing it. So I know what I’m doing today. What I don’t know is what I’m posting tomorrow for my newsletter. This post has been the type of rambling morning pages post that’s of no use to anyone but myself, and which is not the type of quality content that I want to be sending out to my LinkedIn network, or to the email list which I just salvaged from an old CSV file.

I’m going to let that one mull in my head today, and let it stew.

Genius dad

So this is a late post for me today. Woke up the same time as the kids, and forgot to turn my phone on DND before I started meditating, and got a text in the middle of it about an outage at one of Zombie, Inc.’s cornerstone clients. So I felt obliged to take it, and the morning was just shot from there. The day actually improved from there, even though I wasn’t as productive as I wanted to be. So here I am, trying to finish what is for me one of the most important parts of my day. I finished meditating after I put the kids to bed, and I want to put down some thoughts before I get to work on coding. No TV today.

I had a good day with the kids. One of their friends came and knocked on the door. They hadn’t seen her in several weeks, so I let them out for some socially distanced bike riding. I chaperoned. Then after lunch Younger and I took a ride to the pier nearby. She’s totally comfortable on her new pedal bike, only took four or five days. I’m so proud, she’s not even four yet and riding a pedal bike; didn’t even need training wheels. I feel like genius dad.

Elder had a good day also. She had a nine-thirty call with the gifted teacher, which probably broke up our routine for the better. She has this idea that she’s been bringing up for the past couple days about turning the house into a hair salon. I’m trying to humor her but explain the reality about what that really would mean. We also discussed writing a book. She came up with this idea for a story called “Cave of Gold” or “Treasure of Gold”. Her description of it sounds like The Goonies, which we watched over the weekend. I told her the most important thing about making it happen was getting it out of her head and into the real world. We discussed typing it, writing it, and I even showed her some voice dictation options, both on my iPhone and a electronic voice recorder that I have. She wound up writing a scene just before bed. It was a dialog between a mom and an older sibling being asked to take care of their little sibling. Sounded like something right out of our house. Seems like she’s already learned the rule, “write what you know”.

Getting work during the day is hard, though. There’s the distractions from the kids that make most deep work impossible. By the time I actually have the time in the afternoon, my energy is dead. I moved the needle on a few small tasks: ongoing domain migration woes from a crappy reseller; and got a copy of my resume added to my CV site and made a few edits. I’ve got no excuses to start applying now.

I’ve started refactoring my value averaging code. The main function is a hundred lines long, and there’s no tests, so I’m going to to spend some more time on that again today. I run it every day when the market opens. I’m having some problems with it. I give it a list of positions to process, and it takes each one and goes through several steps of calculations before sending a buy or sell order to the exchange. Some of the positions are failing and I’m not sure why, so I’ve got to decouple several of the functions so that I can debug it better. After that I need to pull it out of the package that it’s in and make it a separate library. Right now it’s in my trade plan library, which has turned into a bit of a junk drawer over the past year or so. It’s also tightly coupled to the TDAmeritrade brokerage, and that needs to be abstracted out at some point. I’m getting ahead of myself though.

Tomorrow, I want to get up by six so I can get my meditation and writing done. After the kids are settled in and I’ve done all my morning checks for Zombie, I’m going to focus on the software design pilot project I’m working on there. Then the afternoon, I want to find the best job posted on LinkedIn and apply for it. We’re going to make this happen.

Templates, makefiles, and YAML, oh my!

When I first started programming, it was simple to just fire up an editor and start typing away. Scripts usually wound large procedural monstrosities. If I managed to get anything working it usually was such a mess that it quickly became unmanageable. Now days, there’s so much setup that needs to be done before I can even get to work: setting up a git repo and Python virtual environment; external repo; databases, setting up my IDE. I suppose it must be indicative of the progress I’ve made as a programmer.

One of my final classes is a multi-semester group project. We spent last semester building out a the design docs, and are spending the first few weeks of this one refining those docs individually before coming back together and deploying a prototype. I’m the old man on the team, about twice as old as the rest, and I’ve been doing this long enough to have very strong opinions about a lot of things, so I’ve been trying to guide the team toward these standards.

I’m not going to get into the use case around our app yet, but convinced the team that we would use Django for the backend. Now, while we could use it for the front end as well, I figured that since Django was giving us most of what we needed for the core functionality, we could spend some resources trying out some cutting edge tech that would give the team some experience with GraphQL and React Native. I’ve got no idea whether that will make it in the final approach. Even though we’ve got a team of six people and I’m handling all of the infrastructure stuff, I’m starting to wonder whether the others are going to be able to implement those new features in time.

I’ve got a few more passes through my individual paper to make, then I’ll start focusing on the prototype presentation. My professor made a comment during last week’s recitation that these applications did not have ‘cookie cutter’ approaches, and I almost laughed out loud cause we’re literally using Cookie Cutter Django as the basis of our project. I’m debating whether I want to try a live demo of deploying one or do it offline and record screenshots or something.

Being able to use something like Cookiecutter to setup a Python package, with all the unit testing, CI, and documentation setup via make make commands, out of the box, is amazing once you understand what all of that stuff actually does. It can lead to a bit of choice paralysis at first, trying to figure out testing frameworks, code coverage tools, linters and all that. I’m still getting there. But once found, it makes rapid prototyping easy.

It’s almost maddening thinking about how many different ways there are to setup your workflow. I’m currently using Pipenv as my tool of choice, but recently read about Poetry, which seems to be a step up in many ways. For now though, I’m not chasing it down. Instead, I’m going to focus on delivering something using the tools I already have, instead of getting caught up in what’s new. It’s lesson that continues to be more and more relevant as I mature in my abilities.

Gaussian Elimination with TDD C++, Part 1

I’m pretty pleased with myself. I managed to pull an epic coding session Friday night and met the deadline on a school assignment. I almost used the word finished there, but I merely got it working. As Uncle Bob said, getting it working is the first step to completion, refactoring and cleaning it up is the next.

The purpose of the assignment was to implement a Guassian Elimination function in C++. The professor, an old Fortran/C++ veteran who had done a lot of scientific matrix work back in the day, wanted us to use pointer pointers for the matrix rows, to make swapping rows faster. They gave us the following specification of how the Matrix would be represented in a data file:

3 // int N representing the size of the matrix A
1 1 1 // values of A[row i]
0 1 1 // A[i+1]
0 0 1 // A[i-N]
1 1 1 // right hand side

The professor then went through the algorithm for solving such a matrix on the board. Later they showed us how to generate datafiles with solvable problems for testing, but we’ll skip over that for now.

The example that the professor did in class was a bit of a mess, so I went looking for better examples. Rosetta Code has examples of Gaussian Elimination in many different programming languages. The C version is pretty close, but even looking at the gauss_eliminate function here, we can see that it’s doing a lot and can further be broken down into smaller functions.

void gauss_eliminate(double *a, double *b, double *x, int n)
{
#define A(y, x) (*mat_elem(a, y, x, n))
    int i, j, col, row, max_row,dia;
    double max, tmp;
 
    for (dia = 0; dia < n; dia++) {
        max_row = dia, max = A(dia, dia);
 
        for (row = dia + 1; row < n; row++)
            if ((tmp = fabs(A(row, dia))) > max)
                max_row = row, max = tmp;
 
        swap_row(a, b, dia, max_row, n);
   
        for (row = dia + 1; row < n; row++) {
            tmp = A(row, dia) / A(dia, dia);
            for (col = dia+1; col < n; col++)
                A(row, col) -= tmp * A(dia, col);
            A(row, dia) = 0;
            b[row] -= tmp * b[dia];
        }
    }
    for (row = n - 1; row >= 0; row--) {
        tmp = b[row];
        for (j = n - 1; j > row; j--)
            tmp -= x[j] * A(row, j);
        x[row] = tmp / A(row, row);
    }
#undef A
} 

My experience with C++ has been limited, mostly schoolwork with CodeBlocks and Eclipse; I prefer using JetBrains these days. And I’ve never done tests in it, so after I set up a new repo the first thing I did was spent some time figuring out Google Test before I wrote my first line of code. I started with making sure I could load files, then started writing output helpers, overloading the ostream operator and creating a print() function.

Let me say: Test Driven Development is HARD. It requires a lot of thought up front about what it is that you are trying to do. I started off with a todo list:

- call GaussianElimination function
- read file from file system
- get size from file
- create matrix(size)
- load vector data from file
- create 2d vector array size N
- initialize matrix with values 

and started working through each of them, going through the red light/ green light cycle: writing a test that would fail, then implementing the code that would make that test pass — and NOTHING MORE. Like any discipline, it’s hard. But the effect is amazing. Having a magic button that lets you change code and get [ PASSED ] back after is exhilarating.

I’ll admit that I cheated a bit as the deadline approached. I hadn’t implemented proper comparison operators for the Matrix class, so I was doing everything by eyeball before I got to the point where the code worked and I could submit for credit. The result was still a far cry from the way I usually operate, with a bunch of manually entered code.

I’ll share more in a further post.

Learning to fly

I’ve been on a bit of a kick on Robert C. Martin’s work lately. Martin, AKA “Uncle Bob” is the author of several books on coding and is the author of a couple of classics in the software development field. I’ve watched several of his lectures on YouTube recently, and have been reading through Clean Code the last couple days. It’s really making me realize how garbage the things I’ve been writing lately are, and I’m pressed with an immense urge to go back and completely refactor everything that I’ve been working on the past few weeks.

Of course, having a robust integration test suite is absolutely necessary for any kind of refactoring, which is not something I’ve been terribly disiplined about recently. I’m proud to say that I am taking a strict TDD approach to my latest class assignment in C++, although it has slowed me a great deal. The hardest part is determining how to right tests. Sure I could go and write a massive 200-line function that would take input and perform the Gaussian Elimination on it, but since this is part of a larger test suite that we’ll use for our final exams, I want to make the code more modular. For example, see the difference between this big 75 line single main statement, and this one. The latter could still be broken out to smaller functions according to Uncle Bob, but is still a step in the right direction.

There were two reasons that I went back to school to finish my degree. The first was that I thought I needed a BS after my name in order to get my resume past some of the gatekeeping algorithms at some firms. I’ve since come to the realization that I have no desire to go to work at any large enterprise or other organization where this would be a factor — six figures be damned. The second was that I felt like I was running into roadblocks with my own development projects. They were basically huge convoluted procedural things. Even when I tried to adopt OOO principles, they were still a mess. I felt like I needed to go back to school and go through the curriculum to get where I needed to get.

I don’t think it’s quite worked out the way I wanted it to. Now, don’t get me wrong, I think earning a degree in ‘Computer Science’ has been valuable, but it’s not quite what I expected. I think one of the intro Unix classes really broke my block when it comes to working with Linux, and that’s a skill that I have definitely appreciated. But I think the focus on Java and C++ is behind the times.

I recently had a conversation with one of my professors about why I was surprised that there hadn’t been any focus on Software Design patterns. (I’m still working my way through the Gang of Four.) He told me that there was a bit of disagreement within the department between those who wanted to focus on theory, and those who wanted more actual engineering and development. So far, the balance of power lay with the theoretical side, which is why the focus on the maths, big-O notation, data structures and discrete-finite-automata.

Even so, I’m still surprised that I feel like I’ve taken more out of a couple of 30 year old videos on Lisp than I have out of the classes that I’m going $20K+ in debt for. All I wanted to do was to write better code, so that I can make programs do what I want them to do. The ideas that I’ve had for things were beyond my grasp to complete them, and I was looking for ways to increase my knowledge. I’m probably being unfair to the university, since some of the more business-end document writing (requirements, software specification documents, use cases, &c..) have helped me already in some of my professional interactions.

At the end of the day, it’s about sitting down with an IDE and writing those magic lines of code that make the computer do what I want.

Programmer Discipline

So my productivity has been shot to hell the last two days while I try to familiarize and setup not one, but two new programming environments. I’ve got Javascript for the CCXT/Safe.Trade library, and just got assigned a C++ module for one of my classes.

I have a somewhat convoluted setup. I like to work from one of two machines. My desktop is for gaming and personal or school projects, and my laptop has a Windows VM that I use for my day job. I also have an Ubuntu server that I’m running a file share and other services on. It’s got Docker running over ssh, but I was pounding my head today trying to figure out how to get IntelliJ to talk to it so I could use the integrated run tools instead of the copy/paste garbage I’ve been dealing with as I try to catch up on 20 years of Javascript changes and Node.

For one of my final classes I’ve got to implement Gaussian Elimination in C++ as part of a larger library that will be part of my final grade. I said goodby to CodeBlocks and Eclipse a while back, but I haven’t started a project in C++ in years. The only time I looked at it all has been for the PennyKoin updates. I’ve never spent the time to understand make lists and linking, so I just spent a painful hour trying to get Googletest integrated with this new project. Cause of course I’m going to write a test before I put down anything more complicated than ‘hello world’.

Of course I am.

I’ve spent the last week going over a series of videos on Clean Code by Uncle Bob C. Martin. It’s a good one that I really enjoyed. Martin is really good up on stage — and funny — and I was disappointed when I finished the last one and realized that there weren’t any more. There’s much more on his CleanCoder site for sale that I might dive into, but I want to read his Clean Code and Clean Architecture books first.

Highly recommended if you have several hours to spare.

I came to realize that the tests that I wrote for the GBTC Estimator were too tightly coupled to the module code, and that the module code was coupled to the input (IEX via Pandas DataReader class. So I’ve been trying to decouple it so that so that it works with a dataframe from my broker’s API. I’m taking some hints from a mocking talk I saw that made me realize that I need to break out dependencies even more.