Getting your money’s worth

Or, make sure you back up your 2FA codes…

I’ve been using the same iPhone 7 Plus for over three years now. In fact, I just finished paying my carrier the last installment plan for it a few months ago. Since my wife and I are focused on becoming financially independent, having the $40-50 a month back has given me some satisfaction, and I had no desire to upgrade to a new device. In fact, part of my mindfulness practice has been to spend less time on the phone, so I’ve taken a few steps to make the phone less appealing to use. In addition to keeping it in do not disturb mode most of the time, I’ve also turned the screen to greyscale via the Accessibility Options -> Color Filters. There’s also a Accessibility Shortcut that I can activate with a triple-click of the home button to turn this off, but most of the time when I see my icons in full color the vividness is almost unbearable.

Now I’ve had no shortage of accidents with my phone over the years. I dropped it while canvassing back in 2017, and rather than break the screen, it did something with the LCD substrate that caused black lines to creep in from the edges. They went away after a while before coming back, and I dealt with it for several months before they became so large that I couldn’t read things like the battery meter or other elements that I couldn’t scroll to a better part of the screen. So I reluctantly brought the phone to the local cell phone repair store and had it fixed for a hundred bucks. What took me so long, I said to myself, looking at my fully-readable screen.

The next time I broke it I decided to try my own repair. I ordered a kit for forty bucks off of Amazon, and pleased myself by repairing the phone myself in less than an hour. The repair didn’t last much longer than that, as I again dropped the phone and broke the screen again. The next DIY repair wasn’t so lucky.

The encryption for the Touch ID on iPhones are controlled by a small integrated circuit soldered onto the home button’s ribbon cable. This chip is paired to the iPhone’s motherboard and contains hardware keys that are part of the encoded fingerprint data. In the past, Apple maintained tight control over the repairing of the device with a new TouchID button, and it was not something that could be done outside of an Apple store. In fact, a whole market around repairing TouchID flex cables appears to have sprung up in Asia, as there are several how-to repair guides on these cables that involves sophisticated soldering techniques.

This was obviously beyond my ability and resources. My other option was to have Apple replace my screen for $160 or so. It seems that Apple just swaps the screen with the home button attached, to mitigate the risk of damaging the flex button during removal as I had done. That was still a bit much for me, and I relied on the on-screen Assistive Touch button for several more weeks. The most major downside for losing TouchID was that LastPass forced me to type in my full password, but it wasn’t the end of the world.

And then the final straw. The microphone died. I’m still not sure what happened, but at some point I discovered that I couldn’t complete calls. I couldn’t hear anyone, and they couldn’t hear me. But it was strange. Certain videos on Twitter would play, but not others. I couldn’t record sound in the memo app or in videos. I could complete calls via my car’s bluetooth or my iPad, but not using my bluetooth headset. And at first this was fine; I have a VOIP phone for work that I can use for important calls, and everything else went to voicemail. It was actually kinda relaxing for a while. At some point last week I decided enough was enough. I did an iTunes backup, wiped my phone to factory, and checked again. Still broken. I restored the backup to make sure I still had access to my 2FA accounts, and made plans to go to the Apple store for an assessment.

The short story is that the Apple tech couldn’t run the diagnostics. The Lightning dock diagnostic device that they use returned CATASTROPHIC FAILURE and the tech just shook their head. Since I was adamant about another $500-800 phone purchase, I went to file an insurance claim with my carrier, agreed to a $120 for a replacement device, and had a new phone via FedEx the next day. I ran a backup on my old phone, loaded the new one, swapped the SIM card and laughed to myself. What took me so long, I asked myself again.

I wiped the old phone, threw it in the box, slapped on the return shipping label and stuck it in the mail. Then today I went to open Google Authenticator and stared in shock at a blank screen. What followed was an hour of mild shock as I cataloged and prioritized what I had lost. Work accounts were the least of my worries, of course. I have crypto accounts. Thankfully Coinbase and Gemini were in Authy, which seems to use a cloud based storage protected by a key that gets backed up by iTunes. After some mild panic-Googling, I was able to recover one of my Google Auth account keys for another exchange from a note in my password manager, and started a two-week cool-off period on another. I provided some personal details in a ticket to a portfolio tracking site, and they had the account unlocked in less than 15 minutes.

Everything that I reactivated went into Authy. I’ve still got a few vendors that I’m going to have to contact to regain access, but nothing critical that is causing me any stress. I’ve learned a valuable lesson about 2FA.

And next time my screen breaks or my phone gets damaged, I’m not going to wait months for repairs. I’m just going to file the damn insurance and be done with it.

“If you think you’re enlightened, spend some time with your family”

Last night, I spent a couple hours working on a team project, trying to get our development environment setup for our team. I was using a Django Docker container that I had generated using Cookiecutter-Django, but I had forgotten to select the setting that checks in the local .env files. I had already deleted my local copy of the repo to test cloning it, so I had to start from scratch.

I wanted to have an answer file in case I needed to regen the project with Celery support or whatever, so that meant learning about Cookiecutter’s replay capabilities. After figuring that out I determined that the Git repo wasn’t setup properly, so I moved the old one on our uni’s GitLab and ran the whole procedure from one more time, pushing and recloning the entire repo down and deleting my Docker images and building the whole thing again to make sure my teammates wouldn’t have any problems running it themselves.

I popped into our team Discord channel to let everyone know, and found myself in a discussion about dropping Django and pivoting to Android Studio.

I was furious.

Not like annoyed, but actual rage. I felt my blood pressure go up and I had to try very hard to maintain my composure while I responded.

I’ve never been what I would describe as an angry person, in fact most of adult life I’ve been pretty good about not getting mad at people. Computers on the other hand… I’ve been able to deal with things pretty well, I’ve never been one of those guys who loses their cool, gets in fights or screams at people. The last few months though, I’ve been a bit more willing to allow myself to get mad, and I’m not sure whether or not it’s a good thing or not.

When I was younger, I did a great deal of holding things in. I had a lot of unrequited crushes; mostly they were unrequited cause I never let the other person know. But outside of issues around my sexuality, I don’t think things were ever a problem. Now I don’t know if my meditation practice has anything to do with it, or whether part of it is being a father and having young children, but I am definitely way more willing to let people know when I’m mad or upset. I’m sure part of it has to do with how I was raised and disciplined. I’ve probably internalized a lot of abuse — for lack of a better term — and have been struggling with how I deal with that and how I discipline my children.

While I was responding to the team, I made sure to voice my frustration without resorting to any personal attacks. I spent about ten or fifteen minutes trying to respond to a few points and finishing what I had came there to do, to let them know that I had redone the development repo. I said straight up that I was too mad at the suggestion that we drop our entire software architecture plan six weeks out from the end of the semester. I tried not to be snippy and sign off with some useful information, so I told them to let me know what they decide tomorrow.

I haven’t really been able to stop fuming about it since I signed off though. I woke up after an hour of going to bed and tried to read for a bit to go back to sleep, and still woke up an hour before my normal time. My sleep patterns have been a bit fucked and my wife is out of town, but it’s still an unusual pattern for me.

I don’t want to get into any more of the details of this, but want to close with a quote from the Dali Lama that comes to mind: “worrying is prayer in reverse”. Meditating on this incident and writing about it has already reduced some of the anxiety or stress that I was feeling. To circle back around to earlier about being more willing to experience the anger and allow myself to get mad, I want to close and note that it’s important that the anger doesn’t get bottled up inside, and flows out before it can do real harm.

I’m going to post this and move on with my day. Peace.

The Peripheral, by William Gibson

I am a huge sci-fi lover, especially stuff by Neal Stephenson, Charlie Stross, and Chia Mielville. So when I heard that William Gibson has a new novel out, a sequel to one of his other works that I hadn’t read yet, I immediately added it to my library list. The Peripheral is a novel based around a concept the Gibson describes in his acknowledgement as “third worlding alternate timelines” via remote controlled avatars and drones.

The novel’s short chapters flip-flop between the two main characters, a woman named Flynn Burton, who resides in our near future in an unnamed place referred to as the county, an Appalachian community beset by job loss and drugs. The other main character, Wilf Netherton, is a publicist in Flynn’s future, where eighty percent of the population was wiped out by climate disaster and other calamities. Some of the ones who survived this period, referred to as The Jackpot, are lucky indeed, as they have access to uber-wealth and amazing technology: nano assemblers that can build (or destroy) anything, and Peripheral technology, which allows them to control biological android avatars via remote control. What Wilf and his kleptocratic friends also have access to is a stub, a way to reach back to Flynne’s time and communicate with people there.

Flynn’s brother, a former special forces soldier, is hired by Wilf’s associates to run security in their future. Thinking it a video game, Flynn covers for her brother and witnesses a murder, and from there the novel takes off as Flynn’s world, in fact, her entire timeline, gets turned upside down. As Wilf and his friends try to uncover the mystery in their timeline, their adversary has found a way into Flynn’s time also, and the two sides engage in economic warfare, using AI to manipulate the markets, using the cash to buy up every corporation, crime boss and politician that gets in their way.

Flynn and her brother, as well as his combat buddies are soon given directions on how to build tech from Wilf’s time, and spend a good deal of the time in Peripherals in what would have been their future. By stubbing Flynn’s timeline, it has diverged from Wilf’s time, and they start taking steps to prevent the Jackpot from occurring.

I enjoyed the book, but almost gave up at the beginning because Gibson doesn’t explain much in Wilf’s future London from Wilf’s point of view. Things become more clear once we experience them through Flynn’s eyes, but the first fifty pages I could barely understand what I was reading and had to step away from it for a day or two before I could bring myself to come back to it. That said, Gibson is a great storyteller and futurist, and I’m looking forward to getting my hands on the sequel.

Not surprisingly, Amazon is working on a television adaptation of the series. Like Altered Carbon, and Dollhouse before it, the idea of consciousness transfer or remote control seems to be in the zeitgeist these past few years, and I wonder if this tells us something deeper about who we are today. Isn’t that the point of science fiction in the first place?

Single dad weekend

So my wife’s job in union leadership has lead her out of town again, this time by train to DC where she’ll be lobbying our local congressional representatives. I dropped her off at the train station early yesterday, and spent the rest of the day with the kids. Last night we broke open a copy of Shadows in the Forest, which wasn’t quite what I was hoping for. My oldest enjoyed it, but I was mostly bored by the second game. We’ll try again in a few weeks and see if I warm up to it. The rest of the evening was watching various live performance videos on YouTube. I think I may have overdid the head-banging to Enter Sandman, as my neck has been bothering me all day. I thought I had slept on it wrong before I remembered what happened last night.

After the kids went to bed I spent the rest of the night watching the final episodes of Bojack Horseman while refreshing TradingView and Twitter on my app. Of course the big news is that Bitcoin crossed $10,000, and of course CryptoTwitter lost their shit. There weren’t any big price moves like I was expecting though, just a slow and steady slog up to it on the 1-minute candle, bouncing a narrow range until it finally made it and everyone started congratulating themselves for making it through the bear market.

I did see mention that this time feels a lot different. Last time we were here, in November of 2017, there was so much excitement about it and everyone (myself included) seemed like they were going crazy. None of that this time around. I actually tried to talk about bitcoin to one of my neighbors earlier today, but for the most part I’ve barely mentioned it to anyone in real life the past few months. I’ve brought it up several times with others the past week, but no one has asked me about it. Perhaps it’s good to keep it that way for now.

I’ve been taking it easy this weekend, trying to keep myself sane for the next few days. I’ve got a busy week at work lined up, and I know it’s going to be busy, so for tonight I’m going to lay on the couch with The C++ Programming Language. I’m only a couple chapters in, but already learned a good deal that is going to come in handy with class. For an old nerd like me, there’s no better way I’d rather spend my time.

Selling in a bull market

We’re already a week into February; it seems like this year is flying along already. And what a crazy one it is. I’ve been spending a lot of time watching the price of Bitcoin; it seems like the bull market is here and ready to fulfil my dreams of wealth. Also, it seems like Sanders is in position to take the Democratic presidential nomination. On the other hand, Trump just got acquitted by the Senate, and the Democratic party seems to be doing everything they can to fuck things up.

I’ve been very low-key about crypto lately. I don’t talk to people much about in real life. I have fun with it on Twitter, but the fact is that if things go right, I don’t want people to know how much I’m involved with it. Keeping control of your bank is all fun and games until someone gets kidnapped. Someone on Twitter was bragging about being a member of the 10BTC club, and I warned them about OPSEC. They took the tweet down after.

I’ve done my best to protect my holdings. I’ve got redundant hardware wallets plus the private keys protected, but it’s starting to get to the point where I don’t feel entirely safe. I could literally make more from hodling this year than I do at my day job. That’s insane. Many months ago, during the depths of the bear market, I set some dynamic price targets to sell some of my holdings once things took back off using the Mayer Multiple (MM), or the price of BTC as a multiple of its 200-day exponential moving average. I’ve also posted the current MM chart and the TradingView PineScript I used to create it as well.

//@version=1
study("Mayer Multiple", overlay=false)

psma_length = input(100, title="Price SMA Length")
msma_length = input(250, title="Multiple SMA Length")


ma = sma(close, psma_length)
multiple = close / ma
mma = sma(multiple, msma_length)

plot(multiple, title='EMA Multiple', color=#891A0D, linewidth=3)
plot(mma, color=orange, linewidth=2)

Looking at the above chart, one can see that the price of bitcoin has usually peaked when the MM hits 1.9. The winter 2017 bull run peaked just under 2.9x. So a possible strategy would be to start selling as the MM approaches these numbers. I won’t be dumping my holdings at these points, rather I’ll probably start scaling out gradually. I’ve been using a dollar-cost averaging approach, or accumulating, every week, so I think I may start selling the same amount as the price reaches 1.55-1.60x, which is currently $12,900. However, I have made a decision to sell a significant portion of my holdings if we reach 2.88 like we did at the end of the last bull run. That would be just under $24,000. Of course those numbers are dynamic and will likely be much bigger if we take our time to get there. Otherwise, I assume we’ll have some sort of blow off top with opportunity to buy back in later.

I truly believe that Bitcoin represents the greatest financial opportunity that I’ve seen in my lifetime, and one of the main difficulties I’m struggling with is how to balance my risk. I’ve already got a majority of my net worth in crypto, and the temptation to go even further is strong. I’ve written about GBTC in the past; in the next week I’ll complete a 20-week value-averaging plan that I’ve been executing. It’s just hit it’s max payout target for the first time, and we are fully in the black. More about that next week.

That said, it’s hard finding a sell strategy. The important thing is to have a plan, and having the discipline to execute it. My hope is that I can use some of the longer-term trend indicators to build a cash reserve that I can redeploy during the next bear market. If we’re setting up for another multi-year parabolic bull run, then I want to make sure that I take profits and do so slowly enough that I don’t miss too much of the top.

The Mastermind

Last night was the SuperBowl, and since I haven’t watched a single game since the last SuperBowl, and naught before the one before that, I figured why start now. Instead the wife and I decided to wrap up the last few episodes of the final season of Mr. Robot. Spoilers ahead, obviously.

Putting a bow on any work of art and calling it finished is always a challenge. It’s difficult enough for a song or a paper, which is the extent of my creative experience; I can only imagine how hard it is for a book or a multi-season television series. I’s impossible to please everyone, as Lost and The Sopranos demonstrate. With Lost, it was apparent that the showrunners had no idea what they were doing, and the Sopranos may have been more a case of the director being a bit vague about what had happened. There may have been too much credit given to the audience in the latter case. And perhaps no show has so brilliantly destroyed its fanbase more than Game of Thrones, which in the course of its run went from cultural touchstone to something that has disappeared from public consciousness mere months after its conclusion.

FX’s Legion, based on the Marvel comics, was the last show that I wrapped up. It was mostly satisfying, although its final scene, with the (anti)hero and heroine fading out of existence after changing the past, had me saying ‘really?’ to the TV afterward. Man In The High Castle had a good, satisfying ending that tied up all the character arcs and left all the American Nazis dead, although Commander Smith’s downfall and suicide at the end seemed a bit out of character for him.

And I was definitely thinking of Man In The High Castle near the end of Mr. Robot, as we viewed the alternate universe Elliot go about his day in a world where everything was ‘too perfect’. The parallels, (pun intended) between this part of the show and the metaverse of MITHC seemed very similar. But it was the way in which it all fell apart at the end of Mr. Robot that was a bit confusing as it was revealed. I had to chuckle as the fourth wall was obliterated, as a manifestation of Elliot’s psychiatrist (or was it the Architect from The Matrix?) looked into the camera and told the viewers that we too needed to let go of everything.

The ending was a bit messy. With the truth revealed as to who the Mastermind was, and the show back in the real-world hospital bed, I found myself wondering what that meant for White Rose. When she said she wanted to show Elliot what she had shown Angela, did that mean she had killed herself before?

I had known as soon as the two Elliots confronted each other that our Elliot would kill the other. In a show as paranoid as Mr. Robot, it seemed the only way out. But the escape back to the real world seemed anti-climactic. The final scenes were a bit emotional as the various aspects of Mr. Alderson’s dissociative personalities came together, a la Inside Out, before a tunnel ride that borrowed heavily from 2001‘s star child sequence.

Overall I was happy with the conclusion, and Mr. Robot stands up as one of my favorite shows, even if their depictions of hacking were just realistic enough to make the outcomes completely absurd. It was still a great show.

Templates, makefiles, and YAML, oh my!

When I first started programming, it was simple to just fire up an editor and start typing away. Scripts usually wound large procedural monstrosities. If I managed to get anything working it usually was such a mess that it quickly became unmanageable. Now days, there’s so much setup that needs to be done before I can even get to work: setting up a git repo and Python virtual environment; external repo; databases, setting up my IDE. I suppose it must be indicative of the progress I’ve made as a programmer.

One of my final classes is a multi-semester group project. We spent last semester building out a the design docs, and are spending the first few weeks of this one refining those docs individually before coming back together and deploying a prototype. I’m the old man on the team, about twice as old as the rest, and I’ve been doing this long enough to have very strong opinions about a lot of things, so I’ve been trying to guide the team toward these standards.

I’m not going to get into the use case around our app yet, but convinced the team that we would use Django for the backend. Now, while we could use it for the front end as well, I figured that since Django was giving us most of what we needed for the core functionality, we could spend some resources trying out some cutting edge tech that would give the team some experience with GraphQL and React Native. I’ve got no idea whether that will make it in the final approach. Even though we’ve got a team of six people and I’m handling all of the infrastructure stuff, I’m starting to wonder whether the others are going to be able to implement those new features in time.

I’ve got a few more passes through my individual paper to make, then I’ll start focusing on the prototype presentation. My professor made a comment during last week’s recitation that these applications did not have ‘cookie cutter’ approaches, and I almost laughed out loud cause we’re literally using Cookie Cutter Django as the basis of our project. I’m debating whether I want to try a live demo of deploying one or do it offline and record screenshots or something.

Being able to use something like Cookiecutter to setup a Python package, with all the unit testing, CI, and documentation setup via make make commands, out of the box, is amazing once you understand what all of that stuff actually does. It can lead to a bit of choice paralysis at first, trying to figure out testing frameworks, code coverage tools, linters and all that. I’m still getting there. But once found, it makes rapid prototyping easy.

It’s almost maddening thinking about how many different ways there are to setup your workflow. I’m currently using Pipenv as my tool of choice, but recently read about Poetry, which seems to be a step up in many ways. For now though, I’m not chasing it down. Instead, I’m going to focus on delivering something using the tools I already have, instead of getting caught up in what’s new. It’s lesson that continues to be more and more relevant as I mature in my abilities.

February Focus

Well February is shaping up to be a very busy month for me. Besides my two courses to finish my degree, I’ve got one credit hour left to earn, so I’ll be taking a 2-day bootcamp on R near the end of the month. That same week, I’ve got to take my college exit exam. In addition to that, PennyKoin has been revived, so the team is debugging a wallet bug that is burning payments, and also forking Monero as the base for a new chain that we’ll be swapping to. And if that wasn’t enough, I took on digital director duties for a school board campaign. I must be crazy.

But wait, there’s more. I’m setting up a WordPress site for the local Democratic Party. I was Sergeant At Arms previously, but had to drop that. I’ll be working with a steering committee and doing the technical work to guide them through customizing a theme I picked out an setting up operations on it. I also promised I’d submit a merge to the Python TDAmeritrade API library.

I’m really having to dig into C++ between my course on scientific computing and Pennykoin. It’s so different coming back from spending so much time on Python. I’m still reading through Clean Code and the Gang of Four’s Design Pattern book to try and figure out how to abstract my code and get rid of code duplication. My other class, a group development project, is basically going to be a social networking app running in a Django setup. I’m going to be demoing Cookie Cutter Django and a Docker setup for the prototype presentation on Thursday after I finish the first revision of my software design spec.

My boss had a lot of questions for me earlier this week about my post-graduation plans. Our company still isn’t making any money (or so he claims), and I was honest with him that I can expect a six figure salary after graduation. I told him that I wasn’t going to abandon him, but that I thought our business model was a dead-end. I suggested a pivot to business process automation, and alluded to trying to some steps that we could do to open up the web-page maintenance business. We’ve been shoveling off any web development requests that our clients have had, and it’s probably been the wrong move. We’ve got a pretty big marketing funnel, and so I suggested that we scrape all of the web domains in our database and run them through a WordPress vulnerability scanner and see what we get. Low hanging fruit and all that.

Lastly, I can’t wait to listen to this episode of Unconfirmed about employment in the blockchain industry. I think in some sense, that’s where all the work I’ve been doing recently has been heading. It seems so obvious right now. Between the stuff I’ve been doing with Pennykoin, the trading algorithms I made for my brokerage, and the infrastructure work, it’s obvious that’s where I want to go. I’ve got a connection also that I’m going to pursue.

Right after I knock some other things off my todo list.

Windows 10 Provisioning packages FTW

My company has spent the last few weeks trying to prep customers for the death of Windows 7, which happened officially earlier this month. I’ve been less than happy with the amount of buy in that we were able to get from clients, most of whom are either too broke or too stubborn to deal with the expense. We’ve had a few that are taking things seriously and are upgrading their machines, so I’m preparing to do a lot of installs.

In the past, when I worked in Enterprise, we would do images for the various models of desktop and laptops that we deployed. It made sense because of the scale. My work in the SMB space doesn’t necessitate this type of operation, since deployments are sporadic and smaller in quantity. Plus there’s so much churn in the OEM hardware market that it just doesn’t make sense.

As a managed service provider (MSP) with dozens of clients, I’ve been trying to standardize our operations as much as possible, but it doesn’t scale very well. We have a Remote Monitoring & Management tool (RMM) that we deploy to all our endpoints that installs our remote access and security tools, but we wind up with different installers for each site. We can create scripts to deploy some software, but it’s clunky and I don’t like using it. I’ve had some success deploying things like Chrome and Acrobat Reader, but it’s useless for some of the more obscure line of business vendors that haven’t packaged their installers for silent installations.

A majority of our clients are either on Windows Active Directory domains or Office 365, which uses Azure AD, so I’ve managed to write and collect a number of Powershell scripts to repeat common tasks, but even after seven years I haven’t been able to reach the level of automation that I’d like to be at. I’ve written about my attempts to integrate some of our platform APIs, but doing things at the user level is really difficult when you’re dealing with small sites of five or ten users.

Recently I extracted the installer from our RMM provider, and found that the package is just another executable with a settings file. One of the lines in this settings file contains our client’s unique ID in their system, and discovered I could use this as the base for a universal installer. I wrote a PS script to search and replace this string based on a hash map, and I even added a little popup selection box to run it. It wasn’t anything fancy, but it made my life just a little bit easier.

One of the things that’s always been a pain in the ass is dealing with the ‘out of the box experience’ (OOBE) that Windows presents the first time you turn it on. We’ve got a standard operating procedure for naming the default admin account and password, machine names, but for some reason it still gets screwed up. So I wrote another small script that I can run in Audit mode that imports an XML file to skip OOBE, create the account, and install the RMM tool. Life easier, but still buggy.

Lately I’ve been playing with Windows Configuration Designer. It creates Provisioning Packages (PPKG), which can be used by end users to do a lot of these things. It’s got some useful features: device naming; enroll in AD or Azure AD; local admin account; and adding applications. You can even specify WLAN authentication for wireless deployment. Unfortunately, it’s not a panacea as debugging packages (especially application installs) is a pain. One, WCD is buggy. The advanced editor started acting buggy when I started changing the application install order, garbling friendly and file names in the XML and finally throwing errors during compile time that forced me to start from scratch. And if the package installation fails, it can’t be run again.

I made the mistake of trying to wing the installation of some of the applications. It’s really good about MSI packages, but you better have your command switches right for everything else. I kept running into issues with Acrobat Reader. Apparently it was still throwing up the GUI, waiting for a finish, which caused the PPKG to stall. And after restarting the machine and trying to run the PPKG again throws an obscure error message that is not well documented. And don’t even think about running the package in audit mode. It won’t skip OOBE, and seemed to undo the Azure AD join that I did.

I wound up splitting the device and account setup into a separate package that I could rerun while I troubleshot the application installations through the main package. Eventually I started using a VM workstation that I could restore a snapshot on, but it was only seven laptops that I needed to deploy. I finally had a working package by the time I got to the seventh!

I’m starting to see the larger picture for a program that can edit the XML to dynamically generate provisioning packages for all of our clients. Grabbing the Azure token might take more time to work out, but I just need to swap out a few variables for device name, local admin, and could individually select applications to install.

One last thing about Azure AD, apparently, joining a Windows 10 device to Azure causes Bitlocker to be enabled. Decryption keys are synced to the cloud, which is a nice feature.

Gaussian Elimination with TDD C++, Part 1

I’m pretty pleased with myself. I managed to pull an epic coding session Friday night and met the deadline on a school assignment. I almost used the word finished there, but I merely got it working. As Uncle Bob said, getting it working is the first step to completion, refactoring and cleaning it up is the next.

The purpose of the assignment was to implement a Guassian Elimination function in C++. The professor, an old Fortran/C++ veteran who had done a lot of scientific matrix work back in the day, wanted us to use pointer pointers for the matrix rows, to make swapping rows faster. They gave us the following specification of how the Matrix would be represented in a data file:

3 // int N representing the size of the matrix A
1 1 1 // values of A[row i]
0 1 1 // A[i+1]
0 0 1 // A[i-N]
1 1 1 // right hand side

The professor then went through the algorithm for solving such a matrix on the board. Later they showed us how to generate datafiles with solvable problems for testing, but we’ll skip over that for now.

The example that the professor did in class was a bit of a mess, so I went looking for better examples. Rosetta Code has examples of Gaussian Elimination in many different programming languages. The C version is pretty close, but even looking at the gauss_eliminate function here, we can see that it’s doing a lot and can further be broken down into smaller functions.

void gauss_eliminate(double *a, double *b, double *x, int n)
{
#define A(y, x) (*mat_elem(a, y, x, n))
    int i, j, col, row, max_row,dia;
    double max, tmp;
 
    for (dia = 0; dia < n; dia++) {
        max_row = dia, max = A(dia, dia);
 
        for (row = dia + 1; row < n; row++)
            if ((tmp = fabs(A(row, dia))) > max)
                max_row = row, max = tmp;
 
        swap_row(a, b, dia, max_row, n);
   
        for (row = dia + 1; row < n; row++) {
            tmp = A(row, dia) / A(dia, dia);
            for (col = dia+1; col < n; col++)
                A(row, col) -= tmp * A(dia, col);
            A(row, dia) = 0;
            b[row] -= tmp * b[dia];
        }
    }
    for (row = n - 1; row >= 0; row--) {
        tmp = b[row];
        for (j = n - 1; j > row; j--)
            tmp -= x[j] * A(row, j);
        x[row] = tmp / A(row, row);
    }
#undef A
} 

My experience with C++ has been limited, mostly schoolwork with CodeBlocks and Eclipse; I prefer using JetBrains these days. And I’ve never done tests in it, so after I set up a new repo the first thing I did was spent some time figuring out Google Test before I wrote my first line of code. I started with making sure I could load files, then started writing output helpers, overloading the ostream operator and creating a print() function.

Let me say: Test Driven Development is HARD. It requires a lot of thought up front about what it is that you are trying to do. I started off with a todo list:

- call GaussianElimination function
- read file from file system
- get size from file
- create matrix(size)
- load vector data from file
- create 2d vector array size N
- initialize matrix with values 

and started working through each of them, going through the red light/ green light cycle: writing a test that would fail, then implementing the code that would make that test pass — and NOTHING MORE. Like any discipline, it’s hard. But the effect is amazing. Having a magic button that lets you change code and get [ PASSED ] back after is exhilarating.

I’ll admit that I cheated a bit as the deadline approached. I hadn’t implemented proper comparison operators for the Matrix class, so I was doing everything by eyeball before I got to the point where the code worked and I could submit for credit. The result was still a far cry from the way I usually operate, with a bunch of manually entered code.

I’ll share more in a further post.