Windows 10 Provisioning packages FTW

My company has spent the last few weeks trying to prep customers for the death of Windows 7, which happened officially earlier this month. I’ve been less than happy with the amount of buy in that we were able to get from clients, most of whom are either too broke or too stubborn to deal with the expense. We’ve had a few that are taking things seriously and are upgrading their machines, so I’m preparing to do a lot of installs.

In the past, when I worked in Enterprise, we would do images for the various models of desktop and laptops that we deployed. It made sense because of the scale. My work in the SMB space doesn’t necessitate this type of operation, since deployments are sporadic and smaller in quantity. Plus there’s so much churn in the OEM hardware market that it just doesn’t make sense.

As a managed service provider (MSP) with dozens of clients, I’ve been trying to standardize our operations as much as possible, but it doesn’t scale very well. We have a Remote Monitoring & Management tool (RMM) that we deploy to all our endpoints that installs our remote access and security tools, but we wind up with different installers for each site. We can create scripts to deploy some software, but it’s clunky and I don’t like using it. I’ve had some success deploying things like Chrome and Acrobat Reader, but it’s useless for some of the more obscure line of business vendors that haven’t packaged their installers for silent installations.

A majority of our clients are either on Windows Active Directory domains or Office 365, which uses Azure AD, so I’ve managed to write and collect a number of Powershell scripts to repeat common tasks, but even after seven years I haven’t been able to reach the level of automation that I’d like to be at. I’ve written about my attempts to integrate some of our platform APIs, but doing things at the user level is really difficult when you’re dealing with small sites of five or ten users.

Recently I extracted the installer from our RMM provider, and found that the package is just another executable with a settings file. One of the lines in this settings file contains our client’s unique ID in their system, and discovered I could use this as the base for a universal installer. I wrote a PS script to search and replace this string based on a hash map, and I even added a little popup selection box to run it. It wasn’t anything fancy, but it made my life just a little bit easier.

One of the things that’s always been a pain in the ass is dealing with the ‘out of the box experience’ (OOBE) that Windows presents the first time you turn it on. We’ve got a standard operating procedure for naming the default admin account and password, machine names, but for some reason it still gets screwed up. So I wrote another small script that I can run in Audit mode that imports an XML file to skip OOBE, create the account, and install the RMM tool. Life easier, but still buggy.

Lately I’ve been playing with Windows Configuration Designer. It creates Provisioning Packages (PPKG), which can be used by end users to do a lot of these things. It’s got some useful features: device naming; enroll in AD or Azure AD; local admin account; and adding applications. You can even specify WLAN authentication for wireless deployment. Unfortunately, it’s not a panacea as debugging packages (especially application installs) is a pain. One, WCD is buggy. The advanced editor started acting buggy when I started changing the application install order, garbling friendly and file names in the XML and finally throwing errors during compile time that forced me to start from scratch. And if the package installation fails, it can’t be run again.

I made the mistake of trying to wing the installation of some of the applications. It’s really good about MSI packages, but you better have your command switches right for everything else. I kept running into issues with Acrobat Reader. Apparently it was still throwing up the GUI, waiting for a finish, which caused the PPKG to stall. And after restarting the machine and trying to run the PPKG again throws an obscure error message that is not well documented. And don’t even think about running the package in audit mode. It won’t skip OOBE, and seemed to undo the Azure AD join that I did.

I wound up splitting the device and account setup into a separate package that I could rerun while I troubleshot the application installations through the main package. Eventually I started using a VM workstation that I could restore a snapshot on, but it was only seven laptops that I needed to deploy. I finally had a working package by the time I got to the seventh!

I’m starting to see the larger picture for a program that can edit the XML to dynamically generate provisioning packages for all of our clients. Grabbing the Azure token might take more time to work out, but I just need to swap out a few variables for device name, local admin, and could individually select applications to install.

One last thing about Azure AD, apparently, joining a Windows 10 device to Azure causes Bitlocker to be enabled. Decryption keys are synced to the cloud, which is a nice feature.

Gaussian Elimination with TDD C++, Part 1

I’m pretty pleased with myself. I managed to pull an epic coding session Friday night and met the deadline on a school assignment. I almost used the word finished there, but I merely got it working. As Uncle Bob said, getting it working is the first step to completion, refactoring and cleaning it up is the next.

The purpose of the assignment was to implement a Guassian Elimination function in C++. The professor, an old Fortran/C++ veteran who had done a lot of scientific matrix work back in the day, wanted us to use pointer pointers for the matrix rows, to make swapping rows faster. They gave us the following specification of how the Matrix would be represented in a data file:

3 // int N representing the size of the matrix A
1 1 1 // values of A[row i]
0 1 1 // A[i+1]
0 0 1 // A[i-N]
1 1 1 // right hand side

The professor then went through the algorithm for solving such a matrix on the board. Later they showed us how to generate datafiles with solvable problems for testing, but we’ll skip over that for now.

The example that the professor did in class was a bit of a mess, so I went looking for better examples. Rosetta Code has examples of Gaussian Elimination in many different programming languages. The C version is pretty close, but even looking at the gauss_eliminate function here, we can see that it’s doing a lot and can further be broken down into smaller functions.

void gauss_eliminate(double *a, double *b, double *x, int n)
#define A(y, x) (*mat_elem(a, y, x, n))
    int i, j, col, row, max_row,dia;
    double max, tmp;
    for (dia = 0; dia < n; dia++) {
        max_row = dia, max = A(dia, dia);
        for (row = dia + 1; row < n; row++)
            if ((tmp = fabs(A(row, dia))) > max)
                max_row = row, max = tmp;
        swap_row(a, b, dia, max_row, n);
        for (row = dia + 1; row < n; row++) {
            tmp = A(row, dia) / A(dia, dia);
            for (col = dia+1; col < n; col++)
                A(row, col) -= tmp * A(dia, col);
            A(row, dia) = 0;
            b[row] -= tmp * b[dia];
    for (row = n - 1; row >= 0; row--) {
        tmp = b[row];
        for (j = n - 1; j > row; j--)
            tmp -= x[j] * A(row, j);
        x[row] = tmp / A(row, row);
#undef A

My experience with C++ has been limited, mostly schoolwork with CodeBlocks and Eclipse; I prefer using JetBrains these days. And I’ve never done tests in it, so after I set up a new repo the first thing I did was spent some time figuring out Google Test before I wrote my first line of code. I started with making sure I could load files, then started writing output helpers, overloading the ostream operator and creating a print() function.

Let me say: Test Driven Development is HARD. It requires a lot of thought up front about what it is that you are trying to do. I started off with a todo list:

- call GaussianElimination function
- read file from file system
- get size from file
- create matrix(size)
- load vector data from file
- create 2d vector array size N
- initialize matrix with values 

and started working through each of them, going through the red light/ green light cycle: writing a test that would fail, then implementing the code that would make that test pass — and NOTHING MORE. Like any discipline, it’s hard. But the effect is amazing. Having a magic button that lets you change code and get [ PASSED ] back after is exhilarating.

I’ll admit that I cheated a bit as the deadline approached. I hadn’t implemented proper comparison operators for the Matrix class, so I was doing everything by eyeball before I got to the point where the code worked and I could submit for credit. The result was still a far cry from the way I usually operate, with a bunch of manually entered code.

I’ll share more in a further post.

A full plate

Earlier today I finished a fascinating interview that Tim Ferriss did with Penn Jillette. One of the things that I found most interesting was his journaling habit that he’s done daily for over thirty years. He begins each morning spending about half an hour writing about the conversations that he had the day before and some of the things that he did. He said he writes about half to a thousand words, then finishes up by reading the diary entries from one year ago, five, ten and so on years ago. It seems to me that it’s quite a way to keep track of one’s life and find out what one was doing, and how one’s grown.

I think ideally I’m still trying to make this blog a diary of sorts. Having a daily writing habit is good practice and keeps my mind sharp, but at the same point I don’t see that the content here would be of much use to anyone other than myself in some regards. I don’t want to get into the minutiea of who I talked to and so forth, but I aim to say something true each day, and hope that I’ll be struck by some sort of creative impulse that will mean something more than my particular day. I tell myself if I just keep writing, things will develop on their own. We’ll see.

I’ve been trying to maintain my routine, getting up early enough that I don’t have to rush, trying to maintain a balance between early morning productivity and my late-night tendencies. Trying to juggle all the responsibilities that I’ve somehow managed to saddle myself with, and still find time to do what I want to do. Being a parent is probably the most important job that I have, and it’s made harder by the fact that I only have less than three hours with my children each weekday. We recently picked up the idea of “special time” from one of the discipline books that I picked up to help deal with some defiance issues that my eldest is having. It’s fifteen minutes a day that the kids are in charge. Right now all they want to do during that time is for me to give them horsey rides, or toss them around with my feet, doing front and backflips onto the floor or couch cushions. We’ve been experimenting with me holding their feet in my hands while they stand and I push them up in the air above me. Needless to say I have been sore and feeling beat up for several days.

I’ve had some physical discomfort in my shoulder the past few days. I can’t tell if it’s due to the aforementioned horse-play, or a repetitive stress issue due to leaning on my elbow while at my desk. Or it could be strain from the increased amount of piano practice that I’ve been doing. I’ve got a couple short classical pieces memorized that I’m polishing up and hope to have a video up soon. I’ve been spending a good deal of time throughout the day at the keys, and have been slowly improving over the past few weeks. Emphasis on slowly; just earlier I was able to complete a short Bach minuet without any flubs. I’ve probably been practicing it for weeks.

I’m also fighting the nagging feeling that I may have taken on too much and that I’m going to be very, very busy soon. I’m maintaining right now, but a meeting last week looks like it may lead to a partnership that could lead to a lot of opportunity. I just have to balance my past obligations, which are so low on the backburner right now that they may wind up burning me if I don’t stay ahead of it.

Other than that, it’s just breathe in, breathe out. I’m doing fine.

Learning to fly

I’ve been on a bit of a kick on Robert C. Martin’s work lately. Martin, AKA “Uncle Bob” is the author of several books on coding and is the author of a couple of classics in the software development field. I’ve watched several of his lectures on YouTube recently, and have been reading through Clean Code the last couple days. It’s really making me realize how garbage the things I’ve been writing lately are, and I’m pressed with an immense urge to go back and completely refactor everything that I’ve been working on the past few weeks.

Of course, having a robust integration test suite is absolutely necessary for any kind of refactoring, which is not something I’ve been terribly disiplined about recently. I’m proud to say that I am taking a strict TDD approach to my latest class assignment in C++, although it has slowed me a great deal. The hardest part is determining how to right tests. Sure I could go and write a massive 200-line function that would take input and perform the Gaussian Elimination on it, but since this is part of a larger test suite that we’ll use for our final exams, I want to make the code more modular. For example, see the difference between this big 75 line single main statement, and this one. The latter could still be broken out to smaller functions according to Uncle Bob, but is still a step in the right direction.

There were two reasons that I went back to school to finish my degree. The first was that I thought I needed a BS after my name in order to get my resume past some of the gatekeeping algorithms at some firms. I’ve since come to the realization that I have no desire to go to work at any large enterprise or other organization where this would be a factor — six figures be damned. The second was that I felt like I was running into roadblocks with my own development projects. They were basically huge convoluted procedural things. Even when I tried to adopt OOO principles, they were still a mess. I felt like I needed to go back to school and go through the curriculum to get where I needed to get.

I don’t think it’s quite worked out the way I wanted it to. Now, don’t get me wrong, I think earning a degree in ‘Computer Science’ has been valuable, but it’s not quite what I expected. I think one of the intro Unix classes really broke my block when it comes to working with Linux, and that’s a skill that I have definitely appreciated. But I think the focus on Java and C++ is behind the times.

I recently had a conversation with one of my professors about why I was surprised that there hadn’t been any focus on Software Design patterns. (I’m still working my way through the Gang of Four.) He told me that there was a bit of disagreement within the department between those who wanted to focus on theory, and those who wanted more actual engineering and development. So far, the balance of power lay with the theoretical side, which is why the focus on the maths, big-O notation, data structures and discrete-finite-automata.

Even so, I’m still surprised that I feel like I’ve taken more out of a couple of 30 year old videos on Lisp than I have out of the classes that I’m going $20K+ in debt for. All I wanted to do was to write better code, so that I can make programs do what I want them to do. The ideas that I’ve had for things were beyond my grasp to complete them, and I was looking for ways to increase my knowledge. I’m probably being unfair to the university, since some of the more business-end document writing (requirements, software specification documents, use cases, &c..) have helped me already in some of my professional interactions.

At the end of the day, it’s about sitting down with an IDE and writing those magic lines of code that make the computer do what I want.

Not getting fired isn’t enough

Last week was the first day of my last semester before I get my degree, and I’ve already managed to miss two days of classes, one because I had the time wrong, and the second because my youngest was sick. The New Year has seen little activity at my day job, save for some scrambling around the end of life for Windows 7. And despite my best efforts, I’ve still managed to pick up a few projects to add to my already overloaded schedule.

Most of my side work right now is around website hosting. Maintaining domains, SSL certs, installing WordPress; nothing too complicated. I’m using Infinite WordPress to keep an eye on security updates and backups, and have a year membership for Envato to access all their premium themes and other assets. It’s low hanging fruit, I’ll admit.

I’m also close to closing a deal for setting up digital infrastructure for a local school board candidate. One of the perks for having run for office myself is the ability to get paid to do the same work.

Each day that goes by brings home the realization that I’m all but checked out from my day job. The last few months of 2019 I was busy furiously trying to find ways to automate operations as much as possible. Not now. There’s no urgency. I’ve spent a lot of time thinking about core values, trying to figure out what kind of culture we have at my job, and the answers I’ve found have been lacking. I take a lot of the blame for the way I’ve acted that have led to a toxic work environment, which has caused any efforts to enforce discipline to fail.

Perhaps part of it is going on several years now without a boost in compensation. I’ve been working with the same salary for about five years now, which is crazy. I’ve told myself that I’m comfortable with this because of the freedom I have, I’m able to work from home and don’t have to worry about how many hours I actually work. It’s well under thirty five a week. Even the sparse goals that I’ve set for myself for billable time have gone unchecked.

There is an old adage: people work only as hard enough as to not get fired, and employees pay them only enough not to quit. It’s not enough for me anymore. My main concern right now is doubling my income before the end of the year. Wish me luck.

Programmer Discipline

So my productivity has been shot to hell the last two days while I try to familiarize and setup not one, but two new programming environments. I’ve got Javascript for the CCXT/Safe.Trade library, and just got assigned a C++ module for one of my classes.

I have a somewhat convoluted setup. I like to work from one of two machines. My desktop is for gaming and personal or school projects, and my laptop has a Windows VM that I use for my day job. I also have an Ubuntu server that I’m running a file share and other services on. It’s got Docker running over ssh, but I was pounding my head today trying to figure out how to get IntelliJ to talk to it so I could use the integrated run tools instead of the copy/paste garbage I’ve been dealing with as I try to catch up on 20 years of Javascript changes and Node.

For one of my final classes I’ve got to implement Gaussian Elimination in C++ as part of a larger library that will be part of my final grade. I said goodby to CodeBlocks and Eclipse a while back, but I haven’t started a project in C++ in years. The only time I looked at it all has been for the PennyKoin updates. I’ve never spent the time to understand make lists and linking, so I just spent a painful hour trying to get Googletest integrated with this new project. Cause of course I’m going to write a test before I put down anything more complicated than ‘hello world’.

Of course I am.

I’ve spent the last week going over a series of videos on Clean Code by Uncle Bob C. Martin. It’s a good one that I really enjoyed. Martin is really good up on stage — and funny — and I was disappointed when I finished the last one and realized that there weren’t any more. There’s much more on his CleanCoder site for sale that I might dive into, but I want to read his Clean Code and Clean Architecture books first.

Highly recommended if you have several hours to spare.

I came to realize that the tests that I wrote for the GBTC Estimator were too tightly coupled to the module code, and that the module code was coupled to the input (IEX via Pandas DataReader class. So I’ve been trying to decouple it so that so that it works with a dataframe from my broker’s API. I’m taking some hints from a mocking talk I saw that made me realize that I need to break out dependencies even more.

Safe.Trade integration with CCXT

Cryptocurrency exchange automation library

I’m still operating a two year old crypto mining rig here at the house. For the couple months I’ve had it mining Arrow, a Zcash clone that has all of the non-private transactions turned off. I’ve accumulated quite a bit of it, and found out this past week that it was released on the Safe.Trade exchange. Me being me, I immediately went looking for the API docs to see what was available.

I have yet to sell any of the accumulated tokens since I turned the rig on, but feel like I have enough of a stockpile that it’s time for me to start selling some of it for Bitcoin. So what I would like to do is write a program that will interface with my Arrow mining wallet, see how much has been deposited in that day, and transfer said amount over to the Exchange. From there, place a market order, and transfer the proceeds to my bitcoin hard wallet.

Usually I would just open up Pycharm and start building an API wrapper for the exchange, but I’ve been using the excellent CCXT cryptoexchange libary, and wanted to try my hand at adding an exchange to that. The library is very well designed, exchanges are added via a single Javascript file that performs authentication and API calls to CCXTs unified specifications. It seems simple enough, but I haven’t done JS development in fifteen years.

I managed to download the CCXT Docker image and run the tests, but figuring out how to do test driven development in Node is going to be a bit more than I had originally bargained for. I’m going to have to spend a few days figuring out how to set things up and get in the flow.

Of course, yesterday was also the first day of school, so it’s going to be interesting figuring out how to fit all this in. I’m also still doing work with the Value Average and GBTC Estimators, so I’ll have to balance doing all that as well. Still, having a commit in the CCXT library would be like a badge of honor, so I’m going to give it a shot.

We will keep you posted.

Estimating GBTC price from BTC after-hours activity

Grayscale Bitcoin Trust (GBTC) is the name of a publicly traded OTC investment product listed on the public OTC markets. It’s a way for US investors to take a position in Bitcoin through brokerage and retirement accounts like IRAs. A lot of OG crypto-types scoff at the prospect of purchasing such an asset, since you don’t actually control the BTC or the private keys, but for some this is an attractive option, or an only one. I’ve been personally taking positions in GBTC over the past 3 or so years through my retirement IRA. One of the most underlooked qualities of GBTC through an IRA is that all transactions are tax-free. I can take profits in my IRA at any time without worrying about tax liability, which is not something I can say for my actual crypto holdings.

Two of the downsides of GBTC is that Grayscale takes a two percent management fee. This isn’t a big deal to me because of the expected gains in a bull run. The other is that there is a premium on GBTC over the underlying asset value. Each share of GBTC represents .00096884 Bitcoin, but the GBTC’s price is usually 30-10% higher than the value of the underlying asset.

One of the main differences between the equities and crypto markets is the fact that crypto is 24/7. Often, during times when BTC has made a big price movement, I’ve wondered what the corresponding change in the price of GBTC would be (and in my portfolio!) So, I have written a small Python package to calculate this that I call GBTC Estimator.

I have it setup to get public BTC prices from Gemini (via the excellent CCXT package). Right now it’s using IEX’s daily GBTC data (and required an IEX API key), so it only has access to daily OHLCV (open, high, low, close, volume) data. We take the close price of GBTC, and divide it by the price of BTC at the same time (4PM EST) to come up with the actual BTC per share. This number is then used with the current BTC price to come up with the estimated GBTC value.

This current version is run from the command line and returns the estimated price as well as the difference from the last close in dollars and percentage. I have plans to put this up as a website that updates automatically, but first I think I’m going to do some backtesting to see how accurate this is. I think there may be some arbitrage opportunities to be found here. I’ve already started refactoring and will have more updates to follow.

Nation Magazine: September 9/16 2019

Yea, I know, we’ve got a stack of magazines from last fall that we’ve been procrastinating on, and they keep getting bigger and bigger. It’s obviously too much to keep up with. Nation’s publishing schedule is pretty prolific, and their subject matter is quite a step from Time Magazine. Couple of interesting articles in this one:

INDIVISIBLE, by Joan Walsh: Covers the post-Trump activist org of the same name, and schisms between their national leadership and grassroots organizers. This seems to be a recurring theme with liberal organizations; I wonder if Conservatives have the same problems?

Indivisible’s work has earned it enormous political capital; now its national leaders want to figure out how to use it. But since so much of that capital has been earned at the local level, the leadership has to be careful about spending it—and whether it is theirs to spend at all.

I’ve got nothing but respect for the work that Indivisible has done; the local chapter here has done political work for my causes in the past, and they’re a great group of committed activists. The issue here seems to be with how the national leadership wants to leverage Indivisible’s political capital as part of a 2020 presidential endorsement process. The issue with some of the larger cohorts is that an endorsement will likely alienate some members.

Interestingly, after this issue was released, Indivisible released their 2020 Candidate Scorecard. Warren and Bernie take the top two spots, with Biden dead last in the rankings. Apparently Biden declined to participate, so their ranking is based on ‘research into his public record.’ Oops.

To Stay Or Go, by Mara Kardas-Nelson: Environmental racism is the focus here, as the author details the battles that communities in Cancer Alley go through against their own elected officials and the corporation who are poisoning the water and air with the highest concentration of petrochemical plants in the United States. Of course, the prevalence of increased Cancer and other sickness in this region has lead to a flight of citizens out of the area, leaving most of rest with few options. As the population has fled, those remaining have fewer options to sell homes, and communities see their young people and entrepreneurs dwindle.

Of course people shouldn’t have to flee the homes that they’ve made, but it seems that people of places like St. James Parish seem to be fighting a losing battles. While there are some activists trying to fight back via court challenges and electoral battles, the current situation for these communities is quite dire. It’s a situation that we’ve seen play out in other places around the world.

Our Shared Fate, by Suzy Hansen: Review of What You Have Heard Is True, by Carolyn Forche: You may remember this exchange between Ilhan Omar and the Trump administration’s special envoy to Venezuela, where she pressed him over his involvement in the Iran-Contra Affair and in El Salvador’s civil war.

The massacre that isn’t mentioned by name in the above clip is the 1981 El Mozote massacre, an event during the El Savador civil war in which a village of 800 men, women and children were raped and murdered by a US-backed Salvadoran army battalion.

What You Have Heard is True is the memoir of American poet Carolyn Forche, who spent several years in El Salvador during this period. It is one that more Americans need to be aware of, given our complicity in the events there. It puts a different perspective on the immigrants who are fleeing from there to this day, trying to enter the United States and being caught up in family separation.

Fair Open Source

Last night I had the pleasure of meeting Travis Oliphant, one of the primary creators of Numpy and and founder of Anaconda. He’s currently the CEO of OpenTeams, a company attempting to change the relationship between open source software and the companies that build on top of it. I found out about the lecture and was interested in it because of an article I had read in Wired about technology’s free rider problem, and went to the event without knowing anything much at all about Mr. Oliphant. I soon found out who he was and was very grateful that I had come. I’ve spent a lot of time using Numpy, and I’ll admit I was a bit starstruck.

Travis’s lecture spawned from his experience working on Numpy. He basically gave up tenure track at Brigham Young University to work on it, and had to find other ways to support his family for the two years that he was working on the initial release. As was noted elsewhere, much of the tech boom over the past 20 years has been built on top of the contributions of FOSS developers like Travis and others. He’s a big believer of profit, and thinks that the lack of financial incentives in the FOSS space has caused several problems, including developer to burnout, leading to a lack of proper maintenance of these projects. Many of these projects, like Numpy, have become crucially important to the scientific and business community.

Tim Oliphant’s Pycon 2019 Lighting Talk about Quansight

Oliphant’s goal is to make open source sustainable. Quansight is a venture fund for companies that rely on OSS, one of the ones they’ve funded is a public benefit corporation called FairOSS, which hopes to support OSS developers through contributions from companies that use OSS. He’s also doing something very similar with OpenTeams, hoping to follow Red Hat’s model of supporting Open Source by providing support contracts for various projects.

These are all very worthy goals, and I was both impressed and inspired by his talk. It’s opened up some interesting career opportunities. I recently took my first developer payment through GitCoin recently, and it was a bit of a rush. Getting paid to work on Open Source Software seems like an awesome opportunity, and I’ll be keeping an eye on this for potential post-graduate plans.