Cryptoasset portfolio tools

I spent some time Sunday cleaning up all the old coding projects on one of my computers and uploading them to my GitLab page. Most of the repos are private, but I’ll talk about cryptomarketbot shortly. I had to go through everything to make sure all tokens and other identifiable information was moved out. I used the wonderful python_dotenv package for that.

Most of the packages related to campaign finance data, that will likely stay private until I’m ready to dox myself, or crypto or equities related stuff. There were a few commercial projects that I had parked on Bitbucket that have been moved. Now that I’ve been able to inventory what I have, I can start fleshing out the useful stuff a bit more and refining it to something useful for myself and others.

Cryptomarket Bot

Cryptomarket Bot, as I call it, is not particularly useful. I wanted to track the advance and declines in the top cryptoassests by market value, so I built a small function that queries the CoinMarketCap API and inventories the the top x coins and counts the number that have gone up or down. I think I had the idea while I was reading Alexander Elder’s book Trading For a Living, and it seemed easy enough to implement. I went and bundled that as a Twitter bot, but I haven’t been very motivated to maintain it. I suppose I should figure out a way to park it in a docker container where I can keep it running in-house, or push it to a Heroku hobby instance and leave it there. Maybe there are additional analyses that could be run, and the library could be triggered as a function call via a scheduler, cron or Celery, instead of a never ending Python script.

Finance libraries

I have a number of spreadsheets that I use to track my cryptoasset holdings. I have one for mining and masternodes, and a series of others that I use to track investments in alts that I’ve also started using to plan equities trades as well. The general idea follows Elder’s two percent rule, that no trade exposes you to more than two percent total portfolio loss. Calculating that number for a brokerage account is pretty simple, since everything is in cash or equities, but crypto is a whole other story. One has to determine whether portfolio value is going to be pegged against the dollar, or in terms of BTC, (I prefer the latter,) and many assets may have no direct pairings, such as new shitcoins that aren’t listed on exchanges, or ERC20 assets that are only pegged to Ether.

So while figuring out my risk profile for a ethbtc trade may be simple enough, determining that for something like IDEX staking is a bit more difficult. It’s also hard to separate long term buys, (dollar cost averaging BTC,) from more speculative plays like trying to swing-trade PIVX or something. I’ll be spending more time in this space walking through that decision-making process as I figure out ways to model my portfolio.

Tomorrow marks the start of the last quarter of 2019, and we’ll use the date to mark a new snapshot of my holdings, figure out what my strategy is for the quarter, and will walk through the trades as I plan them out, execute them, and track them. Stay tuned.

Continuous integration/deployment for static web sites

I spent most of my day yesterday setting up Jekyll on my local workstation and getting that setup. The group project that I’m working on at school requires us to publish a web page for it, but the course assignment requires that we only use static pages, so we can’t run WordPress or other CMS systems. The pages have to be easily transferable to a new location for posterity once the term ends, and it’s questionable if PHP is even running on the web server, and getting a MySQL database is a special request.

Anyways, there’s only one other person on our team besides me that knows anything about HTML. One of my teammates started threw a site together on Silex.me, which is a visual editor that allows you to download static files that can be thrown into the web server’s directory. Editing them isn’t so much fun. It requires downloading the files from the web server, uploading them to Silex via Dropbox, and then uploading the files back to the web server. Not efficient, and there’s no history.

Now our CS department does have a local instance of GitLab setup, which raises possibilities of using some continuous integration/continuous deployment automation to the process. (No Pages, though. Alas) After much debugging, we were able to create a job that would copy the HTML files from our repo’s public directory to the webserver’s secure_html location, where the web page is served from: .

image: alpine:latest 
before_script:  
 - apk update && apk add sshpass openssh-client rsync bash 

secure_html:   
 stage: deploy   
 script:    
  - eval $(ssh-agent -s)    
  - bash -c 'ssh-add <(echo "$SSH_PRIVATE_KEY")'    
  - mkdir "${HOME}/.ssh"    
  - echo "${SSH_HOST_KEY}" > "${HOME}/.ssh/known_hosts"    
  - rsync -auvz public/* user@hostname:/home/user/secure_html/   
artifacts:    
 paths:   
  - public   
 only: 
  - master 

There are a couple of of variables that we had to specify in our CI settings. SSH_PRIVATE_KEY is of course the key that we created and uploaded to the server to connect without a password. There should be no password on the key itself, as we could not determine a way to provide the password within the script. And it’s probably redundant anyways. The thing that caused us a lot of issues was figuring out that we needed the SSH_HOST_KEY to prevent a key verification error when running the rsync command.

This way, whenever someone modifies the HTML files in the repo, it will deploy those files to the web server without any intervention. It will also allow us change history, which is also crucial if someone messes up. Another benefit is that we have a record of commits from the various team members, so we can tell who is contributing.

Now since HTML still requires a bit of finesse to work, we’ve been converting our site over to Jekyll, which should allow us to use either Liquid or Markdown for our content. I was able to get a local development environment up and running and generate a basic template for our site, so now the next step involves deploying a job in CI that will build the site, then push over the static site files to our web server. We’ll cover that later.

Scaling a managed IT service provider

The company that I work at is coming up on seven years old this winter. We’re a small managed service provider with about 4 employees and 25 or so clients. We provide IT support and project implementation services for small professional and service companies. We’ve been stagnant, growth wise, for the past three years or so, and my main focus in addition to taking care of our clients is refining our business processes so that we can scale to the next level. What we’ve been doing has brought us success, but it’s not enough to get us to where we want to be.

We’re part of a franchise system of independent operators all over the U.S. The home office is supposed to provide us with best practices and partner relationships, and the franchisees pool their purchasing power to get best deals with the partners. That’s how it’s supposed to work, anyways. What’s happened in practice is that the home office basically provides new franchise owners with a vendor for this, a vendor for that, and so on, and basically leaves the franchisees to themselves to figure out how to implement it. It’s completely inefficient. I can’t even begin to tell you how much time we’ve spent managing our RMM and PSA tools, or how much of my day to day is refining these various systems (some of which don’t have any API for automation control) to talk to each other.

Instead of pooling human resources, say to have a team of engineers that specialize in setting up firewall systems, for example, each location pretty much has their own teams. We rely on outside NOC and helpdesk partners to deal with first-line issues, and the local teams are supposed to be escalation support. But providing information to these various entities can be very difficult (ITGlue has helped tremendously!) but having a remote helpdesk is very frustrating for customers who expect some sort of continuity.

Unfortunately we’re just not able to provide that level of service for what clients are willing to pay. Especially the smaller clients. MSPs use a per-month contract billing, with rates for servers, workstations, and other IT resources, but that usually just covers keeping things running, remotely, and on site and project work is billed separately.

Things can really add up for clients, especially when they don’t follow our recommendations and shit goes south. Most of them are trying to balance the cost of having their own in-house IT resource, but hardware, software and human resource costs can quickly add up. This is even more true when you consider regulatory and compliance requirements. It’s really hard.

And companies that skimp on these costs always pay for it. Always. I’ve had my fair share of ransomware breaches, but one that I saw this week really took the cake. An firm who we have done business with in the past, that we’ve been under a limited engagement with, had a really bad attack which took down their entire Windows domain: three servers, including AD, Exchange, SQL, file services, and a custom database application. We stopped doing business with them three years ago because it was always a challenge to justify what needed doing over there, and things were usually such a matter of urgency that we would be forced to do things to keep them running. And then we would have to spent weeks having to pull teeth to get paid. We finally said enough is enough and just walked away.

So we got a call from them a few weeks ago. Turns out they had pissed off another MSP, and needed help. They had been through several in-house IT resources, but they needed RMM monitoring, AV and patch management stuff that we would provide. But because they were in dispute with the old IT company, we weren’t able to get access to their backup and data continuity appliance.

Long story short, they got hit earlier this week and didn’t have backups for half their shit. I had convinced their in-house person that they really needed to get some sort of local backup, and thankfully they followed my advice. But it was really too little, and they’ve spent the last 72 hours trying to recover. And let me tell you, it was the most stress-free disaster recovery that I’ve ever dealt with. I’ve damn near had panic attacks and probably lost years off my life from the stress of dealing with my own share of these disasters. Sometimes they were self-inflicted, other times not. But since I wasn’t the one holding the bag, I was chill as fuck.

I’ve saw the writing on the wall for MSPs some time ago. I don’t know if it will be ten years or when, but the business model is going to approach a race to the bottom. And our local market is already saturated with 4 or 5 decent competitors, and many more not so decent. Internal conversations around the future of our firm talked a lot about compliance auditing for DOD/NIST, and the question we’re struggling with now is whether we want to be an MSP that does compliance, or a compliance firm that does MSP. My gut tells me to go where others aren’t. Which is why I’m focusing my time on process automation, combining applications via API.

I was able to list several things to our no list, things we’ve done in the past that have gotten us into trouble in the past. That means setting boundaries for business that we deal with, and will likely involve cutting some of our clients who aren’t growing with us or don’t see the value of the service we provide. It means converting our services to product offerings in order to differentiate ourselves from the competition. And it means automating our processing so we’re not making the same decision over and over again.

Sixty for sixty challenge completion

So yesterday marked the end of the Sixty for Sixty meditation challenge. I originally started it following a comment Naval Ravicant made on Joe Rogan’s podcast. I’m not really sure how I’d quantify the experience.

I started using the Waking Up app in November of last year, so I have mindful minutes on my iPhone going back to that. It’s mostly the ten minute guided lessons with some twenty or thirty minute bursts, then around late June there’s a couple thirty minute sessions before I quit drinking, then my first one hour session on the 19th. I had originally tried to scale up from thirty to sixty in five-minute increments, but I gave that up after four days and just went straight to sixty.

I was actually very disciplined about it it, only missing one session in August, and even then I still got thirty minutes in. In September, I really seemed to lose steam and the totals start dropping to 40 minutes, down to 20. I think the main reason for the drop has been time. School started back up and as a result I stayed up later, and started sleeping in. When I started doing the challenge, I was waking up around 5AM before everyone else in the house, getting my hour in and making a cup of tea before my youngest would wake up. Other times I might do it first thing in the AM or in the early afternoon before the girls get home.

Now, it’s all can do to get to bed before 11PM, and I get woken up several times during the night by the baby. By the time we get out of bed in the morning now it’s time to go, go, go. I’ve tried making up for it with 20 minute, guided lessons for an early session, followed by another 20 unguided later in the day, but I’ve fallen off.

One thing that was almost unbearable in the beginning of doing the hour long sessions was the physical pain. I would sit on the ground on top of a couple of cushions, and my back would develop these horrible aches that I would have to stretch out every five minutes near the end of the session. And just getting the feet right for that long took some getting used to as well. But then one day, following a workout, I found that the back pain was gone. So now it’s not the physical pain that is the biggest impediment to my practice, but mental ones. Getting started is the hardest part.

One of the things that isn’t so clear to me is what type of practice I’m doing. Since I was coming from the Waking Up course, Sam Harris’s version of mindfulness is what I was used to: focusing on the breathing, sounds, the visual field, noticing thoughts and just being aware of the whole of conscious experience. But Naval had urged people just to sit, without any goal. As a result, I found myself thinking through about whatever was going on, challenges, ideas about whatever. It was a much more creative, effortless practice.

I’m going to have more to say about habits in a later post, all I want to mention now with regard to meditation practice is that the amount of time I spent meditating daily is a pretty good indicator as to how well I’m sticking to my healthy habits. I don’t have any plans to start drinking alcohol, but I have been drinking a lot of caffeine-laced energy drinks lately. And while I don’t think the two are directly related, I think there’s probably some underlying factor, probably stress, that I’m not dealing with elsewhere else.

And I’m probably not the best person to assess whether my practice has affected my interpersonal behaviors. Of course, the goal of meditation is not what happens during practice, but how you carry that practice into the real world. Being able to recognize and interrupt unhealthy behaviors or responses to stress throughout the day is one of the reasons I took it up. I’m not really sure how that’s turned out. I do find myself more aware at times, but on the other hand I think I’ve been quicker to temper, especially with my kids. Part of it may be no alcohol. But the temper doesn’t linger, and less likely to beat myself up about negative behaviors.

And one more point, about clock watching. I’ve tried to refrain from keeping a visual or auditory timer or any other indicator of how long is passing when I meditate. Earlier apps that I used had a soundtrack or a bell to mark intervals, but I found those too distracting. I would hear the loops in Calm’s bird-chirping background and notice it every time I’d hear the same pattern of tweets. So when I first started to do the longer sessions, I’d just set the starting gong, and sit there until I heard the next one, which would mark then end of my session.

I always had my iPhone in front of me, and while I would use it as a visual fixture sometimes, I found I had to put in in airplane mode after a few notifications interrupted the app’s timer. A few times I found myself checking the timer to see that I had been sitting for an indeterminate amount of time. Anyways, the last minutes of a session, whether it’s a sixty minute or twenty minute one, are still challenging. A part of my brain is sitting there, ready to get up and go, go, go, and it’s hard to sit still without checking the clock to make sure that the timer is still running. And no matter whether I’m doing a twenty, forty or sixty minute session, a part of my brain knows that time is winding down, and is gets anxious about getting up and getting on with my day. There was only one time I can remember being surprised by the closing bell, and thinking “wow, it’s over already.”

Conscious realism

Donald Hoffman has been popping up a lot recently, he’s the originator of the theory of conscious realism, which is a new attempt to resolve the mind-body problem, also known as the hard problem of consciousness: how does the experience of consciousness arise from the physical body? Religion’s answer has has pointed to the soul, but non-theists have been trying to come up with an answer that has a more testable hypothesis. Quantum physics has shown us that the classical model of Newtonian physics, (cause and effect,) is not quite correct, and scientists have been trying to reconcile the two for several decades. Hoffman’s theory is an inversion of the physicalist interpretation that the mind is an emergent behavior of the mind, that instead, that the fundamental constant of the universe is consciousness itself, and that the physical world as we know it is but an approximation of the underlying reality, as interpreted by our biological system.

I realize that I’m blowing the interpretation, and that this all may sound a lot like the old adage that we are not physical beings living in a spiritual world, but spiritual beings living in a physical world. Hoffman takes a couple steps to build to this conclusion, the first seems to be based on some evolutionary mathematics that he developed that shows that perception of true reality is antithetical to fitness selection in evolution. Hoffman built a computer simulation of a reality, with creatures that either perceived an accurate representation of that world, or ones that were able to screen out only that which was necessary for fitness, survival and reproduction. In all of his models, the creatures that saw an accurate representation of reality became extinct.

So the first part of this theory is what Hoffman calls the mulitmodal user interface theory, which is a way of saying that humans, and all creatures have evolved with a species specific interface for perceiving a limited version of reality. This is driven by natural selection, and our reality is different from other species. This is easily apparent when one considers variations within humans such as color blindness or synthesia, or between different species, such as the perception of different wavelengths of light.

This idea of the mind as a reality-filter is probably well known to anyone who has partaken in psychologics, as it becomes apparent that having the entirety of subconscious awareness rushing into consciousness is very detrimental to normal functioning. There’s a school of thought in Buddhism called mind-only that has a similar take, that only the mind is real, and that the physical world is created from it.

Two men were arguing about a flag flapping in the wind. “It’s the wind that is really moving,” stated the first one. “No, it is the flag that is moving,” contended the second. A Zen master, who happened to be walking by, overheard the debate and interrupted them. “Neither the flag nor the wind is moving,” he said, “It is MIND that moves.”

Most materialist theories of consciousness get to a certain point with the structure of the brain, the activity of neurons and neurotransmitters, and posit that add enough of these dendrite connections and -POOF! Consciousness. It’s hard to avoid hand waving or magic. Alternatively, Hoffman proposes that “the objective world consists of conscious agents and their experience.” Now this part may be hard to distinguish from the theory of panpsychism, which holds that all matter is in fact conscious, and that consciousness is the fundamental building block of reality. (Annaka Harris is a reluctant fan.)

Hoffman’s theory is interesting because he’s attempting to create a framework for testing these hypotheses with math. These types of questions have ultimately been philosophical ones, and it’s good to see progress in a way that may one day be experimental in a subjective way.

Music like breathing

I just have spent over two hours today playing piano today. I must have been flow state for some time. I was learning a simple version of Canon in D, only about 40 or 50 bars, but I managed to get the music and proper fingering memorized, and really speed drilled it into my head today.

It worked a bit to much, cause I found the song playing in my head for most of the day when I wasn’t playing the piano, and I knew I was really deep in it when I starting singing Blues Traveler over it. I just did a 30 minute meditation session, and it was everything I could do to concentrate on my breath or on the crickets outside. But even my breath was betraying me, as I found myself keeping time with my inhalations and exhalations. I can still hear it in my brain as I type these words, and I know it will be driving me crazy tonight as I try to go to sleep.

A few years ago I got my hands on a copy of Chuan C. Chang’s Fundementals of Piano Practice, and it got me to the point where I could actually sit down and teach myself how to play Fur Elise all the way through. The premise behind the method is that it turns the standard practice regimine on its head. Instead of practicing scales and exercise, Chang recommends going straight into playing an actual piece of music. Granted, there are certain fundamentals that one must understand, (reading notation!) but there is enough to be learned from just practicing a score than there one can learn from ‘exercises’. And learning the performance aspect of playing is important.

I took a similar approach when I was learning to play guitar decades ago. I didn’t do lessons or study books, I just picked up a copy of Guitar For the Practicing Musician from 711 and spent hours up in my room learning to play whatever random rock tracks they had for that month. I had a collection of tab books that I would study and play through for hours on end.

Piano has been frustrating for me because tablature is so much relatively easier to read than notation, and I guess I’d lacked the patience to internalize the mapping between the notes and the keys. It’s not any easier now, it just takes practice.

An another of the things that I took away from Chaun’s method is the focus on practicing the hardest parts first. And practicing hands separate. I don’t know if the latter is really revolutionary, but one can really swap back and forth with the hands separate method, drilling the hardest turns over and over at twice playing speed until the one hand gets tired, then switch to the other.

Chaun’s book is very long. I’ll admit that I haven’t read it. But their site seems like a trove of resources for anyone wanting to learn piano.

Goodbye Github

A good version control system is critical to any software development project. I haven’t been serious enough in the field to have ever messed with Subversion, but git has been part of my daily workflow for a while now. Github has been instrumental for the advancement of open source software, and there are still tons of projects out there that are still relying on it, and I’ll continue using it as far as I need to in order to participating in those projects. But moving forward, I’m using GitLab for all my new projects, and will be recommending it moving forward.

I’ll admit that I’m a cheapskate, and have never shelled out the $7/month to enable private repos on GitHub. And then Microsoft bought them. I haven’t noticed any changes as a result of that buyout, so I can’t say that there’s anything that troubles me. I understand the ban on embargoed countries that they had to implement, but that bothers me from an imperialist standpoint than anything I hold GitHub responsible for. Their hands are tied.

I attempted to setup local git servers on some my local and hosted servers, but nothing beats the convenience of software as a service. However, when I started pitching what would hopefully be a commercial project, I wasn’t about to put things up in a public repo. I had originally started using Bitbucket as an alternative, but recent experiences with other people who have used it previously have been problematic. (Issues following their buyout by Atlassian left many users unable to access their team accounts…)

My university is currently running an (older) version of GitLab internally, and I’ve been working with it extensively today as part of a new group project that I’m working on. One of the things we’re doing is setting up project repos for our website, and eventually other deliverables that we’ll be generating as part of the course. I wanted to avoid using Google drive, so I set something up to push our repo to the computer science department web servers. Unfortunately, they’re running a newer version of Kubernetes which is preventing the continuous integration runner from working, so we’re pretty much stuck for now. But it’s got me looking at options for static content management pages, which is cool. The idea is to allow people to easily edit the repo pages using markdown, and then have Jekyll or whatever package push the generated HTML to the project site. Today has mostly been about setting up scheduling tools and a Discord instance.

But hopping back to the public Gitlab site, I’ve been pretty impressed with the functionality of things like GitLab Pages and the features built into the service. So for now, GitLab will be my go-to for all new code repos.

Housekeeping

I don’t really have a lot to talk about today. Yesterday’s post was about three thousand words, and that’s by far the longest post I’ve written in months, especially for a daily. So far the reaction from the Pennykoin community has been positive. We’ll see how things go.

I’ve had family stuff going on this weekend, so was tied up with all that going on today. Next week I’ll probably have more ideas about a new machine learning project that I’ll be working on at school. I’ll probably have to doxx myself to talk about it, but I think it’s worth it, so we’ll see how that goes.

Speaking of school, the other class that I’ve been taking at school has been about the history of programming languages, theory and history of FORTRAN and all that. It got me looking at LISP, and I’ve been reading about that, and have been watching these old MIT videos from 1986 that deals with it.

Seriously, what is going on with Harold Abelson’s hair in this video?

One of the things that I’ve been struggling to grapple with is functional programming. I first heard about it through Cardano. That team decided to use Haskell for it, as it allows code to be formally validated using math. This is important in the smart contract space, as we saw with the Dao hack. So the MIT vids are actually pretty relevant still today, and I’ve been learning a lot from them. I told my wife that between these classes and my recent progress learning piano and sight reading, that I can feel my brain changing. I was only half joking. This type of meta-cognition is an important part of why I meditate, and all of the stuff that I’m learning is definitely having an effect on the way I’m thinking.

But right now I’m just beat from hosting 20 kids for a birthday party and just want to veg out with some World of Warcraft.

Peace.

The Rise and Fall of Pennykoin

Pennykoin is dead! Long live Pennykoin!

About two weeks ago, Jerry Howell, the main developer of Pennykoin (Pk), a Cryptonote privacy coin, publicly abandoned the project, citing personal health and financial reasons. As someone who was heavily involved with Pk during its early days, (eighteen months ago!), and probably the only person besides Jerry that has looked under the hood of Pk, I’ve been asked to take up maintenance of the project.

I feel it is necessary to provide some details that should be shared with the Pk community, my assessment of the current state of the project, and why I think Jerry was right to abandon it.

Beginnings

I don’t have much to say about Pk’s origins. Jerry posted an [ANN] link on May 14, 2018, which was the day after he had the release version of the wallet software released. There isn’t really much to say about Pk as a product. Cryptonote is a privacy coin framework. It was developed to allow people to create their own Proof of Work (PoW) coin by manipulating things such as block time and emissions rate. Bytecoin was the first coin made from it, (and is probably a scam), and Monero is based off of it as well.

Pk first came to my attention over the summer through a Tweet of a noted shitcoiner that I followed, and there seemed to be a nice community developing for Pk on Twitter, so I threw some hashrate at it and started mining. Jerry and I started talking, and pretty soon I was heavily involved, setting up the official mining pool and block explorer for Jerry while he worked on the code.

I found Jerry’s backstory very interesting. He worked in the service industry, I want to say it was food-related, but my memory is hazy. He had been sidelined due to health issues and was off his feet, so he started learning to program as a way to pass the time. I was actually impressed with what he had managed to accomplish. He had a vision for Pk, and I was happy to be helping build something, instead of passively investing in a project as I had been up until then.

Signs of trouble

At some point during the summer, it became clear to me that there were some serious issues that were going to hamper Pk in the long run.

Bootstrap fail

The first was a problem with bootstrapping, which is the way in which a freshly copy of the PK software downloaded the historical blockchain data from the other nodes. In short, it didn’t work. Part of the point of a blockchain is the validation process, in which a node verifies that each block is valid and meets the parameters of the chain that have been specified. Now Cryptonote is built to allow these parameters to be changed. So if one initially specifies a mining reward of 10, and later decides to change this to be 6, there is a way to specify versions of the blockchain, that the node can use to validate.

Unfortunately, Jerry made several changes to the Pk code in the first few weeks that didn’t follow this pattern, and as a result, the nodes were unable to download the complete chaindata, and would fail to sync when connecting to the network. Jerry’s response to this was to provide the chaindata as a separate download that needed to be thrown into the application data folder on machines.

I attempted to help fix this, but my knowledge of CN and C++ programming wasn’t up to the task, so I tried to hack our way around it by disabling validation checks on certain blocks. I don’t believe that these workarounds made it into any surviving releases of the Pk source, but the bootstrap issue has only gotten worse and worse, which is now why new installs require a 100+ megabyte download to bootstrap clients to block 130,000 or so, which is around the time of Jerry’s last update around the end of August.

Chain fail #1

Pk started to get noticed around late summer, and we started to see more and more hashpower being added to the network. I forget exactly what algorithm Jerry was using for PoW, but it was GPU-friendly, and available on rental services like NiceHash. Pk wasn’t yet available on any exchanges yet, but we had a healthy over-the-counter (OTC) market going on. (Disclaimer: I was providing escrow services for a fee.) Now, I can’t really say for certain whether what happened next was malicious or not, but we experienced a chain failure to to what I refer to as a difficulty attack.

Without getting too technical, block chains have a target block time, the average time in which a block should be mined. In the case of bitcoin, it’s two minutes. In Pk, it was two. There is an adjustment built into the blockchain algorithms that will determine whether this target block time is being met, and it will adjust the PoW difficulty target accordingly. To explain this in plain language, think of the PoW game as a requirement that n coins must be flipped heads in a row to win the block reward. As more miners join the network, that number will increase, from 10 to 100 to 1000 and so on. As miners leave the network, that number decreases.

What happened with Pk that fall, was that so much hash power was thrown at the network that some individual or individuals were able to mine an immense amount of Pk at a faster than normal rate. This increased the target difficulty immensely, at which point this power was removed from the network. The end result was that the remaining Pk miners did not have enough power available to mine any blocks, or at least at any rate that would have kept the chain running. Miners, not getting the expected rewards, left the network, making the promise even worse.

Now, I give Jerry immense credit for getting things running again. He basically implemented a difficulty adjustment to the code base and swapped the mining algorithm out for one which we could deploy much more powerful ASIC miners. He pushed the changes out after a week, the community threw some hashpower at it, and we were off and running again.

Note that I am being a bit speculative about what happened, without running metrics on block times and other chaindata, one cannot know whether this was a simple difficulty attack or something more nefarious.

You don’t know what you don’t know

In spite of the heroic efforts on the part of Jerry and others to keep Pk moving after this, there were problems with his development style that were due to his novice, that have ultimately hobbled this project. Now, let me be clear, I have nothing but respect for Jerry and am saying none of this to impinge his character. That said, I think there are things that any potential Pk investor needs to know, and that’s why I want to put this on the public record. I am by no means a professional software developer, so while I was ultimately able to spot these problems, I was in no position to correct them myself.

First off, Jerry never caught the hang of version control. When I first started working with him, he had a horrible habit of committing to master, and when he needed to change something, or hit a wall, he would just delete everything and start from scratch, which breaks the commit change and makes tracking changes very difficulty. For example, here’s what the early releases of the PKNode software looks like:

Orphan commits are not good.

Now, someone with more experience than I could probably rebase these to link them together, but there are dozens and dozens of commits and branches that go no where, and since I stepped away from Pk at the end of fall 2018, I don’t really know what he was doing. There are a few blog updates over the past year that talks about various initiatives, but there’s just too many changes for me to try and reverse engineer.

Technical debt must be paid

Unconfirmed transactions or ill-gotten gains

There are a number of problems with Pk as it stands now. First off, there is was a huge bug that has locked up about seven percent of all of the current coins in circulation. For a quick technical explanation, I need to talk about the unlock_time parameter of a CN transaction. This unlock time is primarily meant as a way to prevent newly mined coins from being spent immediately. I believe it’s meant to discourage forking attacks or something similar. With Pk, this unlock time is normally set to 10 blocks (20 minutes) for block rewards. For standard transactions, this is zero. Normally, anyways.

After Jerry’s disappearance, following the latest release, we started getting reports from people with unconfirmed transactions. According to the information that I’ve gathered, these people withdrew these amounts from the Graviex exchange, and their wallets were not making them available to be sent elsewhere. I was able to get my hands on one of the individual wallets, and confirm the issue.

A number of these transactions were mined to the blockchain with a large unlock time. So large, in fact, that at Pk’s target block time, it will be one hundred and twenty years before these funds are available to the owners. Now I wrote a program to examine the entire Pk blockchain, and I believe that these transactions were all limited to between November 2 and March 31, which coincides with the release of several version of the Pk software. I do not believe this bug is currently in circulation. However, I have been unable, or more accurately, unwilling, to contact Graviex to see what was going on their end in order to properly establish a root cause. In fact, I don’t think it’s likely that I will.

Et cetera…

There are other issues with the current code base that I’m not go into much detail on. There seems to be discrepancies between the CLI and GUI software, which I believe stems from the core CN/Pk libraries existing separately. Wallets amounts are different between the two for some of my older wallets, and I’ve documented at least one transaction that is present in the CLI is not displayed at all in the GUI. From what I can determine, this is not related to deposit functionality, which is not implemented in the CLI at all.

Given the recent vulnerabilities and bugs that have recently been exposed in Cryptonote, I cannot rule out the possibility that this was intentional, whether it was malicious or more benign. I am not accusing anyone specific of any ill behavior, but am just stating that I do not know enough to make that determination at this time.

The Jerry-sized hole

For the most part, up until now I’ve been discussing what are mostly technical challenges that can be overcome with enough time and effort. There is one issue that cannot, and that is the absence of Jerry Howell.

I’m passing no judgement here. Jerry had to chose between health and keeping a roof over his head, and I’m not going to second-guess his decision. Unfortunately, he couldn’t eke out a living from Pk, and priorities is priorities. He did what he had to do. For the future of Pk, that presents us with some problems.

First off, because Jerry still holds the key to the kingdom, so to speak, we would have to recreate much of the public-facing infrastructure: two sets of github repos, social media accounts, various Discord instances, &c.. Thankfully, members of the governance committee have access to the web domains and hosting services that are running the main pool. But in order to modify the code, I would have to create new Pk repos, which would present us with a different type of ‘fork’ that could become problematic. The last thing we want is a non-authoritative code source online, and what would happen if we took steps to publish new code to maintain Pk, and Jerry changes his mind and decides to go in a different direction?

To that end, members of the governance committee have decided that it would be best to wait till the end of the month to see if he will resurface. There is no doubt that he has been active; we noticed a new Github repo for an unrelated project that he created a few days after he disappeared. The best thing for Pk is that he will pop up long enough to properly hand off control of these resources so that development can continue without him.

If not, it will likely be the end of ‘Pennykoin’ altogether. The risk of associating with that name is too great, and at a minimum, the community will have to decide on a new name and rebrand.

The future of Pk

Even if Jerry does come back and hands things off, it doesn’t mean that we’ll be out of the woods. I’ve mentioned the technical debt that has to be paid off, and I can tell you right now that I am not the one to lead Pk out of the desert, so to speak. My personal investment in this project is not substantial, and the opportunity cost to me to spend the necessary amount of time needed for me to figure out how to fix what has happened is too great a risk. Unfortunately, there does not seem to be another individual within the community right now that has the technical skills needed to manage a project of this complexity. I simply do not have the time.

A quick calculation of the current Pk supply will give us a market cap of Pennykoin: 1.6b Pk minted over 150k blocks. The ‘price’ as I write this is less than one sat, more like 4/10 of one, which means we’re looking at a total cap of less than 7BTC, or $70,000 in USD. That’s impressive nonetheless, and demonstrates the promise of crypto-based assets, but as Jerry has found, one can’t pay the rent with those kinds of numbers. I just can’t commit to that, and am not willing to take responsibility as the one to fix the numerous problems that exist today. I’m not even sure I know how to fix these problems.

Stats on the PK block explorer

My personal and professional opinion is that the entire codebase should be redeployed from the ground up using test-driven development principles, with features selected from a proper governance process. I also think that inevitably, the existing chain will need to be abandoned completely. I’m not familiar enough with blockchain engineering that I could figure out a way to hard fork the stuck funds back into circulation. It would be something on par to Ethereum undoing the DAO hack, and I’ll admit I’m not up for it. have already been thinking about protocols for an automated chain swap that would allow us to provide current Pk holders with equity in the new chain.

Moving forward

I realize that this post may not be what anyone wants to hear, but I think it is an honest assessment of where Pk stands. I haven’t given much thought to what the fallout of this will be, but I assume that most people will be disappointed, and maybe even angry at me for publishing this. So be it. I’ve been in the cryptospace for almost five years, and I think this is one of the greatest opportunities that our generation will see in our lifetimes. Many of you no doubt agree, else you would not be here at the end of this brutal, brutal bear market. That said, I believe that there is no such thing as failure so long as knowledge is gained. I know I have learned an immense amount about blockchains from working on this, and many others have done so, and stepped up to run nodes, websites, or other parts of the Pk infrastructure.

So there is a way for us all to move forward, utilizing the connections that we have. It may not look like Pk does today, but there’s no reason that the connections and network that we’ve built should go to waste. This space is moving fast, and there are lots of opportunities to form a new type of organization that can help us move forward and build a future that we can all be proud of. I am personally very interested in work being done around Decentralized Autonomous Organizations (DAO) and Decentralized Finance (DeFi), but others may have other suggestions that they may be interested in.

What’s important to me is that the future that the Pk community decides on, whatever the course, is decided in a democratic manner. That is our first challenge, ensuring that what we build is more robust, and can’t be destroyed by the decision of one person to disappear. I have already been in discussions with others on the governance committee about forming a formal, legal entity to help manage this responsibility.

I realize the community is going to need some time to process what I’ve written, and I know there will be lots of discussion around this in the days to come. I hope that enough people will feel it worth their time to stick around, and help decide on next steps, together. There is vast opportunity in the crypto space, and I have already met many people that I would be honored to work with to build the next iteration of blockchain based products.

Regards,
el

Stablecoin lending interest with DeFi

So there’s been a bit of life in the crypto markets the past day or two. Bitcoin has been trending in a range. Cryptotwitter is debating whether it’s a descending triangle or a wedge, trying to predict whether a breakout up or down is coming. It looks to me that it’s in a consolidation zone. I have been holding off on purchasing much fiat to BTC, since I have other financial responsibilities that are taking precedence. Plus I have too much exposure, in general.

I have begun plans to phase out my use of Lending Club for investment purposes. I had started separate accounts for both of my kids, and was happy with the $25-50/month that I had been setting in there for them, with three to five percent interest. But around the time of the bull run, October 2017, I decided to start putting those funds into BTC, giving both of them their own wallets. I let Lending Club continue to reinvest the returned payments. Until recently.

The big talk in the cryptoasset space right now is in decentralized finance, or DeFi. Most of the major apps in the space rely on Ethereum smart contracts, stable coins like Dai being the most prominent. I became aware of platforms like Compound, which allow lending and borrowing of several assets, like Dai, Ether, and others. The basic premise behind Compound is that people deposit their assets with the smart contract, and can then use those assets as collateral to which they can borrow other assets. The reasons why is something I really can’t explain. I assume most of it is speculative trading; a bit to risky for me given the borrrower APR.

Now with Dai, which tries to maintain a 1:1 parity with USD, has had a 20% stability fee assessed against it. Which is why it had a nearly twelve percent lending interest rate on Compound a few weeks ago. I had to try it out. I had some change on Coinbase, so I bought twenty bucks worth of Dai, transferred it to a Metamask wallet, and had it deposited at Compound in no time.

Supply (lending) interest rates on Compound for Dai and USD coin are much higher than traditional finance (for now).

Now, this is not financial advice, and there is a risk with DeFi and smart contracts. There is the possibility that there is a flaw in either the Compound or Dai contracts, and something could go horribly wrong. But I’ve decided to stop reinvesting the kid’s funds on Lending Club, and will start moving their funds over to Compound as the loans are paid out. There’s no sense in lending USD at less than three percent, given that it’s hardly better than inflation. Now the rates on Compound and other DeFi applications can fluctuate daily as well, so I’ll need to keep an eye on things and make sure nothing crazy happens.

Given that I want to take advantage of this new opportunity, without increasing exposure to BTC directly, gaining high interest on stablecoins pegged to USD seems like a no brainer.