Unveiling DPHA: My Journey in Creating a GPT-Based Developer Assistant

A futuristic AI assistant represented as an abstract digital entity, with glowing neon lines and digital elements symbolizing advanced technology and intelligence. The background is a complex network of digital nodes and connections, illustrating a high-tech, interconnected environment. The assistant is not humanoid but rather an intricate, luminous, and dynamic digital construct, showcasing the essence of an AI operating in a virtual space.

Harnessing LinkedIn’s Open-Source Framework for Enhanced Developer Productivity

I’m excited to share with you a fascinating project I’ve been working on recently – the creation of a GPT Assistant, which I’ve affectionately named the Developer Productivity and Happiness Assistant (DPHA).

LinkedIn’s Open-Source Framework: A Catalyst for Innovation

Recently, LinkedIn made headlines by sharing its Developer Productivity Framework (The New Stack article) and opening its doors to the broader tech community through their GitHub repository. Inspired by this, I embarked on an intriguing journey to transform this framework into a practical AI tool.

The Birth of DPHA: From Concept to Reality in Fifteen Minutes

Believe it or not, converting this repository into a GPT Assistant was a remarkably swift process, taking about fifteen minutes. You can check out the DPHA here. This swift turnaround was an eye-opener, showcasing the immense potential and efficiency of AI in modern software development.

Exploring the GPT Store: A New Avenue for Monetization

As someone keenly interested in the evolving landscape of AI, I’ve been exploring the idea of creating GPTs as a potential revenue stream, especially with the emergence of OpenAI’s app store (Forbes article). DPHA is my first foray into this domain.

Spreading the Word: LinkedIn and GitHub Discussions

To gauge interest and gather feedback, I’ve shared DPHA with my LinkedIn network and also posted a link to the GitHub discussion for the repository (GitHub Discussion). These platforms are excellent for attracting traffic and engaging with potential users, especially developers who might benefit from this tool.

Test Driving DPHA: A Glimpse into the AI’s Capabilities

My first interaction with DPHA was simple yet revealing. I started with a “let’s get started” prompt and followed the assistant’s lead. The conversation was insightful, with DPHA adeptly handling queries and offering valuable suggestions based on the Developer Productivity Framework. Here’s a snippet of our conversation, tailored to my context.


Business Context:

  • DAO Operating Model: Unique collaborative and decision-making processes.
  • Focus on LLM-Based Tools: High engagement with cutting-edge AI technologies.
  • Participation in Tech Sprint: Opportunity for recognition and scaling.

Team and Tools:

  • Small Team with Diverse Experience: Your extensive tech experience combined with Laren’s current academic perspective.
  • Current Tools: GitHub, IntelliJ/PyCharm, OpenAI, various FOSS projects.
  • Pain Point: Transitioning from Replit to GCS with proper CI/CD pipelines using GitHub Actions.

Goals and Challenges:

  • 2024 Goals: Enhancing personal automation projects, succeeding in the challenge.org tech sprint, launching GPT assistants, and building tools for personal branding.
  • Pain Points: Deployment issues, desire for smoother CI/CD processes.
  • Metrics and Feedback: Basic time tracking, concerns about code quality and build integrity.

Recommendations:

Workflow Optimization:

    • Transition to GCS: Plan a phased migration to Google Cloud Services, ensuring minimal disruption. Start with less critical components to gain familiarity.
    • CI/CD Pipeline: Implement GitHub Actions for continuous integration and delivery. Start with basic build and test automation and gradually incorporate more complex workflows.

    Metrics and Feedback Systems:

    • Code Quality Metrics: Integrate tools like SonarQube or CodeClimate with your GitHub repository to automatically review code quality and maintain clean code.
    • Build and Deployment Metrics: Track build success rates, deployment frequency, and time-to-recovery in case of failures.
    • Time Tracking Review: Schedule regular reviews of time tracking data to identify bottlenecks or inefficiencies in your development process.

    Team Collaboration and Growth:

    • Regular Check-ins: Have weekly sync-ups with your team to discuss progress, challenges, and brainstorm solutions.
    • Learning and Development: Encourage Laren to bring in new ideas from her academic background and allocate time for learning new technologies and best practices.

    Project Management:

    • Agile Practices: Implement agile methodologies like Scrum or Kanban to manage your projects. This can help in better handling your diverse project portfolio.
    • Task Management Tool: Use tools like Trello or Jira for better task visibility and prioritization.

    Risk Management:

    • Version Control Best Practices: Ensure that you’re using feature branches, pull requests, and code reviews to maintain code quality and prevent breaking the build.
    • Backup and Recovery Strategies: Regularly back up your code and have a disaster recovery plan, especially important when working with cloud services.

    Innovation and Experimentation:

    • Blockchain and AI Integration: Keep experimenting with blockchain and AI in your personal projects, as this aligns with your interest and business model.

    Participation in Challenge.org Tech Sprint:

    • Focus on MVP: Concentrate on building a Minimum Viable Product (MVP) for the tech sprint, leveraging your strengths in AI and blockchain.

    Building Personal Brand:

    • GPT Assistants Deployment: Progress with deploying GPT assistants on the OAI store, potentially using this as a case study or showcase for your brand.

    Navigating the Evolving Landscape of LLM Apps and Agent Frameworks

    Exploring the Convergence of Workflow Management and AI in the Age of Automation

    In the dynamic and rapidly evolving field of LLM applications and agent frameworks, staying ahead of the curve is a necessity. This blog post delves into our journey and insights as we navigate through various platforms and technologies, aiming to integrate advanced AI capabilities into our workflow and products.

    Steamship to Super Agent: A Journey of Exploration

    Our exploration began with the Steamship framework, offering a comprehensive environment for single-agent application development. However, as our needs evolved, we found ourselves seeking more control over the systems we run. This led us to Super Agent, a self-hosted solution offering a similar range of capabilities but with greater autonomy.

    Integrating OpenAI’s Latest Innovations

    A significant focus has been on integrating OpenAI’s latest offerings. We’re currently working on incorporating the Whisper model into Discord voice channels, aiming to enable the model to transcribe and respond using text-to-speech modules. This integration represents a step towards a fully multimodal business intelligence system.

    The Promise of ChainLit and TaskWeaver

    ChainLit has emerged as a compelling option for building Python LLM applications, akin to StreamLit but tailored for LLM apps. Simultaneously, we’re examining Microsoft’s TaskWeaver, a code-first agent framework that appears to merge workflow management with LLM capabilities seamlessly.

    Navigating the Challenges of Observability and Management

    Observability remains a significant challenge, especially given the limitations of OpenAI’s organizational structure. Tools like Pezzo Labs and Langfuse offer promising solutions, but the decision on which platform to commit to remains open.

    Looking Forward: An Eye on the Future

    Our journey is characterized by continuous learning and adaptation. We recognize the importance of not committing prematurely to any single framework, keeping our options open as the landscape of LLM apps and agent frameworks continues to evolve.


    References and Further Reading:

    Building the Nebuchadnezzar

    Dreams you can’t remember but keep searching for an answer

    I’m not sure exactly why I’ve been having problems with my sleep schedule lately, waking up at 3AM in the morning has never been my thing, and I’m not particularly enjoying it. It doesn’t help that I get woke up by Younger, or that the cats like to stampede through the house, making me think that we’re getting robbed and spiking my adrenaline. I’ve always been a light sleeper, even since I was a teen. I could never sleep as long as my mom and dad were awake, and I could hear them watching TV or talking and getting ready for bed. I suppose that this is one of the reasons that I’ve always gravitated toward being a night owl. I’ve heard that some people with ADHD tend to be night owls as well, getting most of their work done in the hours when everyone else is quiet. It certainly works for me in the morning.

    As my tinnitus has gotten worse over the years, I’ve taken to keeping a fan on, and more recently a noise machine. My wife has convinced me to start making a habit of going to bed at 10PM, and I’ll read a book on my iPad or dead trees for a bit before going to bed. Sometimes I’ll take a melatonin, but I’m wary about over using them because of the tolerance, and because of the fact that lately when I take them I wake up at three in the morning and can’t go back to sleep.

    I am one of those people who seem to get by with less sleep than most people. Certainly less than my wife, who would be happy to sleep twelve hours a day if circumstances permitted. I think it’s one of the reasons that our relationship has worked so well over the years. She liked to sleep, and I got to stay up late playing video games. At least before the kids came along, anyways.

    So this morning I woke up from a dream — something about an alien invasion and trying to fight back against the oppressors. We were on the run and came to a dead end in a canyon. We were trapped — and I woke up. I tried to go back to sleep but then the cats knocked something over or made a noise, maybe it actually came from outside, but a part of my brain yelled HOME INVASION THEY’RE COMING FOR YOUR CRYPTO and then my adrenaline spiked and started running though response scenarios.

    Meditation has been helpful in quieting things down, but most of the time I fail to fall asleep, so I lay in bed, until there’s some thought that won’t let go and I finally give up. Of course this morning it was related to the dao proposal that I’ve been working on for two days now, trying to come up with a proper incentive scheme to incentivise early participation. So I got up, boiled water for my tea, medidated, had some ideas, gave sleep one more shot on the couch and now I’m up for the day. It is twenty after six. Such is my life these days.

    So I’m very close to finishing the dao proposal. I have some more writing to do, then I’ll be ready for some more community feedback. Yesterday I spoke to a computer science student in Singapore. They “manage equities” for their family and said they were looking at a minimum investment of fifty thousand dollars. We talked for an hour, and I was able to secure a promise to move forward with the broad plan that I described. So I just need to finish writing out those broad strokes so that I can give everyone a day to join before the sale launches.

    All this work with DaoHaus on xDAI and with the Gnosis safes has really given me some great ideas. Even beyond the NFT sale, this may be a great way to organize the family business. xDAI would mean that the wallets would be easy to manage, share wise, and I could just run everything from a Gnosis safe. I could setup my dad and brother as owners and we could just have a one of three multisig on it. Might work out beautifully, depending on gas costs. Might get a bit more complicated later on when we start , but I think it’s the solution that I’ve been looking for.

    Side note, I ran across this firm, Korporatio, that sets up offshore entities in Panama and other locales. The created entities are managed through on Ethereum through an interface that looks strikingly like DaoHaus. Too expensive for me to consider right now though.

    I’m going to be very happy with myself after the NFT dao is launched. I’m still excited about Star Atlas itself, my conversation with the student from Singapore touched on Solana a bit. Since the game will run on Solana, and most of the game’s underlying infrastructure will be publically accessible, it means that we’ll be able to write our own programs to interface with it. It’ll be like hacking into the Matrix from onboard the Nebuchadnezzar. We should be able to access the game’s API and manage things from outside of the official game client. Fun. I was skimming the docs for Solana yesterday, and it looks like it’s going to be a long road to that one.

    From what I can tell, there are so far no dao projects active on Solana, so hacking together something that resembles MolochDao looks like it’s going to be our first goal if we are going to bring SAIA Dao to fruition.

    Web3 development

    Yesterday we had a party for Elder, since we were out of town for her birthday. Our quranteam came over, so all the kids were running around the backyard while we ate pizza and wings, and drank the latest batch of my homebrew. Missus even got me to break out my guitar and I spent a good hour playing and singing at the top of my lungs. It was good times.

    I keep partying after the kids went to bed, and played video games until well after midnight. I paid for it this morning. There wasn’t much cleanup left to do from the party, but I wasn’t productive in the AM. I didn’t get much work done for Zombie, I pretty much just checked in with Boss and spent the rest of the morning looking at markets and reading.

    I wound up buying an IXL subscription for Younger, and we spent some time working through some of that.

    I decided to take a look at Flutter, and worked through the entire tutorial. It’s a very interesting project, like React, that allows you to create one project that will render on IOS, Android and the web. It’s pretty neat. I’m not sure how I feel about Dart yet, but I’ll probably dive into more later. The whole ecosystem is pretty interesting; Material probably deserves a closer look at some point also.

    I’ve got to be careful though, cause I feel like my backlog is gonna quickly get swamped up at the rate I’m going. I also ran across Alchemy, which is a development platform for Ethereum, and I feel like I’m going to be spending a lot of time with that very soon.

    It seems like there’s a whole lot of things to be built in the DeFi ecosystem, so having a good understanding of how to interact with smart contracts is going to be instrumental, as is being able to build dashboards like what Zapper and others are doing. I’m really going to have to have my work cut out for me. I love Python, but I’m going to have to work with some of these other languages if I want to be a world-class developer.

    Tonight I think I’m going to spend some time working on Ethernauts. I need to make some adjustments to my Solidity workflow to make things work a little better, and it should give a better understanding how to interact with Ethereum programmatically.

    Evening notes

    Trade plan programming

    I’ve been working on my trade planning Python module the last couple days, and already the project is becoming rather complex. I say it’s a trade plan module, but really it’s a capital preservation ‘brake’, if you will.

    The basic idea behind the module is like this:

    • Get balance list and filter empty ones.
    • Get last symbol/BTC market price.
    • Calculate total BTC value of all holdings.
    • Get open orders for each market. For each, look for limit orders, and calculate the covered/uncovered amount in BTC.
    • Make sure that no uncovered position accounts for more than two percent of total portfolio value, and that no more than six percent of the portfolio value is uncovered. If they are, do not allow any additional buys.

    The last couple days I’ve been slowly working through everything, following a strict TDD methodology to make sure the code is covered, monkeypatching and mocking calls and creating fixtures for the exchange data. Now I’m getting to the point where I don’t know how to proceed, and I’m getting frustrated.

    I don’t know where the problem arises in times like these, but I have a feeling it comes from lack of proper planning. I start out with a few procedural calls, then I get to a certain point of complexity where I have to start refactoring classes. Or I don’t know what to do next, and so I cobble come code together without writing a unit test first, and start breaking my flow.

    All I can do at times like this is take a break.

    Binance token mooning

    Binance token has been on a bit of a tear the last few days. Apparently they’ve launched their own EVM compatible Binance Smart Chain, and are hoping to go after the DeFi space. Good luck to them.

    I took a look at the validator instructions earlier to price out the cost of being one. It costs 10,000 BNB tokens, or about 300 BTC ($3 million), and about $244/month in AWS costs. That’s still a magnitude cheaper than running a $30 million Serum DEX node, but shows the type of centralization that we’re going to be seeing with these projects. I’ll keep running my puny IDEX node, and work toward my 32 ETH so I can run a Ethereum 2.0 node.

    I’ve actually been holding my BNB tokens for two years, and they just actually touched my cost basis after spending so much time underwater. Since I’m actually trying to follow my capital preservation rules, I’ve had to put a tight stop on this latest run. I’ll have to figure out how to account for entry cost in my trade plan program, as now I’m just looking at the percentage of total. This may not work well when things start mooning and I have to recalculate on the run-up.

    Jumping into the DeFi deep-end

    I’ve decided that the opportunity cost for keeping my funds in BlockFi is just too great, and I’ve initiated some withdrawals. I’ll be putting the entirety of the funds set aside for my kids into the sBTC vault later this week, for a modest 40% APY. I must have stared at the withdrawal screen for five minutes before I could push the submit button. I must have read the wallet address over and over three or four times to make sure they were right.

    It’s stressful, being your own bank.

    Anyways, I’ve still made no decision on my cold storage funds. I’m risking way more than two percent on this vault, and any more would be irresponsible.

    Famous last words.

    No rest on Labor Day

    Today is the first day of school for Elder, and one that will be entirely online. She’s got afternoon sessions, which means that Younger will be looking to me as a playmate. I don’t think I’ll be getting much, if any work done at all. We’re lucky she got the afternoon spot though, cause the eight to noon spot would be even worse.

    Yesterday morning I produced a Labor Day Breakfast for the local political party. It was in Zoom, and they called me about three days out to put it together. I managed to use OBS as a virtual camera, and was able to queue up a half dozen videos in OBS so that we weren’t struggling with it using Zoom’s sharing capabilities. I just set the scene in OBS, put the spotlight on my video stream and hit transition. Voila!

    It went really well, but was really harrowing. There was probably close to two dozen speakers, including our Senator, several Representatives, as well as numerous state officials. I managed to keep it on track, except for some of the speeches. By the end I was playing people off, Improv-style. The whole thing was about two hours. One attendee remarked that it was the best-run Zoom meeting that they ever saw. There was over one hundred people on it at one point.

    Then as soon as I was done with that I had to get ready for a pool party, which meant I had to run to the grocery store for last-minute shopping. I only picked up a few things, but the store was crowded. As soon as I got back Missus loaded the kids up and we went to pick up her mom and headed off to the party.

    I didn’t get to enjoy it for long.

    About an hour after we got there, I got a message from a political committee member that the website had been “hacked”. I pulled it up and was met with the bare directory listing for the WordPress site. Index.php was missing, so my first thought was that it was a failed upgrade. I tried to pull up my management console on my phone to restore a backup, but the most recent one was months old. Oops. So much for relaxing.

    I’m not going to get into the details of the hack or the recovery, I think this was a simple case of credentials being leaked. There were too many people that had access to it, and the committee secretary was given the site admin credentials to use to post on the page. Yikes!

    I cleared that situation up with them and urged everyone to check their antivirus. One of the committee chairs was running a Mac with nothing but Malwarebytes on it. I swear. I locked the site down as much as I could with free versions of Securi and Ninja Scanner, so I think things are cool for now. This is the second time this site has been hacked though, so I’ll have to keep an eye on it.

    I just checked the login attempts on the site, two attempts from Rio De Janeiro overnight.

    Liquidity

    Spent some time today delving into Uniswap. Here’s a couple of posts that have some good information:

    Understanding Uniswap Returns

    An Introduction to Automated Market Makers

    I had a bit of a flash this morning that I should probably start exiting my IDEX position into ETH, specifically the yETH pool, but it turns out that Yearn has halted deposits on the pool. I’m glad i got my little test deposits in when I did.

    Still, I was looking at the best way to exchange my tokens. On IDEX, obviously, but I have never actually used them since they implemented accounts, so I can’t trade there as of now. Binance has trade pairs to BTC, but that would involve another trade. Then of course, there’s Uniswap, so I took a look and found an IDEX-ETH trading pool.

    The liquidity here is not very impressive. And I saw an opportunity for me to provide some, although I still don’t understand how the assets in the pool are being staked together. I would assume that the pool would need to be 1:1 in value between the pairs, but it actually looks to be about 1:2, as far as the USD value of IDEX-ETH. And I’m not going to put any more capital at risk until I understand what this “divergence loss” is and how I can keep from being affected by it.

    I also spent some time looking for arbitrage opportunities. There was a bit of a price divergence between the IDEX exchange price and the Uniswap price, but the liquidity is so low that trying to take a large order would eat the price divergence back to par, and dealing with low amounts would have caused any profits to have been eaten up by gas fees.

    So for now, I’ll take no action while I wait for a bit of a price recovery on IDEX and explore other opportunities. I’ve given up trying to get the Monero blockchain running locally, and have it syncing in a cloud server. What was taking over a week with my SATA stripe array looks like it’ll take a few hours on cloud.

    Other than that, I’ll be working through Mastering Ethereum, trying to understand these smart contracts, and hopefully figure out how these smart contracts work, how to design my own, and how to build programs to interact with them.

    Generating spelling flash cards with RemNote

    Making alphabet and spelling flash cards with a little help from regex

    I’ve been getting used to RemNote for a little over a week now. I haven’t really gotten too much into yet, just taking notes and trying to link things up. I haven’t played with the spaced repetition features yet; I’ve used Anki in the past to get through an accounting class a few years ago, but I haven’t really felt the need to use it much for anything I’ve been dealing with lately. I may start using it for certain CLI commands at some point, we’ll see.

    I did start trying to use it for Younger and Elder, though. I set up a document for the alphabet and filled it out like so:

    A:: A
    B:: B
    C:: C

    And so on. It doesn’t look like there’s a way create these cards without having something on either side of the double colons, so I just filled it in with the letters on each side. Of course, Younger can’t do these by herself, so I have to sit there with her and push the answer buttons for her. It’s been working ok so far, it takes a couple minutes, and the app makes a nice little fireworks display when you hit your daily goal. She loves it. It of course makes her big sister a little jealous so I had to find a way to do one for her as well. We settled on third grade vocabulary words.

    I found a couple lists online, but I wasn’t trying to copy and paste two hundred words into the proper format, so I did what any programmer worth their salt would do: regex.

    Take a list like the following:

    additional	event	region
    agreeable	examine	repair
    argue	example	ridiculous

    We want to separate the non-whitespace \S from the whitespace \S, into two ( ) groups : (\S+)(\s*). Then we can substitute, using \1 as shorthand for the first group: \1::\1\n. This gives us the following output, which exports perfectly into RemNote:

    additional::additional
    event::event
    region::region
    agreeable::agreeable
    examine::examine
    repair::repair
    argue::argue
    example::example
    ridiculous::ridiculous

    Now while this works fine from a technical perspective, it’s a bit flawed in execution. Elder can’t see the words that she’s trying to spell, obviously, so I have to read them to her while she sits across the room from me. It causes her to miss the reward, the fireworks, and caused a bit of distress on her part.

    So here I am now, brainstorming ways to generate audio files for these words so that I can put them in with the cards. Do I read a list of 200 words, and then go through the editing process to separate them into individual files and attach them to the proper file, or is there a way to program and automate all this.

    Of course there is. There’s a Python module for the Google Text to Speech library, so I could literally generate the files in a few minutes. Then it’s just a question of importing them into RemNote. Unfortunately, RemNote doesn’t seem to support uploading or local audio files, so I would have to either upload them somewhere like an AWS bucket, or just use something like Anki, which supports audio within the card decks themselves. We shall see.

    I’ll have to keep quizzing Elder on my own now, she seems to do better with the one on one time anyways. I’ll be sure to share any updates.

    Storm watch

    Hurricane Isaias is making itself known. Wind gusts are pounding the house, making it shake like a freight train. The girls are up, Missus let them start a movie this morning despite my protests. She woke up early because of the storm and apparently isn’t planning on doing any work till later this morning.

    Alerts have been popping up on my phone all morning as our managed servers have been going dark across the board. Internet and power have been dropping across the region as the storm makes its way across the area. It’s not really that much more work for me, since there’s not much I can do about it. Hopefully I’ll be able to get some work done on my two main goals at work: converting a client over to Microsoft’s mobile device management, and building a C++ build pipeline for some embedded controller software.

    The RMM vendor that we work with integrated IBM’s MaaS360 product into their offerings two years ago, and we signed on one of our clients for it. It was a bit more involved than we expected for such a small deployment. We had to get a management certificate issues from Apple, which wasn’t too bad, but then we had to manage eleven Apple IDs, one for each user, before we could even enroll the phones. This involved downloading a special management app and profile. The client wanted content filtering on the phones, which meant the deployment of MaaS’s Secure Browser, which involved several more steps. Then we thought we were done, and I just ignored the deployment until about a month ago.

    The client contact me about installing a new service app on the phone, and after figuring out how to login to the management portal I found that nine out of the elven mobile devices hadn’t checked in, some in over eighteen months. After contacting my RMM vendor for some support and getting frustrated at their lack of knowledge, I started searching for solutions. I new Microsoft had been offering some options through O365, and since most all of our clients are 365 clients, I thought that any solution that can be managed through it would be a plus. What I found is that the latest MDM offerings, included free with O365, actually gives us a lot of what we need, which is security profiles on the device itself, and the ability to control the software installed on the device. I did a quick test with our O365 tenant and my personal device, and I’ve been holding on to a client phone for about a week to test and document procedures so that they can setup the rest of the devices. I’ve been talking to other MSPs in our network, and let me say that there’s a lot of interest in the fact that I’ve been able to setup federation between O365 and Apple Business Manager.

    The other project I’m trying to work on involves setting up automated deployments for a development project. The developer workstations are based off of an Ubuntu 16 VirtualBox image with a custom IDE and hardware libraries installed. The process to setup runs about five or six pages, and hasn’t been replicated by the client, so I’m hoping to go through the document and create a full script that can be replicated to set things up for new employees, or whenever the developer config changes. I’d like to get them up to Ubuntu 18, at a minimum, but the eventual goal is to make sure that we have a build process that exists outside of the IDE and can be automated via a build job as part of the version control process.

    The problem I was running into is that my own computing resources are kind of limited right now. I already run my Windows workstation in a Ubuntu KVM instance, so running another VirtualBox wasn’t really an option. So I decided to use some of my Azure credits that I get from my Microsoft Service Provider benefits. I recently used an Azure VM to stage an on-prem domain deployment, scripting it out using Desired Configuration State (DCS). I was able to validate my AD and DHCP scripts on the Azure server, then copy the files down to the on prem server, run them, and have my deployment up and running in about an hour. The scripts will need some improvements before it’s really useful, but it’s a start.

    So before I got started yesterday, I decided to explore deploying my VM via the Azure CLI. I went through a couple exercises yesterday to practice, and today I’m ready to get started with the actual projects.

    A couple days ago, a marketing employee at Zombie made a comment to me that they were thinking about becoming a technician, and I told her to look at cloud engineer tracks, cause AWS and Azure jobs are among the highest paying and in demand, besides data scientists. Spurred by my own comments, I started exploring the training options for AWS, and started going through the AWS Cloud Practitioner track. The exam is only $120, and why not. I actually prefer AWS over Azure cause of the pricing — good luck finding a $15 a month Azure VM! — and want to really have a handle on it since that’s where I’ll probably be focusing my own entrepreneurial projects. I’m still locked into Microsoft at work, so learning Azure is going to help me, but everything Microsoft does is convoluted and complicated.

    Will having a handle on both AWS and Azure make me a double threat? Doubtful, since I wager most large shops will use one or the other, not both, but that’s just my situation now. So I’m stuck between the two. Jack of all trades, master of none.

    Fast, good, cheap

    Pick any two

    It’s been a little over one year since I started blogging in earnest. I’ve been taking a look at the archives from last July to see what I was writing about back then. When I started, I think I gave myself a three hundred word target, just to get in the habit. Today, these posts routinely run two to three times that length, and with some posts in excess of fifteen hundred words. The content of those early posts were more focused; I had the habit of writing a post for every book or magazine that I read, but today these posts are mostly journal exercises for the most part.

    My most popular posts have been on technical issues, two about a WordPress hack and an Windows server issue seems to drive most of the traffic here. My exploration into Facebook’s Prophet machine learning tools gets another trickle. I’ve yet to find a focus for this blog beyond whatever strikes my fancy for the day, and I’m content to continue with it as is, making small adjustments as necessary. However, they say that no one ever got where they wanted to go without a plan, so some critical fascimilie of a plan might have to come together at some point if I want this to be a part of a long-term career strategy.

    For now, it serves enough for it to be a place where I practice my writing muscle. If I write, I am therefore a writer, so it goes, and every day that I write the better I get. I’m closing in on three hundred posts here, including ones older than a year old. (This count doesn’t consider the archival posts that are monthly roll-ups from the previous incarnation of my WP database.) I’m hoping that by the time I reach five hundred I’ll be even better. We’ll see if the traffic to this blog increases along with it. Time will tell.


    The kids have been incredibly difficult this morning. We all got up pretty much at the same time, and I was unable to get much done till after they left for their grandmother’s house. Younger has been especially sensitive this morning, but both of the girls seem intent on making a sport out of disobeying me. I was unable to get either of them to do their studies this morning, and at one point I had them both taking timeouts in the kitchen, which they made into a game where they tried to laugh at each other while I made lunch. I shouldn’t be mad but I did lose my temper briefly from having to repeat myself whilst being ignored repeatedly. Hopefully they’ll be better behaved when they come back.


    I’ll admit that part of the reason for the discord here in the house is due to a text I got from my WordPress client basically firing me from the project. When we had set out, I thought I had made perfectly clear that this was going to be done quickly. I believe my exact words were something to the effect of being on the cheap and good areas on the project triangle, and that if we needed to move to the fast that they should let me know. As we entered the third month of our engagement, they let it be known that they were frustrated with the pace, and that I had expressed some doubts about my ability to deliver the project. I had expressed some frustrations about the work that I had inherited. This was mostly due to the amicable arrangement that we had started out on.

    I think one of the major mistakes I made taking on this project was not properly scoping it and setting expectations. Another WP developer in my area charges twelve hundred dollars for a basic, four or five page WP site, and this project involved a major redesign and restructuring of an existing site. Easily a six month project at the rates I was charging. That obviously wouldn’t have flown if I had proposed that at the beginning.

    I did identify several aspects of the redesign that I wasn’t going to be able to deliver on my own, mainly image assets. I was having a hard time gathering stock photography to match what they were asking me for. When I made this clear to the client, and told them that delivering everything I felt needed to be done within the accelerated timeline was going to be difficult, they told me that they had other developer resources that we could bring in. I said by all means.

    This hasn’t been going quite the way I hoped it would turn out. In anticipation, I wrote up a project summary, invited the outside dev to my Basecamp, where I had all of the project notes and tasks, and spent several sessions building out a backlog of things that needed to be done. I told the dev, a PHP and Laravel dev from Pakistan, that I needed their assistance with one particular task: setting up the MemberPress plugin for us.

    It doesn’t seem that any of that has even been considered. When I got the text, to the effect that development would proceed from scratch due to the difficulty in determining what I had done, I checked logs for the staging site and saw that no one besides myself had even logged into it. So something else appears to be going on. I suspect that besides the English language barrier, the outside dev might be more of a Laravel developer than a WordPress one. And I find it highly ironic they’re starting from scratch, when I literally spent two months trying to figure out what the last dev did.

    I’m trying to tread a fine line here given that this engagement is with someone I consider to be a friend. We had gotten into some heated discussions about this, and you know the old saw about mixing business with pleasure. Still, my friend is enough of a intrepid entrepreneur that I considered this a baby step into what should be the start of a mutually profitable enterprise for both of us. When they broached the subject of terminating the arrangement with me a few weeks ago, I was so held by a sense of honor that I basically volunteered to finish the work for free. That’s why this morning’s message stung so much.

    I replied back with as much tact as was possible given the cortisone flowing. I told them that the outside dev hadn’t even given a cursory look at what I had done, and I asked that they take another look at the progress I had made in the past few days before they pulled the trigger on a redesign. Further, I said, even if they did insist on moving forward with a new project, I intended to continue my development on the staging site until I was satisfied that I had fulfilled my promise to deliver the redesign and the membership features by the end of next week.

    This project has taught me a lot already, both about WordPress development, but aslo about managing client expectations. I have got to spend more time focusing on the business side of the relationship, and establish some formal contracts and work blueprints so that expectations are better managed up front. For now, I’ve got about twenty hours of work left in the month in which to deliver and salvage this project. Failure is not an option, and neither is ruining this friendship.