Running Solana

Well we are off to a good start this weekend. I went to bed at a decent hour, got a great night’s sleep, then woke up and ran three miles, meditated, and am doing my morning pages. There’s a lot to do today, it’s the first weekend of the month, so I need to do the family business: transferring funds from cash to crypto accounts, balancing the house accounts, and making sure I deposit my share of the rent.

Yesterday wasn’t a total disaster, in spite of my efforts from Thursday night. I managed to fix an issue with one of the SAIA token accounts that prevented me from making a transfer. I had a call with our VC team, and I managed to work through the Solana Hello World program on Figment.io.

I figure I’ve spent enough time messing around with Rust tutorials and manually trying to parse transactions via the explorer, and that it’s time to really delve in. I spent the last two days digging into the Solana web3 and Serum DEX libraries, reading Rust code and, trying to figure it out from there. I’ve studied enough, the only way for me to learn more is by doing.

It only took me a few hours to get through it. The Figment system is pretty nice, it’s basically a huge web app that has modules for various chains like Solana, Polkadot, and many others. Strangely enough they haven’t added an Ethereum one, but I suppose those are easy enough to find. I’m not a total noob though, so I had many of the prerequisites in place, and most of the technical details I already knew from high level how things were supposed to work.

Figment takes the Hello World program and removes certain functions or parameter calls, forcing you to figure out which ones to use and how to parse them. I might have been trying to go through them too fast, because I cheated a bit and looked ahead at the solutions to see how things worked. Not like my work with Exercism where I would sometimes take days to figure out a solution, in order to make something that worked before peeking at the community solutions.

So I got through the track last night. Some of the things aren’t quite clear to me. Compiling and deploying a Rust program to the blockchain is straightforward enough, but this is a web3 tutorial after all. The way that program accounts are created is something I’ll need to go back over. The way program instructions are parsed is still a bit incomplete in my head, especially when it comes to figuring out which accounts, pubkeys and secrets need to be passed along where.

But I finally know what borsh is, and a bit more about how Typescript works. I’m definitely not an expert at any rate, so I’ll probably head back to the Solana developer resources later and go back through another starter tutorial later.

I’ve been reading through the spl-token Rust code to get used to the type of operations that are done there. I’ve been using the multisig CLI and I want to see if there are ways we can call functions on chain using web3. Anchor has an multisig implementation also that allows signing ‘arbitrary’ transactions, but up until today I really had no idea how I might create one of those. But after digging into the Serum Rust contracts, as well as the various web3 libraries that I’ve been working with for that, I’m starting to get a bit of an idea as to how that might work. I also took a deep dive into the Solana Explorer code, and learned more how we might retrieve and parse market data events going back in time and form cost basis calculations or fill in the gaps in our indexer history.

It might be slow going, but just like running, all I’ve just got to keep putting one foot in front of the other.

Challenged

I have been a bit out of control lately. I’ve been drinking a lot of beer. Missus and I went through two cases, thirty total in three days, and I wound up going to the store at eight-thirty last night to stock back up. I’ve been in hangover city this morning and have finally decided it’s time to get up. Missus is in bed, she called out.

My first mistake, as always, is buying beer in the first place. I haven’t worked out at all this week except for a run on Monday. I almost went for a jog yesterday afternoon since the weather was cool, but I had eaten a big lunch and was worried I wouldn’t be able to finish. I haven’t lifted in a week probably.

I don’t know why I drink. I know I get triggered if Missus drinks, but I’m not really sure why. FOMO, I guess. I don’t think I get stressed out and say man, I really need a drink, I just do it, mechanically. I know I’m poisoning myself, slowly, and the effect on my pocketbook is probably significant as well. I’ve probably spent seventy dollars this week and we’ve still got the weekend.

Maybe I am stressed out more than I tell myself. Programming is hard, and trying to work on coding projects with a beer in my hand, gulp gulp gulp. My Perpetual funding is flat this week, glug glug glug. Kids are being PITAs, you get it.

I’m not quite sure what I’m going to do today. I started working though some Solana developer tutorials, so I’ll probably work on one of those. I keep banging my head around this problem of transaction history, and I’m not even sure I’m looking at it properly.

The problem I’m looking at is ultimately about Serum market history, but I’ve chunked down to something more base, namely the state of a Solana program account at a particular point in time. The hello world contract is an example that I’m probably going to delve into. It’s basically a greeter contract. You call it, it responds with “hello”, and increments a counter variable that you can query to find the value. Ultimately, I want to be able to look back at a particular transaction in the past when the contract was called and determine what the state of the counter variable was. Ethereum explorers will show the internal state change, but I don’t think such a thing exists on Solana. I haven’t been able to find it.

But maybe I’m overthinking it. Even with this inability, I may still be able to determine the state of a particular contract by going back and replaying it’s history from genesis. I just don’t have the tools yet. I’m just scratching the surface on the Solana API, I’m nowhere near the point of being able to go back and retrieve every transaction that touched a pubkey, let alone parsing Serum’s instructions. So I’ve got to attack the problem on several fronts. It’s a huge technical problem, and I understand so little of it right now.

It’s no wonder I’m stressed.

Black hole

Yesterday was a decent day. I worked on our automation scripts that we’ll use to update our Serum indexer and front ends. I haven’t figured out quite how to fully automate it yet, but now I’m generating JSON and TypeScript files directly instead of using print statements to output them on the screen. I’ve got two more functions left to do and I’ll be happy. I’ll just need to copy the files into the respective repos and push them up to make the changes stick.

There are a bunch of hoops that I have to go through though. The JSON feed that we’re using doesn’t have all the markets that we want, so I have prelude of sorts that I’m appending the dynamically generated content to. On top of that I have to modify the dynamic content as well to correct a perplexing design decision — the same symbol for two items. I’m trying not to get caught up in too many optimizations at this point, but I will need to prepare for some changes in the base currency that’s being used. That will be a challenge since it’s so far outside of what the reference implementation will do.

Right now the current markets are all tied to USDC, but Atlas Co. is planning on creating ATLAS ones as well. We currently have the markets identified only by the NFT assets, but when the ATLAS markets go live we’ll need to distinguish between the two. I already know how to update the tickers in our Serum indexer, I just need to update the Redis key rename script that I used previously and I can gangload the changes. Updating the market itself can be done relatively easy once the feed has been updated, but the question is how to make the best user experience to distinguish between the two?

The current market list is already really crowded. We’ve got at least 78 items already in the list. Doubling everything just be too much. Splitting the market lists with some sort of top-level selector makes more sense, but I’m already dreading the program logic required to do such a thing.

I also need to update the DEX code to make sure that we can even take fees in another base currency. It’s currently coded for USDC and USDT, so I need to add some parameters for ATLAS as the base currency. It’s shouldn’t be too bad, Raydium currently has several markets for their token, so I know it’s not impossible.

I have a lot of things to add the exchange project Kaban board, that’s for sure. I’m not sure how much I’ll prioritize these things as I have a couple other Solana-related projects that I’ll be looking at. One is to really delve into the RPC API and start looking at accounts and transactions to see if I can recreate history. The other is to explore the Metaplex mono repo and figure out how Candy Machine and Fair Launch works.

I’m really getting pulled in now.

Undecided

Hopefully today can be a bit of a reset from yesterday, which just wasn’t that great of a day. I got some good sleep — Younger slept in her bed all night yay! — and didn’t waste an hour at the bus stop this morning. I meditated and have my tea, and am ready to sit down and start cranking out some work.

The question at this point is on what?

StarAtlas.exchange is up, but is not quite racking up the referral fees. We did the math, we need $3.2m of post orders to clear on there if we’re going to clear a profit on it, which could take a while. We don’t really have a marketing plan, so unless I’m going to spend all day pumping it in the official SA market channels, it’s not going to generate the needed traffic. There’s a list of things that we can improve on, my main concern is getting the cost down on our EC2 instance, but that doesn’t really matter if we still need to keep a $500/month RPC endpoint. I spun up a new instance last night and tested using Elasticache instead of Redis, but I need to figure out how to migrate the production data set to it and switch everything over. Plus I still need to figure out how to automatically update the markets when Atlas Co. adds them.

I could spend a lot more time working on it. The charts are acting weird, they’re not showing gaps if there are no sales, and are showing strange volume activity. I’m not sure how accurate they are. And there are a lot of cosmetic enhancements that we could add by updating descriptions and making other things more user friendly, but I’m not sure how much time to spend toward that.

One thing that I do think would be helpful long term would be for me to work on pulling historical data from Solana using Quiknode’s new endpoint. I have one from a friend that I can use, but I’d likely be starting from first principles to connect to the API, pull and collate data. I’d probably want to do that using Python, which would pull me away from the JS work that I’ve been doing. Still, working on this could potentially solve several problems: rebuilding trade data for Serum markets and providing cost basis for wallet transactions, just to start. So even if I couldn’t figure out how to integrated it into the exchange, it could be beneficial for PnL calculations, which would help me as dao manager. I would like to be able to rebuild Serum market trade data though, and who knows what else we could figure out with onchain data.

There’s a lot of administrative stuff that I need to do with wallet management. I alone control the keys for the ALPHA hotwallet and need to get a Shamir’s secret sharing scheme setup so that I don’t disappear and leave everyone out to dry. I’ve also got the wallet acquisition funds that need to be transferred over to the multisig vaults. I need to convert the privkey to a format that I can load in the CLI and transfer over everything. I’m delaying on that because of the cost of creating new token mints for each of the assets.

And to top it off, the most recent new shiny thing: Metaplex. I’ve been asked to assist in a new top secret NFT project that’s related to. Now thing pushes all sorts of buttons for me, but it’s very complicated. It involves Solana contract code that I might have to deploy, new frameworks like Next.js and deployment vendors. There’s new architecture involved with it as well, and the synergy with everything else is quite tempting. Man, is it tempting.

And we don’t even have the Star Atlas mini-game ready. There’s so much chatter in my various groups about how we’re going to build tooling (read bots) for that, but I think I’m going to leave that to others for now. They’re trying to figure out how to use Selenium with the web3 library to keep fleets running round the clock. There’s only so much we know with that one, so we’ll probably put that to the back burner.

Well, writing this out has made me realize what’s the most important thing I can be doing right now. I need to get to it.

Wasted

Today has gotten off to a bad start.

I went to bed on time and got a decent sleep. I woke up to get Younger ready for school and she gave me a hard time, but I got her down to the bus stop. Yesterday I got a text message from the school that the bus would be late, so I wound up taking her in. There was a similar message this morning but I decided to wait it out as it had only been about fifteen minutes late yesterday. Not so today.

There had been a story in the local newspaper this weekend about restaurants closing because they couldn’t find workers. I’ve been hearing anecdotes and news reports all over about it. People don’t want to work at bullshit jobs anymore. Partially because of the stimulus, COVID fears, and whatever other factors, the Great Resignation is making it hard to fill these jobs. And yes, let’s not forget that I myself and opted out to do .. well, what is it I’m doing now?

So I don’t know why I just decided to write a long Tweetstorm on this instead of here, but I felt like putting some thoughts down about this supply-chain and labor shortage. I didn’t want to say the word Weimar, but that’s what I’m think of course.

Anyways, my day was made worse by the fact that I forgot Elder and I have a dentist appointment this morning and that I was supposed to drive Missus into the office. Anyways, that’s in an hour, then I need to drop her off at school, try and get some work done in two hours, pick up Younger at the bus stop, come home, work for two hours, then pickup Missus and Elder.

Yea, I’m not getting a lot done today.

Confidence

I’m feeling great this morning, mostly because I went to bed at a decent hour and slept in the big bed all by myself. Woke up at 5:30 because I told myself I wanted to go running this morning, but laid in bed for an hour anyways before getting Younger up for school. I had a message on my phone from the school that the bus would be late, so I drove her in and was back while people were still standing at the stop.

I decided I’d go for a run anyways. I was a little worried about my left foot, which had given me problems on my last run and was a bit off, so I tried to cushion it by hitting with more of a mid-strike. I had listened to someone on Profit Maximalist who ran a lot off hundred-mile races, and he’d mentioned that mid or fore-strikes were the way to go long, so we’ll see how it works out. I got back to an empty house, so I did my morning meditation and now here I am.

My brain is pretty much thinking about Star Atlas or Solana now. I was making some changes to the exchange code last night, talking with people about pulling data from Solana, figuring out how we’re going to automate the minigame for profit. Even while I was trying to be still this morning, my brain was just pounding along trying to figure out various things. The problem now isn’t lack of ideas, it’s figuring out which one is the most important and needs my immediate focus. That’s tough.

I finally finished the deck yesterday, spraying the last of it. It looks pretty good, but I’m can’t say I’m proud of the job. I’m just glad it’s done. Looking back, I’m not sure that I would have done the project myself or paid someone to do it. I have no idea how many hours I spent on it. And while I might have saved some cash doing it myself, I’m not sure it was the best use of my time. I just have to think of it as exercise, or homesteading. I’ve still got some cleanup to do to get some stain off of the siding, but I’m ready to move on to my next home project, which will be cutting down a tree and some overgrown bushes in the front yard so Missus can redo the landscaping. I still want to build out the side yard for a bigger garden, but I’ve got a lot of stuff to cut down first.

It’s Monday, which means I need to do the weekly dao status report. That usually takes me till lunch, then I need to look at a UI bug on the exchange that has to do with a new design we pushed to dev last night. I’ve got lots of ideas about how to integrate some of the asset metadata into the site. I need to work on our Serum market indexer, it needs backing up, or balancing; I really want to get the database in Elasticache. And there’s a whole list of stuff I need to look at with Solana with regard to historical data that I need to figure out, and long term I need to be able to write my own contracts. There’s a lot going on, but I feel like I can do anything at this point.

Solanian

Well I will see how well I can write this morning given the early morning slumber party going on in the den right now. Yesterday was Eldest’s birthday, and instead of a birthday party we let her BFF stay over. The two of them and Younger have been up the last two hours having a dance party listening to songs, singing along loudly and stomping around the room. It’s cute, but very distracting.

I spent last night delving into the internals of Redis, trying to figure out how to clear out some market data that got entered incorrectly due to an error on my part. I got that done while I was trying to figure out how to get historical data for the Serum event queue. It’s a circular buffer, so once it clears out it seems the only way to get the data is via the Solana explorer, feeding it into the Serum JS library to replay and decode the instructions. I think anyways. I’m waiting for confirmation, but I have no idea whether such a tool exists or whether we’d have to build one from scratch.

I went to bed reading over the Solana framework docs. I managed to skim through last night and learned a bit. I really don’t care much about most of it, but it did confirm some of my hunches about the ephemeral nature of Solana. A validator node can not be expected to store a complete copy of the Solana blockchain the way a Bitcoin or Ethereum node does. Apparently they’re going to use Google’s BigData as a storage medium for data after six months or so.

I woke up and immediately picked up my iPad and started reading the developer docs again, and took a look at the spl-token code again. It made a lot more sense this time, I guess I’ve started picking up some Rust skills after all. One thing that has finally made sense is that data in Solana programs is obscured, meaning that it has to be interpreted by the program that it is intended for. Looking at the data in the Explorer or SolScan is only going to get you so far to figure out what the hell is going on.

Coming from Python, I’ve been struggling with ways to make the NodeJS REPL work in a similar way. Import functions aren’t really allowed in the REPL, although requires are. Trying to load functions in real time and use them to debug live, on-chain data is a bit of a challenge. I made a couple of in-roads with that last night, and was able to do a bit of playing around in the serum-history module to see exactly what it was doing, but it still confirmed my hunch that Serum event data was ephemeral.

I see a lot of opportunity here if we can build a module that can recreate event data from Serum marketplaces, tracking cost basis is an obvious example. I can tell you that tax season for Solana is going to be almost impossible for people that aren’t taking good notes.

StarAtlas.Exchange

So yesterday was a really proud day for me as we shipped StarAtlas.Exchange to the public. It was the culmination of over three weeks of work by SAIAdao/IA members and myself, and I couldn’t be more proud. I wanted to get it live before the town hall with the Star Atlas team, and was a bit worried that we were going to demo the site with some broken functionality, but we had a couple of fortunate breakthroughs and were able to ship the site completely working and bug free!

The idea came together over the past few weeks as I started putting some tooling in place for my own personal use. We had started off trying to track the price of SAIAdao’s NFT assets and discovered that there wasn’t any simple way to get the price data into our spreadsheet. So we stood up our own Serum price history module in an EC2 instance and used that API to feed into a Google Sheet in which we had been collecting asset market and mint addresses. I figured out that I could build some charts with it as well and that this would be valuable to some people, and by then we had figured that we would just roll our own Serum front end for it and make trade fees off of it.

I had to pick up a lot of stuff to get this up and running. I’d done some React tutorials a few weeks ago so that helped, and I’ve used EC2 before as well. Wrapping my head around how Solana and Serum works has been a huge challenge though, and the long build times that our codebase has made testing changes and troubleshooting very difficult.

Solana’s public RPC issues have also been very problematic. Half the time we couldn’t tell whether our code had a bug or if there were network issues. Settlement in particular wasn’t working during nearly the entire development process, and then just magically started working early yesterday morning. We’ve had inconsistencies with trade execution and wallet operations as well, but some careful testing yesterday told us that our problems were likely due to indexing slowness via the Project Serum API. Thankfully we got the hookup from GenesysGo yesterday and now have our own private API endpoint that we can use.

And let me just say how amazing AWS Amplify is. Webhooks on our Github repositories trigger automatic rebuilds which are deployed to Cloudflare CDNs within minutes. It’s really amazing.

I’m really impressed how quickly things came together yesterday. We got the domain setup, images and branding updated and got a lot of support by IA and even some members of the IA community. One of our dao members was able to bring it to Star Atlas’s CEO during the town hall and we got confirmation that our efforts aren’t going to be thwarted by the team.

I’ve been busy making improvements today as well, trying to shore up our backend infrastructure and get analytics running so we can tell who’s using the site. The good news is that we can see the referral fees rolling into our USDC wallet already. It’s going to take a lot more for us to reach profitability though, but we’re in a good position to reap some serious rewards once the SA mini-game launches and people start earning in game assets.

Mints and markets

I spent way to much time working yesterday. I was trying to get the exchange working, and had some irregularities with certain markets where the token mint wasn’t working properly. I about lost my damn mind.

The Serum web3 library has two files that are relevant, one contains market data such as the name and market ID, and a mint list that maps the symbols to the token mints. It seems that the Serum program is able to query the base and quote currency mints from the market on-chain, and the exchange code then does a reverse lookup to display the names.

But for some reason, our implementation displays UNKNOWN for the bases. So my mission yesterday was to figure this out. And I failed. Problem number one was that my local development system is too unstable to run the DEX via yarn start. It caused my IDE to become very slow, and the entire system would lock up after an hour or so. So I started doing live edits on the EC2 instance where we run the API. This was less than ideal because I was having to edit through an ssh console, and it was very inefficient. Now for starters, yarn isn’t watching the serum dependency where we’re making the changes, so I’d have to restart the app each time, which takes about five or more minutes. I started getting more and more frustrated because it didn’t seem like anything I was doing was having any effect.

It wasn’t.

The original dex code uses @project-serum/serum/ as a dependency. We had been making changes directly within the node-modules version of this to update our mints, but this doesn’t work for source control. So the other dev I’m working with pulled it out into a root folder and included it in package.json using a file: designation. It worked great locally and in the Amplify build — mostly — so I thought all was well. The first problem I noticed was that Amplify had the node-modules cached in the yml build file, and changes in our project path didn’t get updated. So I removed the cache and it was fine. But I missed something on the EC2 instance.

It was looking to the node-module directory during the build process. Now I’m not sure about the internal workings of yarn, but on my local dev instance, restarting yarn picks up changes in the embedded module. Not so on the EC2 instance. So late last night I ripped out the node-module folder and tried to rebuild the app. It didn’t find our local version.

So that’s where I am at this morning. I either need to spin up a new EC2 instance to do this work or rest the password on this old server running under my desk. My VM is just not cutting it for this work. I’ve got several discrepancies between instances of the UI that are running, and I’m not quite sure why. I did spend a good chunk of time yesterday experimenting with different RPC servers and CORS issues, as I wasn’t quite sure whether there were network instability issues or whether I was getting rate limited, but I was having all kinds of failed calls. Serum sends a huge number of RPC calls. A good number of them are orderbook updates, but there are a lot of them that have to do with spl-token accounts.

Solana is one complicated beast.

So now it’s back to work, with an awesome hangover, no less. There’s a Star Atlas town hall later today and I had really wanted to have this thing ready to go for it. I got the card validator all set, but I really to eliminate any crashing bugs before I give out this url.

Exchanging

I got so caught up with working on staratlas.exchange last night that I didn’t even think about writing until right after my head hit the pillow. I was just too exhausted to get back up, and I figured it was better for me to start writing in the mornings when I have the energy. So I’ll start prioritizing this in the AM now, before I hop on the computer and start working.

Younger is sick, she was complaining about a sore throat and headache, but I knew something was really wrong when she fell asleep before dinner; she never takes naps. We’re keeping her home from school today, but Missus is WFH Thursdays and Fridays now, so I won’t have to manage it all by myself.

Of course the big news that I want to talk about is the progress that I made with setting up the exchange. Two days ago we were able to get the front end to build in AWS Amplify, but the charts weren’t working. Yesterday I figured out that it was because the front end was behind TLS, but the data feed wasn’t and browsers won’t display mixed content. So we needed to figure out how to make it work behind a firewall. So I went with nginx’s Unit application.

I’ve never used it before and had to deal with a bunch of issues to make it work. I struggled to get it working on my dev box because of some mismatched Ubuntu sources, which cost me time. It’s controlled through pushing json config files through a pipe via curl, which is new to me. I also had to reconfigure node because v10 was still installed as the system default (I override it with nvm) and I had to suss out that the API calls were failing because of the older version. Anyways, I got it to work locally and then had it changed around on our cloud server pretty quickly.

Then I just need the cert, so I went with Let’s Encrypt and their certbot. I had setup issues. We’re using Route53 for DNS, and you’re supposed to be able to setup certbot to do the DNS validation so that things will auto-renew. I couldn’t get it to work and gave up and did it manually. Even that took longer than it should because I mistyped startatlas in the request and couldn’t figure out why the validation wasn’t working. So I got the cert and loaded it without too much trouble but the site was down. By this point it was getting late, and I had given up for the night when I realized that I had applied the cert to the listener on port 80. A small change and the API was secure, and the site was functional.

I also spent some time cleaning up the git repo. There’s a lot of, shall we say experimentation or troubleshooting in those commits, and I didn’t want our idiocy immortalized in the index. So I reset some branches, cherry picked some commits and got everything looking nice and tidy. I’m surprised it worked as well as it did. I actually deleted the dev branch from Github and sent it back up with the pruned version, and CI/CD did the rest. I have yet to do the same with origin/master. That’s a bit more risky than I wanted to deal with last night.

Earlier in the morning I had attempted to move our market indexer to Amplify, and spun up a micro Elasticache instance to replace our Redis database. I couldn’t get it to build though, which is something I need to do, since that should be cheaper to operate than our large EC2 instance. That’s one for the backlog.

So I’ve got a few more checks to do this morning, some cosmetic changes that we’ll need to do before we’re ready to get user feedback. Right now the site is just a vanilla Serum clone with a few changes, and we need to strip out a bunch of things and customize. I plan on really digging into the Serum code to figure out some bugs, and look through any other exchange implementations that I can find to figure out who else is having problems and what’s just us.