Do nothing stupid

It’s been half a year since I published anything publicly. It got to the point that I was no longer comfortable talking about things that might let things slip about what was going on at work. It was just safer to not post.

I wish I could say that I kept up the journalling habit, but I didn’t really. Not typing things out the hard way, by hand. Instead, I started using Obsidian, a note-taking app, to start organizing my life. Obsidian has a number of plugins, core ones that ship with the app, and a host of open source community plugins that provide additional capabilities. So instead of these journals, I’d been dictating to Obsidian, then using Whisper to transcribe the files, at which point they could be passed as context to GPT for whatever purpose.

I became quite obsessive with using it, I would transcribe work meetings and use it to create task lists, stakeholder notes, or whatever task I needed. The dream was to have a codex of company information that could be used to build an AI assistant. Things haven’t quite come together, for reasons I’ll explain later.

Of course the subtext here is that I was laid off from Star Atlas last week – my last paycheck was yesterday so now I am officially unemployed. The news was not completely unanticipated — we hadn’t heard anything about VC funding in some time, so it was pretty clear the writing was on the wall. I’d also been bothered by economic doomer news, and had been saving up cash as much as I could these last few months. So while I’ve got my short-term financial needs in the bank, I’m a bit lost when it comes to my long-term plans.

I’ve decided to put all the worry out of my head for the next week or two — the kids start school at the end of the month, so I’m telling myself to just be a dad and enjoy the time with them. I pulled them out of day camp this week — Missus is out of town, so it’s just the three of us at the house the week. My hope is to keep them on track with their studies, and try to return to some semblance of routine once next week when they go back to camp and school. Last week was such a mess, it just didn’t make sense to pay $160 to keep them at camp while I was at home futzing about.

I’m just not ready to start looking for a new job. I don’t even know what I want to do yet, not ready to quit what I was doing. I’ve put so much mental energy into the SA DAO over these past two years that I can’t just stop and pivot. There’s still a lot of things I want to finish. I’ve got a lot invested in SA, and still believe that the team can do it. Now we know they have another year and a half runway. More than me, at this point, but all I want to do is survive. I don’t want to update my resume and go network on LinkedIn and fill out job applications that I don’t want in order to qualify for unemployment. The last two years have been a dream, and I’m not ready to wake up.

If I have to ask what I have to show for the last two years of fat times, I think I’d be hard pressed. We’ve travelled a bit: Costa Rica, Niagara Falls, Puerto Rico; I didn’t really buy a bunch of toys — maybe a bit much on video games — but we’ve actually been taking steps to declutter the house. I’d ask where it all went, but I know most of my budget lately has actually been groceries.

I’ve accumulated a fair bit of crypto, I’d have to go back and look at my cost basis before I make any claims to whether I’m up or down, but I have a sizable stash of Solana that I’m not in a hurry to part with. That’s probably the main determining factor as to why I’m not itching to look for jobs in the real world quite yet. I can’t cut my ties from Star Atlas completely — I’m locked for five years, like 99% of people invested in the DAO. So I’m not going anywhere yet.

There were about 120 people laid off last week, the contractors actually have another week per their agreement, but regardless. Last week we all moved into a new Discord server to commiserate and console each other. My initial reaction was the fact that we had a great team of people that were left out in the cold. After some initial conversations, we determined there are a good number of us that don’t need to go back to work right away. And many of us have the personal runway and desire not to return to the workforce. So while we all deal with the shock of our forced departures, we’re helping each other with resumes and LinkedIn connection, trying to help each other along, figure out who needs to work and who wants to build something new.

I really haven’t the faintest clue what I’m doing, so right now I’m sticking to the Do Nothing Stupid plan: coast on inertia, don’t risk capital unnecessarily, don’t burn bridges. I’m taking some time to get back up to speed on the latest advancements in the Solana ecosystem, and work with anyone from the old team who wants to build. I’m not sure if anyone else has the stomach to work within the SA ecosystem, but it’s what we know, and there’s nothing saying that we can’t do it. Permissionless blockchains and all that…

Exploring Nature & Technology

Morning journal entry, January 17. Time is 8.40. Girls just left for work since it’s Mrs. Work from Home Day on Tuesday. I took Younger in this morning. She wanted to ride her bike. It wasn’t too bad. It was about 44 degrees or so. So I wrote her in without too much trouble and made it back, meditated and here I am.

So I got about 20 minutes just to collect my thoughts. I had a lot I was thinking about during meditation today. I’ve got a busy day at work. Not a lot of time to work on other things this morning. So I’m not going to talk about that in particular for this entry.

Yesterday we took our neighbor/cousins, the T’s, to a state park. We packed a picnic and I got the bikes out. We took a walking trail and it was a lot harder than I thought it was going to be. We had walkie-talkies so we could stay together and not get lost. We found this one little gorge that was beautiful where we had a large tree had fallen over and it kind of made this bridge across the ravine. We played on it for a bit and then took a picture. Afterwards, we ate some more and went to the playground. I took the two little girls on a bike ride around the lake which took us about half an hour. We had to push the bikes up the hill, but got to ride down which was pretty fast and fun.

Mrs. was ready to leave so we packed it up and headed back. It was pretty fun, although I wish we would have stayed longer. I was surprisingly sore in my legs given that it wasn’t that big of a deal. I slept great last night.

I want to talk about Starseer this morning. There’s a CLI tool that integrates with the GPT that I want to try. We also took another look over the weekend at the Discord bot that we have set up in our staging server. So, it seems to be working right now. We’re going to hopefully take some make some efforts on that, but I really want to spend some time planning what I do in the ML department in the AI ML department. This sprint we’re gonna need a demo by the end of the sprint so I need to figure out like what I can show now and what we can create that we can ship and actually demo. Maybe we focus on the Whisper pipeline and integrate it with Discord to make recording easier. There’s a channel in Discord where people can go into, they can get the help commands from the channel and they can hit a transcribe command. As they start talking it will record their audio and provide them with a text document.

The house is a mess. We’ve just destroyed it after we got home last night.

I am trying to get the girls to do some more video stuff. We started recording one of their science experiment kits over the weekend and gave them cameras for the trip to the park. Just trying to figure out how we want to operate, like with all these PCs and Macs and iPhones and Androids. It’s kind of a mess.

I think I’m gonna break here and get to work on work stuff because we got 10 minutes till nine.

A Sunday Morning Reflection

It’s Sunday morning, January 15th, and not even eight o’clock yet. It’s very unusual for me to be up this early on a Sunday, let alone doing a journal entry, but I woke up pretty early this morning and just kind of laid in bed.

Saturday went well. My mom came down Friday and after the last recording we had a celebration of Star Outlaces two-year anniversary. After my mom got here, we went to pick up the girls and then we ate and drank. We played Dungeons and Dragons actually last night, or Saturday night. It was a very, very condensed version. I didn’t want to subject Nana and Missus to 16 rounds plus of the two hour Dungeons and Dragons adventures game, but we managed to just do the beholder 1-4 and it still took us an hour. It still proves my point that the Star Outlist RPG adventure is a good idea.

There was a free game called GameDec (short for Game Detective) on Epic Games. I started playing that and it’s a wonderful kind of asymmetric adventure game and real engine, I guess you would call it like 2.5d.

We did additional tests with Starseer here and GPT index over the weekend. GPT index had a new release that adds PDF reader, and there’s some other functionality. I think it does like web pages now, simple web pages, but we did manage to load up the master agreement between the union and the VA. I was able to query it and it worked actually pretty well. It did use a lot of tokens, so it’s a long document. Still, I think my API usage was less than a dollar yesterday.

We need to focus on like the command line interface, building a GUI to kind of manage some of the stuff. I’m still creating these indexes manually. I still have not been able to compose a recursive index yet, so I think maybe I can work on that today hopefully.

We went roller skating yesterday. We took our friends, whom I’ll start referring to as my cousins because Dan and I have that kind of relationship. We took the cousins down to the roller skating rink and had fun. We were mainly out there trying to figure out how to do like hockey turnarounds or whatever you call it, just trying to get comfortable spinning and stuff.

After that, Nana left about one o’clock. The kids went out to play with their cousins and we just kind of chilled. The house is kind of needs some cleaning up, but it’s not too bad. I am gonna do some work today. I’ve had some thoughts about how the indexing process in the summary. What I did do is I’m not sure I’ve talked about before making file documents from the indexer and kind of putting the original path and the original hash of the dock. I really want to focus on indexing GitHub repos and file systems. If you’re indexing a file folder, it should check to see whether it’s in a GitHub repo and if it is, it should have to get the hash of the repo.

I don’t want to call it directly from the command line, obviously, but we should be able to get some sort of information file hashes, whether it’s changed, and then maybe even have some kind of like file watching system that it automatically re-indexes data when it changes. But that use case might be a little much for what we’re doing.

The test with the union agreement proves some valid theses that we’re gonna be alright. If we can create a user interface that’s as user-friendly as GPT, we should be able to create a grievance intake form or intake process where the website actually prompts the user for information. Please guide them with questions until we’ve refined the data that the grievance form needs, and then examine the user’s problem against the master agreement.

Yeah, I mean we start there, but there’s still a lot of work to do. I want to see if I can get these transcripts through some kind of process. The one thing I’m trying to figure out is how accurately we can regurgitate a document. Like this transcript goes over 2,000 tokens, or 4,000 tokens. I mean if the transcript stays under 2,000 tokens, it’s really easy to just have that window. I don’t think that indexer is gonna be the end-all-be-all for this type of stuff. There needs to be some other kind of like workflow, kind of like task passing. Here’s an index file, it’s been transcribed, run this query on it, run this on it, and then once that’s done then you can index it.

Tomorrow is a holiday, and we’re gonna try to go back up to the state park where I almost died from over-exercising. There’s free admission to the parks, and Misses wants to do something. I talked to the cousins about coming down with us, doing some bike riding, some hiking, packing a cooler with sandwiches and snacks and stuff, and just kind of hanging out there all day in the park which I think will be fun.

The girls are going to church here in about an hour and a half. I’ll see if my brother wants to play some video games and we’ll just chill till they get back. Maybe go to brunch with Misses, and we don’t really have to do anything today. Just relax and kind of chill. It should be chill.

Taking Breaks: A Reflection

Yesterday was productive, although I was a little bit impatient with myself. I imagined progress would be faster than it has been, considering I now have GPT index at my disposal. I got a little frustrated, so I stopped working and just let my brain do its thing.

I had to stay off the computer all afternoon, because there’s so much information and so much going on—reaching maximum cognitive load for the day, for lack of a better term. So, I just needed to offload some of that during sleep. Hopefully Starseer will help with some of that today.

One good piece of work we did yesterday was that we have the console running now to an index. The way this thing is supposed to work in the future is that you’ll install the source code and you’ll run the application and basically start with a blank buffer waiting for its first command. We do need to do a little bit of prompt engineering because the indexer has a default query that’s something like use the context provided, answer the question, do not use prior knowledge.

So the way I have it, the way it should be working now, is that I turned it on. It has all the previous console commands that it loads up in a list vector, and it’s basically just me giving commands to the computer. No responses are being recorded.

This morning I want it to recursively load its main file, which is going to be just main.py at this point. I’ll give it some directions, a little bit of prompting as to what Starseer is and directions to read the source code and provide the next step. I should probably provide it with unit tests, because the unit tests will actually teach it what to write. So basically the loop is going to be something like this: okay, here’s the source code and unit tests, here’s what we’re building or here’s a particular user story, write a unit tests that can be incorporated into the code, and then after it does that, write the function. Then we’ll just start iterating like that.

A little personal anecdote: it was like 30 degrees this morning when I woke up, my youngest and the car wouldn’t start. I tried to charge it using the battery cable jumper, but it wouldn’t turn over. So we rode our bikes to her school and it was super cold, and my back tire was flat. She was a trooper though, so I woke up, meditated, and Berkeley woke up before I started. She was trying to shop on Amazon for furniture for her spy school. We argued a little bit about that and I’m still thinking a lot about Dave’s funeral and kind of how I acted. I’m trying to put it out of my mind, as meditation would suggest. I don’t feel good about breaking dry January and I don’t feel good about driving home, so I consider myself lucky.

`Got into a fight with my dad, apparently. That’s probably what’s bugging me the most—how to resolve that. So I sent a text message to my dad, my way of apologizing. We’ll see what happens.

Exploring gpt-index

Good morning! It’s Monday, July 9th, and I’m up nice and early this morning.

I attended my the funeral for my childhood neighbor Dave on Saturday. It was a bittersweet experience. It was great to reconnect with people I hadn’t seen in 20 years, but it was an emotional roller coaster. After the funeral, I broke my dry January and had a few drinks. Despite this, I still managed to get home safely.

Sunday was a busy day. The kids went to church and my wife and I did some cleaning around the house. I also worked on Starseer and we had the chance to play a Dungeons and Dragons adventure game with the kids. It was great to see them getting the hang of it and enjoying the game.

What I’m really excited about today is what I discovered over the weekend: a GPT index repo. Since the release of chat GPT, I’ve been trying to build a context management system. However, chat GPT index has pretty much got it covered. I’ve got to build a system that can index the directory and figure out how to arrange all the directories. I built the first test around the console version of this and I’m looking forward to seeing how it pans out. I’m also working on an electronic module to convert the Dungeons and Dragons game into a Star Atlas theme adventure. For now, I’m going to keep this information close to the chest.

Finally, I’m going to document the analytics code I used for the DAO, prepare it for public release, and also work on open sourcing the player profile. I’m also looking into incorporating a pause button into my whisper-mic fork so that I don’t have to kill it every time somebody walks in the room.

To kick off the day, I’m using whisper and GPT to dictate and write this blog post. I’m not using the raw response today, this post was slightly edited from the GPT response from my ‘blog helper’ prompt in Playground.

Using Whisper to Summarize Meetings After a Hectic Day

Today has been an interesting day. I woke up at 4 am and couldn’t get back to sleep, so I got up at 6 and went for a run. Then I completed my power routine, meditated and didn’t eat anything till late in the morning.

The experiments with Whisper continued today. I recorded a meeting and fed it to GPT with a prelude based on the audience or the end goal. After curating the chunks, I threw them through the summarizer and had a quick TLDR.

Now I’m trying to figure out the best scaling solution to capture a conference room full of voice activity, feed it into contextualization or summarization, and chunk it down. I’m also looking for a good design to manage the prompts via command line.

I’m also working on abstracting the functions from my Discord bot and sharing them between libraries. And I’ve had a realization that to judge the relevance of a transcript, I need more context. So my current strategy is to crawl transcripts from whatever audio/video source, add a context layer, and break it into chunks of 2000 characters or less.

I’m exploring the concept of a semantic search, which is vectorization of a string based on a dictionary. This linear array of word weights can be plotted in a multi-dimensional space, and the semantic search looks for the nearest neighbors.

It’s been a busy day and I’m having fun moving forward.

Voice First Experiments: Adventures in AI and Transcription

Okay, it is Wednesday, January 4th, and this is my second day trying to do voice first on my computer. It’s been some interesting experiments. Yesterday, I tried to do a blog post just by speaking into my iPhone memo program. It wasn’t quite the experience I was looking for. First off, getting the memo app off of the iPhone onto the iPad, or the MacBook rather, was a little bit difficult. The app itself on the MacBook isn’t great either the way the file storage works. It’s not user-friendly at all.

So what I wound up doing was I wound up using my iPhone memo, recording the audio on that, and then basically transferring the memo to an iCloud folder, which was then synced to my MacBook, then I was able to run that file in whisper. The transcription was actually pretty good. I tried running it through GPT a couple of times to pare it down or to clean it up, but I wasn’t happy with the results, mainly because my original post was about 3,000 characters. So it was a little too long to get a real balance between the prompt and the result from GPT. So yeah, I didn’t have too good of a time with that. I did generate a title out of it. It’s really good at summation. We all know that. So it’s really good at summarizing things, but I tried GPT through Playground. I tried chat GPT, several things I tried to do, wasn’t really happy. I get the sense that it wasn’t quite as frictionless as just typing it out myself, which is kind of why I’ve always been typing to begin with, but we’ll try it again today and we’ll see how things go.

So hopefully we’ll have some better luck today. One thing I have been playing around with also is this thing called Whisper Mic. It’s a GitHub program that uses your computer’s microphone to basically just listen. It’s been running right now. It’s been running all night, actually. I didn’t realize it, but we turned it on last night just to kind of capture things. So it’s kind of clunky. It does what it’s supposed to do if you’re giving it good quality audio. Like if I’m speaking right in front of my MacBook right now, it seems like it’s been doing a pretty good translation, but while I was leaving it running in the background and people were walking around the house this morning and a couple feet away across the room, it wasn’t picking things up very well. I’m not sure how it’s going to pick up the coughing either, excuse me. I noticed it didn’t do a very good job with Elder’s speech. She’s 10, but it did not seem like it was picking up her words properly as much as it has for me. There’s some testing to do there to figure out how that works.

I also want to play around with some of these multi-speaker models. I’ve seen some demos of some where you specify the number of speakers in a clip and you feed in and whisper, and it’s able to basically tell you speaker one, speaker two.

One thing that I did have some success with yesterday was using this to grab information, to summarize a meeting basically. We had a quick standup yesterday. Our designer came in and gave us an update on his work over the break. He goes in the work mode and I managed to hit the record button and click up while he was speaking. I did have a video, but I was not able to get it from there where I wanted it to go very easily. I figured I could go straight from click up, pull the video down, but it was in some kind of WebM wrapper and I was getting some error trying to pull it into FFMPG. What I wanted to do was just letting it play while my computer’s voice memo program started running again and was able to pull the transcript into whisper and then fed it in GPT, asked it basically to pull out requirements to do the next task and stuff like that. It did summarize it. Again, it was a very short two or three minutes, I guess. The context was not overloading the chat GPT buffer, which is one of the main problems we have with this, obviously.

There were a couple things I had some interesting conversations I had with friend Chris about AI. He’s in the machine vision industry, and so he and I had a long discussion around dinnertime last night. Some interesting developments at work that I can’t really talk about, other than to say that I’m trying to build a department of AI, so we’ll see if any of the founders bite at that. So that’s what I’m going to do to present that to them and put some of this information together. Hopefully, having more of a record of things will help and we’ll see how I can take these transcripts and use them in such a way that will be helpful and will be very interesting to figure that out. Maybe some embeddings work, text embeddings, building some search around audio, like how expensive would it be to run whisper mic 24-7, and then just log everything that it catches into a database and just do semantic search on that. I think it would be pretty interesting to have your entire life stored in that way.

Speaking of which, I have been trying something called rewind AI, basically the way it works is that it records a screenshot of your computer screen every X seconds or whatever, and then it does some sort of… It crawls all the text from your screen, so it crawls all that text from your screen, and then I guess it loads some kind of database, some compression algorithm that they have as proprietary, but basically then if you’re looking for something, you can push command shift and two swipes up on the mouse, and it will give you a search. Anything you’ve seen, said, or heard, well, heard, that’s interesting. I’m going to have to experiment with that to see how that works, but it also has an interesting little history function you can slide back and see what your stuff sounded like.

One thing that I did have some success with yesterday was using this to grab information, to summarize a meeting basically. We had a quick standup yesterday. Our designer came in and gave us an update on his work over the break. He goes in the work mode and I managed to hit the record button and click up while he was speaking. I did have a video, but I was not able to get it from there where I wanted it to go very easily. I figured I could go straight from click up, pull the video down, but it was in some kind of WebM wrapper and I was getting some error trying to pull it into FFMPG. What I wanted to do was just letting it play while my computer’s voice memo program started running again and was able to pull the transcript into whisper and then fed it in GPT, asked it basically to pull out requirements to do the next task and stuff like that. It did summarize it. Again, it was a very short two or three minutes, I guess. The context was not overloading the chat GPT buffer, which is one of the main problems we have with this, obviously.

There were a couple things I had some interesting conversations I had with friend Chris about AI. He’s in the machine vision industry, and so he and I had a long discussion around dinnertime last night. Some interesting developments at work that I can’t really talk about, other than to say that I’m trying to build a department of AI, so we’ll see if any of the founders bite at that. So that’s what I’m going to do to present that to them and put some of this information together. Hopefully, having more of a record of things will help and we’ll see how I can take these transcripts and use them in such a way that will be helpful and will be very interesting to figure that out. Maybe some embeddings work, text embeddings, building some search around audio, like how expensive would it be to run whisper mic 24-7, and then just log everything that it catches into a database and just do semantic search on that. I think it would be pretty interesting to have your entire life stored in that way.

Speaking of which, I have been trying something called rewind AI, basically the way it works is that it records a screenshot of your computer screen every X seconds or whatever, and then it does some sort of… It crawls all the text from your screen, so it crawls all that text from your screen, and then I guess it loads some kind of database, some compression algorithm that they have as proprietary, but basically then if you’re looking for something, you can push command shift and two swipes up on the mouse, and it will give you a search. Anything you’ve seen, said, or heard, well, heard, that’s interesting. I’m going to have to experiment with that to see how that works, but it also has an interesting little history function you can slide back and see what your stuff sounded like.

So that’s probably it for right now. We’re going to do a test, I guess I’m going to use the iPhone Memo again today just to see how it compares to what Whispers is pulling out, so that will be my next task, and yeah, that’s it for now.

Experimenting with Voice First Blogging

I’m trying to do something new this year. Instead of just doing the same old bloggy blog stuff that I’ve been doing, I’m trying to mix it up a little bit. So I want to try doing something called Voice First, which basically means not being so much of a keyboard jockey, but being able to just use natural language. I think that it’s important to develop the skill just to be able to talk about things off the cuff in a natural flowing way, similarly to how I can type. I have this mental block about being able to speak temporarily the same way that I do on my blog period. So I don’t know quite what I’m doing yet, but what I was attempting to do this morning was to use Whisper to transcribe audio period. So usually I have a special laptop that I use that’s separate from my work laptop that I type on. I have a special seat in the house where I sit when I type and do my blogs. And I don’t know why I want to do things differently this time, but I attempted to set up Whisper on my laptop that is freshly wiped Windows 10 Alienware machine that I ran Linux on for years. So what I wanted to do was run a Whisper program that will basically transcribe to a file using the computer’s microphone, but didn’t have Python set up on the machine, didn’t have it set up and just trying to deal with like PowerShell again, I just said no. I’ve been a huge Windows person ever since Windows came out. I’ve had a brief experience with Mac back in the Apple 2e days. Technology programming was pretty much my first introduction to computers and programming, but after getting sucked into being a Windows 95 power user and taking all those Microsoft courses and certifications that I had, anytime I would touch a Mac, it would be pretty much in kind of a revulsion just because everything was so different and I didn’t want to waste, I didn’t want to have to figure it out, basically is what it boiled down to. So I just said no Macs, no Apple, but all that change when the iPhone came out. So it’s been a bit of a change recently, having iPads and all the Apple devices because Windows, Microsoft phone never took off, they never had anything that I even wanted to touch, I didn’t want to take a Zoom, so yeah, I’ve avoided Macs despite being an iPhone user for 10 years now. I really got a hand it to my friend Mike Kane who works with me at Star Atlas, bought me a Macbook Pro, I guess is like a recruitment bonus for getting him a job at Star Atlas. So I appreciate that and I fucking love my Macbook and I still have a gaming PC upstairs that I play all the Steam games stuff on and I’ll do some web surfing and stuff like that, but I just set a limit, I was not going to install Python on there, I wasn’t even going to install any command tool stuff, it was strictly a gaming PC and I was going to leave it at that. And so what happened was that the writing Alienware machine that I reformatted, I was using this weekend for the girls to shoot some video and so I decided to pick it up. Anyways, I’m getting off on a tangent, the point is that we’re moving to voice first, whether that means releasing these, this is a memo on my phone right now, so whether we’re going to release this memo as like an actual podcast, who knows, but I do want to practice with the transcription. So basically what I’m planning on doing is running this through Whisper on the Macbook and then passing it through GPT for editing basically. So you’re hearing my voice one way or the other, speech to text, text to text and then text to blog post I guess, yeah, so what else are we planning on doing this year? So there’s a lot of things that we have planned around GPT and AI systems in general, I’ve been playing Dungeons and Dragons Adventures with my kids and I think this idea of a GPT DM is coming together quite nicely in my head so far, my main priority right now as far as work for Star Atlas goes is to obviously continue development on the Dow, we’ve got design working on that now, they’re going to go crazy and we’re going to have to bring it back down to Earth which is kind of interesting for me being the one who’s normally has their heads in the cloud, their heads in the cloud. So we’ll be working on that process but in the meantime I’m going to be integrating GPT into my workflow as much as possible, making it interact with other APIs, trying to see how I can, you know, Notion has their API, how can we teach GPT to be able to build its own programs or facilitate me building programs that interact with these various APIs. So like, hey, GPT or as I’ve been calling it Jarvis after the Iron Man machine, Tony Stark’s computer generated AI, whatever you want to call it. So basically, what can we build and what do we need to borrow? I’m going to be playing around with a bunch of tools but basically what my kind of like moonshot thing is right now is basically using GPT to play moderator or facilitator for some sort of adventure game and the goal here is for the adventure game to be like people talking like you would at a regular Dungeons and Dragons session. So what we want to do is we want to provide context for the adventure, whether that’s going to be, you know, some kind of schema in a no Excel database or probably what I’ll do for the demo is just basically do like a flat file thing and just make it as simple as possible just to show like, you know, what a scenario would play out like and then go from there. I think it’s going to be very interesting. I need to put a proposal together for the company to get a little bit of funding and a little bit of just founder, founder synchronization. I’ve been very lucky to just be able to do what I want and I’d like things to continue to be that way. So it’s time to have fun. So that’s pretty much my goal every day when I sit down at my desk is just to have fun. I mean, obviously there’s work to do. But I’m enjoying it and I enjoy the people that I work with and having the opportunity that I do. Hashtag blessed as much as it pains me to say that. But anyways, so yeah, 2023 other thing we’re going to do stay fit. I went mountain biking yesterday over exerted myself on a four and a half. I actually I think it was like six miles. My whoops says I actually did nine and a quarter mile, which is quite a lot. But I think it was the six and a half mile. We’re going to probably do a video in this place. So yeah, so without doxing myself again and saying my name, we do have a LLC that we’re probably going to take advantage of in order to actually do some things with the girls. I think it’s time for them to really get into the business and entrepreneurial spirit. So I think we’re going to crank things up a little bit, start operating as an LLC. Mrs is doing all this stuff with the Southwest companion pass and trying to get like fire for travel and stuff like that. I’m just going to leave the whole Southwest Christmas to do a back hole aside for the moment. I think I still have a blog post to finish writing actually writing about that, but we had a bit of a bit of an issue coming back from Costa Rica and got stuck in Houston for about four days over the break. But I’m not going to get into that right now. Basically I just think having the girls involved like creativity as a family, being able to like take pictures and take video as we go out on these little adventures and just get the girls familiar with like video editing. How do you make videos? How do you edit videos? How do you do production? Like what does production look like? Music production, video production, like all types of media basically. Those are all skills that I think are handy still despite what our AI overlords are doing with stable diffusion and stuff like that. I think integrating all this stuff together is going to be even easier now. Yeah, so staying fit, trying to ramp up some sort of like project work with the kids. I tried to get them to do, actually we tried to get them to put like a little thing together for New Year’s. They found a song, they were doing some dancing to it. We took a couple little bits of video of it, but didn’t have enough time. It was like four days to really put something solid together. So I think we’ll probably have a January project. Our January project will be to maybe go to the park that I went to yesterday and take some video, just run around having fun and maybe do some video stuff. I am thinking about getting a new iPhone just for the camera, GoPro for when I do biking stuff, but trying not to go crazy with the money spending considering how December went and kind of the pay cuts of the company that went across there, so just trying to actually get rid of stuff, clean up some stuff before I go crazy, but probably I’m going to buy a $1,000 iPhone and a hanging board for the garage. The girls have started converting the garage into a fitness center younger, my daughter younger, youngest, I don’t know how to say that out loud. My younger daughter, she got a punching bag. So we’ve got like a hanging board, like a two by four we can hang from, try to pull up, we’ve got a punching bag that’s hanging from the ceiling in the garage and then we’ve got mountain bike scooters galore, the girls have really been skating, roller skating and roller blading. So we went to our first skating rink exercise a couple days ago, I think that was Saturday maybe. So that was fun and I’m going to start wrapping this up I guess so I can actually get to work and do some of this cool stuff that I’m so excited about. We will scratch that. It is definitely a bit easier, I don’t know, like compared to typing, I can definitely talk a lot faster than I type so I’m able to dump that knowledge out or that experience out of my brain, words, things that I’m saying. It comes easier to speak so it is kind of weird actually talking to myself in this empty room with my phone in front of me. It’s definitely more different experience than trying to type a blog post which is a completely different thing altogether. So we will see if I can set up some workflow, my ideal scenario here is just basically a Siri on steroids that I can talk to, give commands to it, have it actually do stuff in a smart way, open this, do this in an email to someone telling them blah, blah, blah. That’s one particular example but some of the scenarios where hey, I’ve got a project idea for the company, here it is, this is what we’re going to do, write this up as a project brief. I think if I can get something that flows quickly from voice to GPT, I think it will be a game changer. I don’t know if we’re going to be able to use GPT-3 chat or just using the API but we’re going to figure something out. The end goal here for this project is again, I say again like I have actually put it into good words but the demo video of the guy talking to the metahuman that receives his voice commands and is able to do some responses based on that. So we’ll see if we can recreate that demo because I think that’s what the future is going to look like and so it’s my turn to bring it into reality.

Winter Vacation Part I: Costa Rica

We’ve been on vacation the last nine days. Two days longer than we had intended, and we still have two more days to go. We’ve been caught up in one of the worst airline travel debacles in recent memory, as a winter cold spell wrecked havoc across most of the north and northeastern United States.

We left on the nineteenth to fly to Costa Rica. We had a layover in Houston then got on our flight to Costa Rica’s capital, San Jose. We were surprised to find ourselves in what I can only describe as the most luxurious flight experience I’ve ever had: first class seating that was spacious and reclined into a bed. It was fantastic. We had TV screens and got served a hot meal. I’ve never had anything like it, it was wonderful.

I had already booked a shuttle to take us from the SJO airport to the Fiesta Resort in Puntarenas, which was about an hour and half drive. I was dumbstruck by the beauty of the mountains, which we saw off in the distance as we wound our way down the highway.

One thing that struck me was the anti-landslide measures that had been put in place. High cliffs bounded the road, and miles of the cliffs were covered either by chain-link fences which were bolted into the granite. In other places, concrete had been sprayed on the cliffs, with small pipes drilled into the wall to let water escape. There were drainage ditches dug into some of them as well, pulling water down gentle slopes, instead of over the sheer drops. It was fascinating.

Our resort was nice, but not the best. We’d budgeted for a more modest stay instead of something like a Sandals or Beaches resort. Most of the guests were Costa Rican, one of the managers told me only a few guests were from out of the country. I ran into a handful of Americans over the week, and a European couple.

We had a beach-front view, we could spy across the bay to Tortuga island, but the actual beach itself was meh. The sand was black, and driftwood washed up everywhere. Younger enjoyed it but we wound up spending most of our time at the pool.

We took an excursion the second day, our only one of the trip. It was an all day trip to the Poas volcano. We had a 90-minute drive where we stopped at a coffee field, where we had a few cups and bought some trinkets. After refreshing ourselves it was on to the volcano.

Costa Rica has microclimates, little islands of weather that change as you climb the mountains. Poas was very cold, rainy and cloudy that day. The 300-meter walk to the overlook was miserable. We were very underdressed for the weather. I basically had to drag Elder, who was practically crying because of the cold and wetness. It was a bust, we couldn’t even see the caldera through the clouds. So we packed it up quickly and head out to the last stop on our day trip, to the waterfall gardens of La Paz.

This was much more fun, the park had a butterfly garden, which the kids really enjoyed, as well as a hummingbird garden. We stopped for lunch, which included a buffet. It was here that we lost our guide, Guillermo. He was a very nice gentleman, in his seventies. We had taken a table on the opposite side of the cafeteria from him, and when we finished eating he was no where to be found. I sent a message to the travel company via Whatsapp while we went for a walk on the nature trail.

The trail was a hike through the woods, following a river. It was beautiful. The path was marked off with stone slabs up and down the wandering hills. It was amazing really. By the time we got out I had gotten a call from the travel manager, he said Guillermo was out looking for us, and we managed to catch up with him a few minutes later. He was distraught about losing us, but I told him it was alright.

We explored the rest of the park, which included large cats, a replica of a traditional cabin, and a snake exhibit. Guillermo showed us the pit viper, the most poisonous snake in Latin America, and told us his brother had been killed by one. Then we went on to the actual waterfalls.

This was my favorite part of the park. Steel steps had been driven into the cliffs, offering observation posts at several points along several deep waterfalls along the river. It was gorgeous. We climbed down to the river bank and then back up and down as we followed the drops along the falls. After a short break back at the gift shop, we made our way back to the van and resumed our long ride back, which was almost three hours.

We didn’t go on any more excursions the whole trip, instead just lounging around the resort. I felt like the resort’s excursions were probably overpriced, but I didn’t have it in me to deal with the firm that waited right outside the resort on the beach. I really wanted to check out the adventure park down in Monteverde, but wasn’t sure that Younger would have been able to do the fun stuff that I wanted to do.

Food at the resort was not that great, with a few exceptions. There was a cafeteria that served various fare, every meal had rice and beans, plantains, a white cheese that resembled mozzarella, and tortillas. There were several stations that had chefs preparing various foods like omelets for breakfast, or fish and chicken dishes. It was traditional Costa Rican fare, but it seemed pretty bland for the most part. The girls didn’t eat much of it, despite wanting to eat there every day. There was a nice selection of desserts and sugary cereals that they liked, of course.

There were several other places to get food there, including a couple of grill stations where one could get hamburgers, pizza, or chorizo, and a two additional sit down restaurants with a la carte menus. As ‘exclusive’ level guests we were able to eat breakfast and lunch here without reservations, but we needed to reserve seats for dinner. This is where I preferred to eat. My best meal was when they transformed the fancier restaurant to a steak house, and I had a wonderful plate that included a trio of meats, including a bacon-wrapped chicken breast, and a pork chop and beef tenderloin.

Of course the alcohol was included. There was a swim up bar in the exclusive pool near our room, serving the local brewed lager, as well as a number of mixed drinks. You could get beer anywhere, and I probably woke up every morning with a hangover, except the day after we went on our trip.

At night there were tons of entertainment options. The entertainment started with a movie — League of Superpets was on the two nights that we saw — followed by an act for the kids to bring them up on stage and interact. After that was a dance and comedy show, with a troupe of four or five couples dancing to Disney songs or whatever, interspersed with a clown slash jesters that juggled and joked and did unicycles and other physical humor. Everything was in Spanish, of course, save for the emcee who repeated everything in Spanish and English — he was a Canadian expat who had lived in CR for twenty five years.

I felt bad about not making any attempt to learn Spanish before our trip. I started doing Duolingo one or two days into the trip, and after a few days was able to make simple requests en Espanol. It wasn’t quite enough though. Despite everything I’d heard about ‘everyone’ in CR speaking English, it wasn’t quite the case. Many of the guides and staff were able to talk to us, but most of the random staff that I talked to didn’t know any. I was able to bridge the gap by using the translate app on my iPhone, it actually worked pretty well despite the app’s clunkiness.

The weather was great the entire trip. There was one or two days where it got so hot around that we had to hide out in our room for a siesta, but we wound up spending mornings and evenings out in the pool.

Missus and I didn’t really spend too much alone time together. We put the girls in the resort’s kids’ club a few times, but they didn’t really enjoy it so we only left them there a few hours. In hindsight, we should have just left the girls in the room by themselves to sneak off and eat.

In all we spent seven days at the resort, which in hindsight, again, was a bit too much. Four days would have been ideal, and we could have spent a couple of days wandering around the country, adventuring. Maybe next year.

We enjoyed a bit of schadenfreude while we were enjoying the tropical weather. A massive cold front had swept through the US, bringing temperatures at home down into the teens. Two days before Christmas the temperature at the house didn’t get above twenty degrees. Our hubris, however, was about hit us upon our return to the United States.

Morning update

I’m behind this morning because I was cleaning the kitchen. I went to the climbing gym last night with someone I met there who lives nearby. He almost got in a wreck on the way home, merging onto the freeway, but never mind that. I got home at 9:30 and stayed up too late. Anyways. Shoulder’s felt great this morning after I stretched the sleep out of them. Ready to rock.

Something popped up for the DAO last night, so I need to provide some updata — that’s updated data — hehe. I’m being silly today because I got into an argument with Missus as soon as I woke up this morning. Gas gauge was on the car and it’s my fault. Anyways.

I’ve got so many ideas of things to do with GPT.

The Discord bot is coming along, but the prompt engineering and training is going a little slower than I wanted. I had an idea this morning, that I should just go ahead and build a demo of this right now.

… ok, back, I literally got sucked into a rabbit hole about text embeddings for semantic search. Stuff is mindblowing.