Exploring Omnivore: A Powerful Open-Source Alternative for Content Aggregation

An illustration representing the concept of content aggregation and management. The image should feature a central figure symbolizing the Omnivore app, depicted as a powerful, modern tool. Around it, various streams of information (like articles, newsletters, and notes) are being efficiently organized and funneled into a streamlined workflow. Elements representing key integrations, such as Obsidian, should be subtly included, emphasizing the seamless connection between Omnivore and other productivity tools. The overall tone of the image should convey a sense of organization, efficiency, and technological advancement, suitable for a blog post about innovative content management solutions.

Streamlining Content Consumption with Omnivore – Integrations, Features, and Personal Experience

In the quest for efficient content management and aggregation, especially for those of us embedded in the tech and AI space, finding the right tools can be a game-changer. Today, I want to delve into one such tool that has recently caught my attention: Omnivore.

A Glimpse into the Landscape

Before Omnivore, there were several alternatives for content aggregation, notably Readwise. However, as a family man and solopreneur deeply involved in blockchain and AI, I sought something more aligned with my workflow, particularly integration with Obsidian, my go-to for managing information. Here’s a quick overview of what I considered:

  1. Zotero2Readwise: A Python library, perfect for bibliographic data management.
  2. Obsidian-Readwise: A TypeScript plugin syncing Readwise highlights into Obsidian.
  3. Omnivore: An open-source app with a compelling Obsidian integration.
  4. Wallabag: A self-hosted read-it-later app focusing on privacy and offline reading.
  5. Instapaper: A freemium model for saving web pages, albeit not entirely open-source.

Each of these tools offers something unique, but Omnivore stood out for its alignment with my needs.

Why Omnivore?

Omnivore appealed to me for several reasons. Its open-source nature means I can delve into its workings, ensuring it meshes seamlessly with my other tools. More importantly, its integration with Obsidian—a cornerstone of my workflow—made it an attractive option.

Personal Journey with Omnivore

My journey with Omnivore wasn’t without hiccups. Initially, I faced issues setting it up and turned to Wallabag as an interim solution. However, Wallabag’s lack of annotation features, crucial for my work, led me back to Omnivore.

After overcoming initial setup challenges and integrating it with Obsidian, the true potential of Omnivore began to shine. Its ability to pull in newsletters and possibly handle RSS feeds aligns perfectly with my need to sift through vast amounts of information efficiently.

Current State and Future Plans

I’m currently in the process of refining my use of Omnivore. The aim is to fully integrate it into my morning routine, replacing the manual trawl through news feeds with a more streamlined, Omnivore-mediated process. This change promises to enhance my competitive edge by allowing me to process and act on information more effectively.


In conclusion, for those in the AI and blockchain space, tools like Omnivore can significantly impact our daily productivity and information management. It’s not just about consuming content; it’s about integrating it into our workflow in the most efficient way possible. Omnivore, with its open-source nature and integration capabilities, is a tool that deserves attention for anyone looking to streamline their content consumption.

2024-01-09 Reading List

A collage for a blog post featuring six different themes: 1) A person doing equipment-free exercises like push-ups or squats at home. 2) An abstract representation of Bitcoin and ETFs, like golden coins with the Bitcoin symbol and documents or graphs. 3) Futuristic elements and gaming graphics inspired by a popular puzzle-platform game. 4) A computer screen displaying code and a certification badge, symbolizing tech education and certifications. 5) A juxtaposition of a corporate office setting with a home office, representing remote work and office policies. 6) Various cryptocurrencies, like coins and digital symbols, depicting the crypto market. The image should be vibrant and engaging, capturing the essence of each theme.

6 equipment free exercises for sculpting muscle over the holidays, according to a triple Olympian

So I did this to mix it up today, I’m well rested and need a pretty intense day to hit my goals, so I wanted to get a quick exercise in this morning at home instead of going to the gym. One of my goals that I’ve failed to do this year is to do five minutes of activity soon after waking. It’s good for sleep. So while I only intended to do this for five minutes, it took me well over twenty.

The target here is three sets of 20-25 reps with 30 seconds rest, but I cannot do that many pushups or pikes (yet!) so I only did half.

  • Plank ups
  • Knees to ISO squat
  • Pike ups
  • Split Squat
  • Push up
  • Side Lunge

I actually didn’t see side lunges, so I guess I cheated a bit. My plan was to go out running afterward, but it’s raining and only getting worse out there today, so I’m going to need to be creative to get it done today.

SEC Hustles to Answer Latest Bitcoin ETF Filings: Source
No comment here, will have more about the crazy week another day.

A fan-made, 7-hour Portal 2 prequel just hit Steam for free and it’s so good that I’m sad Valve stopped making Portal all over again

If you have Portal 2 it’s a no-brainer.

GitHub Certifications are generally available

I’m not sure if this is going to be a desired trait for job seekers, but if I was a few decades younger I would probably go for this. It’s definitely geared at enterprises, and seems to be similar to the CompTIA and Microsoft tracks. Exam costs are $200 each. The training materials might be worth bookmarking, it’s all on Microsoft Learn.
GitHub Actions
GitHub Administration
GitHub Advanced Security
GitHub Foundations
I would recommend any students or junior level job seekers to pick these up.

The data is in: RTO policies don’t improve employee performance or company value, but controlling bosses don’t care

The headline says it all. My partner is dealing with this right now, and I’d like to find a way to help her fight it. Collecting these notes is the first step.

8 Best Altcoins in January 2024: Reviewing the Top Altcoins Including Celestia, Solana, Sei, Corgi Ai, ApeMax, Injective, Bonk, and Arbitrum
Best Crypto to Buy Now January 8 – Injective, Stacks, Axelar
No recommendations here, DYOR, as I am.

Portkey : Control Panel for AI Apps

Adding this to the list of LLM observability tools. Hosted/OSS. Something I will be evaluating against Chainlit/LiteralAI.

LangChain v0.1.0
They’ve come a long way and were one of the first ‘agentic workload’ implemetnations. I’ve yet to really use them because of Steamship and OAI Assistants, but I should probably go back and rebuild an app using their stack for comparision’s sake.

Unveiling DPHA: My Journey in Creating a GPT-Based Developer Assistant

A futuristic AI assistant represented as an abstract digital entity, with glowing neon lines and digital elements symbolizing advanced technology and intelligence. The background is a complex network of digital nodes and connections, illustrating a high-tech, interconnected environment. The assistant is not humanoid but rather an intricate, luminous, and dynamic digital construct, showcasing the essence of an AI operating in a virtual space.

Harnessing LinkedIn’s Open-Source Framework for Enhanced Developer Productivity

I’m excited to share with you a fascinating project I’ve been working on recently – the creation of a GPT Assistant, which I’ve affectionately named the Developer Productivity and Happiness Assistant (DPHA).

LinkedIn’s Open-Source Framework: A Catalyst for Innovation

Recently, LinkedIn made headlines by sharing its Developer Productivity Framework (The New Stack article) and opening its doors to the broader tech community through their GitHub repository. Inspired by this, I embarked on an intriguing journey to transform this framework into a practical AI tool.

The Birth of DPHA: From Concept to Reality in Fifteen Minutes

Believe it or not, converting this repository into a GPT Assistant was a remarkably swift process, taking about fifteen minutes. You can check out the DPHA here. This swift turnaround was an eye-opener, showcasing the immense potential and efficiency of AI in modern software development.

Exploring the GPT Store: A New Avenue for Monetization

As someone keenly interested in the evolving landscape of AI, I’ve been exploring the idea of creating GPTs as a potential revenue stream, especially with the emergence of OpenAI’s app store (Forbes article). DPHA is my first foray into this domain.

Spreading the Word: LinkedIn and GitHub Discussions

To gauge interest and gather feedback, I’ve shared DPHA with my LinkedIn network and also posted a link to the GitHub discussion for the repository (GitHub Discussion). These platforms are excellent for attracting traffic and engaging with potential users, especially developers who might benefit from this tool.

Test Driving DPHA: A Glimpse into the AI’s Capabilities

My first interaction with DPHA was simple yet revealing. I started with a “let’s get started” prompt and followed the assistant’s lead. The conversation was insightful, with DPHA adeptly handling queries and offering valuable suggestions based on the Developer Productivity Framework. Here’s a snippet of our conversation, tailored to my context.

Business Context:

  • DAO Operating Model: Unique collaborative and decision-making processes.
  • Focus on LLM-Based Tools: High engagement with cutting-edge AI technologies.
  • Participation in Tech Sprint: Opportunity for recognition and scaling.

Team and Tools:

  • Small Team with Diverse Experience: Your extensive tech experience combined with Laren’s current academic perspective.
  • Current Tools: GitHub, IntelliJ/PyCharm, OpenAI, various FOSS projects.
  • Pain Point: Transitioning from Replit to GCS with proper CI/CD pipelines using GitHub Actions.

Goals and Challenges:

  • 2024 Goals: Enhancing personal automation projects, succeeding in the challenge.org tech sprint, launching GPT assistants, and building tools for personal branding.
  • Pain Points: Deployment issues, desire for smoother CI/CD processes.
  • Metrics and Feedback: Basic time tracking, concerns about code quality and build integrity.


Workflow Optimization:

    • Transition to GCS: Plan a phased migration to Google Cloud Services, ensuring minimal disruption. Start with less critical components to gain familiarity.
    • CI/CD Pipeline: Implement GitHub Actions for continuous integration and delivery. Start with basic build and test automation and gradually incorporate more complex workflows.

    Metrics and Feedback Systems:

    • Code Quality Metrics: Integrate tools like SonarQube or CodeClimate with your GitHub repository to automatically review code quality and maintain clean code.
    • Build and Deployment Metrics: Track build success rates, deployment frequency, and time-to-recovery in case of failures.
    • Time Tracking Review: Schedule regular reviews of time tracking data to identify bottlenecks or inefficiencies in your development process.

    Team Collaboration and Growth:

    • Regular Check-ins: Have weekly sync-ups with your team to discuss progress, challenges, and brainstorm solutions.
    • Learning and Development: Encourage Laren to bring in new ideas from her academic background and allocate time for learning new technologies and best practices.

    Project Management:

    • Agile Practices: Implement agile methodologies like Scrum or Kanban to manage your projects. This can help in better handling your diverse project portfolio.
    • Task Management Tool: Use tools like Trello or Jira for better task visibility and prioritization.

    Risk Management:

    • Version Control Best Practices: Ensure that you’re using feature branches, pull requests, and code reviews to maintain code quality and prevent breaking the build.
    • Backup and Recovery Strategies: Regularly back up your code and have a disaster recovery plan, especially important when working with cloud services.

    Innovation and Experimentation:

    • Blockchain and AI Integration: Keep experimenting with blockchain and AI in your personal projects, as this aligns with your interest and business model.

    Participation in Challenge.org Tech Sprint:

    • Focus on MVP: Concentrate on building a Minimum Viable Product (MVP) for the tech sprint, leveraging your strengths in AI and blockchain.

    Building Personal Brand:

    • GPT Assistants Deployment: Progress with deploying GPT assistants on the OAI store, potentially using this as a case study or showcase for your brand.

    Navigating the Ups and Downs of AI and Blockchain Development: A Real-World Saga

    A futuristic workspace with advanced AI and blockchain technology, featuring screens displaying complex code and digital interfaces, amidst a modern, sleek office environment. The setting is vibrant yet professional, embodying the cutting-edge nature of AI and blockchain development.

    From Technical Hurdles to Unexpected Downtimes – Lessons Learned in the Trenches of Tech Innovation

    In the ever-evolving world of AI and blockchain, we often encounter a myriad of challenges, some anticipated and others completely unforeseen. Our recent experience serves as a testament to this unpredictable journey. Last Friday, we started encountering issues with our OpenAI assistant, deployed through a Chainlit app. For those unfamiliar, Chainlit is akin to Streamlit but with a Python backend and a React frontend, a blend of technological sophistication.

    Our initial task seemed straightforward – uploading a PDF and several markdown files as source documents to enhance our app’s user experience. However, the challenge lay in properly displaying these citations in the UX. The complexities of this task led us down a rabbit hole of technical intricacies. We faced peculiar issues with the model’s response, especially when it came to retrieving information from the database. An instance that stood out was when a query about taking a day off on a birthday yielded no relevant results from the documents, leading to confusing model responses.

    Determined to resolve this, we dived deeper, fine-tuning our approach. We realized that when no results were found, it was best to assume the documents didn’t cover the queried topic. Yet, this was just the tip of the iceberg. As we delved further, testing and tweaking, more bizarre responses emerged. The model reported technical difficulties in reading documents, yet contradictorily, it displayed search results.

    Amidst this chaos, we discovered another glaring issue – the redundancy of file uploads. We had inadvertently uploaded the same file numerous times, a clear oversight in our workflow. This necessitated a thorough cleanup and recreation of our assistants, ensuring everything functioned seamlessly in our playground tests and in the app.

    The real twist came when we recognized that the root of our troubles lay not within our code or approach but with OpenAI’s retrieval runs. Their models were not performing as expected, leading us to some prompt engineering adjustments. Frustrated and exhausted, we decided to pause and revisit the problem with a fresh perspective on Monday.

    Monday brought its own set of surprises. Our return was greeted by a non-functional app, thanks to Chainlit Cloud being down. This downtime was a significant blow as Chainlit Cloud is integral to our data persistence layer, storing user maps and conversation databases. It’s akin to how ChatGPT displays threads of conversations. This persistence is crucial for our app’s functionality.

    Rushing to the Chainlit Discord server, which I hadn’t joined until then, I learned that their cloud service was undergoing an update. This revelation was both horrifying and enlightening. In response, we disabled data persistence on our production server as a temporary fix and spent the day refactoring our code to adapt to the new Chainlit Cloud version.

    This refactoring journey was not just about code. It was about understanding and integrating changes from the Chainlit cookbook, separating our custom code, and preserving the essence of third-party contributions. We faced dilemmas about merging different code histories and ensuring our customizations, especially around the assistant’s citation returns, were seamlessly integrated.

    Our approach was to move our customizations into a separate file and import Chainlit’s updated cookbook as a utility. This method, while effective, brought its own set of challenges. Chainlit functions uniquely, with Python on the backend and a React frontend, a system that allows for deployment on platforms like Replit. Our goal was to customize this frontend without bloating the repository.

    The downtime also highlighted a critical migration issue with Chainlit Cloud, now rebranded as Literal AI. They had changed key determinations and OAuth methods, leading to a temporary loss of our historical data. This situation underlined the importance of staying attuned to dependencies and the risks involved in relying on external services.

    Despite these challenges, we emerged with valuable insights and a stronger, more resilient application. Our journey with Chainlit and OpenAI continues, marked by both triumphs and tribulations. As we progress, we remain committed to exploring and harnessing the immense potential of AI and blockchain technology, ready to tackle whatever hurdles come our way.

    2024-01-08 Reading List

    A stock image featuring a modern, clean desk with a computer displaying graphs and cryptocurrency logos on the screen. Beside the computer, there's a notebook with handwritten notes about AI and trading strategies. Several finance-related books are stacked neatly on the desk, along with a cup of coffee, symbolizing long hours of research. In the background, a corkboard or whiteboard is visible with various tasks and goals, emphasizing themes of freelancing and personal branding. The setting conveys a focused work environment, suitable for an entrepreneur engaged in the digital economy, crypto, and AI space.

    I’ve decided that if I’m going to spend the time to go through my news feed in the morning, I should at least start posting what’s relevant to me (and hopefully, to you as well).

    SEC reissues crypto ‘FOMO’ warning amid hope for spot Bitcoin ETFs

    Just say no to drugs, kids.

    11 Ways To Earn Money on TaskRabbit With ChatGPT

    TL;DR: Basically anything that you can do on TR
    – Offer Landscaping and Design Services
    – Turn Car-washing into Auto Detailing
    – Offer Home Design Services
    – Provide Eco-friendly Cleaning Services
    – Create Customized Grocery Lists and Recipes
    – Get Tips and Suggestions for Baby Proofing
    – Create Resumes and Help People Find Jobs
    – Schedule Appointments
    – Manage Social Media for Companies
    – Send Marketing Emails
    – Use ChatGPT to Market Your Services

    This Paper Introduces LARP: An Artificial Intelligence Framework for Role-Playing Language Agents Tailored for Open-World Games

    Saving for later, I have plans for stuff like this.

    Project: https://miao-ai-lab.github.io/LARP/
    Paper: https://arxiv.org/abs/2312.17653
    GH: https://github.com/MiAO-AI-Lab/LARP

    Your Personal Brand Is The Key To Building A Successful Career

    This is the stuff google feeds me now to keep me pumped up about being a solopreneur.

    5 Passive Income Ideas For 2024

    Looks like I should take blogging/writing seriously again. I’m not buying property this year, and dividend stocks aren’t going to do it. I don’t really have the desire to start a course, but we’ll see what happens.

    Investors hedge bets on Bitcoin with $50K call options before ETF decision

    I have been telling people that the ETFs wont’ get their BTC for less than $60k, but given that the supposed approval will go through on Tuesday, I am willing to admit that I could be wrong. Still, part of my hedging plan for this year is to take advantage of calls and put options.
    Right now I can make a $19 premium by selling BTC-Mini-12JAN2024-45000-Call option, but I’ll lose out on any rise over the strike price. I’ve done it a couple of times when I thought there was no chance of the price running high, but we’re only 4 days out and there aren’t any higher calls available unless I go further out. The ### BTC-Mini-26JAN2024-75000-Call is only going for about a dollar right now.
    Generally speaking, the open interest on Derbit indicates short term consensus sentiment around the 50k mark.

    Bitcoin Stock to Flow Model is Back With $532K BTC Price Prediction

    Ah, PlanB, he of the eternal bull. There’s no doubt in my mind that BTC will hit this price, but if you had to pin me down I’d say this is a very long-term bet. I can’t say whether I think we’ll see that this cycle. The S2F model is a useful one regarding BTC’s design and long-term sarcity model, but I think that the 30% error in PlanB’s predictions from last cycle are too far off to make any bets. I’m still holding a profit-taking level (3.6x the Mayer Multiple), which is currently around $117k.

    3 Altcoins That Could Replicate Solana’s 975% Annual Growth

    Borroe Finance ($ROE): Pioneering Hassle-Free Fundraising
    WEMIX (WEMIX): A Steady Growth and Diverse Ecosystem
    Conflux (CFX):  Blockchain  with Hybrid Consensus and Tree-Graph Technology

    AlgosOne AI Trading Solution
    Put this in the bucket of things I’m willing to put money in but haven’t. Their front end is slick as hell, and it doesn’t have any schemey vibes at all. Unfortunately I’m unable to complete KYC due to non-US participation, but I’m still gonna shill my reflink:

    A Messari Report: Crypto Theses for 2024

    I usually read these religiously, but the report was restricted to paying member back in December. Now it’s available for freebie account holders. I’ll be putting this into a vector database for sure and using it to inform my 2024 strategy.

    Navigating the Evolving Landscape of LLM Apps and Agent Frameworks

    Exploring the Convergence of Workflow Management and AI in the Age of Automation

    In the dynamic and rapidly evolving field of LLM applications and agent frameworks, staying ahead of the curve is a necessity. This blog post delves into our journey and insights as we navigate through various platforms and technologies, aiming to integrate advanced AI capabilities into our workflow and products.

    Steamship to Super Agent: A Journey of Exploration

    Our exploration began with the Steamship framework, offering a comprehensive environment for single-agent application development. However, as our needs evolved, we found ourselves seeking more control over the systems we run. This led us to Super Agent, a self-hosted solution offering a similar range of capabilities but with greater autonomy.

    Integrating OpenAI’s Latest Innovations

    A significant focus has been on integrating OpenAI’s latest offerings. We’re currently working on incorporating the Whisper model into Discord voice channels, aiming to enable the model to transcribe and respond using text-to-speech modules. This integration represents a step towards a fully multimodal business intelligence system.

    The Promise of ChainLit and TaskWeaver

    ChainLit has emerged as a compelling option for building Python LLM applications, akin to StreamLit but tailored for LLM apps. Simultaneously, we’re examining Microsoft’s TaskWeaver, a code-first agent framework that appears to merge workflow management with LLM capabilities seamlessly.

    Navigating the Challenges of Observability and Management

    Observability remains a significant challenge, especially given the limitations of OpenAI’s organizational structure. Tools like Pezzo Labs and Langfuse offer promising solutions, but the decision on which platform to commit to remains open.

    Looking Forward: An Eye on the Future

    Our journey is characterized by continuous learning and adaptation. We recognize the importance of not committing prematurely to any single framework, keeping our options open as the landscape of LLM apps and agent frameworks continues to evolve.

    References and Further Reading:

    Embracing the Future of Personal Finance Management with Firefly III

    As we head into 2024, I find myself reflecting on the past six months of unemployment and planning for the future. My strategy has been to live off savings, with plans to liquidate some cryptocurrency holdings to sustain myself and focus on my new software development venture.

    Transition from Traditional to Modern Finance Management

    For years, my wife and I have used GNU Cash to manage our household finances. This tool helped us track individual contributions, manage shared bills, and even handle larger expenses like home repairs. However, GNU Cash, with its roots in a previous era of software, often felt cumbersome, especially during the monthly reconciliation process.

    Seeking a more efficient and modern solution, I turned to Firefly III, a free and open-source personal finance manager. Its robust features and intuitive design immediately caught my attention.

    Firefly III: A New Era of Financial Management

    Developed by James Cole, Firefly III stands out with its comprehensive feature set and ease of use. Setting it up was straightforward, thanks to the well-documented Docker container option. Firefly III not only facilitated the import of banking data through CSV files but also allowed for asset tracking and rule-based categorization of expenses.

    However, I encountered challenges with categorizing transfers accurately. Despite this, the ability to import two years of credit card data and 90 days of banking history with ease was a game changer compared to the laborious task of manual entry in GNU Cash.

    Looking Ahead: Integration and Automation

    My next goal is to explore automated data import options, potentially through Plaid, to streamline the process further. This integration could pave the way for a more hands-off approach to finance management and budgeting for the coming year.

    Firefly III has transformed how I approach personal finance management. It offers a modern, efficient, and flexible solution that aligns with our lifestyle and financial goals. I commend James Cole for his exceptional work on this project and highly recommend Firefly III to those seeking a contemporary finance management tool.

    Notes on Culture DAO

    Since getting laid off I’ve been tossing the idea of a new DAO around. I’ve been reading Iain Bank’s Culture series for the last few months, and the AI minds from the series are what brought me to it. There was a tweet asking which sci-fi series was most likely, given the advances of GPT and other LLMs, and one of the answers was the Culture. So I picked up Consider Phelbas and now I’m halfway through the series.

    Building an AI assistant has been one of the projects I’ve been working on — I named mine Zephyr — it named itself, actually, but I digress. The Westworld examples of multiple independent LLM agents running around independently has fascinated me, and I want to build my own little world of these things. There are some gaming projects that are putting them to use in not only player interactions, but in other ways as well. A fishbowl perhaps, that we can peer in and play god.

    Seeing as how I’ve spent the last year and a half working on the Star Atlas DAO, it was the only thing I could think to do but build a new one. The tech industry is notorious for their lack of unions, and I figured this could be a place for us to build a new group, a way for us to stick together and work toward some new network state.

    And so Culture DAO. If we want to take the optimistic path in our new AI-assisted future, then the Culture is probably the best vision of utopia that I’ve seen put to media. Now of course we’ll have to be careful about how we associate ourselves with Bank’s vision — his estate had a deal with Amazon to develop a series — so we can’t take directly from it. But this idea of super-intelligent AI that function as drones, ships, and habitats as conscious individuals should be general enough.

    I put some of my thoughts about the project into GPT and published this What is Culture DAO post.