Serge Toarca

Startups, AI, and macroeconomics.

  1. I spent millions building a product with no revenue

    For the last decade, I've neglected my personal relationships. I was heads down working on my company, and it always felt like taking time to socialize and talk to others was "not getting any work done". That mindset was... incredibly stupid, and it became obvious in an exchange I had on Facebook. First, some context:

    My oldest son just turned 4, and I was thoroughly disappointed with the schools available. So I decided to purchase a building[1] in the US to build a charter school with an accelerated curriculum[2], starting with teaching 3-year-olds to read.

    Future best school in Michigan

    I didn't have enough money to buy it outright, so I went looking for a bank to finance the deal. The building I found was an office building with a high vacancy rate, but discounted enough that it was still cashflowing after debt - this would allow me

    read more ⇁

  2. My adventures in entrepreneurship

    I built my business blind. Not "Ray Charles" blind, but "pilot flying in fog" blind. Except I didn't have any instruments.

    I incorporated right out of university. The business was Debuggex Inc., and the first product was a regex debugger. This was an ultra-niche (I didn't realize just how niche at the time) tool to help you understand regexes. A regex is like ctrl+F on steroids - it lets you find not just specific pieces of text, but complex patterns.

    Suppose you wanted to find all the dates in a document. You might want to match any digits that look like YYYY-MM-DD. To solve this, you can use a regex like this: \d{4}-\d{2}-\d{2}. A few squigglies, but overall, not terrible.

    Now, let's say you wanted to match an email address. It's slightly more compli...

    (?:[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]
    read more ⇁

  3. Content-defined chunking: unreasonably effective compression

    Content-defined chunking runs approximately as fast as gzip and can achieve compression ratios better than 100:1 on certain classes of data.[1]

    Suppose we have several webpages that frequently get updated, and we want to store every update of every page. The most naive way to do it would be to make a full copy of the visited page every time, and use a standard compression algorithm like zstd on each copy. For simplicity, let's say that each page is about 100kB and that the compression ratio of zstd on the typical page is 5:1.[2]

    To store n updates, we would require n * 20kB of storage. Can we do better? We notice that occasionally, one of the updates is identical to a previous one. That sounds like an opportunity to improve. Before saving, we hash the content to check for duplication. We only save the content and

    read more ⇁

  4. BitCollapse: Bitcoin's hard problem of latency

    High latency between miners gives an exponential advantage to miners closer to the highest concentration of compute. Unless this problem is solved, it is not possible to mine Bitcoin on multiple planets at the same time. Mining "collapses" to the single largest low-latency mining cluster.

    Consider the following thought experiment.

    There are exactly 2 computers in the universe, sitting 60 light minutes away from each other. Each is exactly as powerful as the other, and both are spending all of their compute mining bitcoin. Suppose they both start mining the same block at the same time ("same time" according to an observer sitting halfway between them). After about 10 minutes, one of them (call it computer A) will have mined the first block. A sends it over to B.

    But it takes 60 minutes for the block to arrive at computer B. In that time, B will have mined an

    read more ⇁

  5. Why the hell am I building a product with a tiny market?

    This article was originally published on debuggex.com

    Two months ago, I launched a regex tester.

    Why would I ever build a product around helping people with their regular ex­pres­sions? The market is tiny. There are dozens of free al­ter­na­tives, and only a small percentage of people I've asked said they would pay for my product.

    There are a few big advantages to be had competing in a smaller market.

    In a smaller market, everything happens on a smaller scale. Successes and failures are smaller in magnitude and take less time to pan out. Less effort is required to build a competitive product, since the existing ones are not as well-developed. The result is a tighter feedback loop for your learning. Debuggex is my first product, so I want to optimize for learning and profit rather than just profit.

    To date, I've spent less than

    read more ⇁