Posts

Showing posts from January, 2021

Don't use gross churn to compare SaaS businesses

Image
Whether you're optimizing your ad spend or in talks with investors about what your SaaS business is worth, you need to calculate your customer lifetime value right. And while there are good shortcuts, using them can easily lead to a completely wrong number , resulting in really bad decisions. In my role I've come to realize that the shortcuts rarely work well enough. In this article I'll cover the following challenges in calculating churn: Fluctuating / seasonal churn Churn-and-return customers Churn of different price tiers is asymmetric Churn is non-linear over time Saturated markets And finally the best way to calculate lifetime value that account for these challenges. Calculating lifetime from retention Typically lifetime value is calculated by dividing your average monthly revenue per account (ARPA) by your monthly churn. You can get your ARPA by dividing your monthly recurring revenue (MRR) by the number of paying customers you have. And so, if your ARPA is 10 € and y

Snowflake UPSERT operation (aka MERGE)

You want to insert data to a table, but if a corresponding row already exists (by some rule, e.g. unique key) you want to update that instead of adding a new row, keeping the dataset's unique requirements intact. That's an "UPDATE AND INSERT" operation, or UPSERT. Some SQL languages have native support for it.  PostgreSQL has UPSERT as native. Also MySQL supports the operation with INSERT and ON DUPLICATE KEY UPDATE. How do you do UPSERT on Snowflake? Here's how: Snowflake UPSERT i.e. MERGE operation Snowflake's UPSERT is called MERGE and it works just as conveniently. It just has a different name. Here's the simple usage: MERGE INTO workspace.destination_table d USING workspace.source_table s ON d.id = s.id AND d.val1 = s.val1 WHEN MATCHED THEN update SET d.val2 = s.val2, d.val3 = s.val3 WHEN NOT MATCHED THEN INSERT (id, val1, val2, val3) VALUES (s.id, s.val1, s.val2, s.val3); Here the destination_table and source_table are of similar form,

Popular posts from this blog

How to access AWS S3 with pyspark locally using AWS profiles tutorial

Snowflake UPSERT operation (aka MERGE)