Showing posts from 2018

Using survival plot to analyze churn in Power BI

I did not  guess I'd be working with Kaplan-Meier survival plots so soon. Analyzing churn and trying to figure out what kind of users churn more likely is not so easy. In order to calculate churn, you need to have a good volume of users that you can follow month over month, having a number of active users per month and what share of them are left behind every month. But what if you want to select a different set of users, another segment? Producing an analytics cube with the necessary dimensions takes time. And if you end up with a segment that doesn't have high volumes every month, interpreting the results can be quite tricky.

Create a funnel analysis tool with Redshift and Power BI in 5 minutes

If you're not collecting events from your product, get started right away! Events are a great way to collect behavioral data on how your users use your data: what paths they take, what errors they encounter, how long something takes etc. When you have events, there isn't a lot you cannot  analyze.

How to ETL in Amazon AWS? AWS Glue for dummies

You can do ETL in AWS in a few different ways: Glue DataPipeline A custom solution, e.g. a Docker

An insights strategy for winning companies

An executive summary Companies struggle to gain maximum benefit from analytics and insights since Analytics is seen as a support function, not a business partner and therefore is  not prioritized high enough Analytics is separated from business processes and insights are produced away from execution Under-resourced, inflexible analytics stack, which doesn’t enable speed to react to changing needs

A more holistic purpose of analytics

Why do people make bad decisions? Mostly because of insufficient or wrong information, but not always. So what should be done about it?

Simple Big Data setup on Amazon AWS

Everyone wants to do some big data stuff, right? In all honesty, no-one cares if your data is big or small - size doesn't matter. What matters is your ability to take any size of data and generate understanding from it. At some point the data you are gathering might become inconvenient to process with more traditional tools. It might  be that some big data tools might help you - or not. The bottom line is, it is a tool you want to have in your toolbox.

Simple way to query Amazon Athena in python with boto3

ETL takes time and it's a lot to maintain. Sometimes it breaks when you didn't expect a string to contain emojis. You might decide the transformation needs to be changed, which means you need to refresh all your data. So what can you do to avoid this?

Know exactly how much you pay to acquire any user: Python with Google API

So you've read about how to optimize your marketing efforts through data . With that, you should know the kind of users different marketing campaigns are bringing in. Some campaigns might be bringing more high-quality users than others. Can you do that now?

What is the easiest way to train a neural network?

We've been surveying our customers since forever, but recently we've started to become more and more hungry on data. As an invoicing service provider for micro entrepreneurs, Zervant's  customers are of all varieties. One interesting data snippet we've found out is that roughly half of our customers are part-time entrepreneurs. But which half is it?

Prioritize your features with the Kano method

Prioritizing new features requires intuition, but it can definitely be made better by using data. As there's always a limited amount of bandwidth to do stuff, you need to relentlessly prioritize what to do next. An easy way to fail is to be too busy doing the wrong things.

Popular posts from this blog

Snowflake UPSERT operation (aka MERGE)