1. Introduction 🏁

Main Dashboard: https://public.tableau.com/app/profile/rodolfo/viz/Uniswapv3TraderBehavious/Dashboard?publish=yes

GitHub Repo: https://github.com/LimaRods/Uniswap-User-Behaviour/tree/main

1.1 Protocol Introduction

Uniswap was launched in November 2018 and has since become one of the most popular decentralized exchanges in the cryptocurrency space. It is known for its simple and user-friendly interface, low fees, and ability to support a wide range of tokens. As of July 2023, Uniswap is the largest decentralized exchange by trading volume, with over $1 billion in daily trading volume on average, and has already processed over $1.5 trillion since its launch.

1.2 Objective

As a data consultant for Uniswap, I aim to deliver a comprehensive business growth report. This report will offer insights into Uniswap v3's customers, focusing on enhancing profitability, revenue, retention, and usage metrics. The project will involve:

  1. Crafting a dataset for analytical purposes.
  2. Detailing the dataset creation methodology.
  3. Examining pivotal customer metrics across various cohorts over time.
  4. Proposing actionable strategies to augment key business performance indicators.

2. Methodology 🔬

It's important to mention that the scope of the project is restricted to v3 pools in Uniswap once is easier to find reliable data sources for structured data that provide the fee percent for hundreds of different pools.

2.1 Data Sources & Tools 🛠️

**Flipside** is the data provider of this project. We retrieved data from different tables such as ethereum. core.ez_dex_swap, ethereum.uniswapv3.ez_pools , ethereum.core.fact_transactions. We used the new Flipside feature called LiveQuery, then we were able to query external APIs to combine off-chain sources with Flipside data. We utilized the series FEDFUNDS from the FRED API

The main language used in this project was SQL, following Python, which helped us build data pipelines and run some advanced algorithms. We also decided to go with Tableau in order to provide comprehensive and clear cohort charts. The dataset that feeds the dashboard is stored in **Google Sheets** and can be updated by running the Python Notebook main.ipynb

All codes are in the GitHub Repo

2.2 Data Transformation & Decisions 🧐

3 Datasets were crafted to run the analysis.