Analytics Engineer

$70k - $100k + Stock

Argyle is a fast-growing, remote-first Series B startup solving a systemic data problem.

Underneath the consumer finance industry’s decisions and processes is static, analog documentation—things like credit reports and paystubs—designed decades ago for a world that no longer exists. Meanwhile, credit bureaus buy, move, store, and sell consumers’ data without their knowledge or consent.

The result? A labyrinth of manual workflows and shortsighted underwriting models obstruct financial access, compound operational costs, and impede innovation.

The solution is Argyle. We’re a real-time income data platform that lets our end-users instantly connect their employment records to apps and websites, so they can access and qualify for the financial resources they need to get ahead. Providers benefit from streamlined workflows and enhanced visibility that reduce costs and risk across the user journey.

Our mission is to give consumers the means to exercise ownership over their income, employment, and identity data in order to create a more equal, efficient, and effective financial system for everybody.

If you’re looking to join a fun and ambitious group of people working remotely across dozens of countries, apply today.

About the team

The Analytics team is focused on enabling and nurturing the data-driven aspects of Argyle. From "How are Argyle clients using the API?" to "How do the user journeys look like in Argyle Link?" and beyond. Digging through terabytes of data, shaping all of that data into a standard structure, so it's ready for exploration, generating deep insights, cooking up intuitive dashboards for everyone to see - these are some of the parts of what our Analytics team does to make that happen.

What will you do?

  • Provide clean and documented datasets ready for analysis
  • Manage the overall data pipeline orchestration
  • Extract data out of different sources using data extraction platforms, by pulling data from a Kafka topic or by writing custom integrations
  • Develop tests and monitoring to ensure overall pipeline stability and data quality
  • Collaborate with data analysts and product and engineering to scope data analysis demand as part of key decision flows
  • Support general data-related projects as required across Argyle's product suite

What are we looking for?

  • Excellent knowledge of SQL
  • Experience developing and maintaining Data Warehouses (we use BigQuery + dbt)
  • Proven Python development skills to accomplish various data-related tasks like writing Kafka consumers, Airflow DAGs, etc.
  • Knowing your way around software deployment and CI/CD processes (familiarity with Kubernetes and the GCP ecosystem are a bonus)
  • Ability to run multiple data analytics projects independently
  • Ability to work with various parts of Argyle (Product, Engineering, Finance, etc.)

Why Argyle?

  • Remote first company
  • International environment
  • Health Insurance (US residents only)
  • Flexible working hours
  • Stock Options
  • Flexible vacation leave
  • $1000 after a month of employment to set up your home office.
  • MacBook
  • Annual Company Performance Bonus

Argyle embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our company will be.