About Our Client

Our client is pioneering the development of cutting-edge marketing intelligence platforms for exponential growth. With a focus on attribution modeling technology and customizable dashboards, our client empowers e-commerce brands with a comprehensive view of their data, enabling them to make informed decisions for profitable expansion.

As our client's team and customer base expand rapidly, they are seeking talented individuals to join their dynamic team. This is an exceptional opportunity for an experienced engineer to contribute to a fast-growing company and advance their career.

Our client fosters a culture of collaboration, personal growth, and technical excellence, and they welcome passionate individuals to join their team.

Job Description

As a Data Engineer with our client, you will collaborate with cross-functional teams comprising product managers, engineers, and business leaders to transform customer feedback into scalable data pipelines and products.

In the realm of marketing attribution for direct-to-consumer (DTC) markets, you will be instrumental in developing, enhancing, and maintaining intricate data transformations across a diverse network of touchpoints. This involves harnessing data from various advertising platforms, order management systems (e.g., Shopify, Amazon), and real-time events to facilitate seamless operations.

Success in this role hinges on your curiosity, expertise, and dedication to building robust data pipelines and applications at scale.

About the Role

Key Responsibilities:

  • Collaborate with customers, product teams, and customer support to conceptualize, develop, and refine high-value solutions.
  • Provide technical leadership to software engineers and shape long-term technical strategies with scalable architecture and best practices.
  • Design and maintain data ingestion pathways spanning APIs, file processing, and configurable inputs.
  • Lead end-to-end implementation of data and infrastructure engineering aspects within cross-functional initiatives.
  • Architect and optimize data pipeline infrastructure for readability, maintainability, and cost efficiency.
  • Interface with machine learning and data science systems to deliver actionable insights to customers.
  • Enhance and scale platform infrastructure as needed.
  • Develop comprehensive technical documentation for internal and external stakeholders.

About You

Requirements:

  • Proficiency in understanding data processing requirements for real-time and batch systems, encompassing transactional and analytical processing.
  • Experience with SQL, Java, Scala, Spark, and Python.
  • Proven track record in designing and deploying high-performance systems with robust monitoring and logging practices.
  • Familiarity with data pipeline orchestration tools and best practices.
  • Six to eight plus years of relevant experience.

Preferred Qualifications:

  • Experience with data engineering solutions such as BigQuery, Airflow, dbt, Kafka, and Pinot.
  • Previous involvement in marketing, e-commerce, or ad-tech domains.
  • Familiarity with cloud infrastructure, including Google Cloud Platform, Azure, or AWS.
  • Experience with or interest in leveraging infrastructure-as-code practices.

Values

As part of our client's community, we embody the following values:

  • Growth Mindset: We embrace continuous learning and personal development.
  • Customer Focus: Our primary goal is to ensure customer satisfaction through our products.
  • Ownership Mentality: We take ownership of our work and think like business owners.
  • Radical Candor: Transparency and direct feedback are integral to our communication.

Benefits

  • Equity package
  • Competitive base salary
  • Healthcare benefits (medical, dental, vision)
  • Opportunities for travel to team meetings
  • Flexible PTO policy
  • 12 company-paid holidays