About the Job

A cutting-edge web3 startup is leveraging NFTs to tokenize real estate data on an Avalanche subnet. This company is revolutionizing how municipalities track and measure code enforcement for real estate properties.

Responsibilities:

  • Design and implement data pipelines for ingesting and processing real estate and code enforcement data.
  • Develop ETL processes to integrate data from various sources into the PostgreSQL database.
  • Create and maintain data models to support efficient querying and analysis.
  • Implement data quality checks and monitoring systems.
  • Collaborate with backend developers and data analysts to ensure data availability and integrity.

Requirements:

  • 3+ years of experience in data engineering in Python.
  • Experience with relational databases (e.g., PostgreSQL), including performance tuning and optimization.
  • Proficiency in designing and implementing ETL processes.
  • Familiarity with data modeling techniques and best practices.
  • Familiarity with data pipeline orchestration (e.g., Airflow, GitHub Actions).

Nice-to-Have:

  • Experience with or interest in smart contract development using Solidity.
  • Understanding of blockchain technologies, particularly Avalanche.
  • Knowledge of tokenization concepts and NFTs in the context of real estate data.