Job Description

Responsibilities:

  • Develop and maintain robust data pipelines and architectures.
  • Collaborate with cross-functional teams to understand data requirements and implement solutions.
  • Optimize and scale data systems to improve performance and efficiency.
  • Manage and integrate data from various sources using technologies such as Kafka and RabbitMQ.
  • Monitor data quality and implement strategies for data governance.
  • Utilize Python, Postgres, Redis, ELK, QuestDB, and other tools to create reliable data solutions.
  • Ensure the security and integrity of data platforms.

Requirements:

  • Proven experience as a Data Engineer or in a similar role.
  • Strong proficiency in Python programming.
  • Hands-on experience with databases like Postgres and Redis.
  • Familiarity with data processing tools such as ELK and QuestDB.
  • Experience with messaging systems like Kafka and RabbitMQ.
  • Ability to use workflow management platforms like Airflow.
  • Excellent problem-solving skills and attention to detail.

To see more jobs that fit your career