Senior Data Engineer (#2130)

Europe, Ukraine, Colombia
Work type:
Office/Remote
Technical Level:
Senior
Job Category:
Software Development
Project:
American brand for home crafters

Position Overview:

As a Data Engineer, you will play a crucial role in building and optimizing our data pipeline, which is central to our platform’s reliability and performance. You will be responsible for architecting and maintaining efficient data streams and integrating the latest technologies, with a particular focus on complex event processing (CEP), to support our strategic goals and build a robust data engineering ecosystem.

Data Engineering & Stream Processing:

  • Architect, develop, and maintain scalable data pipelines using AWS services such as Kinesis, Firehose, Glue, Lambda, and Redshift.
  • Implement and optimize stream processing solutions with frameworks like Apache Flink and Kafka to support real-time data ingestion, analytics, and complex event processing (CEP).
  • Design and build CEP systems to detect and respond to patterns of interest in real-time data streams.
  • Develop, optimize, and maintain the streams pipeline to ensure efficient, reliable, and scalable data processing.
  • Ensure data quality and reliability by implementing robust data validation, error handling, and monitoring frameworks.

Monitoring, Observability & Optimization:

  • Utilize Apache Flink for real-time stream processing and monitoring to handle complex event processing and data transformations effectively.
  • Use tools like Prometheus and Grafana for observability to ensure the health and performance of the data pipeline.
  • Continuously monitor, troubleshoot, and optimize data pipeline performance to handle billions of events per month.

Collaboration & Integration:

  • Work closely with cross-functional teams, including front-end developers and data scientists, to build a robust data platform that meets current and future business needs.
  • Participate in the design and architecture of scalable data systems, integrating new data sources, and optimizing existing data processes.
  • Write production-level code in Kotlin and Python to build data processing applications, automation scripts, and CEP logic.

Qualifications:

  • Bachelor’s degree in Computer Science or a related field, or equivalent experience.
  • 3+ years of experience in data engineering or a similar role.
  • Strong proficiency with AWS services such as Kinesis, Glue, Firehose, Lambda, and Redshift.
  • Expertise in stream processing frameworks like Apache Flink and Kafka.
  • Experience in designing and implementing Complex Event Processing (CEP) solutions for real-time data streams.
  • Solid experience in programming with Kotlin; familiarity with Python or other programming languages is a plus.
  • Demonstrated experience in developing, optimizing, and maintaining streaming data pipelines in large-scale environments.
  • Proven ability to build, optimize, and maintain complex data pipelines that handle billions of events per month.
  • Strong analytical and problem-solving skills with the ability to manage multiple complex projects and deadlines.
  • Good communication skills, with the ability to work effectively within cross-functional teams, and experience with Agile project management methodologies.

Preferred Skills:

  • Experience with cloud data infrastructure solutions in AWS.
  • Expertise in distributed data processing frameworks such as Apache Spark or similar.
  • Familiarity with OLAP databases such as Snowflake or Redshift.
  • Knowledge of C#/.NET core or developing and maintaining specific data processing applications.
  • Familiarity with data integration tools such as Metillion and experience with large-scale data environments.
  • Ability to work collaboratively and adapt to dynamic project needs.

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits