Lead/Senior Data Engineer (#1771)

REFERRAL BONUS
$1000
Ukraine, Europe
Work type:
Office/Remote
Technical Level:
Leader
Job Category:
Software Development
Project:
Intralogistics innovator

Currently our customer is gearing up to revolutionise our data landscape by building a cutting-edge Enterprise Data Lakehouse Platform. We are forming multiple teams that will spearhead the creation of the platform's foundational components. These teams go beyond traditional data ingestion; they are architects of a microservices-driven platform, providing abstractions that empower other teams to seamlessly extend the platform.

We are seeking a dynamic and highly skilled Data Engineer who has extensive experience building self -service enterprise scale data platforms with microservices architecture and lead these foundational efforts. This role demands someone who not only possesses a profound understanding
of the data engineering landscape but also has a very strong software engineering background specially building microservices frameworks and architectures. The ideal candidate will be an individual contributor as well as the technical lead and contribute significantly to platform development and actively shape our data ecosystem.

Requirements:

  • Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture.
  • Proficiency in building end to end data platforms and data services in GCP is a must.
  • Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub.
  • Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience.
  • Experience building Symantec layers.
  • Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads.
  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows
  • Hands-on experience with GCP ecosystem and data lakehouse architectures.
  • Strong understanding of data modeling, data architecture, and data governance principles.
  • Excellent experience with DataOps principles and test automation.
  • Excellent experience with observability tooling: Grafana, Datadog.

Responsibilities: 

  • Design and build self-service enterprise-scale data platforms using a microservices-based architecture from scratch in a greenfield environment.
  • Architect and implement microservices architectures. Create microservices frameworks and components that provide abstractions for seamless extension by other teams.
  • Build and integrate Symantec layers within the data platform to enhance functionality and efficiency.
  • Design and develop infrastructure for batch and real-time streaming data workloads, ensuring efficient and scalable data processing.
  • Implement comprehensive metadata management solutions, including data catalogues, data lineage, data quality, and data observability for big data workflows.
  • Utilize hands-on experience with the GCP ecosystem to build and maintain data lakehouse architectures that support the company’s data strategy.
  • Implement DataOps principles and automate testing processes to ensure continuous integration, delivery, and deployment of data solutions.
  • Develop and use observability tools like Grafana and Datadog to monitor platform performance and ensure system reliability.
  • Act as both an individual contributor and a technical lead, guiding the development of the platform and shaping the data ecosystem. Provide mentorship and direction to other engineers in the team.
  • Work closely with multiple teams to align platform development with business goals and technical requirements. Facilitate seamless integration and extension of platform components by other teams.

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits