Senior Data Engineer (#2275)

Ukraine, Europe
Work type:
Office/Remote
Technical Level:
Senior
Job Category:
Software Development

N-iX is looking for a Senior Data Platform Engineer to join the team.

About the project

Our client is the leading retail chain in Uzbekistan, with about a million regular customers and more than 125 stores in 11 regions. The company operates supermarkets, neighborhood stores, convenience stores, a wholesale store, and an online supermarket. The client also runs two warehouses (dry and temperature-controlled) and a fruits/vegetables hub.

We aim to deploy a robust, flexible, and scalable Data Warehouse, BI platform, and analytics layer, along with data processing tools (e.g., Python workbooks) and integrations with Client’s IT systems. This project will enhance data governance practices and support significant company growth, laying the groundwork for advanced AI/ML solutions.

Responsibilities:

  • Design and build scalable data pipelines to ingest, process, and transform large volumes of data from various sources.
  • Implement ETL/ELT processes to integrate data into the Data Warehouse.
  • Develop and optimize queries for performance and scalability.
  • Maintain and support the Data Warehouse, ensuring high availability and reliability.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Monitor and troubleshoot data pipeline issues, ensuring data integrity and quality.
  • Implement best practices for data engineering, including code reviews, testing, and version control.
  • Work in an Agile, collaborative environment to build, deploy, and maintain data systems.
  • Ensure data solutions are compliant with security, privacy, and governance standards.
  • Develop and maintain comprehensive documentation for all data engineering processes.

Requirements:

  • Bachelor’s degree in Computer Science, Information Systems, or a related field.
  • Proven experience (5+ years) as a Data Engineer, Data Architect, or similar role.
  • Proficiency with ETL tools and frameworks, such as Apache Airflow, dbt, AWS Glue.
  • Strong knowledge of Data Warehouse concepts and experience with platforms like Snowflake, AWS Redshift, or similar.
  • Experience in designing and managing scalable Data Lakes using cloud platforms (e.g., AWS S3, Azure Data Lake), optimized for large-scale data storage and retrieval.
  • Expertise in Avro, Parquet, and ORC formats, ensuring efficient data storage, schema evolution, and high-performance processing in distributed environments.
  • Solid experience with SQL and relational databases (e.g., PostgreSQL, MySQL, SQL Server).
  • Familiarity with big data technologies (e.g., Hadoop, Spark).
  • Strong programming skills in Python or other relevant languages.
  • Experience with cloud platforms, particularly AWS.
  • Understanding of data governance, data quality, and data security best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills, both verbal and written.

Nice to Have:

  • Experience with SAP BW/4HANA, SAP S/4HANA.
  • Knowledge of data visualization and BI tools, like Tableau, PowerBI.
  • Understanding of AI/ML concepts and integration within data pipelines.
  • Certification in cloud services (AWS, Azure, GCP).

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits
×

Easy apply


    or
    Refer a friend