Data Engineer (#14748265)

SPECIAL REFERRAL BONUS $1500
Work type:
Flexible (Office/Remote)
Technical Level:
Middle
Job Category:
Big Data
Project:
Moove

N-iX is looking for an experienced Data Engineer to join the global team behind a mobility company that aims to democratize vehicle ownership and financing on the entire continent.

About the role:
You will join a team of 5 Data professionals and Build a Data Lake platform from scratch using cutting-edge technologies. You will also be involved in developing new data pipelines, ingesting, and data orchestration.
The Data Team is a brand-new function that’s been created with data at the heart and minds of growing the business to be a data-led company. We want to hear from experienced and creative data analysts who want to use their skills to improve people’s lives and help us democratize mobility thru fintech.
 
Responsibilities:

  • Design, develop, test and maintain highly scalable data infrastructure
  • Build high-performance integrated data pipelines
  • Ensure standardized metadata, access protocols, and discovery mechanisms
  • Build data quality checks
  • Explore and integrate new big data technologies and software engineering tools into current infrastructures
  • Research opportunities for data acquisition and new uses for existing data
  • Collaborate with Data and IT team members in enhancing the data platform and data standards
  • Monitoring data collection, storage and retrieval processes

Requirements:

  • Advanced knowledge in a high-level programming language commonly used in ETL pipelines, such as Python or Scala
  • Experience in building ETL pipelines in AWS Glue
  • Experience in developing RESTful web services for data APIs
  • Strong Proficiency in SQL including a solid understanding of how to optimize query performance
  • Experience creating resilient ETL pipelines with workflow management tools such as Airflow
  • Working knowledge around data modelling best practices with modern data-warehouses such as Redshift, Snowflake
  • Experience with cloud computing platform (AWS, GCP, Azure) and UNIX environment
  • Experience in designing, implementing, and monitoring big data analytics solutions
  • Have fast learning capability and natural curiosity about big data
  • BS or MS in Computer Science or related fields
  • English level – upper-intermediate or higher

Nice to have:

  • Knowledge of distributed systems: Spark, Hadoop, Presto, Hive, etc
  • Message Queueing systems: Kafka, NSQ, SQS, etc
  • Database (Relational & NoSQL): PostgreSQL, MySQL, Cassandra, etc
  • Have productionized open-source data tools, including setting up CI/CD pipelines, IaC (Terraform), version control and technical documentation authoring
  • Experience working with dbt
  • Familiarity with Docker, Kubernetes

We offer:

  • ​Flexible working hours
  • ​A competitive salary and good compensation package
  • ​Best hardware
  • ​A masseur and a corporate doctor
  • ​Healthcare & sport benefits
  • ​An inspiring and comfy office

Professional growth:

  • Challenging tasks and innovative projects
  • ​Meetups and events for professional development
  • ​An individual development plan
  • ​Mentorship program

Fun:

  • ​Corporate events and outstanding parties
  • ​Exciting team buildings
  • ​Memorable anniversary presents
  • ​A fun zone where you can play video games, foosball, ping pong, and more