Senior Cloud Data Engineer (#699)

Colombia
Work type:
Office/Remote
Technical Level:
Senior
Job Category:
Software Development
Project:
Canada’s leading online broker

N-iX is a software development service company that helps businesses across the globe develop successful software products. During 20 years on the market and by leveraging the capabilities of Easter Europe talents, the company has grown to 2000+ professionals with a broad portfolio of customers in the area of Fortune 500 companies and technological start-ups. With its headquarters in Lviv, Ukraine, the company also has multiple development offices in the East European region and representative entities in the United States of America, Sweden, and Malta.

Throughout the years of its operations, the company has developed strong expertise in such fields as digital turn-key solutions engineering, cloud services, big data & analytics, user experience design, engineering excellence, digital platforms integration, and its own R&D in different domains such as financial services & banking, telecommunications, e-commerce, automotive, manufacturing, and others.

The modern global trends in the IT industry have led the company to consider its expansion to the Latin American market to complement its expertise with high-end talents in the area.

Our client is a Financial Group of Companies committed to helping Canadians become more financially successful and secure. We are everything a traditional financial institution is not. Our vision is to revolutionize financial services for the benefit of Canadians by providing the most innovative and cost-efficient financial services online.


The Cloud Data Engineer will work with a primary focus on delivering our data platform footprint, as an expansion of our cloud presence, enabling key initiatives such as our Digital Banking. This role will be working with several teams in the organization in a collaborative manner, helping deliver data related components that enable productivity across QTG by leveraging automation to deliver self-service components as much as possible. This key individual will be part of a team that drives the implementation of data components such as pipelines, applications, sanitation, supporting our data scientists and working with the rest of the product teams while collaborating with the architecture team, influencing the design and delivery of our data footprint and striving for greater functionality in our data systems.

Responsibilities:

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Optimally extract, transform, and load data from a wide variety of data sources using SQL and Google Cloud data technologies
  • Collaborate with the team to decide on which tools and strategies to use within specific data integration scenarios
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs
  • Develop and maintain code and documentation for ETL and other data integration projects and procedures
  • Monitor and anticipate trends in data engineering, and propose changes in alignment with organizational goals and needs
  • Share knowledge with other teams on various data engineering or project related topics

Requirements:

  • Proficiency in SQL language
  • Strong knowledge of Python language
  • Experience in optimization of high volume ETL processes
  • Experience with any of the popular Clouds (GCP, AWS, Azure)
  • Good knowledge in Message Broker systems (e.g., Kafka, PubSub)
  • Google GCP data platform (Dataflow, Dataprep, Cloud Composer, BigIGQuery, CloudSQL) knowledge and experience is an asset, or knowledge of the equivalent open source toolset behind those products

Nice to have:

  • Good knowledge of popular data standards and formats (e.g, JSON, XML, Proto, Parquet, Avro, ORC, etc)
  • Data modelling skills
  • GCP Data Engineering Certification preferred

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Paid vacation days, sick leaves, and days off
  • Healthcare & Sport program
  • Medical insurance
  • Memorable anniversary presents
  • Corporate events and team buildings