Remote, Dnipro, Vinnytsia, Zaporizhzhia, Lviv, Kyiv
The Questrade Technology Group (QTG) is home to a unique environment, where our culture thrives and most importantly, we get stuff done! Questrade is continuing with its digital transformation initiative and our infrastructure footprint is growing beyond our data centres and into the Google Cloud Platform. We are currently working towards our exciting strategy that is driven by business value. Join us and help solve some complex challenges such as handling low latency and high traffic market data, event streams and messaging, in a hybrid cloud environment within an industry that has so much room for disruption.
The Cloud Data Engineer will work with a primary focus on delivering our data platform footprint, as an expansion of our cloud presence, enabling key initiatives such as our Digital Banking. This role will be working with several teams in the organization in a collaborative manner, helping deliver data related components that enable productivity across QTG by leveraging automation to deliver self-service components as much as possible. This key individual will be part of a team that drives the implementation of data components such as pipelines, applications, sanitation, supporting our data scientists and working with the rest of the product teams while collaborating with the architecture team, influencing the design and delivery of our data footprint and striving for greater functionality in our data systems.
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Optimally extract, transform, and load data from a wide variety of data sources using SQL and Google Cloud data technologies.
- Collaborate with the team to decide on which tools and strategies to use within specific data integration scenarios
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
- Develop and maintain code and documentation for ETL and other data integration projects and procedures.
- Monitor and anticipate trends in data engineering, and propose changes in alignment with organizational goals and needs.
- Share knowledge with other teams on various data engineering or project related topics.
- Minimum 3+ years of experience working in the data engineering field.
- Proficiency in SQL language.
- Strong knowledge of Python language.
- Experience in optimization of high volume ETL processes.
- Experience with any of the popular Clouds (GCP, AWS, Azure).
- Good knowledge in Message Broker systems (e.g., Kafka, PubSub).
- Data modelling skills.
- Good knowledge of popular data standards and formats (e.g, JSON, XML, Proto, Parquet, Avro, ORC, etc)
- Experience in the financial industry is an asset.
- Google GCP data platform (Dataflow, Dataprep, Cloud Composer, BigIGQuery, CloudSQL) knowledge and experience is an asset, or knowledge of the equivalent open source toolset behind those products
- Flexible working hours
- A competitive salary and good compensation package
- Best hardware
- A masseur and a corporate doctor
- Healthcare & sport benefits
- An inspiring, comfy, clean, and safe office
- Challenging tasks and innovative projects
- Meet-ups and events for professional development
- An individual development plan
- Mentorship program
- Corporate events and outstanding parties
- Exciting team buildings
- Memorable anniversary presents
- A fun zone where you can play video games, foosball, ping pong, and more