Remote, Dnipro, Vinnytsia, Zaporizhzhia, Lviv, Kyiv
N-iX is looking to hire a Data Architect who will be involved in the development of a functionally rich product part, following through all stages of the Agile software development life cycle from inception to implementation.
Our customer is the world’s first regulated Digital Asset Bank, founded on Swiss and Singapore heritage, operating globally. We make digital assets bankable, secure and convenient, empowering our clients to invest in the digital asset economy with complete trust. Major currencies and digital assets, including Bitcoin, Ethereum and digital CHF and asset tokens are seamlessly integrated in one account.
- You will serve a pivotal role in the oversight, architecture, and implementation of our new Data Lake Project.
- You will also collaborate with our Data Development Team to understand current-state data schemas and data models then contribute to defining future-state data schema, model, and flow.
- You will provide input in each cycle of the development phase (develop, test, and release) to ensure we produce a leading-edge solution and we continually learn and evolve, adapting to changing business landscapes.
- You will verify the stability, interoperability, portability, security, or scalability of existing and new data architectures.
- You will establish data quality check approaches, tools, and data governance where ever necessary, to ensure the delivery of trusted and accurate data.
- Provide mentorship, technical expertise and recommendations on the current and emerging data strategies and platform trends to our team of data engineers and scientists.
- Strong familiarity with AWS managed services for data engineering/analytics such as EMR, Athena, MSK, Kinesis, etc;
- Familiarity with Apache/CNCF self-hosted services for data analytics: Cassandra, CockroachDB, Droid, Pig, etc;
- Familiarity with Hadoop ecosystem;
- Strong familiarity with event sourcing, domain-driven design and Complex Event Processing;
- Strong familiarity with Apache Kafka, Kafka connectors and stream processors such as Kafka Streams, Spark Streaming, Flink or Storm;
- Familiarity with data pipeline schedulers such as Luigi or Airflow is a plus;
- Familiarity with visualization tools such as Power BI or Tableau is a plus;
- Familiarity with Apache Beam is a plus;
- Flexible working hours
- A competitive salary and good compensation package
- Best hardware
- A masseur and a corporate doctor
- Healthcare & sport benefits
- An inspiring and comfy office
- Challenging tasks and innovative projects
- Meetups and events for professional development
- An individual development plan
- Mentorship program
- Corporate events and outstanding parties
- Exciting team buildings
- Memorable anniversary presents
- A fun zone where you can play video games, foosball, ping pong, and more