All vacancies

Hadoop Software Engineer


We are looking for Hadoop Software Engineer for our Information Management practice. The role is involved in designing, implementing and managing the data and analysis platforms and infrastructure, and works closely with Architects, Data Analysts, and others to determine what data management systems are appropriate for specific solutions. The role builds and maintains data management systems that combine core data sources into data lakes or other accessible structures to support integrations, reporting and analytical systems. In this role you will be focused on delivery and support for our current big data platform. The ideal candidate has prior experience working in big data, experience with Hadoop, and deep experience in traditional relational databases, and data integration. You will be involved in growing and developing the Hadoop platform – both on premises and in the AWS cloud.
Our customer is Fortune 500 company. As a leading business-to-business organization, more than 3.2 million customers rely on its products in categories such as safety, material handling and metalworking, along with services like inventory management and technical support. ​

Core Responsibilities:
  • Build and maintain data management systems
  • Perform data conversion, data imports, data exports and transformations
  • Support a Hadoop installation leveraged primarily by Data Scientists and the Marketing organization
  • Develop and maintain a data dictionary
  • Design and implement processes to ensure data integrity and standardization
  • Continuously improve the Hadoop environment to ensure they are highly performant and provide an optimal end user experience
  • Implements data reliability, efficiency, and quality improvements
  • Ability to bring complex concepts into the organization and mentor others
  • Demonstrated ability to learn independently as well as from others. Push for improvement by bringing new ideas into the organization
Position Requirements:
  • Proven experience with big data technologies (Hadoop, Sqoop, Hue, MapReduce, Kafka, Hive, Spark, Ranger)
  • 2+ years of experience within Data Warehousing and Analytics with demonstrable experience in data integration and data warehouse projects
Would be a plus:
  • Prior experience building on Cloud technology providers (preference for AWS but open to Azure or GCP)
 
We offer:
  • Competitive salary and compensation package
  • Possibility to join a team of professionals with 10+ years of experience
  • Interesting project
  • Challenging tasks
  • Possibility to take part in creating and designing hardware and firmware from the scratch
  • Flexible work schedule
  • 18 business days of payable annual free-time
  • 10 days of paid sick leave
  • Gym, Masseur, and Doctor
  • An inspiring and comfy office
  • Regular office fruit delivery
Professional growth
  • Challenging tasks and projects
  • An individual development plan
  • A personal education budget
  • A regular performance appraisal
  • Mentorship opportunities
  • Business trips
Fun
  • Corporate events and outstanding parties
  • Exciting team buildings
  • Memorable anniversary presents
  • A fun zone where you can play video games, foosball, ping pong, and more
Apply now

Know someone who is a perfect fit for this position? Refer a friend and get a bonus!