AWS/ BigData Developer (#12724348)

Work type:
Flexible (Office/Remote)
Technical Level:
Job Category:
Big Data
W. W. Grainger

Company Overview:
Our client is a global leading broad line supplier of facilities maintenance products serving businesses and institutions. Our 18,000 employees are driven to serve customers and the community in exceptional ways focusing on delivering the highest level of service. The team works closely with customers to better understand their challenges and provide cost-saving solutions. Customer's employees serve customers more than 115,000 times every day through multiple channels. As part of a high-performing team, you’ll be able to develop your talents, and make a difference. The client is a Fortune 500 company and a perennial member of Fortune magazine's Most Admired Companies list.

Position Description:
We are seeking an AWS BigData Engineer for our Information Management practice.  The role is involved in designing and buiding the AWS BigData platform from scratch. The role builds data management systems that combine core data sources into data lakes or other accessible structures to support integrations, reporting and analytical systems. The ideal candidate has prior experience working in big data, AWS, experience with Hadoop, and deep experience in traditional relational databases, and data integration 

Duties and Responsibilities:

  • Build and maintain AWS BigData solution.
  • Perform data conversion, data imports, data exports and transformations.
  • Design and implement processes to ensure data integrity and standardization.
  • Continuously improve the Hadoop environment to ensure they are highly performant and provide an optimal end user experience.
  • Implements data reliability, efficiency, and quality improvements.
  • Ability to bring complex concepts into the organization and mentor others.
  • Demonstrated ability to learn independently as well as from others. Push for improvement by bringing new ideas into the organization.

Position Requirements:

  • Hadoop experience, familiarity with HDFS/Hive in particular
  • Python (PySpark a plus) and Java
  • AWS experience, especially EC2, S3, and EMR (Cloudformation/other infrastructure scripting a plus)
  • Bash scripting
  • Git
  • SQL
  • Test-driven development (unit testing)
  • RDBMS/ETL experience
  • Streaming data (Kafka/Kinesis/etc.)
  • CI/CD, Docker
  • ETL orchestration (Airflow/Luigi)
We offer:
  • Flexible hours approach
  • Competitive salary and compensation package
  • 18 business days of payable annual free-time
  • 10 days of paid sick leave
  • Personal education budget
  • Membership of the IT-BPO club card, loyalty program.
  • Gym, Masseur, and Doctor
  • Relax zones


  • Corporate events and outstanding parties
  • Team buildings
  • Anniversary presents