Hadoop & ETL Engineer

You will be part of the Automation Team who is responsible for creating automated tests to ensure SQream DB integrates smoothly with HDFS environment. You will own the HDFS domain within the automation team, analyzing and testing data flows, configurations, performance, and high availability solution.

Major Responsibilities

  • Perform detailed analysis/design of functional and technical requirements and translate them to automated testing in the Big Data environment
  • Testing SQream DB on HDFS integrated environment
  • Maintain data platforms and environments (Hadoop, private cloud instances, public cloud services, GPU-based development server for advanced computing)
  • Provide subject matter expertise on Hadoop environment
  • Clearly articulate the pros and cons of various technologies
  • Document use cases, solutions and recommendations
  • Perform detailed analysis of business problems and technical environments


  • 3+ years of experience with HDFS / Spark / Kafka / Hive
  • Deep hands on experience with Spark performance tuning
  • ETL/ELT experience is a must
  • Strong Python coding skills
  • Strong data modeling skills
  • Knowledge of Data Warehousing, Data Governance, OLAP and Big Data
  • Extremely proficient with SQL
  • Strong experience with Linux OS
  • Familiarity with technologies such as Hbase, MapReduce, BigTable (google) – an advantage
  • Excellent written and verbal communication skills
  • Bachelor’s Degree in Computer Science or a related area
  • Apply now

We're always looking for great people

Got what it takes to work with us? Great! Send us a link to your CV or resumé.


Similar Openings

Tel Aviv, IL
High-performance C++/CUDA Software Developer

Tel Aviv, Israel

View Posting
Tel Aviv, IL
GPU Software Developer

Tel Aviv, Israel

View Posting
Tel Aviv, IL
BI Developer for Automation Testing

Tel Aviv, IL

View Posting