Big Data Engineer

TLV · Full-time · Experienced

About The Position

As a Big Data Engineer, you will be part of our Customer Delivery Team that is executing POCs and implementing SQream with significant interaction with SQream's customers around the world.

You will lead analysis and implementation of SQream big data technology stack, implementing best practices from architectural design through administration, creating highly scalable and extensible cloud-based and on premise DB solutions integrated with customers’ existing echo systems of huge data sets.

You’ll be working closely with our Product, Sales and R&D teams, as well as our customers, on a wide variety of business solutions. Together, we aspire to provide complete customer satisfaction for our current and pending projects.

This job requires travel.

  • Contribute to overall data warehouse architecture, framework, and design patterns to store and process high data volumes
  • Execute POC’s for potential opportunities as part as the presale process of the product both at customers sites or remotely
  • Design, implement and fine tune SQream’s big data solutions according to customers' needs as part of SQream DB implementation both at customer sites or remotely
  • Conduct technical training for customers, partners and integrators of SQream products
  • Integrating different Big Data tools and frameworks required to provide desired solution
  • Implement ETL tools for customized Big Data solutions
  • Maintain ongoing communications with prospects, customers and partners along their engagement with SQream
  • Maintain ongoing communication with Product and R&D to ensure continuous improvement of SQream products

Requirements

  • 2-5 years of experience in Data Engineering, Data Ingestion methodologies, Advanced SQL programming, and Data modeling on RDBMS
  • This job includes travel in the amount of ~30%
  • Good Experience with Linux and DB performance testing
  • Excellent knowledge of SQL
  • Performance tuning for Big Data platforms and data modeling (optimize/rewrite SQL queries )
  • Experience with integration of data from multiple data sources
  • Excellent written and oral communication skills in English

Nice to Have

  • Experience with different storage systems
  • Java/Python programming experience
  • Cloud-based Data Warehousing Applications
  • Familiar with different ETL tools (such as Talend, Informatica and more) and experience in implementing complex ETL processes
  • Experience with big data tools such as Kafka, Spark, Hadoop, snowflake, etc.
  • Experience with different BI Tools (like Tableau, Qlik, etc. )

Apply for this position