Skills:Data Engineer | Location: Sydney , New South Wales , Australia
Views:120
Role: Data Engineer Location: (Remote Role) Mandatory Skills: Unix, Jenkins/GitLab, SQL, Python, (Pyspark/Scala), Databricks & Spark Mandatory Responsibilities: Delta Tables, Merging & Design Desired Skills: IDE Good To Have: Python API, Terraform & AWS (IAM & S3) YOE: 10-18 Basic Unix Knowledge: (Mandatory) Must be able to work in mac Resolve environment setup issues with minimal help from others Basic Devops skills: (Mandatory) Understanding of source control, feature branch, resolve code conflict issues Experience in working with build pipelines using Jenkins or GitLab Must be comfortable working with IDE Basic SQL skills: (Mandatory) Inner join, left joins etc Basic AWS skills: (Secondary) Understanding of IAM, S3 Resolve access issues related to IAM Policy, trust relationship etc. At least 5 years of development experience using Python (Mandatory) At least 1 year of experience working with Pyspark or Scala: Basic understanding of ETL (Mandatory) Basic knowledge of Databricks clusters (Mandatory) Basic knowledge of spark (Mandatory) Must have worked in Delta Tables, Merging (Mandatory) Must have used python api to call elasticseaarch (Secondary) Basic Design skills. Must be able to talk about the design work done in earlier projects (Mandatory) Must be willing to work with fluid requirements in fast paced environment (Mandatory) Must be comfortable in working with 5-6 code bases at a time (Mandatory) Basic Terraform knowledge (Secondary) Reference : Lead Data Engineer jobs
Register/Login
Quick Apply