Data Engineer
Data Engineer
We’re looking for Data Engineers to join a global digital consultancy with exceptional capabilities. They are a global company with a turnover in the billions and boast an impressive portfolio of clients. You’ll be working on a number of exciting and high priority projects with plenty to get stuck into from day one.
GCP is advantageous however other cloud platform experience will be strongly considered.
What you’ll be doing:
- Collaborate with client stakeholders to understand product and technical requirements
- Build data pipelines to ingest and transform the data into data platform
- Support large scale data movement, capture data changes and apply incremental data load strategies
- Develop, implement and tune large-scale distributed systems and pipelines that process large volumes of data
- Set-up data pipelines and monitor daily jobs
- Develop and test ETL components to high standards of data quality and act as hands-on development lead
- Oversee and contribute to the creation and maintenance of relevant data artifacts (data lineages, source to target mappings, high level designs, interface agreements, etc.)
- Contribute to the team through mentorship, improvement in ways of working, reviewing code and test plans, verifying that design best practices as well as coding and architectural guidelines, standards, and frameworks are adhered to, communicating risk, and addressing roadblocks as they arise
What you can bring
- Masters or Bachelor’s degree in Data analytics, Computer Engineering, Math, Statistics, Economics or related analytics field from top-tier universities with strong record of achievement
- 4+ years of experience in ETL (or) data engineering role in an analytics environment.
- Comfortable with Python (Pandas and NumPy), Java and writing complex SQL queries
- Working knowledge of cloud platforms (One or more of GCP, AWS, Azure) and related managed services
- Expertise in building data pipelines in Big-data platforms
- Good understanding of data warehousing concepts
- Strong verbal and business communication skills
- Strong business acumen & demonstrated aptitude for analytics that incite action
What would be advantageous to have
- Exposure to ETL tools like Airflow, Apache Beam etc. would be a plus
- Experience of working on Spark Framework and Hadoop ecosystem
- Experience in Scala, Scala-spark and Pyspark
- Knowledge of Shell scripting
What you can expect
- Bonus – 10%
- 6% matched pension
- Hybrid working – 2 days office/3 days remote
- Private healthcare