Principal Data Engineer
Introduction
Our client, who has built a leading rebate management platform, is looking for a Principal Data Engineer who will be looking after the development and operations of the data platform.
Important
The Job
Principal Data Engineer
We are on the lookout for a Principal Data Engineer. This is a key role that will be accountable for the development and operations of the company's Data Platform to drive maximum value from data for business users and in line with company best practices. You will work as part of a cross-functional agile delivery team, including front and back end engineers, customer support, product managers and infrastructure.
You will have the opportunity to work on complex problems, implementing high performance solutions that will run on top of a cloud based big data platform in Azure.
Accountabilities
- Work as part of the Data Engineering team to uphold and evolve common standards and best practices, collaborate to ensure that data solutions are complementary and not duplicative.
- Build and maintain high-performance, fault-tolerant, secure, and scalable data platform to support multiple data solutions use cases.
- Interface with other technology teams to design and implement robust products, services and capabilities for the data platform making use of infrastructure as code and automation.
- Build and support platforms to enable the data engineers and product engineers to build the cloud based big data platform.
- Create patterns, common ways of working, and standardised guidelines to ensure consistency across the organisation.
- Help to engineer platform ingestion, data warehouse/data lake and API strategies for the data management ecosystem
- Work with DevOps team to provide high availability data solutions that scale geographically
Experience
- Strong experience on Cloud architecture/administration in production environments.
- Experience with object oriented and functional design, coding, and testing patterns as well as experience in engineering software platforms and large-scale data infrastructures.
- Experience writing production quality code in C#/Python/Bash/Powershell/Go, etc
- Experience of building and maintaining distributed platforms to handle high volume of data.
- Strong platform-level design, architecture, implementation and troubleshooting skills.
- Outstanding knowledge of MS SQL
- Good understanding of Enterprise patterns and best practices applied to data engineering and data science use cases at scale.
- Good understanding of cloud storage and orchestration, and computing platforms (especially document/blob stores, GraphDB Kafka, Airflow, Elastic, Spark ).
- Good understanding of DevOps/DataOps in an Agile Environment, familiarity with Jira and Confluence.
- Experience of Docker/Kubernetes would be beneficial.
- Expertise in Databases (Postgres, MySQL, etc)
- Solid experience of network and security on cloud-based environments, specifically on cloud services such as VPCs, Security Groups, NACLs and IAM roles.
- Deep understanding of CI/CD using tools like Jenkins/CircleCI/Azure Data Pipelines, along with deep experience in source control like Git.
Competencies
- Great problem-solving skills, and the ability and confidence to hack their way out of tight corners.
- Ability to prioritise and meet deadlines.
- Conscientious, self-motivated, and goal orientated.
- Excellent attention to detail and solid written and verbal english communication skills
- Willingness and an enthusiastic attitude to work within existing processes / methodologies