DevOps Data Engineer
DevOps Data Engineer required by Whitehall Resources on an initial 3 month contract based in Berkshire.
Role & Responsibilities:
– The bulk of the data engineer’s work would be in building, managing and optimizing data pipelines and then moving these data pipelines effectively into production for key data and analytics consumers (for example business/data analysts, data scientists or any user that needs curated data for data and analytics use cases).
– Data engineers need to guarantee compliance with data governance and data security requirements while creating, improving and operationalizing these integrated and reusable data pipelines.
– The data engineers will be measured on their ability to integrate analytics and (or) data science results.
– The newly hired data engineers will be the key interface in operationalizing data and analytics on behalf of the business unit(s)
– This role will require both creative and collaborative working with the wider business. It will involve evangelizing effective data management practices and promoting better understanding of data and analytics.
– The data engineers will also be tasked with working with key business stakeholders, IT experts and subject-matter experts to plan and deliver optimal analytics and data science solutions.
– Data engineers will also be expected to collaborate with data scientists, data analysts and other data consumers and work on the models and algorithms developed by them in order to optimize them for data quality, security and governance and put them into production leading to potentially large productivity gains.
Candidates required/essential experience:
– Experience in DevOps on Azure and VSTS for Analytics Platform in Azure/VSTS
– Extensive experience of developing using the Azure analytics components including Data Lake Store, Power BI and Microsoft Visual Studio, Data Factory, HDInsight, Data Factory, SQL DB/DW and SSIS
– Fully conversant with Agile and DevOps development methodology and concepts as applied to data driven analytics projects. Including CI/CD Coding, security testing best practice and standards.
– Experience with designing, building, and operating analytics solutions using Azure cloud technologies
– Data Management experience e.g. data profiling, large volume data handling
– Experience in automated data driven testing
– Experience in ETL Tooling
– Scripting Languages: R, Scala, Python (at least one of them)
– DB Programming: relational dbs (SQL) , nonrelational (NoSQL,MongoDB, Cassandra)
– Proactive self-starter who can work independently and, in a team
– Highly organised, with attention to detail
– Results orientated
– Adaptable to changing requirements
– Good communication skills
All of our opportunities require that applicants are eligible to work in the specified country/location, unless otherwise stated in the job description.