Our people work differently depending on their jobs and needs. From home working to job sharing, visit the to find out more. This role is based in India and as such all normal working days must be carried out in India. Join us as a Data Engineer This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You'll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers and the bank's data safe and secure Participating actively in the data engineering community, you'll deliver opportunities to support the bank's strategic direction while building your network across the bank What you'll do As a Data Engineer, you'll play a key role in delivering value for our customers by building data solutions. You'll be carrying out data engineering tasks to build a scalable data architecture including carrying out data extractions, transforming data to make it usable to analysts and data scientists, and loading data into data platforms. You'll also be: Developing comprehensive knowledge of the bank's data structures and metrics, advocating change where needed for product development Building automated data engineering pipelines through the removal of manual stages Working closely with core technology and architecture teams in the bank to build data knowledge and data solutions Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions The skills you'll need To be successful in this role, you'll need to be an entry level programmer and Data Engineer with a qualification in computer science or software engineering. You'll also need a good understanding of data usage and dependencies with wider teams and the end customer, as well as a proven track record in extracting value and features from large scale data. It's essential that you have experience creating and managing Kubernetes clusters as well as experience working with AWS Cloud services, including networking, such as VPCs. It'll be ideal if you have experience in creating and managing Spark and Elastic clusters. You'll also demonstrate: Knowledge of Terraform scripting Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Extensive experience using RDMS, ETL pipelines, Python, Hadoop and SQL A good understanding of modern code development practices Good critical thinking and proven problem solving capabilities Effective written and verbal communication skills