Candidates: Create an Account or Sign In
Senior Data Engineering Manager
London - Hybrid (2 days/week, usually Tues & Fri)
12-month contract
Day rate: £700 (umbrella)
About the Role
Join a critical data remediation programme within the Surveillance Team/Lab, part of the Markets Platform area of a major financial services business.
You will lead design and delivery of stable, scalable, and performant data solutions within a complex GCP architecture, driving simplification and improving pipeline automation to enable timely, secure data delivery to surveillance systems.
Key Responsibilities
Engineer stable, scalable, performant, accessible, testable, and secure data products aligned with endorsed technologies and best practices
Lead the design, build, and optimisation of GCP-based data pipelines for critical surveillance and remediation projects
Apply common build patterns to minimise technical debt and adhere to group policies and frameworks for build and release
Participate actively in design and implementation reviews, Agile ceremonies, and planning to set clear goals and priorities
Identify opportunities to automate repetitive tasks and promote reuse of components
Engage with technical communities to share knowledge and advance shared capabilities
Lead incident root-cause analysis and promote active application custodianship to drive continuous improvements
Invest in your ongoing development of technical and Agile skills
Collaborate closely with architecture, security, and compliance teams to ensure solutions meet regulatory requirements
Manage and enhance CI/CD automated pipelines supporting data engineering workflows
Support potential transition planning to vendor surveillance products
Core Skills & Experience
Proven hands-on experience with Google Cloud Platform (GCP) and components such as Dataflow, Pub/Sub, Cloud Storage
Deep expertise in BigQuery and Big Data processing
Strong programming skills in Python or Java used extensively in surveillance data integration
Solid SQL expertise for querying, transforming, and troubleshooting data
Experience building robust ETL pipelines from diverse source systems
Familiarity with CI/CD pipeline automation and enhancement
Understanding of Agile delivery frameworks, including Jira and sprint planning
Knowledge of Terraform for infrastructure as code provisioning
Experience with containerization technologies to deploy and manage environments
Ability to simplify complex architectures and communicate effectively across teams
Strong stakeholder management and influencing skills
Experience in financial services or surveillance system data engineering
Understanding of data governance and data quality best practices
Exposure to vendor-based surveillance solutions
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you