Data & Analytics Engineer
About Us
BACB is a UK bank that offers trade finance and complementary products to clients in specialist markets, especially Africa and the Middle East.
We have been helping businesses with trade finance and complementary products for over half a century, focusing on trade flows to and from Africa and the Middle East as well as real estate in the UK.
Our in-depth knowledge of the countries and practices where our clients operate ensures that we put them first.
Additional Info
- Hybrid Working: 3 days onsite, 2 from home
- Location: City of London
- Contract Type: Permanent
Job Summary
We are seeking a driven and technically proficient Data & Analytics Engineer with a minimum of 7+ year of experience to join a small, agile technology team within the Software Engineering Department of BACB. This is an exciting opportunity for a hands-on engineer with a passion for data engineering and a track record of delivering innovative data & analytical solutions.
Your Impact
As a key member of our growing data engineering team, you’ll be instrumental in designing, building, and deploying scalable data pipelines and end-to-end analytics platforms. Leveraging technologies such as Databricks, Power BI, Informatica, Python, SQL and SQL Server, you’ll help lay the foundation for a cloud-first data strategy that supports business-critical decision-making.
Why Join Us?
Our Bank is embarking on a cutting-edge digital transformation and tech modernization initiative, focused on building next-generation data and analytics capabilities from the grounds up. You’ll work in a collaborative and forward-thinking environment, with the opportunity to shape strategic data infrastructure and contribute to truly impactful data & analytics initiatives.
What We’re Looking For
- Hands-on expertise in modern data engineering tools and platforms
- A proactive mindset and enthusiasm for tackling complex data challenges
- Excellent communication and strong articulation skills
- Experience working in cloud environments and developing robust, scalable ETL pipelines
- The ability to translate technical solutions into real business value
Key Work Outputs and Accountabilities
- Design and develop robust ETL pipelines to enable seamless data integration and migration from diverse sources into a centralized data lake
- Implement and maintain the Medallion Architecture, ensuring alignment and consistency across raw, enriched, and curated data layers
- Enforce data governance and compliance through advanced validation and cataloguing tools such as Azure Purview, Unity Catalog, and equivalent frameworks
- Optimize data storage solutions—including data lakes, warehouses, and SQL databases—to support scalable analytics and business intelligence initiatives
- Build and maintain interactive dashboards and reports using Power BI, empowering stakeholders with real-time access to actionable insights
- Maintain and modernize legacy systems, built with SQL, Stored Procedures, SQL Server, Data Cubes and Analysis Services
- Leverage DevOps and CI/CD best practices, utilizing Git and Azure DevOps to streamline development, testing, and deployment
- Contribute across the full SDLC, from requirements gathering and analysis to design, build, testing, and production rollout
- Collaborate with business analysts and stakeholders to translate business needs into scalable, data-driven technical solutions
- Communicate data insights effectively to non-technical audiences, providing analytical support and clarity across business functions
- Manage end-to-end project delivery, covering requirement analysis, technical design, development, testing, and deployment phases
- Document the data ecosystem thoroughly and lead knowledge transfer to stakeholders, ensuring sustainability and supportability
Required Qualifications and Experience
- Extensive hands-on expertise with Databricks, Power BI, Azure technologies, and Informatica
- Proficient in Python and advanced SQL, with a strong command of data manipulation and transformation techniques
- Demonstrated experience with Informatica, Informatica eco system
- Proficiency in Azure Data Analytics tools including Synapse, Data Factory, Databricks, ADLS, and Power BI
- Working knowledge of SQL, Stored Procedures and SQL Server technologies such as Analysis Services (SSAS), Reporting Services (SSRS), and advanced Excel—considered a plus
- Familiar with data governance, lineage, and access control tools to support data security and compliance
- Understanding of GDPR and data security best practices, including robust access control policies
- Comfortable working in Agile practises, with hands-on experience in CI/CD pipelines and source control using Git and Azure Repos
- Strong analytical and problem-solving skills, with the ability to diagnose and optimise complex data processes
- Excellent interpersonal and communication skills, enabling effective collaboration across technical and non-technical teams
- Adept at translating business requirements into scalable, high-impact technical solutions
- Meticulous attention to detail with a clear commitment to data quality, reliability, and accuracy
- Knowledge of T24 Core Banking systems and familiarity with BACB’s core business domains—including Trade Finance, Treasury, and Real Estate—would be advantageous
- Department
- Information Technology
- Role
- Associate
- Locations
- London
- Remote status
- Hybrid