Data Platform Team Lead
Remote (UK timezone preferred, GMT+0)
Dwelly — a UK-based, AI-enabled lettings and property management platform, that is growing through a roll-up strategy acquiring estate agencies. The company leverages two arms: i) acquiring existing letting agencies, effectively buying its highly sticky, recurring revenue-type landlords portfolios, and then ii) building a top-notch technology to automate tenant management, payments, and post-rental property maintenance. The company seamlessly integrates AI services to automate all business processes within brick-and-mortar real estate agencies, integrating them into a tech-enabled digital letting platform in two months to radically improve the user experiences and increase efficiency of the business.
We’re a fast-growing, product-focused company, backed by top-tier investors and led by a team with deep experience in real estate, technology, and operations.
The Role
We’re looking for a Data Platform Team Lead to own and build our data engineering function. You will be the second hire on the team, with two additional engineers to be hired under your leadership. As the team is still small, we expect you to be hands-on and spend around 40–50% of your time writing code and designing systems.
This role is mission-critical and covers three core areas of the business:
Key Responsibilities
1. CRM Integration & Data Ingestion
Each business we acquire uses a different real estate CRM system (one of the top-10 most widely used in the UK). Your team will be responsible for building custom data extraction tools for each CRM — extracting lease contracts, landlord and tenant data, financial records, and more — and piping that into our systems. This will involve:
Reverse-engineering or working with APIs/databases of major CRM vendors
Designing modular, maintainable ingestion services
Automating “one-click” data exports for seamless onboarding
2. Data Architecture, Optimization & Compliance
Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability.
Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters.
Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments.
3. Data Infrastructure, Analytics Platform & Ops Automations
You’ll also lead the buildout of our central analytics infrastructure, making it easy for our operations, finance, and DS teams to get the insights they need. This includes:
Building production processes with complex logic, for example, using CV libraries and flexible cloud expansion for quick recalculations of historical data. The calculation logic and python notebooks will be provided by DS, your responsibility is to wrap it up as a production pipeline.
Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability.
Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters.
Designing and maintaining robust ETL/ELT pipelines for different sources and different types of data, including phone calls, emails, scrapping public reviews, еtс.
Building and optimizing our data warehouse on BigQuery
Setting up foundational data governance practices (lineage, quality checks, cost monitoring, etc.)
Choosing the right tools from the Google Cloud stack and applying them effectively to balance performance, cost, and maintainability.
Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments.
Requirements
You’re a great fit if you
Turn ideas into code that’s clean, structured, and elegant
Have not only written code for isolated components but also designed systems that were later implemented by you and your teammates
Worked with different data pipeline technologies and understand their strengths and trade-offs, including: message queues, distributed systems, data warehouses, orchestration tools
Solved more than one tricky performance optimization challenge in data pipelines
Know how to choose a practical and reliable way to monitor data quality
Have a solid grasp of core data warehouse design patterns and can “speak SA' language” fluently
You’re a perfect fit if you
Worked in a large company, led a team of 10–15 engineers, got tired of legacy systems or bureaucracy — and now want to build everything from scratch, properly, and without illogical legacy
Have hands-on experience with at least one cloud provider (Google Cloud, AWS, Azure) across storage, compute, and security
Built truly complex pipelines at the crossroads of very different technologies (e.g., image or audio analysis with auto-scaling compute, cascading ML libraries, and calls to LLM APIs)
Can set up CI/CD for the point above
Conditions
Remote (UK timezone preferred)
Market compensation
What is it like being a Dwell-er
Feel free to check out Dwelly Core Principles. That’s about what we believe in, how we operate and make decisions.
Customer obsession rather than competitive focus
Passion for invention
Operational excellence
Long-term thinking
Published on: 11/28/2025

Dwelly
An AI-powered platform that supports lettings and property management by connecting agents, landlords, and tenants through a digital marketplace.
Please let Dwelly know you found this job on Wantapply. It helps us get more jobs on our site. Thanks!