
Data Platform Team Lead (Data Engineer)
Dwelly โ a UK-based, AI-enabled lettings and property management platform, that is growing through a roll-up strategy acquiring estate agencies. The company leverages two arms: i) acquiring existing letting agencies, effectively buying its highly sticky, recurring revenue-type landlords portfolios, and then ii) building a top-notch technology to automate tenant management, payments, and post-rental property maintenance. The company seamlessly integrates AI services to automate all business processes within brick-and-mortar real estate agencies, integrating them into a tech-enabled digital letting platform in two months to radically improve the user experiences and increase efficiency of the business.
Weโre a fast-growing, product-focused company, backed by top-tier investors and led by a team with deep experience in real estate, technology, and operations.
The Role
Weโre looking for a Data Platform Team Lead to own and build our data engineering function. You will be the second hire on the team, with two additional engineers to be hired under your leadership. As the team is still small, we expect you to be hands-on and spend around 40โ50% of your time writing code and designing systems.
This role is mission-critical and covers three core areas of the business:
Key Responsibilities
1. CRM Integration & Data Ingestion
Each business we acquire uses a different real estate CRM system (one of the top-10 most widely used in the UK). Your team will be responsible for building custom data extraction tools for each CRM โ extracting lease contracts, landlord and tenant data, financial records, and more โ and piping that into our systems. This will involve:
- Reverse-engineering or working with APIs/databases of major CRM vendors
- Designing modular, maintainable ingestion services
- Automating โone-clickโ data exports for seamless onboarding
2. Data Architecture, Optimization & Compliance
- Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability.
- Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters.
- Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments.
3. Data Infrastructure, Analytics Platform & Ops Automations
Youโll also lead the buildout of our central analytics infrastructure, making it easy for our operations, finance, and DS teams to get the insights they need. This includes:
- Building production processes with complex logic, for example, using CV libraries and flexible cloud expansion for quick recalculations of historical data. The calculation logic and python notebooks will be provided by DS, your responsibility is to wrap it up as a production pipeline.
- Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability.
- Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters.
- Designing and maintaining robust ETL/ELT pipelines for different sources and different types of data, including phone calls, emails, scrapping public reviews, ะตtั.
- Building and optimizing our data warehouse on BigQuery
- Setting up foundational data governance practices (lineage, quality checks, cost monitoring, etc.)
- Choosing the right tools from the Google Cloud stack and applying them effectively to balance performance, cost, and maintainability.
- Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments.
Requirements
Youโre a great fit if you
- Turn ideas into code thatโs clean, structured, and elegant
- Have not only written code for isolated components but also designed systems that were later implemented by you and your teammates
- Worked with different data pipeline technologies and understand their strengths and trade-offs, including: message queues, distributed systems, data warehouses, orchestration tools ย ย ย ย
- Solved more than one tricky performance optimization challenge in data pipelines
- Know how to choose a practical and reliable way to monitor data quality
- Have a solid grasp of core data warehouse design patterns and can โspeak SA' languageโ fluently
Youโre a perfect fit if you
- Worked in a large company, led a team of 10โ15 engineers, got tired of legacy systems or bureaucracy โ and now want to build everything from scratch, properly, and without illogical legacy
- Have hands-on experience with at least one cloud provider (Google Cloud, AWS, Azure) across storage, compute, and security
- Built truly complex pipelines at the crossroads of very different technologies (e.g., image or audio analysis with auto-scaling compute, cascading ML libraries, and calls to LLM APIs)
- Can set up CI/CD for the point above
Conditions
- Remote (UK timezone preferred)
- Market compensation
What is it like being a Dwell-er
Feel free to check out Dwelly Core Principles. Thatโs about what we believe in, how we operate and make decisions.
- Customer obsession rather than competitive focus
- Passion for invention
- Operational excellence
- Long-term thinking
How to apply
Send your resume or linkedin profiles to telegram (faster) or mail.