About Teamwork
Teamwork.com's mission is to make teams who deliver client work become efficient, organised, profitable and happy! Our platform has revolutionised how companies keep their client projects on track, their resourcing in check and their profits on point. Combining powerful project management and easily streamlined operations—we’re the only platform built for managing client projects, profitably. Our relentless customer-focus has been rewarded with thousands of amazing customers all across the globe and millions of users who sign in every day. We pride ourselves on creating market leading software, working with outstanding people, and going above and beyond for our customers. Trusted by more than 20,000 teams across 170 countries, Teamwork.com is in acceleration mode as we set our sights to become the undisputed platform for teams who deliver client work.
We believe in hiring great people and look to ensure everyone has the best possible experience of work, everyday. We strive to be open and transparent, humble and customer focused. And we thrive on curiosity, getting results and working together relentlessly to deliver excellence. We are a company of action, full of triers and doers: we try things, we make mistakes, and we learn from them.
Our personality is unmistakable: we work hard, take joy in our wins and each other's successes and important life events. And we care and support each other when life throws lemons. More than anything we embrace a straightforward approach to getting things done. We are fanatical about our customers: and when talent meets passion, success happens.
This is a remote role but you must be based in Poland.
The opportunity
We are seeking a Senior Data Engineer to bridge the critical gap between our product engineering teams and data infrastructure.
This role combines strong data engineering and infrastructure capabilities along with strong backend background to ensure our data systems are built for long-term performance, reliability, and analytical value.
The ideal candidate will own our reporting and analytical infrastructure, drive our data roadmap, and act as a technical gatekeeper ensuring data models are designed with both immediate product needs and future scalability in mind.
This position is crucial for preventing the accumulation of technical debt in our data layer while enabling sophisticated BI integrations and maintaining AI-driven services.
Responsibilities:
Data Architecture & Strategy
- Act as the primary data gatekeeper between product teams and infrastructure, reviewing schema changes and query designs before implementation.
- Design and implement scalable data strategies that balance short-term business requirements with long-term analytical and reporting needs.
- Lead data modeling initiatives to ensure performance, maintainability, and extensibility.
- Drive adoption of CDC (Change Data Capture) tools such as Debezium to support real-time data flows.
- Architect solutions that effectively serve both transactional and analytical workloads.
BI Integration & Data Pipelines
- Integrate Business Intelligence tools (AWS QuickSight, Luzmo) into our data ecosystem to deliver reliable insights and dashboards.
- Design and maintain ETL pipelines for data transformation and consumption across internal and external platforms.
- Map complex reporting schemas to BI-friendly models for analytical performance.
- Build robust data connectors and automation for seamless data flow across systems.
- Explore and implement analytical database technologies (e.g., ClickHouse) to support advanced reporting and data analysis at scale.
AI/ML Services & Innovation
- Support the integration of AI-driven features into our data and reporting stack.
- Apply AI-first thinking to data quality, transformation, and automation challenges.
- Experiment with emerging AI tools to enhance engineering productivity and data insights.
Backend Development & Reporting - Just an understanding what’s happening, mostly help with the SQL
- Optimize SQL queries and schema designs for performance and scalability.
- Help build and maintain REST APIs for reporting and data consumption features.
- Help implement backend performance optimizations such as caching (Redis), query tuning, and distributed locking mechanisms.
- Collaborate with product engineers to ensure the data layer meets both performance and reliability goals.
What good looks like
Essential Technical Skills
- 5+ years of complex SQL experience (MySQL or similar), including query optimization, indexing, and schema design.
- 3+ years of data engineering experience with ETL pipelines, data modeling, and performance tuning at scale.
- Strong understanding of database internals – ability to analyze execution plans, design efficient indexes, and optimize for both OLTP and OLAP workloads.
- Experience with cloud-based data services such as AWS RDS, Redshift, DynamoDB, or similar.
- Experience with distributed systems, including caching (Redis), message queues, and distributed locking.
- Strong understanding in at least one major backend language (Python, TypeScript, or Go).
Data & Infrastructure Skills
- Experience with CDC tools and real-time data streaming.
- Understanding of data warehouse concepts and analytical database design.
- Knowledge of BI tool integration patterns and requirements.
- Ability to design for long-term data retention and migration strategies.
- Experience acting as a technical reviewer/gatekeeper for data-related changes.
Soft Skills
- Strong technical vision and the ability to think long-term about data architecture.
- Excellent communication skills to collaborate effectively across infrastructure and product teams.
- Pragmatic, outcome-oriented approach — able to deliver solutions, not just concepts.
- Growth mindset with curiosity about AI/ML technologies.
- Ability to mentor and guide other engineers on data best practices.
Nice-to-Have Skills
- Experience with ClickHouse or similar columnar databases (Redshift, BigQuery, Snowflake).
- Familiarity with specific BI platforms (QuickSight, PowerBI, Tableau, Luzmo).
- Hands-on experience with Debezium or other CDC tools.
- Experience with Kubernetes and containerized data services.
- Awareness of data governance and compliance requirements.
Core Benefits and Perks
- Employee Share Options (ESOP) — we mean what we say when we say, “act like an owner”!
- Competitive salary & OTE (as applicable)
- Up to 30 days vacation from day 1
- Pension benefit (specific to region)
- Health plans and wellbeing programs
- Give Back program
- Ministry of Happiness social club
- Educational resources and generous allowance to support development
- Inclusive policies - maternity, paternity & parent leave, as well as a focus on flexible working
- Recognition programs
Teamwork is an Equal Opportunity Employer, and qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, or national origin.
#LI-TL1