A future with Welltech
Welltech’s purpose is to move each and every body to start and stay well for life! Every trend, every piece of feedback and data, every experiment, and innovative solution we introduce, fuels the way we unlock wellness for and with our users. While appreciating and celebrating who they are, we strive to help them make turn small choices into lasting joyful habits. And that makes us proud of every step they take on their journey to living well, every day.
Disrupting industry
We are an international tech company with origin from Ukraine, founded in 2015, and a fast-growing global player with five hubs across Europe. With more than 220 million installs of our apps, Welltech offers disruptive, engaging fitness and wellness solutions. We are genuinely committed to creating value and ensuring a workplace where our talented, diverse individuals feel they belong as we shape the future of the health and fitness industry. Everything we do is driven by our purpose and guided by integrity, a testament to the trust we build- today and tomorrow.
Spread across five dynamic hubs in Cyprus, Poland, Spain, Ukraine and the UK, with a team of over 700+ #wellmakers, we provide an opportunity for everyone who is passionate about making an impact in the wellness industry.
As a Senior Data Engineer, you will play a crucial role in building and maintaining the foundation of our data ecosystem. You’ll work alongside data engineers, analysts, and product teams to create robust, scalable, and high-performance data pipelines and models. Your work will directly impact how we deliver insights, power product features, and enable data-driven decision-making across the company.
This role is perfect for someone who combines deep technical skills with a proactive mindset and thrives on solving complex data challenges in a collaborative environment.
Challenges You’ll Meet:
Pipeline Development and Optimization: Build and maintain reliable, scalable ETL/ELT pipelines using modern tools and best practices, ensuring efficient data flow for analytics and insights.
Data Modeling and Transformation: Design and implement effective data models that support business needs, enabling high-quality reporting and downstream analytics.
Collaboration Across Teams: Work closely with data analysts, product managers, and other engineers to understand data requirements and deliver solutions that meet the needs of the business.
Ensuring Data Quality: Develop and apply data quality checks, validation frameworks, and monitoring to ensure the consistency, accuracy, and reliability of data.
Performance and Efficiency: Identify and address performance issues in pipelines, queries, and data storage. Suggest and implement optimizations that enhance speed and reliability.
Security and Compliance: Follow data security best practices and ensure pipelines are built to meet data privacy and compliance standards.
Innovation and Continuous Improvement: Test new tools and approaches by building Proof of Concepts (PoCs) and conducting performance benchmarks to find the best solutions.
Automation and CI/CD Practices: Contribute to the development of robust CI/CD pipelines (GitLab CI or similar) for data workflows, supporting automated testing and deployment.
Tech Stack You’ll Work With:
Cloud: AWS (Redshift, Spectrum, S3, RDS, Lambda, Kinesis, SQS, Glue, MWAA)
Languages: Python, SQL
Orchestration: Airflow (MWAA)
Modeling: dbt
CI/CD: GitLab CI (including GitLab administration)
Monitoring: Datadog, Grafana, Graylog
Event validation process: Iglu schema registry
APIs & Integrations: REST, OAuth, webhook ingestion
Infra-as-code (optional): Terraform
5+ years of experience in data engineering or backend development, with a strong focus on building production-grade data pipelines.
2-3+ years of experience working with AWS services (Administration of Redshift is a must),
Solid experience working with AWS services (Spectrum, S3, RDS, Glue, Lambda, Kinesis, SQS).
Proficient in Python and SQL for data transformation and automation.
Experience with dbt for data modeling and transformation.
Good understanding of streaming architectures and micro-batching for real-time data needs.
Experience with CI/CD pipelines for data workflows (preferably GitLab CI).
Familiarity with event schema validation tools/ solutions (Snowplow, Schema Registry).
Excellent communication and collaboration skills.
Strong problem-solving skills—able to dig into data issues, propose solutions, and deliver clean, reliable outcomes.
A growth mindset—enthusiastic about learning new tools, sharing knowledge, and improving team practices.
Experience with additional AWS services: EMR, EKS, Athena, EC2.
Hands-on knowledge of alternative data warehouses like Snowflake or others.
Experience with PySpark for big data processing.
Familiarity with event data collection tools (Snowplow, Rudderstack, etc.).
Interest in or exposure to customer data platforms (CDPs) and real-time data workflows.
Grow Together: Join a culture that champions both personal and professional growth. Here, you’ll thrive as we learn, evolve, and succeed together.
Lead by Example: No matter your role, your leadership matters. Every team member is empowered to inspire and make an impact.
Results-Driven: We’re all about achieving meaningful outcomes. It’s not just about the effort, but the difference we make every day.
We Are Well-Makers: Be part of a movement that’s creating a healthier, happier world. Together, we make well-being a reality!
Candidate journey: Recruiter call ➔ Technical call with the hiring manager ➔ Meet the future stakeholders
Check out some of our products
Muscle Booster — https://musclebooster.fitness/
Yoga-Go — https://yoga-go.io/
WalkFit -http://walkfit.pro
Cyprus
Welltech
As a Senior Data Architect/ Data Platform Technical Lead, you will play a pivotal role in transforming ...
Create a Jobseeker account to apply for jobs.
Check your email and follow the instructions to restore access to your account