Relocate. Ми з України
Post Jobs
Countries

Select a Country

Australia

Austria

Belgium

Canada

Denmark

Estonia

Finland

Germany

Ireland

Japan

Netherlands

Singapore

Spain

Sweden

United Kingdom

United States

blog

Blog

Expat Stories Visas & Immigration Money & Taxes Working Abroad

Read our blog

Visas Taxes Salaries Cost of Living Relocation Companies Jobs
Blog
Expat Stories Visas & Immigration Taxes & Money Working & Money Read our blog
Post Jobs
Menu
  • Home
  • International Jobs in Ireland
  • Principal Data Engineer

Principal Data Engineer

Dublin, Ireland

Optum

Optum logo

Basic relocation package

Adaptation tips
Adaptation tips
Visa services
Visa services

About Optum

Careers with Optum.
Here's the idea. We built an entire organization around one giant objective; make health care work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve.
Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high-performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company, and a singular opportunity to do your life's best work.

Position

As a Principal Data Engineer, you will be part of a team working at the intersection of financial services and health care to launch new cash flow solutions to drive broader, more equitable access to capital. You will be focused on developing cutting edge data-pipelines that are the foundation of our big-data analytic platform. You will be a technical leader, focused on innovation; developing proofs of concepts that leverage data in new and exciting ways. At Optum, you'll find an organization that will recognize your talents and provide growth opportunities. Join us!

Primary Responsibilities

  • Develop data pipelines to ingest and transform data using clean coding principles
  • Contribute to common frameworks and best practices in code development, deployment and automation/orchestration of data pipelines
  • Partner with Data Science and Product leaders to design best practices and standards for developing and productionalizing analytic pipelines
  • Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others)
  • Design conceptional and logical data models, architecture diagrams and flowcharts
  • Mentor and train junior members of the team

Your qualification

  • Extensive hands-on experience utilizing best practices to develop and deploy data pipelines to a production environment
  • Experience working with both real-time and batch data, knowing the strengths and weaknesses of both and when to apply one over another
  • Ability to debug complex data issues while working on very large data sets with billions of records
  • Fluent in SQL (any flavor), with experience using Window functions and more advanced features
  • Understanding of DevOps tools, Git workflow and building CI/CD pipelines
  • Ability to work with business and technical audiences on business requirement meetings, technical white boarding exercises, and SQL coding/debugging sessions
  • Familiar with Airflow or similar orchestration tool

Will be a plus

  • Experience working in projects with agile/scrum methodologies
  • Experience with data de-identification and encryption
  • Experience with shell scripting languages
  • Familiar with Virtual Data Warehousing (e.g. Snowflake)
  • Well versed in Python, in fulfilling multiple general-purpose use-cases, and not limited to developing data APIs and pipelines
  • Experience with Apache Spark and related Big Data stack and technologies, preferably using Scala, else PySpark
  • Experience working with Apache Kafka, building appropriate producer/consumer apps
  • Familiarity with production quality ML and/or AI model development and deployment
  • Experience working with Kubernetes and Docker, and knowledgeable about cloud infrastructure automation and management (e.g., Terraform)

Additional details

A relocation package is discussed individually.


Lead Data CTO Cloud Director Data Engineer Database Principal Azure BigData Big Data Team Lead Snowflake Optum Principal Engineer Data Engineering
Archive vacancy
Archive vacancy
Facts about Dublin
Cost of Living Index 85 /100
Median for apartment
rent in city centre
(1-3 bedroom) $ 1936 - $ 3679
Safety Index 50 /100
Check if your resume is a good fit
25/100
Get Full Report Arrow right
Relocate. Ми з України

Relocation made easy: country guides, visa overviews, tax calculators, and more – Relocate.me has everything you need in one place.

Resources

Blog Webinars Visas Taxes Cost of living Salaries Healthcare Relocation companies

For job seekers

Browse international jobs Companies hiring International job search guide

For employers

Post jobs Global hiring guide

Legal

Privacy policy Terms of service

Newsletter

Curated tech jobs and content for relocation seekers

Subscribe

© 2024 Relocate.me | All Rights Reserved

Proudly built by Ukrainians 🇺🇦

Jobseeker Login

Create a Jobseeker account to apply for jobs.

Forgot password?

Or
Register
Login
Continue with Google Continue with LinkedIn
Back to Login
Jobseeker Register

Create a Jobseeker account to apply for jobs.

Or
Continue with Google Continue with LinkedIn

Check your email and follow the instructions to restore access to your account

Restore access