This a Full Remote job, the offer is available from: United States
Job role : Sr. Data engineer
Location : Remote (USA)
Years of experience : 5+yrs
Our client is seeking Senior Data Engineer to help modernize and scale our Business Intelligence data infrastructure. This role will be critical in transitioning our existing on-prem SQL Server–based dimensional data mart and SSIS pipelines into a cloud-native solution on Google Cloud Platform (GCP) using tools like DBT, BigQuery, and other GCP-native services.
This is a hands-on engineering role focused on building robust, scalable data pipelines and enabling performant data models that power Tableau dashboards used throughout the organization.
Responsibilities:
· Lead the migration of the existing SSIS-based ETL workflows to cloud-native pipelines using DBT and/or GCP tools such as Dataflow, Dataform, or Cloud Composer (Airflow).
• Design and implement scalable, efficient data models in BigQuery, following best practices for dimensional modeling.
• Optimize and maintain existing SQL transformations, ensuring correctness and performance in the cloud.
• Collaborate with BI developers and analysts to ensure data marts align with Tableau reporting needs.
• Ensure data integrity, security, and lineage through testing, documentation, and observability practices.
• Work with on-prem teams to phase out legacy systems and design transitional architectures where needed.
• Establish best practices and mentor junior engineers on cloud-Financial engineering patterns.
Qualifications:
Required:
· 5+ years of experience in data engineering with strong SQL and ETL skills.
• Experience with SSIS and legacy SQL Server–based data marts.
• Proficiency in Google BigQuery and/or similar cloud data warehouses.
• Hands-on experience with DBT or modern transformation frameworks.
• Strong knowledge of dimensional modeling and data warehousing principles.
• Experience migrating on-prem pipelines to cloud platforms.
• Familiarity with GCP-native services such as Cloud Storage, Pub/Sub, Dataflow, Composer, and IAM.
• Strong knowledge of Healthcare Information Systems.
Preferred:
· Experience supporting or integrating with Tableau-based BI solutions.
• Exposure to infrastructure-as-code tools like Terraform for GCP.
• Knowledge of data observability tools and practices.
• Comfortable with Git-based CI/CD for data pipeline deployments.
Nice to Have:
· Familiarity with GCP networking and cost optimization strategies.
• Experience with data validation or automated testing frameworks for pipelines.
• Knowledge of metadata management or cataloging tools (e.g., Data Catalog, DataPlex).
What You'll Bring:
· A builder's mindset with a bias for simplification and automation.
• A collaborative approach to working with BI and application teams.
• The ability to balance long-term platform scalability with short-term deliverables.
• A passion for cloud innovation and data platform modernization.
This offer from "Georgia IT, Inc." has been enriched by Jobgether.com and got a 82% flex score.
Apply Now
Apply Now