Job Title: Senior GCP Data Engineer
Experience Level: Minimum 7 Years
Location: Remote
Job Overview:
We are seeking a highly experienced Senior GCP Data Engineer with a strong background in Google Cloud Platform (GCP) services and Data Engineering tools. The ideal candidate will have extensive hands-on experience with GCP services, including Dataflow, and must be proficient in DBT (Data Build Tool). In addition, familiarity with SQL, Python, and Terraform is a plus. Strong communication skills are critical to effectively collaborate with teams and stakeholders.
Key Responsibilities:
- Architect, develop, and maintain scalable data pipelines using GCP services, focusing on Dataflow and other relevant tools like BigQuery, Cloud Run, Cloud Functions, Pub/Sub, and Cloud Composer.
- Work extensively with DBT for data modeling, transformations, and optimization of data workflows.
- Collaborate with cross-functional teams to define and deliver data-driven solutions for business challenges.
- Ensure high-performance and reliable data operations by continuously optimizing systems and processes.
- Implement and manage infrastructure as code using Terraform (nice to have).
- Write and maintain clean, efficient code in SQL and Python for data manipulation and analysis.
- Deliver solutions that align with security, compliance, and operational excellence standards.
- Communicate effectively with teams and stakeholders to ensure smooth project delivery and issue resolution.
Required Skills & Experience:
- Minimum 7 years of experience in Data Engineering and working with GCP services.
- Strong hands-on experience with GCP services – must include Dataflow.
- Proficiency in DBT (Data Build Tool) – MUST HAVE.
- Familiarity with SQL and Python – NICE TO HAVE.
- Experience with Terraform – NICE TO HAVE.
- Excellent problem-solving skills and ability to work in a fast-paced environment.
- Strong communication and collaboration skills