Data Engineer IV
Location: Charlotte, NC (Hybrid)
Pay: $100/hr (W2 ONLY)
Contract: 24+ Month's
About the Role:
We are seeking a highly skilled and knowledgeable Subject Matter Expert (SME) to join our team in advancing Data Fabric. This role involves developing and optimizing our interconnected data capabilities and products to deliver data efficiently and at scale. The ideal candidate has extensive experience in data engineering and platform development, with a strong foundation in Terraform, AWS, Apache, and related tools to assist the team in navigating obstacles and accelerating production.
Key Responsibilities
•
Technical Direction: Provide technical guidance to the team on feature development and delivery.
•
Product Collaboration: Work closely with Product Owners and Architects to align on technical goals, delivery timelines, and key decisions for data and solution design.
•
Data Quality & Security: Lead the design and implementation of data quality checks, user access controls, data encryption, and logging practices.
•
Innovation: Approach challenges with unconventional solutions, research to resolve issues, and propose alternative approaches as needed.
•
Agile Engagement: Thrive in an Agile development environment, participating in regular reviews, mini Proof of Concepts (PoCs), and iterative improvements.
•
Customer Interaction: Provide troubleshooting support, engage in stakeholder discussions, and guide other teams in using our products effectively.
Required Skills & Experience
•
Experience: 5+ years in Data Engineering or Software Engineering, with the capability to support and advise other engineers.
•
Core Technical Skills:
•
Version Control: Git
•
AWS: IAM, API Gateway, Lambda, Step Functions, Lake Formation, EKS, Glue, Athena, S3
•
Programming: Python, Java
•
Data Management: Apache Hudi, Flink, PostgreSQL, SQL, RDS
•
Infrastructure as Code: Terraform Enterprise (expertise in modules, providers, debugging)
•
Additional Preferred Skills:
•
Event-Driven & Streaming: Kafka, Kafka Schema Registry, and big data tools (EMR, EKS, Hadoop, Spark)
•
AWS Tools: CloudTrail, SNS, SQS, CloudWatch, Aurora, Redshift
•
Containers & Distributed Systems: Kubernetes, Microservices, Concourse
•
Compliance & Standards: Knowledge of Data Lake, Data Warehouse, data security standards, and compliance needs.