Job Title: Data Engineer - Python/Spark Specialist
Experience: 4 to 10 years
Location: Bengaluru or Pune
Notice Period: Must be able to join within 0-30 days
Work Mode: Work from Office Only
We're seeking an exceptional Data Engineer to join our client's innovative team in Bengaluru or Pune. This role is ideal for candidates with strong Python or Spark skills who can hit the ground running and make an immediate impact.
Key Responsibilities:
• Develop and optimize data pipelines using Python and/or Spark.
• Architect solutions leveraging Snowflake, AWS, and other cutting-edge data technologies.
• Translate complex business requirements into efficient, scalable data models.
• Debug and enhance existing codebase to improve performance and functionality.
• Collaborate with cross-functional teams in an Agile environment.
• Implement best practices for data engineering and foster a culture of continuous improvement.
Required Skills and Experience:
• 4-10 years of hands-on experience in data engineering.
• Advanced proficiency in Python or Spark programming.
• Strong SQL skills and experience with Snowflake or similar cloud data platforms.
• Familiarity with AWS services, Airflow, and GitLab.
• Bachelor's degree in Computer Science, Engineering, or related field.
• Proven track record of delivering complex data projects.
• Excellent problem-solving and analytical skills.
Ideal Candidate:
• Demonstrates deep expertise in PySpark and data optimization techniques.
• Has experience with real-time data processing and streaming architectures.
• Shows enthusiasm for learning and adapting to new technologies.
• Possesses strong communication skills and can explain technical concepts clearly.
Note: This position is for immediate joiners only. Candidates must be available to start within the specified notice period.
If you're a data engineering powerhouse with Python or Spark expertise and ready to dive into challenging projects immediately, we want to hear from you. Join us in shaping the future of data-driven solutions!