Job Posting: Spark Engineer
Location: Bangalore / Pune (5 days' Work from Office)
Experience: 4 to 6 years
Notice Period: 15-20 days only
Are you a Spark enthusiast ready to ignite your career? We're seeking a talented Spark Engineer to join our dynamic team and drive our data processing capabilities to new heights.
Key Responsibilities:
• Develop and optimize data pipelines using Apache Spark
• Implement Spark SQL for efficient data transformations
• Design and maintain workflows using Apache Airflow
• Collaborate on cloud-based solutions (AWS, Azure, or Google Cloud)
• Contribute to continuous integration and deployment processes
Required Skills and Experience:
• 4-6 years of hands-on experience with Apache Spark, including Spark SQL and PySpark
• Strong proficiency in Python and/or Scala programming
• Proven track record in developing and maintaining Spark applications
• Experience with Apache Airflow for workflow orchestration
• Solid understanding of SQL and relational databases
• Familiarity with data modeling and data warehousing concepts
• Experience working with cloud platforms (AWS, Azure, or Google Cloud)
Preferred Qualifications:
• Knowledge of containerization (Docker) and orchestration (Kubernetes)
• Understanding of DevOps practices and CI/CD pipelines
• Experience with Big Data technologies and frameworks
Ideal Candidate:
• Demonstrates strong problem-solving skills and attention to detail
• Thrives in a fast-paced, agile environment
• Possesses excellent communication and collaboration abilities
• Shows enthusiasm for staying current with emerging technologies
We offer an exciting opportunity to work on cutting-edge data processing projects in a collaborative environment. This position requires 5 days of work from our office in Bangalore or Pune.
If you're passionate about Spark development and ready to make an immediate impact, we want to hear from you!
#SparkEngineer #ApacheSpark #DataEngineering #BigData #PySpark #SparkSQL #CloudComputing #AgileMethodology #ImmediateOpening