As an Azure BDE Senior Data Engineer, you will
- Lead the design and implementation of data management (Data Lake / Lakehouse/ Data Mesh) solution on the Azure cloud using PaaS services,
- Collaborate with Architects in brainstorming, design patterns, doing POCs, implementation of the solution adhering to the defined architecture and standards.
- Closely collaborate with a team of data engineers in designing frameworks, pipelines for ingestion, integration, data quality, deriving metrics for serving layer.
- Managing requirements analysis, design, delegating work, creating code templates, troubleshooting, performing code review and deployments.
- Collaborate with a team of business domain experts, data scientists and application developers to develop solutions.
- Explore and learn new technologies for creative business problem solving and mentor a team of Data Engineers.
Skillset expectation:
- Strong hands-on experience in implementing data lake using technologies such as Databricks Spark, Microsoft Fabric, Data Factory, Azure Data Lake
- Strong programming skills in Python/Scala, Pyspark, SQL
- Expertise in implementing complex design for data ingestion, data transformation, handling batch, micro-batch, streaming pipelines, event-driven processing, deriving metrics, applying data harmonization, building data marts and aggregate layers.
- Ability to understand data (attributes, metrics), relate to reporting/analytical use-cases of various business functions at a high-level and include those nuances in the design for scalability, availability.
- Expertise in analyzing code, modularized design, coding best practices, code review, test strategies, code review, troubleshooting and performance optimization.
- Collaborate with Architect in implementing governance layer to manage data quality, data catalog, data lineage and data security.
- Experience in setting up consumption layer and serving data assets in a secure way for BI reporting, Adhoc analytics, DS/ML use-cases and external users.
- Ability to upskill, experiment quickly for creating POCs / Prototypes when evaluating new features, design patterns, adopting open-source frameworks and building custom solution.
- Exposure to integrate DevOps, CI-CD branching strategies, deployments for code components developed in Data factory, Databricks, Functions etc.,
- 6+ years of technical experience with at least 4 years on MS Azure and Big data technologies.