Job Overview
Our client currently looking for Data Engineering Lead/Architect – ETL/Snowflake DB
Key Responsibilities :
Architect and Design Data Solutions :
– Collaborate with cross-functional teams to understand business requirements and design scalable and efficient data solutions.
– Define data architectures, data models, and data integration strategies to meet business objectives.
– Provide technical leadership in designing and implementing data pipelines, ETL processes, and data warehousing solutions.
Snowflake Expertise :
– Leverage your in-depth knowledge of Snowflake technologies to build and optimize data warehouses.
– Develop and maintain Snowflake data models and schemas to support reporting and analytics needs.
– Familiarity with Snowflake best practices, such as cost analysis, resource allocation, data loading strategies (e.g., bulk loading, incremental loading), and security configurations
Azure and Databricks Proficiency :
– Utilize Azure cloud services and Databricks platforms to manage and process large datasets efficiently.
– Build, deploy, and maintain data pipelines on Azure Data Factory, Azure Databricks, and other Azure services.
Data Warehousing and Integration :
– Implement best practices for data warehousing, ensuring data quality, consistency, and reliability.
– Create and manage data integration processes, including real-time and batch data movement between systems.
SQL and PL/SQL Mastery :
– Write complex SQL and PL/SQL queries to extract, transform, and load data effectively.
– Optimize SQL queries and database performance for high volume data processing.
Performance Tuning and Optimization :
– Continuously monitor and enhance the performance of data pipelines and data storage systems.
– Troubleshoot and resolve data-related issues to minimize downtime.
Documentation and Collaboration :
– Document data engineering processes, data flows, and architectural decisions.
– Collaborate with data scientists, analysts, and other stakeholders to ensure data availability and usability.
Security and Compliance :
– Implement data security measures and adhere to compliance standards (e.g., GDPR, HIPAA) to protect sensitive data.
– Apart from 10+ years’ experience and the technical skills like Snowflake, Azure Databricks etc., need to showcase below abilities:
– Initiate and drive data engineering strategies.
– Assist and engage with Data engineering sales, proposal activities.
– Develop strong customer relationships for ongoing business.
– Experienced with the Cloud-based Data Solution Architectures, especially in Snowflake.
– Experience with client engagement & aligning others to your ideas/solutions.
– Experience leading a technical team.
– Ability to clarify/translate customer requirements into Epics/Stories & remove ambiguity.
– Ability to mentor other DE team members.
Qualifications :
– Bachelor’s or master’s degree in computer science, Information Technology, or related field.
– Over 10 years of experience in Data Engineering, with a strong focus on architecture.
– Proven expertise in Snowflake, Azure, and Databricks technologies.
– Comprehensive knowledge of data warehousing concepts, ETL processes, and data integration techniques.
– Exceptional SQL and PL/SQL skills.
– Experience with performance tuning and optimization of data pipelines.
– Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment.
– Excellent communication skills to convey technical concepts to non-technical stakeholders.
– Certifications in relevant technologies (e.g., Snowflake, Azure) are a plus.