Job Overview
JOB DETAILS
REQUIREMENTS
- Proven experience with data orchestration tools such as Airflow, CRON, or Prefect, including dependency mapping.
- Strong skills in data analysis, data modeling, and schema design.
- Proficiency in Python, SQL, and shell scripting.
- Experience working with APIs, SFTP, and cloud storage platforms like Google Cloud Storage (GCS) or Amazon S3.
- Excellent analytical problem-solving skills, including the ability to interpret application logs and debug data issues.
- Familiarity with cloud computing platforms, preferably Google Cloud Platform (GCP).
RESPONSIBILITIES
- Design, build, and maintain robust data pipelines using Airflow and Cloud Functions to meet evolving business requirements.
- Optimize and manage table schemas, views, and queries in our data warehouse and databases for performance and scalability.
- Conduct ad-hoc analyses to troubleshoot data-related issues and provide actionable insights into product and feature usage.
- Document data architecture, integration processes, and pipeline designs to ensure clarity and maintainability across teams.
- Collaborate with cross-functional teams to advise on data best practices for new product development.
- Mentor and support junior engineers, fostering a culture of learning and technical excellence.
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#CrossChannelJobs#JobSearch
#CareerOpportunities#HiringNow
#Employment#JobOpenings
#JobSeekers
FacebookLinkedIn