Job Overview
JOB DETAILS
REQUIREMENTS
- Experience with Scaled Agile Framework (SAFe); certification preferred.
- Proficiency in CLI-based Linux environments for system-level scripting and troubleshooting.
- Familiarity with Visual Studio Code or other integrated development environments (IDEs).
- Hands-on experience with job scheduling and process automation tools (e.g., Airflow, Jenkins).
- Knowledge of rules-based models and rules engines such as IBM ODM or Drools.
- Experience working with Centers for Medicare and Medicaid Services (CMS) systems.
- Understanding of Medicare/Medicaid claims processing and related compliance requirements.
RESPONSIBILITIES
- Design, develop, and maintain scalable data processing pipelines using Apache Spark on Hadoop or Databricks.
- Build and optimize middleware services using Java, integrating with Python-based data workflows and APIs.
- Develop and maintain complex SQL queries for ETL processes; experience with Snowflake is a plus.
- Leverage AWS services (e.g., EC2, S3, EMR, Lambda, Glue) to deploy and manage cloud-native data solutions.
- Automate operational tasks using shell scripting and Python for job scheduling, data ingestion, and system monitoring.
- Collaborate with cross-functional teams to ensure data integrity, security, and compliance across platforms.
- Communicate effectively with technical and non-technical stakeholders, including executive leadership.
- Participate in Agile ceremonies and contribute to sprint planning, backlog grooming, and release cycles.
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#CrossChannelJobs#JobSearch
#CareerOpportunities#HiringNow
#Employment#JobOpenings
#JobSeekers
FacebookLinkedIn