DevOps Engineer
Motion Recruitment - auburn, WA
Apply NowJob Description
Our client, one of the world's largest automotive manufacturers is hiring a DevOps Engineer to join their Enterprise Data and Advanced Analytics team in Auburn Hills, MI.This team supports data ingestion processing and as an Engineer on this team, you will be responsible for orchestrating data pipelines using Python and Informatica and supporting data ingestion processing using cloud technologies like GitHub Actions OR Argo CD.Responsibilities: Design & Develop Data PipelinesOrchestrate complex data pipelines using data integration tools like Informatica and Python, ensuring seamless data flow from various sources.Leverage GCP Dataflow, Cloud Functions, and other cloud technologies to build scalable and resilient data ingestion and processing pipelines.Implement robust CICD workflows using GitHub Actions and Argo CD to automate pipeline deployments and ensure consistency.Monitor and manage production solutions. Optimize and fine-tune models for performance, accuracy, and scalability.Document best practices and quality standards to be adherence during development of data science solutions.Conduct review and provide feedback on data science work applications.Manage & Analyze DataWork with diverse data sources, including relational databases (Oracle, SQL Server, MySQL, Postgres, Snowflake), big data platforms (Hadoop, Parquet files, BigQuery, Big Lake managed Iceberg), and streaming data (Kafka, GCP DataflowProc).Employ powerful compute engines like Hive, Impala, and Spark to analyze massive datasets and derive valuable insights.Deliver Actionable InsightsCollaborate with business stakeholders to understand their challenges and requirements.Translate business problems into analytical frameworks and identify opportunities to address complex problems.Build APIs and user-friendly interfaces to present data results and empower informed decision-making.Drive Machine Learning Innovation:Explore and implement Vertex AI models to generate quick insights and support business requirements.Stay at the Forefront:Continuously learn and adapt to emerging data technologies and best practices.Contribute to the ongoing improvement of data infrastructure and processes.Requirements:Bachelor's Degree in STEM4+ years of Python development experienceExperience building machine learning, data pipelinesExperience leveraging cloud technologies to scale data ingestion using InformaticaExperience implementing CICD processes using GitHub Actions and Argo CD for automationExperience working with diverse data sources, including relational databases (Oracle, SQL Server, MySQL, Postgres, Snowflake), big data platforms (Hadoop, Parquet files, BigQuery, Big Lake managed Iceberg), and streaming data (Kafka, GCP DataflowProc).
Created: 2024-10-19