AWS Cloud Engineer w/ Machine Learning Ops
Qualitative Financials - boston, MA
Apply NowJob Description
Python AWS Engineer with ci/cd and Sagemaker Location: Boston MA. Onsite 1 week per month, and the manager picks the week. onsite 2 weeks per month starting Sept. 2024 (Local candidates only) Visa : Any Expereince : 8+ Contract : W2 (no C2C ) Hiring Manager Notes: The team is also asking that candidates include URL's to any certifications they have listed on their resumes. This person is going to be deploying Client Models via AWS Sagemaker. Will NOT be any new dev from scratch. We DO NOT need a Data Scientist or Client Engineer for this. Team will not review those profiles. This is more of an AWS Cloud Engineer with some Client Operations in the background. Domain Knowledge: Client experience in the background. Location: Boston MA. Onsite 1 week per month, and the manager picks the week. onsite 2 weeks per month starting Sept. 2024 (Local candidates only) Job Description: r. AWS Cloud Engineer w/ Machine Learning Ops As a Cloud Engineer, build and maintain large scale Client Infrastructure and Client pipelines. Contribute to building advanced analytics, machine learning platform and tools to enable both prediction and optimization of models. Extend existing Client Platform and frameworks for scaling model training & deployment. Partner closely with various business & engineering teams to drive the adoption, integration of model outputs. This role is a critical element to using the power of Data Science in delivering Fidelity's promise of creating the best customer experiences in financial services. The Team PI Data Engineering team (part of Personal Investing Technology BU) is focused on delivery data and Client solutions for the organization. As part of this team, you will be responsible for building advanced analytics solutions using various cloud technologies and collaborating with Data Scientists to robustly scale up Client Models to large volumes in production. The Expertise You Have Has Bachelor's or Master's Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Experience in Object Oriented Programming (Java, Scala, Python), SQL, Unix scripting or related programming languages and exposure to some of Python's Client ecosystem (numpy, panda, sklearn, tensorflow, etc.). Experience in building cloud native applications using AWS services like S3, RDS, CFT, SNS, SQS, Step functions, Event Bridge, cloud watch etc., Experience with building data pipelines in getting the data required to build, deploy and evaluate Client models, using tools like Apache Spark, AWS Glue or other distributed data processing frameworks. Data movement technologies (ETL/ELT), Messaging/Streaming Technologies (AWS SQS, Kinesis/Kafka), Relational and NoSQL databases (DynamoDB, EKS, Graph database), API and in-memory technologies. Strong knowledge of developing highly scalable distributed systems using Open-source technologies. 5+ years of proven experience in implementing Big data solutions in data analytics space. Experience in developing Client infrastructure and MLOps in the Cloud using AWS Sagemaker. Extensive experience working with machine learning models with respect to deployment, inference, tuning, and measurement required. Experience with CI/CD tools (e.g., Jenkins or equivalent), version control (Git), orchestration/DAGs tools (AWS Step Functions, Airflow, Luigi, Kubeflow, or equivalent). Solid experience in Agile methodologies (Kanban and SCRUM). The Skills You Bring You have strong technical design and analysis skills. You the ability to deal with ambiguity and work in fast paced environment. Your experience supporting critical applications. You are familiar with applied data science methods, feature engineering and machine learning algorithms. Your Data wrangling experience with structured, semi-structure and unstructured data. Your experience building Client infrastructure, with an eye towards software engineering. You have excellent communication skills, both through written and verbal channels. You have excellent collaboration skills to work with multiple teams in the organization. Your ability to understand and adapt to changing business priorities and technology advancements in Big data and Data Science ecosystem. The Value You Deliver Designing & developing a feature generation & store framework that promotes sharing of data/features among different Client models. Partner with Data Scientists and to help use the foundational platform upon which models can be built and trained. Operationalize Client Models at scale (e.g. Serve predictions on tens of millions of customers). Build tools to help detect shifts in data/features used by Client models to help identify issues in advance of deteriorating prediction quality, monitoring the uncertainty of model outputs, automating prediction explanations for model diagnostics. Exploring new technology trends and leveraging them to simplify our data and Client ecosystem. Driving Innovation and implementing solutions with future thinking. Guiding teams to improve development agility and productivity. Resolving technical roadblocks and mitigating potential risks. Delivering system automation by setting up continuous integration/continuous delivery pipelines.
Created: 2024-11-19