AWS Data Engineer
Brooksource - charlotte, NC
Apply NowJob Description
Location: Charlotte, NC (hybrid, 2-3 daysweek in office)3- year contract with opportunity for extension or full-time hireW-2 only (cannot accommodate Corp-to-Corp or 1099) Brooksource is searching for an AWS Data Engineer with expertise in data warehousing using AWS Redshift to join our Fortune 500 Energy & Utilities client in Charlotte, NC.RESPONSIBILITIES:Provides technical direction, guides the team on key technical aspects and responsible for product tech deliveryLead the Design, Build, Test and Deployment of componentsWhere applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)Understand requirements use case to outline technical scope and lead delivery of technical solutionConfirm required developers and skillsets specific to productProvides leadership, direction, peer review and accountability to developers on the product (key responsibility)Works closely with the Product Owner to align on delivery goals and timingAssists Product Owner with prioritizing and managing team backlogCollaborates with Data and Solution architects on key technical decisionsThe architecture and design to deliver the requirements and functionalitySkilled in developing data pipelines, focusing on long-term reliability and maintaining high data qualityDesigns data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performanceManage and resolve issues in production data warehouse environments on AWSREQUIRED SKILLS:5+ years of AWS experience, specifically including AWS RedshiftAWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSightExperience with KafkaMessaging preferably Confluent KafkaExperience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and AuroraExperience with Amazon Redshift for AWS data warehousingProven track record in the design and implementation of data warehouse solutions using AWSSkilled in data modeling and executing ETL processes tailored for data warehousingCompetence in developing and refining data pipelines within AWSProficient in handling both real-time and batch data processing tasksExtensive understanding of database management fundamentalsExpertise in creating alerts and automated solutions for handling production problemsTools and Languages - Python, Spark, PySpark and PandasInfrastructure as Code technology - TerraformCloudFormationExperience with Secrets Management Platform like Vault and AWS Secrets managerExperience with Event Driven ArchitectureDevOps pipeline (CICD); Bitbucket; ConcourseExperience with RDBMS platforms and Strong proficiency with SQLExperience with Rest APIs and API gatewayDeep knowledge of IAM roles and PoliciesExperience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch eventsDeep understanding of networking DNS, TCPIP and VPNExperience with AWS workflow orchestration tool like Airflow or Step FunctionsPREFERRED SKILLS:Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearchDatabases - Document DB, Mongo DBHadoop platform (Hive; HBase; Druid)Java, Scala, Node JSWorkflow AutomationExperience transitioning on premise big data platforms into cloud-based platforms such as AWSStrong Background in Kubernetes, Distributed Systems, Microservice architecture and containersADDITIONAL REQUIREMENTS:Ability to perform hands on development and peer review for certain components tech stack on the productStanding up of development instances and migration path (with required security, accessroles)Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)Lead implementation of integrated data quality frameworkEnsures optimal framework design and load testing scope to optimize performance (specifically for Big Data)Supports data scientist with test and validation of modelsPerforms impact analysis and identifies risk to design changesAbility to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applicationsAbility to implement data pipelines with the right attentiveness to durability and data qualityImplements data warehousing products thinking of the end users experience (ease of use with right performance)Ensures Test Driven development5+ years of Experience leading teams to deliver complex productsStrong technical skills and communication skillsStrong skills with business stakeholder interactionsStrong solutioning and architecture skills5+ years of Experience building real time data ingestion streams (event driven)Ensure data security and permissions solutions, including data encryption, user access controls and loggingEight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.
Created: 2024-11-05