Lead Data Product Engineer
Robert Half - Atlanta, GA
Apply NowJob Description
Lead Data Product EngineerLOCATION: ATLANTA, GA (HYBRID)Robert Half is searching for a Lead Data Product Engineer, Data and Analytic Platforms and Products, a lead technical role, who will design, build, and manage complex data pipelines and infrastructure to create scalable, reuseable data product, while collaborating closely with data product leads, data modelers, data platform engineers, and architects to ensure data products meet the companies data strategy, align to business objectives, promote data governance, data quality, and accessibility standards; essentially, oversee the entire data product development lifecycle, from conception to deployment, while mentoring junior engineers and driving innovation within the data engineering teams. This Lead Data Product Engineer plays a pivotal role in the company's Data Strategy helping to deliver continuous value to the company's organization.What You Will DoPrinciple Based Management: Promote and work within a framework to align our platform decisions and work efforts based on our Principle Based Management culture. Foster a vision based on value opportunities, and a culture of collaboration, innovation, and continuous improvementData Pipeline Architecture: Design, implement, and maintain robust data pipelines for data ingest, transformation, and loading in the creation of reuseable data products, ensuring data integrity and scalability.Data Product Development: Lead the design and data architecture of data products, aligned to business process areas and data domains to facilitate broad data reusability across various different use cases including BI reporting, visualizations, dashboards, AI/ML, API, custom analytics solution needs. Lead teams in data product best practices, standards, and data prep for use in traditional and advanced use cases.Team Leadership: Mentor and guide junior data engineers, creating and sharing design principals, assigning tasks, reviewing code, and fostering a collaborative team environment.Stakeholder Management: Collaborate across Data and Analytics and business teams including, data products leads, portfolio leaders, product owners, delivery leads, architects, platform engineers, data analysts, and business leaders to understand data requirements, translate business needs into technical solutions, and communicate data insights.Data Quality Assurance: Establish which enforce data quality metrics and standards, including data validation checks and cleansing processes to maintain data accuracy.Data Governance: Contribute to the development and implementation of data governance policies while ensuring data processing capability to ensure data security and compliance.Technology Selection: Evaluate and implement new data technologies, tools, and frameworks to optimize data processing and analysis capabilities for performance and cost.TCO and Performance Optimization: Establish monitoring capability for data pipeline performance, identify bottlenecks, and implement optimizations to ensure efficient data processing and best value cost trade-offs.Backlog Management: Oversee and prioritize data product pipeline backlogs based on business value, technical efficiency, and alignment goals with Data & Analytics strategy and priorities ensuring user stories clearly articulate technical requirements and acceptance criteria.Collaboration with Platform Teams: Working closely with data engineers and architects to ensure technical feasibility and efficient implementation of data platform features.Communication and Transparency: Develop documentation and artifacts to clearly communicate data platform goals, progress, and challenges to stakeholders through regular updates and presentations. Understand and represent the data and analytics strategy and platform capability in meetings. Socialize key concepts enabling broader understanding and adoption of data platform technology.Risk Management and FinOps: Identify potential operational and financial risks. Use Economic Thinking to assess our data processing and use costs and optimizations.Agile Practices: Participating in agile ceremonies like sprint planning, backlog grooming, sprint reviews, and retrospectives to effectively manage the product development process.Who You Are (Basic Qualifications)At least 10 years of experience as a technical lead in data architecture and data engineering practices.Hands-on expertise in Python, including data engineering libraries, SQL, and other data manipulation languages. Proficiency in applying DevOps practices.Proven experience in developing software products that are configuration-driven, automated, and self-healing.Proficiency in modern AWS cloud data engineering practices and services (e.g., Spark, Glue, Lambda, Step Functions, S3, Deltalake, RedShift, Snowflake, Elastic Search, DynamoDB, Firehose, Kafka).Extensive experience creating and using data architecture and modeling to design and implement efficient data models for data lakes and data products.Proven experience in leading and mentoring data engineering teams, with a track record of innovative design and build practices.What Will Put You AheadProven experience in designing and implementing data products, architectures, and cloud data processing frameworks using AWS services, with advanced expertise in Python data engineering and Data Vault modeling concepts, along with a strong understanding of software engineering best practicesClear understanding of and experience delivering broad, business based, reuseable data products.Experience with full software development lifecycle including coding standards, unit testing, CI/CD, multi-environment development.Certifications in cloud technologies or data engineering.
Created: 2025-03-01