Chief Data Engineer
SAIC - Mc Lean, VA
Apply NowJob Description
DescriptionQualifications, skills, and all relevant experience needed for this role can be found in the full description below. SAIC's Space & Intel Business Group, USG Mission and Information Technology Division, is seeking a Chief Data Engineer to support a program which leverages integrated discrete technologies to support massive data processing, storage, modeling, and analytics over several thousand unique data sources, to perform threat identification and analysis, as well as support efforts to meet tactical and strategic goals. This position is in McLean, VA and requires an active TS/SCI clearance with Polygraph. Responsibilities include, but are not limited to: Develop, validate, and use methodologies to support analytic requirements in Clustered Computing environments. Support downstream systems and capabilities of external organizations dependent on the data platform via various approaches, to include application programming interfaces (APIs). Develop integration plans that capitalize on new data processing, modeling, and storage technologies including the cloud environments. Evaluate data collections to assess the potential value-add and provide recommendations. Generate assessments about data, support activities to perform data acquisition and engineering, and enable the processing of data so it is integrated into data platform systems for maximum value. Perform and support data modeling and engineering activities for integration of new data into the data platform's data corpus, refining existing models and intermediate models to address deficiencies and defects, creating new models and data feeds to support existing and new analytic methodologies. Qualifications Active TS/SCI with polygraph Bachelors degree in mathematics, computer science, engineering, or similar scientific or technical discipline and 18 years or more experience; Masters and 16 years or more experience; PhD or JD and 13 years or more experience. Experience with: Python SPARK Java SQL Jenkins PyPi Terraform and Cloudera ElasticSearch Pentaho Apache NiFi Apache Hop Desired Skills: Graduate degree in computer science, information systems, engineering, or another scientific or technical discipline. Demonstrated experience using Enterprise Control Language (ECL) and the Lexis-Nexis High Performance Cluster Computing (HPCC) platform. Experience performing All-Source data analysis. Experience developing custom algorithms to support analytic requirements against massive data stores. Ability to perform technical analysis support using massive data processing systems. Demonstrated experience writing cables. Experience planning and coordinating program activities such as installation and upgrading of hardware and software, utilization of cloud services, programming, or systems design development, modification of IT networks, or implementation of Internet and intranet sites. Ability to deploy web applications to a cloud managed environment to include DevOps and security configuration management. Experience developing, implementing, and maintaining cloud infrastructure services such as EC2, ELB, RDS, S3, and VPC. Ability to meet documentation data compliance requirements. #J-18808-Ljbffr
Created: 2024-11-06