Data Pipeline Engineer

Position Location(s):  Charleston, SC

Position Description: We are looking for a talented and forward facing Data Pipeline Engineer who will be a critical part of a fast moving Marine Corps product development team. This team is responsible for the design, development, testing, implementation, and support for a wide range of cloud based data integrations, data pipelines, data warehousing solutions, and delivery of Business Intelligence (BI) services to Marines globally.

What you will be doing:

  • You will identify and deliver data pipeline solutions every day!
  • In this role you will be working in a cross-functional team to build data pipeline solutions and processes to enable data warehousing, analytics and reporting for the USMC Logistics community. 
  • You will be responsible for anything and everything related to existing and future Data Pipelines!
  • Work with the Product Owner team and data analysts to develop efficient ETL processes using Apache Nifi, Kafka and other data pipeline tools. 
  • Build performant data transformations for analysis and reporting.
  • Utilize modern ETL/ELT practices to move data into our Data Lake (AWS S3) and Warehouse (AWS Redshift).
  • Own the design, code development, code reviews, testing, data quality monitoring, deployment activities and operations support for all Nifi flows. 
  • Work alongside the infrastructure, data ingestion, data warehouse and analytics teams to manage and optimize ETL and data pipeline parameters, scheduling, monitoring and alerting.
  • Support development of data dictionary, connection guides, operations and user support SLAs.

What you bring along:

  • Proven analytical, troubleshooting, and problem-solving skills.
  • Ability to communicate well cross-functionally, including non-technical users.
  • Must have expert level ability with Apache Nifi to design, test, deploy and manage pipeline solutions and flows.
  • AWS Glue experience highly desired.
  • Ability to be self-starting, responsive and collaborative in a remote environment working with a distributed team. 
  • Stay current with the latest design and technology trends and adopt modern techniques as appropriate.

Bonus points:

  • Experience with open source software
  • Experience with AWS GovCloud
  • Familiarity with AWS Glue and AWS Redshift
  • Familiarity with BI tools, e.g. Microsoft Power BI and AWS Quicksight

Submit Resume (Please include the position description in the subject line when submitted):  Data Pipeline Engineer