Job Title: Data Engineer (Senior / Mid-Level) – ETL / Cloud / AWS – Top Secret Clearance
Location: Huntsville, Alabama (100% Onsite)
Clearance Required: Active Top Secret Clearance (U.S. Citizenship Required)
Employment Type: Full-Time / Permanent/1099 Consultant
Education: Bachelor’s Degree (B.S.) in Computer Science, Information Systems, Engineering, or related field
Experience:
Senior: 8+ years of hands-on data engineering experience
Mid-Level: 5+ years of hands-on data engineering experience
Job Description
We are seeking Data Engineers (Senior and Mid-Level) to support secure, mission-critical data environments within a classified cloud infrastructure.
These roles are fully onsite in Huntsville, AL and require an active Top Secret clearance.
The ideal candidates will have strong experience with ETL development, data migration, Java-based data pipelines, and relational/NoSQL databases (Oracle, PostgreSQL, MongoDB), along with exposure to AWS cloud services and Agile/Scrum methodologies.
Key Responsibilities
Design, develop, and maintain ETL workflows to extract, transform, and load large-scale structured and unstructured datasets.
Develop data migration solutions between legacy and modern systems using SQL, Java, and cloud-native tools.
Implement data integration frameworks leveraging AWS services such as Glue, Lambda, S3, RDS, Redshift, and Kinesis.
Develop automation scripts using Python, Shell, or Bash for deployment, data validation, and maintenance tasks.
Maintain and enhance data pipelines for real-time and batch data processing.
Support data quality, metadata management, and governance activities.
Participate in Agile/Scrum sprints, contributing to design, code reviews, testing, and documentation.
Troubleshoot and resolve data-related issues across on-premises and AWS environments.
Desired qualifications
Bachelor’s degree in Computer Science, Engineering, or related technical field.
5+ (Mid) or 8+ (Senior) years of experience in data engineering or database development.
Strong hands-on experience with ETL tools (e.g., Informatica, Talend, Pentaho, AWS Glue, or custom Java ETL frameworks).
Proficiency in SQL and at least one major RDBMS (Oracle or PostgreSQL).
Experience with data migration projects and data quality validation.
Proficient in Java or Python for data processing and automation.
Experience working with cloud technologies, preferably AWS (RDS, S3, Lambda, Redshift, Glue).
Working knowledge of Linux/Unix environments and shell scripting.
Experience in an Agile/Scrum development environment.
Excellent problem-solving, analytical, and communication skills.