Senior Data Engineer
Job Description
Role Summary
540 is seeking a Senior Data Engineer to design, build, and optimize cloud-based data pipelines supporting high-impact federal missions. In this role, you’ll contribute hands-on to the development of scalable data pipelines in AWS while collaborating closely with technical leads and stakeholders to deliver robust solutions that drive meaningful business insights. This position is ideal for an experienced data engineer who enjoys solving complex data challenges, writing production-quality code, and mentoring junior team members while continuing to grow their own technical impact.
Location: Remote within the continental United States. Core collaboration hours align with Eastern Time, with flexibility. Occasional travel (3-4x annually)
Citizenship & Clearance Requirement: per client requirements, must be a U.S. Citizen with the ability to obtain a DoD Secret clearance
Education Requirement: Bachelor’s Degree in Computer Science or related engineering field (preferred)
How You’ll Drive Impact
- Design, develop, and maintain scalable data pipelines and ETL processes using Python and AWS services
- Build and deploy serverless functions using AWS Lambda for data processing and automation
- Implement data integration solutions connecting diverse source systems to analytics platforms
- Mentor junior data engineers through pair programming, code reviews, and technical coaching
- Collaborate with data analysts and business stakeholders to understand requirements and deliver solutions
- Optimize data pipeline performance, reliability, and cost efficiency in AWS environments
- Develop and maintain comprehensive documentation for data systems and processes
- Participate in on-call rotation to support production data infrastructure
- Contribute to the establishment of team coding standards and engineering best practices
- Work with Git for version control and participate in collaborative development workflows
Required Skills & Experience
- 6+ years of experience in data engineering or related technical roles
- Strong proficiency in Python programming with experience building production data applications
- Solid understanding of AWS architecture and services (Lambda, S3, Glue, RDS, Redshift, etc.)
- Experience with Git version control and branching strategies
- Knowledge of SQL and relational database design principles
- Familiarity with data modeling techniques and data warehouse concepts
- Experience with RESTful APIs and data integration patterns
- Strong problem-solving skills and attention to detail
- Excellent communication skills and demonstrated ability to mentor others
- Currently possess or be able to obtain within 30 days of hire:
- AWS Cloud Practitioner certification (or higher)
- CompTIA Security+ certification (or higher)
Nice To Have
- Familiarity with X12/MILS processing
- Experience with Apache Spark, Kafka, or other distributed processing frameworks
- Knowledge of data orchestration tools (Apache Airflow, AWS Step Functions)
- Exposure to Infrastructure-as-Code tools (Terraform, CloudFormation)
- Understanding of DevOps practices and CI/CD pipelines
- AWS certifications (Solutions Architect, Data Analytics, or similar)
Benefits & Perks
- Flexible PTO + all Federal holidays off
- Health, dental and vision insurance plans
- Flexible Spending Account (FSA)
- 401k with employer match
- Company-sponsored life insurance, short- and long-term disability
- Professional development (training, certifications, conferences)
- Paid cloud developer accounts
- Referral bonuses
- HQ office perks (parking / metro reimbursement, nitro coffee & lunches)
- Annual social events (540 Week, hackathon, charity golf tournament, etc.)
- Access to 540’s Washington Capitals & Nationals tickets
Is this company safe?
Ask Hyrizon AI to scan this company for potential red flags.
Safety First
- Never pay for a job application.
- Do not share sensitive bank info.
- Verify the client before starting work.