Back to Jobs
United States
Clarity InnovationsDevelopment 2h ago
Principal Software Engineer
United StatesFull-time
Not Disclosed
Be the first applicant! 🚀
Job Description
Role Overview: The Principal Software Engineer is a senior individual contributor responsible for designing and implementing data transformation pipelines and driving the technical evolution of the Unified Data Model (UDM) team's analysis infrastructure. This role requires deep Java expertise, with a specific focus on transitioning legacy data pipelines to modern NiFi-based workflows and integrating with enterprise data services.
Responsibilities
- Serve as the team's Java subject matter expert, providing technical guidance, code reviews, and architectural input on Java-based components.
- Design and implement custom Apache NiFi processors to support UDM data processing and routing requirements.
- Lead and assist in the migration of existing analysis pipelines to NiFi-based workflows, ensuring continuity, correctness, and performance parity.
- Collaborate with data transport, platform, and infrastructure engineers to align pipeline design with enterprise standards.
- Troubleshoot complex integration issues across data pipeline stages, including format conversion, schema validation, and service connectivity.
- Contribute to technical documentation, architecture decision records, and pipeline design artifacts.
- Mentor junior and mid-level engineers on Java best practices, NiFi patterns, and integration design.
- Deliver within an Agile/Scrum framework, actively participating in sprint planning, backlog refinement, and technical reviews.
Clearance & Compliance
- Active Secret security clearance (Clearable to TS/SCI).
- Ability to operate in classified and constrained environments in accordance with all applicable security protocols.
Technical Skills & Experience
- 10+ years of professional software engineering experience with a strong emphasis on Java development and data pipeline engineering.
- Expert-level proficiency in Java, including design patterns, concurrency, performance tuning, and enterprise integration patterns.
- Strong understanding of data integration patterns, schema validation, and format transformation (JSON, XML, Avro, Protobuf, or similar).
- Experience integrating data pipelines with enterprise services, including APIs, message brokers, or data warehouses.
- Familiarity with version control, CI/CD pipelines, and DevSecOps practices.
- Prior experience in DoD, defense contracting, or other classified program environments strongly preferred.
- Track record of leading technical migrations or modernization efforts on production data systems.
- Experience deploying Java applications into containerized environments.
Preferred Qualifications
- Experience with deploying Java applications in Kubernetes environments and using GitOps (e.g., Flux, ArgoCD) methodologies.
- Experience integrating with Intelligence Community data flow systems.
- Experience working in cross-functional Agile teams as a technical lead or senior individual contributor.
- Hands-on experience designing and implementing Apache NiFi flows, custom processors, and controller services.
- Demonstrated experience migrating or re-architecting legacy data pipelines to modern workflow frameworks.
Is this company safe?
Ask Hyrizon AI to scan this company for potential red flags.
Safety First
- Never pay for a job application.
- Do not share sensitive bank info.
- Verify the client before starting work.