Join ClearEdge and be part of a mission-focused team solving some of the DoD’s most complex technical challenges. Every day, ClearEdge supports government and industry customers by delivering innovative solutions that enable critical operations and mission success.
ClearEdge offers an extremely competitive benefits package—including a $10k annual training and education benefit, a 10% 401(k) contribution fully vested on day one, annual health and technology allowances, and access to a state-of-the-art technology lab. Learn more at
www.clearedgeit.com/careers/
Your Mission:
As a Software Engineer, you will create, sustain, and troubleshoot complex operational data flows, including data storage, transport, management, security, compliance, and knowledge store management. You will work closely with the team to perform exploratory data analysis, clean, enrich, transform, and convert raw data into required formats. Additionally, you will devise methods to improve operational data flow processing, distribution, and reliability, while supporting automation, monitoring, and infrastructure-as-code principles.
This position requires a current polygraph within the last 7 years.
You Will Excel in This Role If You Are:
Experienced in designing, managing, and troubleshooting complex operational data flowsSkilled in using Linux command line interfaces (CLI) for development and operationsKnowledgeable in data processing tools, including Apache NiFi, for processing and distributing dataExperienced with monitoring and observability platforms such as Grafana and PrometheusFamiliar with automation and Infrastructure-as-Code (IaC) tools, including AnsibleComfortable collaborating with cross-functional teams using the Atlassian Tool Suite (JIRA, Confluence)Capable of performing exploratory data analysis and transforming raw data into mission-ready formatsInterested in improving the efficiency, reliability, and security of operational data flows
A Day in the Life:
Creating, sustaining, and troubleshooting operational data flows across storage, transport, and management systemsPerforming exploratory data analysis to clean, enrich, and transform raw mission dataUsing Apache NiFi to process and distribute data efficientlyMonitoring data flow performance using Grafana, Prometheus, or similar observability toolsAutomating data flow deployment and system configurations using AnsibleCollaborating with mission stakeholders and cross-functional teams to optimize data pipelinesFollowing corporate data security and compliance procedures in operational workflowsWriting Bash and Python scripts to support data processing and automation
What we are expecting from you:
TS/SCI with Polygraph (within the last 7 years)Nine (9) years of software engineering experience OR Bachelor’s degree + seven (7) years of experience OR Master’s degree + five (5) years of experienceExperience using Linux CLIExperience creating, managing, and troubleshooting operational data flowsExperience with Apache NiFi for data processing and distributionExperience with monitoring and observability tools such as Grafana and PrometheusExperience with automation and IaC principles using tools like AnsibleExperience collaborating with teams using the Atlassian Tool Suite (JIRA, Confluence)
Nice to Have:
General HPC technical knowledge (compute, networks, memory, storage components)Experience with corporate data flow processes, security, and compliance proceduresProficiency in scripting with Bash and Python