
Senior Data Engineer
Barbaricum is a rapidly growing government contractor providing leading-edge support to federal customers, with a particular focus on Defense and National Security mission sets. We leverage more than 17 years of support to stakeholders across the federal government, with established and growing capabilities across Intelligence, Analytics, Engineering, Mission Support, and Communications disciplines. Founded in 2008, our mission is to transform the way our customers approach constantly changing and complex problem sets by bringing to bear the latest in technology and the highest caliber of talent.
Headquartered in Washington, DC's historic Dupont Circle neighborhood, Barbaricum also has a corporate presence in Tampa, FL, Bedford, IN, and Dayton, OH, with team members across the United States and around the world. As a leader in our space, we partner with firms in the private sector, academic institutions, and industry associations with a goal of continually building our expertise and capabilities for the benefit of our employees and the customers we support. Through all of this, we have built a vibrant corporate culture diverse in expertise and perspectives with a focus on collaboration and innovation. Our teams are at the frontier of the Nation's most complex and rewarding challenges. Join our team.
The Data Engineer will build scalable pipelines and data models, implement ETL workflows, and help ensure enterprise data is reliable, accessible, and secure. You will work closely with data scientists, analysts, engineers, and stakeholders to translate mission and business needs into high-quality data solutions and actionable insights.
Responsibilities:
- Build and maintain scalable, reliable data pipelines to collect, process, and store data from multiple sources
- Design and implement ETL processes to support analytics, reporting, and operational needs
- Develop and maintain data models, schemas, and standards to support enterprise data usage
- Collaborate with data scientists, analysts, and stakeholders to understand requirements and deliver solutions
- Analyze large datasets to identify trends, patterns, and actionable insights
- Present findings and recommendations through dashboards, reports, and visualizations
- Optimize database and pipeline performance for scalability and reliability across large datasets
- Monitor and troubleshoot pipeline issues to minimize downtime and improve system resilience
- Implement data quality checks, validation routines, and integrity controls
- Implement security measures to protect data and systems from unauthorized access
- Ensure compliance with data governance policies, security standards, and applicable regulatory requirements (e.g., GDPR, HIPAA as applicable)
- Establish best practices for data management, governance, and secure handling of sensitive information
- Stay current on relevant tools and emerging technologies to strengthen engineering and analytical capabilities
- Identify opportunities to improve workflows for data ingestion, processing, and analysis
- Evaluate and recommend tools, platforms, and data management solutions aligned to organizational goals
Required Qualifications:
- Active DoD TS/SCI clearance (required or pending verification)
- Bachelor’s degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience) OR CSSLP / CISSP-ISSAP
- Strong programming skills in Python, Java, or Scala
- Strong SQL skills; familiarity with analytics languages/tools such as R
- Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow)
- Familiarity with cloud-based data services (e.g., AWS Redshift, Google BigQuery, Azure Data Factory)
- Experience with data modeling, database design, and data architecture concepts
- Strong analytical and problem-solving skills with attention to detail
- Strong written and verbal communication skills; ability to collaborate across technical and non-technical teams
Desired Qualifications:
- Master’s degree and/or certifications (e.g., AWS Certified Data Analytics, DAMA CDMP)
- Experience with big data and streaming platforms (e.g., Kafka, Spark, Flink)
- Experience with containerization (Docker, Kubernetes)
- Familiarity with ML pipelines and data science workflows
- Knowledge of enterprise architecture frameworks (e.g., TOGAF)
EEO Commitment
All qualified applicants will receive consideration for employment without regard to sex, race, ethnicity, age, national origin, citizenship, religion, physical or mental disability, medical condition, genetic information, pregnancy, family structure, marital status, ancestry, domestic partner status, sexual orientation, gender identity or expression, veteran or military status, or any other basis prohibited by law.
Create a Job Alert
Interested in building your career at Barbaricum? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field