Pyramid Systems, Inc.

Data Engineer II

Job Locations US
Posted Date 11 hours ago(7/22/2025 9:51 PM)
Job ID
2025-2115
# of Openings
1

Overview

Pyramid Systems is looking for a Data Engineer (mid-level) who is passionate about bringing creative architect solutions to end customers.

 

Key Skills:

  • 5+ years of IT experience focusing on enterprise data architecture and management
  • Experience with Databricks, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
  • Experience with ETL and ELT tools such as SSIS, Pentaho, and/or Data Migration Services
  • Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design, Postgres performance optimization)

Responsibilities

Responsibilities

    • Plan, create, and maintain data architectures, ensuring alignment with business requirements
    • Obtain data, formulate dataset processes, and store optimized data
    • Identify problems and inefficiencies and apply solutions
    • Determine tasks where manual participation can be eliminated with automation.
    • Identify and optimize data bottlenecks, leveraging automation where possible
    • Create and manage data lifecycle policies (retention, backups/restore, etc)
    • In-depth knowledge for creating, maintaining, and managing ETL/ELT pipelines
    • Create, maintain, and manage data transformations
    • Maintain/update documentation
    • Create, maintain, and manage data pipeline schedules
    • Monitor data pipelines
    • Create, maintain, and manage data quality gates (Great Expectations) to ensure high data quality
    • Support AI/ML teams with optimizing feature engineering code
    • Expertise in Spark/Python/Databricks, Data Lake and SQL
    • Create, maintain, and manage Spark Structured Steaming jobs, including using the newer Delta Live Tables and/or DBT
    • Research existing data in the data lake to determine best sources for data
    • Create, manage, and maintain ksqlDB and Kafka Streams queries/code
    • Data driven testing for data quality
    • Maintain and update Python-based data processing scripts executed on AWS Lambdas
    • Unit tests for all the Spark, Python data processing and Lambda codes
    • Maintain PCIS Reporting Database data lake with optimizations and maintenance (performance tuning, etc)
    • Streamlining data processing experience including formalizing concepts of how to handle lake data, defining windows, and how window definitions impact data freshness.

Qualifications

  • 5+ years of IT experience focusing on enterprise data architecture and management
  • Must be able to obtain a Public Trust security clearance
  • MUST BE US CITIZEN
  • Bachelor degree required
  • Experience in Conceptual/Logical/Physical Data Modeling & expertise in Relational and Dimensional Data Modeling
  • Experience with Databricks, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
  • Additional experience with Spark, Spark SQL, Spark DataFrames and DataSets, and PySpark
  • Data Lake concepts such as time travel and schema evolution and optimization
  • Structured Streaming and Delta Live Tables with Databricks a bonus
  • Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support
  • Advanced level understanding of streaming data pipelines and how they differ from batch systems
  • Formalize concepts of how to handle late data, defining windows, and data freshness
  • Advanced understanding of ETL and ELT and ETL/ELT tools such as SSIS, Pentaho, Data Migration Service etc
  • Understanding of concepts and implementation strategies for different incremental data loads such as tumbling window, sliding window, high watermark, etc.
  • Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus
  • Understanding of streaming data pipelines and batch systems
  • Familiarity with concepts such as late data, defining windows, and how window definitions impact data freshness
  • Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design, Postgres performance optimization)
  • Indexing and partitioning strategy experience
  • Debug, troubleshoot, design and implement solutions to complex technical issues
  • Experience with large-scale, high-performance enterprise big data application deployment and solution
  • Understanding how to create DAGs to define workflows
  • Familiarity with CI/CD pipelines, containerization, and pipeline orchestration tools such as Airflow, Prefect, etc a bonus but not required
  • Architecture experience in AWS environment a bonus
  • Familiarity working with Kinesis and/or Lambda specifically with how to push and pull data, how to use AWS tools to view data in Kinesis streams, and for processing massive data at scale a bonus
  • Experience with Docker, Jenkins, and CloudWatch
  • Ability to write and maintain Jenkinsfiles for supporting CI/CD pipelines
  • Experience working with AWS Lambdas for configuration and optimization
  • Experience working with DynamoDB to query and write data
  • Experience with S3
  • Knowledge of Python (Python 3 desired) for CI/CD pipelines a bonus
  • Familiarity with Pytest and Unittest a bonus
  • Experience working with JSON and defining JSON Schemas a bonus
  • Experience setting up and management Confluent/Kafka topics and ensuring performance using Kafka a bonus
  • Familiarity with Schema Registry, message formats such as Avro, ORC, etc.
  • Understanding how to manage ksqlDB SQL files and migrations and Kafka Streams
  • Ability to thrive in a team-based environment
  • Experience briefing the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior level of management

Target Pay Range

The below listed pay range for this position is not a guarantee of compensation or salary. The final offered salary will be influenced by a host of factors including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at Pyramid Systems that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits, to include our Employee Stock Ownership Program, FlexPTO, and learning and development opportunities.

Pyramid Min

USD $81,305.00/Yr.

Pyramid Max

USD $121,957.00/Yr.

Why Pyramid?

Pyramid Systems, Inc. is an award-winning, technology leader, driving digital transformation across federal agencies. We empower forward-thinking innovations, accelerate production-ready software, and deliver secure solutions so federal agencies can meet their mission goals. Voted a Top Workplace, both regionally (Washington, DC) and Nationally (USA) the past 2 years (2023 and 2024) based on the feedback from our employees, we are headquartered in Fairfax, VA. and have a growing national footprint. We value and promote our Flexible Workplace approach because of the positive impacts it has on work-life integration. We remain committed to ensuring every employee’s voice is heard, performance and results are recognized and rewarded, development and advancement is a focus, and diversity, equity and inclusion is a company priority. We offer competitive compensation and benefits (including a recently launched Employee Stock Ownership Plan - ESOP), a robust performance-based rewards program, and we know how to have fun! Our people and culture have endured and delivered for our clients for nearly three decades.

EEO Statement

Pyramid Systems, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed