Skip navigation
#195872

Data Engineer

Denver, CO or Remote; Virtual
Date:

Overview

Placement Type:

Temporary

Salary (USD):

$48.90 to $54.33 Hourly

Start Date:

06.24.2024

In this role you will be contracted to build and maintain scalable data pipelines on Google Cloud Platform (GCP) to serve our cyber security data mart customers in building an asset inventory. You will work closely with cross-functional teams to ensure data integrity, reliability, and scalability. Your expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, Python, and SQL will be crucial in developing effective data solution that support our security analytics and reporting efforts.

What you’re good at
• Design, build, and maintain scalable data pipelines using Google Cloud Platform tools such as BigQuery, Cloud Storage, and Cloud Composer.
• Develop and optimize SQL queries to support data extraction, transformation, and loading (ETL) processes.
• Collaborate with cross-functional teams, including business customers, Subject Matter Experts, to understand data requirements and deliver effective solutions.
• Implement best practices for data quality, data governance and data security.
• Monitor and troubleshoot data pipeline issues, ensuring high availability and performance.
• Contribute to data architecture decisions to provide recommendations for improving the data pipeline.
• Stay up to date with emerging trends and technologies in cloud-based data engineering and cyber security.
• Exceptional communication skills, including the ability to gather relevant data and information, actively listen, dialogue freely, and verbalize ideas effectively.
• Ability to work in an Agile work environment to deliver incremental value to customers by managing and prioritizing tasks.
• Proactively lead investigation and resolution efforts when data issues are identified taking ownership to resolve them in a timely manner.
• Ability to interoperate and document processes and procedures for producing metrics.

Must Have
• Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field.
• 3 – 5 years of hands-on experience with data management in gathering data from multiple sources and consolidating them into a single centralized location. Transforming the data with business logic in a consumable manner for visualization and data analysis.
• Strong expertise in Google BigQuery, Google Cloud Storage, Cloud Composer, and related Google Cloud Platform (GCP) services.
• Proficiency in Python and SQL for data processing and automation.
• Experience with ETL processes and data pipeline design.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration.

Nice to Have
• Experience with other GCP Services like Dataflow, Pub/Sub, or Data Studio.
• Knowledge of DevOps practices and tools such as Terraform.
• Familiarity with data visualization tools such as Tableau, Grafana, and/or Looker.
• Understanding Cyber Security data tools and analysis methodologies.
• One or more of the following certifications preferred: CompTIA Sec+, CISM, CISSP, CRISC and / or CISA.
• Understanding of security control frameworks such as NIST, CIS controls, COBIT, ISO etc.

The target hiring compensation range for this role is the equivalent of $48.90 to $54.33 an hour. Compensation is based on several factors including, but not limited to education, relevant work experience, relevant certifications, and location.
Additional benefits offered may include; medical health insurance and dental insurance, life insurance, and eligibility to participate in 401k plan with company match.