29th July, 2025
Software Engineer Senior #1032781
Job Description:
- We are seeking a highly skilled and experienced Senior DataOps Engineer to join our EPEO DataOps team.
- This role will be pivotal in designing, building, and maintaining robust, scalable, and secure telemetry data pipelines on Google Cloud Platform (GCP).
- The ideal candidate will have a strong background in DataOps principles, deep expertise in GCP data services, and a solid understanding of IT operations, especially within the security and network domains.
- You will enable real-time visibility and actionable insights for our security and network operations centers, contributing directly to our operational excellence and threat detection capabilities.
Skills Required: - Code Assessment
- GCP
- Data Architecture
- Endpoint Security
- Google Cloud Platform
- Data Governance
- Cloud Infrastructure
- Extract Transform Load (Etl)
- Big Query
- Network Security
- Python
Skills Preferred: - Problem Solving
- Critical Thinking
- Communications
- Cross-functional
- Technologies
- Cloud Computing
Experience Required: Core DataOps & Engineering Skills: - Proven experience as a DataOps Engineer, Data Engineer, or similar role, with a strong focus on operationalizing data pipelines.
- Expertise in designing, building, and optimizing large-scale data pipelines for both batch and real-time processing.
- Strong understanding of DataOps principles, including CI/CD, automation, data quality, data governance, and monitoring.
- Proficiency in programming languages commonly used in data engineering, such as Python.
- Experience with Infrastructure as Code (IaC) tools (e.g., Terraform) for managing cloud resources.
- Solid understanding of data modeling, schema design, and data warehousing concepts (e.g., star schema).
Experience Preferred: Key Responsibilities: - Design & Development: Lead the design, development, and implementation of high-performance, fault-tolerant telemetry data pipelines for ingesting, processing, and transforming large volumes of IT operational data (logs, metrics, traces) from diverse sources, with a focus on security and network telemetry.
- GCP Ecosystem Management: Architect and manage data solutions using a comprehensive suite of GCP services, ensuring optimal performance, cost-efficiency, and scalability. This includes leveraging services like Cloud Pub/Sub for messaging, Dataflow for real-time and batch processing, BigQuery for analytics, Cloud Logging for log management, and Cloud Monitoring for observability.
- DataOps Implementation: Drive the adoption and implementation of DataOps best practices, including automation, CI/CD for data pipelines, version control (e.g., Git), automated testing, data quality checks, and robust monitoring and alerting.
- Security & Network Focus: Develop specialized pipelines for critical security and network data sources such as VPC Flow Logs, firewall logs, intrusion detection system (IDS) logs, endpoint detection and response (EDR) data, and Security Information and Event Management (SIEM) data (e.g., Google Security Operations / Chronicle).
- Data Governance & Security: Implement and enforce data governance, compliance, and security measures, including data encryption (at rest and in transit), access controls (RBAC), data masking, and audit logging to protect sensitive operational data.
- Performance Optimization: Continuously monitor, optimize, and troubleshoot data pipelines for performance, reliability, and cost-effectiveness, identifying and resolving bottlenecks.
Education Required: Education Preferred: - Collaboration & Mentorship: Collaborate closely with IT operations, security analysts, network engineers, and other data stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor junior engineers and contribute to the team's technical growth.
- Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and operational procedures.
Education & Experience: - Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
- Typically, 8+ years of experience in data engineering, with at least 4 years in a Senior or Lead role focused on DataOps or cloud-native data platforms.
Additional Info: At FastTek Global,
Our Purpose is
Our People and
Our Planet. We come to work each day and are reminded we are
helping people find their success stories. Also,
Doing the right thing is our mantra. We act responsibly, give back to the communities we serve and have a little fun along the way.
We have been doing this with pride, dedication and plain, old-fashioned hard work for
24 years!
FastTek Global is financially strong, privately held company that is
100% consultant and
client focused.
We've differentiated ourselves by being
fast, flexible, creative and
honest. Throw out everything you've heard, seen, or felt about every other IT Consulting company. We do unique things and we do them for Fortune 10, Fortune 500, and technology start-up companies.
Our benefits are second to none and thanks to our
flexible benefit options you can choose the benefits you need or want, options include:
- Medical and Dental (FastTek pays majority of the medical program)
- Vision
- Personal Time Off (PTO) Program
- Long Term Disability (100% paid)
- Life Insurance (100% paid)
- 401(k) with immediate vesting and 3% (of salary) dollar-for-dollar match
Plus, we have a lucrative employee referral program and an employee recognition culture.
FastTek Global was named one of the
Top Workplaces in Michigan by the Detroit Free Press in
2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021,
2022 and 2023! To view all of our open positions go to: https://www.fasttek.com/fastswitch/findwork
Follow us on Twitter: https://twitter.com/fasttekglobal
Follow us on Instagram: https://www.instagram.com/fasttekglobal
Find us on LinkedIn: https://www.linkedin.com/company/fasttek
You can become a fan of FastTek on Facebook: https://www.facebook.com/fasttekglobal/
Apply For Job