Reach Security (https://reach.security) builds self-driving cybersecurity. Reach employs the first generative AI for cybersecurity to oversee an organization's current security stack, aiming to achieve the best possible security posture with the products already in use.
We are seeking Data Engineers at all levels to design, build, and manage robust data pipelines to support analytics use cases within a Lakehouse architecture. You will play a critical role in developing scalable solutions using Apache Airflow or Dagster to ingest, transform, and manage large volumes of data efficiently and reliably for analytic workloads.
The ideal candidate is a motivated problem solver who prioritizes high-quality solutions and excels in navigating ambiguity. As an early team member, you will have the opportunity to take ownership of various aspects of our backend from day one. Your role will be pivotal in establishing engineering best practices, balancing engineering priorities with business needs, and identifying innovative approaches to deliver outstanding value to our users. Your engineering knowledge will be applied by designing top-notch architectures, offering insightful feedback on technical designs, solving difficult problems and conducting thorough code reviews, all aimed at ensuring the software we build is both maintainable and dependable.
Design, implement, and maintain scalable and reliable data pipelines using Apache Airflow or Dagster.
Work closely with Platform and Product teams to ensure efficient data ingestion, transformation, and storage strategies.
Develop and optimize data models and schemas that power analytical queries and reporting.
Ensure data integrity, quality, and consistency across Data Warehouse, Data Lake, and Lakehouse environments.
Troubleshoot and optimize performance bottlenecks in complex data processing workflows.
Collaborate in defining engineering best practices, standards, and processes to enhance team productivity and quality.
Proactively identify opportunities to enhance pipeline efficiency, scalability, and reliability.
3+ years of experience in data engineering with a specific focus on building and managing data pipelines.
Strong proficiency in Python and experience with Apache Airflow or Dagster.
Expertise in developing solutions within Data Warehouse, Data Lake, and Lakehouse architectures.
Deep understanding of ETL/ELT processes, data transformation techniques, and workflow orchestration.
Experience working with cloud-based data platforms and services (AWS, Azure, GCP, etc.).
Solid foundation in data modeling, schema design, and optimization techniques.
Excellent problem-solving skills, capable of addressing challenges around data consistency, performance, and scalability.
Strong communication skills with the ability to articulate complex data engineering concepts clearly.
A proactive and collaborative mindset, comfortable working independently and within fast-paced teams.
Must be a US citizen or Green Card holder.
Experience with both batch and streaming data pipelines.
Demonstrated expertise in advanced database schema design, query optimization, and database scaling.
Familiarity with Infrastructure as Code (IaC) tools such as Terraform, Pulumi, or AWS CDK.
Proven ability to align data engineering solutions closely with strategic business objectives.
Competitive salary and equity.
Comprehensive health, dental, and vision insurance.
Remote work flexibility.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
A forward-thinking data platform engineer role at Reach Security, focusing on building scalable and observable data infrastructures using cutting-edge technologies.
Experienced Senior Data Engineer with TS/SCI clearance needed to develop scalable ETL pipelines and data processing platforms for mission-driven federal initiatives.
BetterHelp is looking for an experienced Data Engineer to design and implement scalable data pipelines and platforms supporting our global mental health services.
Exciting opportunity for a skilled Data Engineer with Azure and Python expertise to join Ascent in a hybrid role focused on data pipeline development and optimization.
Capital One seeks a Senior Data Engineer to design and implement cutting-edge data solutions that empower millions and drive technological transformation.
BlackRock is hiring an Analyst - Data Engineering to develop and maintain scalable data pipelines within their Market Data team under a hybrid work model in Atlanta, GA.
Contribute to advancing government missions by engineering scalable data solutions using Azure technologies at Accenture Federal Services.
Mindful Support Services seeks a skilled Data Engineer to build and optimize custom data workflows and solutions supporting multiple departments from their Mountlake Terrace headquarters.
Lead the design and implementation of high-performance data pipelines and ETL workflows at Agile Defense to empower critical cyber-analytics in national security.
Experienced Data Engineer needed to architect and lead innovative, large-scale data pipelines and platforms at Chewy.
West Monroe seeks a Data Engineer to develop data pipelines and collaborate with clients on enterprise data projects in a hybrid work setting.
Lead and shape the data platform as a Staff Software Engineer at Old Well Labs, a cutting-edge financial intelligence startup in Charlotte.
TWG Global is looking for a Senior AI Data Engineer to lead the development of scalable AI-driven data infrastructure supporting key business decisions in a hybrid work environment.
Frost is looking for a skilled Data Engineer II to lead ETL development and optimize data pipelines in a thriving banking environment.