airflow Remote Jobs

143 Results

+30d

Senior Data Engineer

CLEAR - CorporateNew York, New York, United States (Hybrid)
tableauairflowsqlDesignjenkinspythonAWS

CLEAR - Corporate is hiring a Remote Senior Data Engineer

Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Data Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Data Engineer at CLEAR, you will participate in the design, implementation, testing, and deployment of applications to build and enhance our platform- one that interconnects dozens of attributes and qualifications while keeping member privacy and security at the core. 


A brief highlight of our tech stack:

  • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

What you'll do:

  • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
  • Develop and implement data analytics models
  • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
  • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

 What you're great at:

  • 6+ years of data engineering experience
  • Working with cloud-based application development, and be fluent in at least a few of: 
    • Cloud services providers like AWS
    • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
    • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
    • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
    • Data visualization tool like Looker, Tableau, etc
  • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
  • Collaborating and mentoring less experienced members of the team
  • Comfort with ambiguity 
  • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

See more jobs at CLEAR - Corporate

Apply for this job

+30d

Senior Analytics Engineer

CLEAR - CorporateNew York, New York, United States (Hybrid)
tableauairflowsqlDesignjenkinspythonAWS

CLEAR - Corporate is hiring a Remote Senior Analytics Engineer

Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Analytics Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Analytics Engineer at CLEAR, you will participate in the design, implementation, testing, and deployment of applications to build and enhance our platform- one that interconnects dozens of attributes and qualifications while keeping member privacy and security at the core. 


A brief highlight of our tech stack:

  • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

What you'll do:

  • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
  • Develop and implement data analytics models
  • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
  • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

 What you're great at:

  • 6+ years of data engineering experience
  • Working with cloud-based application development, and be fluent in at least a few of: 
    • Cloud services providers like AWS
    • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
    • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
    • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
    • Data visualization tool like Looker, Tableau, etc
  • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
  • Collaborating and mentoring less experienced members of the team
  • Comfort with ambiguity 
  • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

See more jobs at CLEAR - Corporate

Apply for this job

+30d

Alternance Data Engineer (F/H)

ASINantes, France, Remote
S3agilescalanosqlairflowmongodbazurescrumjava

ASI is hiring a Remote Alternance Data Engineer (F/H)

Description du poste

Dans un souci d’accessibilité et de clarté, les termes employés au masculin se réfèrent aussi bien au genre féminin que masculin.

Nous recherchons un Data Engineer pour intégrer notre équipe Data de Nantes, dans le cadre d’un contrat d’alternance.

Sous la responsabilité de Simon, tuteur dédié pour suivre votre évolution, vous serez amené à réaliser progressivement les activités suivantes :

 

  • Participer au développement d’une chaîne de traitement de l’information de bout en bout
  • Intervenir sur de l’analyse descriptive/inférentielle ou prédictive
  • Participer aux spécifications techniques
  • Appréhender les méthodologies Agile Scrum et cycle en V
  • Monter en compétences dans l’un ou plusieurs des environnements technologiques suivants :
    • Les langages : Java, Scala…
    • L’écosystème Data : Spark, Kafka…
    • Les bases de données NoSQL : MongoDB, Cassandra…
    • Le stockage cloud : S3, Azure…
    • Les ETL/Outils d'orchestration du marché : Airflow, Datafactory, Talend...

 

Qualifications

De formation supérieure en informatique, mathématiques ou spécialisée en Big Data en cours de validation, vous recherchez une alternance (professionnalisation ou apprentissage) d’une durée de 12 mois et à partir de janvier 2025.

Passionné par la donnée, vous faites preuve de rigueur et d'organisation dans la réalisation de vos activités.

Vous avez l’esprit d’équipe et vos qualités relationnelles vous permettent de vous intégrer facilement au sein de l’équipe.

L’alternance pourra déboucher sur une proposition d'emploi concrète en CDI.

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement.

À compétences égales ce poste est ouvert aux personnes en situation de handicap.

See more jobs at ASI

Apply for this job

+30d

Lead Product Architect - Qubole

SalesFull TimeDevOPSMaster’s DegreeairflowDesignrubyjavac++pythonjavascript

Idera, Inc. is hiring a Remote Lead Product Architect - Qubole

Lead Product Architect - Qubole - Idera, Inc. - Career PageThe Lead Product Architect is a key member of the core team tasked with defining and achieving product release results that achieve defined business goals. This role owns all technical and delivery matters r

See more jobs at Idera, Inc.

Apply for this job

+30d

Middle/Senior Data Engineer (Social Shopping Platform)

Sigma SoftwareWarsaw, Poland, Remote
airflowDesigngitpythonAWS

Sigma Software is hiring a Remote Middle/Senior Data Engineer (Social Shopping Platform)

Job Description

  • Contributing to new technology investigations and complex solution design, supporting a culture of innovation by considering matters of security, scalability, and reliability, with a focus on building out our ETL processes 
  • Working with a modern data stack, coming up with well-designed technical solutions and robust code, and implementing data governance processes 
  • Working and professionally communicating with the customer’s team 
  • Taking responsibility for delivering major solution features 
  • Participating in the requirements gathering and clarification process, proposing optimal architecture strategies, and leading the data architecture implementation 
  • Developing core modules and functions, designing scalable and cost-effective solutions 
  • Performing code reviews, writing unit and integration tests 
  • Scaling the distributed system and infrastructure to the next level 
  • Building data platform using power of AWS cloud provider 

Qualifications

  • 3+ years of strong experience with Python as a programming language for data pipelines and related tools
  • Proven strong track record of building data platforms and managing infrastructure for Airflow and Databricks
  • Familiarity and understanding of distributed data processing with Spark for data pipeline optimization and monitoring workloads
  • Proven strong track record of building data transformations using data build tools 
  • Excellent implementation of data modeling and data warehousing best practices
  • Good written and spoken English communication skills 
  • Familiarity with software engineering best practices: testing, PRs, Git, code reviews, code design, releasing 
  • Proven strong track record of building data platforms and managing infrastructure for Airflow and Databricks 

WOULD BE A PLUS

  • Strong Data Domain background – understanding of how data engineers, data scientists, analytics engineers, and analysts work to be able to work closely with them and understand their needs 
  • Experience with DAGs and orchestration tools 
  • Experience in developing event-driven data pipelines 

 

See more jobs at Sigma Software

Apply for this job

+30d

Sales & Partnerships - Data Engineer

lastminute.comLisbon, Portugal, Remote
Sales2 years of experiencetableauscalaairflowsqlDesignmobilepythonAWS

lastminute.com is hiring a Remote Sales & Partnerships - Data Engineer

Job Description

lastminute.com is looking for a Data Engineer for its Sales & Partnerships team inside the Data & Analytics department.

The activities of the Sales & Partnerships domain team are focused on reports, tables, analysis and, more generally, all sorts of deliverables related to company's sales data in order to create an important value in supporting decision-making of the business. Significant emphasis will be placed on partnerships data preparation and analysis, helping our business to find best solutions with the partners, monitoring performances and evaluating the effectiveness of sales campaigns, agreements and initiatives through the time. 

The candidate will have the opportunity to become a key member of the team leveraging their engineering skills to acquire, manipulate, orchestrate and monitor data.

Data is at our core and its reliability and effectiveness have direct impact in producing actionable insights and improving business performances

* Please note that this is a remote working model position, remote possibilities can be evaluated inside Portuguese territory only.

Qualifications

Key Responsibilities

  • Understand and analyse functional needs, raw data and develop data dimensional models
  • Design, build and deploy data pipelines with a focus on automation, performance optimization, scalability, and reliability aspects
  • Helps the business to understand the data and find insights that enable the company to take data driven decisions
  • Leverage data and business principles to solve large-scale web, mobile and data infrastructure problems
  • Build data expertise and own data quality for your area

 

Skills and Experience

Essentials

  • At least 2 years of experience in similar role in a fast-paced environment
  • SQL advanced knowledge
  • Experience in Data Modelling
  • Experience in ETL design, implementation and maintenance
  • Experience with workflow management engines (e.g. Airflow, Google Cloud Composer, Talend)
  • Experience with data quality and validation
  • Fluent in English both written and spoken


Desirable 

  • Bachelor or master degree in Statistics, Mathematics, Engineering or Physics or similar fields
  • Experience working with cloud or on-prem Big Data/MPP analytics platform (e.g. AWS Redshift, Google BigQuery or similar)
  • Programming languages knowledge (e.g. Python, R, Scala)
  • Experience in analysing data to discover opportunities, address gaps and anomaly/outlier detection
  • Experience with Analytics tool (e.g. QlikView, Tableau, Spotfire)
  • Familiarity with digital and e-commerce business

 

Abilities/qualities 

  • Problem solving and decision making skills and innovative thinking 
  • Proactivity and strategic approach
  • Ability to interface with business stakeholders by presenting and negotiating one's solutions
  • Passionate about digital world, ambitious and motivated with a can-do attitude
  • High attention to detail and ability to effectively manage multiple projects at a time, successfully able to meet deadlines
  • Strong team player with a willingness to challenge existing processes and applications

See more jobs at lastminute.com

Apply for this job

+30d

Senior AI Infra Engineer, Caper

InstacartUnited States - Remote
S3Master’s DegreeairflowDesigndockerelasticsearchkubernetesAWS

Instacart is hiring a Remote Senior AI Infra Engineer, Caper

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

 

Overview

About the Role

We are seeking a highly skilled and motivated AI Infra Engineer to design, develop, and maintain our data platform specifically tailored for deep learning and computer vision applications. You will be responsible for building and optimizing our data infrastructure to support large-scale data collection, labeling, management, model training, evaluation, and continuous deployment. Your work will be critical in enabling our AI and computer vision teams to build and deploy state-of-the-art models efficiently and reliably.

About the Team

The AI and CV team at Caper (Instacart) innovates at the industry frontier across cloud and edge computing. The systems and algorithms built enable a magical shopping and checkout process in grocery stores. Our enthusiastic researchers and engineers are spread across different time zones but collaborate effectively on multiple exciting projects.

About the Job 

Your responsibilities will include one or more of the following:

  • Design, build, and maintain scalable and efficient data pipelines for collecting, processing, and storing large volumes of structured and unstructured data, specifically image and video streams and relevant metadata.
  • Develop and integrate tools for data labeling and annotation, ensuring high-quality training datasets for deep learning and computer vision models.
  • Collaborate with data scientists and machine learning engineers to build and optimize the infrastructure required for training and evaluating deep learning models at scale.
  • Build and maintain CI/CD pipelines to seamlessly deploy machine learning models into production environments.

About You

Minimum Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering, full-stack and/or infrastructure development.
  • Proven experience with building and maintaining large-scale data pipelines (batching or streaming) for computer vision and/or machine learning applications.
  • Familiarity with observability and monitoring tools (e.g. Datadog) and best practices. 
  • Familiarity with frameworks for large-scale data processing (e.g., Kafka, Spark, Airflow, Ray), storage (e.g., S3, Delta Lake), indexing and search (e.g. Elasticsearch).
  • Experience with cloud platforms (e.g., AWS, GCP) and containerization technologies (e.g., Docker, Kubernetes).
  • Strong problem-solving skills to work in a fast-paced, dynamic environment.
  • Excellent communication skills to work collaboratively in a cross-functional team

Preferred Qualifications

  • Experience building and/or integrating computer vision data collection, labeling and management systems.
  • Experience in edge inference and optimization on Nvidia chipsets. 
  • Experience with deep learning frameworks (e.g., TensorFlow, PyTorch) and model management platforms (e.g., Kubeflow, MLflow, TensorBoard).
  • Knowledge of computer vision and machine learning algorithms and models.
  • Experience with frameworks and best practices for data security or compliance.

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$198,000$220,000 USD
WA
$190,000$211,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$182,000$202,000 USD
All other states
$165,000$183,000 USD

See more jobs at Instacart

Apply for this job

+30d

Data Operations Engineer

PleoRemote - India
SalesagileairflowDesignscrumpostgresqlpython

Pleo is hiring a Remote Data Operations Engineer

Let’s face it: out-of-pocket expenses suck. And manual expense spreadsheets are old-school. No one wants to wait until payday to be reimbursed for something they bought for work, and finance teams have better things to do than spend hours tapping away on Excel. At Pleo, we’re on a mission to change this. We’re here to make spend management surprisingly effective and empowering – for finance teams and employees. But.. we need your help!

What do we need?

As part of our growing Revenue Operations domain, we have established a team called Business Architecture & Technology. This team will focus on optimising processes, data, and tools for our commercial part of the business (Marketing, Sales, Partnerships & Customer Experience). For this team, are looking for a best in class Data Operations Engineer to join our new Hub in Chennai, India on a hybrid working arrangement.

Your role in this will be to help us design and implement new data structures and integrate new tools to support this. Think of it as this - we need to make sure the commercial aspect of Pleo is happening as smoothly and efficiently as possible. We achieve this by designing and maintaining the tech stack, extensions, and related processes. We are a leading design customer of many vendors and always aim to push our tools above and beyond. 

In Pleo, Revenue Operations is a cross-functional discipline that goes across many areas, such as the demand and opportunity management process, sales planning and forecasting, and monitoring business performance. Hence, this is a unique opportunity to learn more about the many different aspects of running a business, the decision processes and prioritisation needed to execute strategic projects, and insights into where the company is going - almost like a front-row seat!

What to expect in the role ????

In numbers: 75% of your time will be spent working with our data models, data pipelines and creating easy-to-understand and readable documentation, while the other 25% of the time will be discussing priorities with our stakeholders.

  • You will design, develop and maintain high-quality, scalable data pipelines and datasets for commercial teams.
  • Own the ETL and rETL (reverse ETL) processes and enable data activation in your stakeholders' projects.
  • Optimise performance and scalability of existing data pipelines and datasets through code refactoring and infrastructure improvements.
  • Collaborate with internal teams to understand business requirements and translate them into technical specifications.
  • Troubleshoot and resolve issues related to HubSpot, Vitally and Iterable configuration, resolving bugs and tickets that might rise.

What we need from you:

  • Minimum of 1 year of professional experience as a data engineer, data analyst or a proven track record working as an analyst in a similar operations position.
  • Business understanding of what data activation means and how your work impacts commercial teams in the front line (such as Sales, Marketing and Customer Success).
  • Good understanding and hands-on experience with SQL. Understanding of dbt.
  • Experience writing digestible and accessible documentation for your stakeholders.
  • Experience building or maintaining ETL pipelines with Python, nodeJS is a plus.
  • Familiarity with Agile development methodologies and tools (e.g., Scrum, Kanban).
  • Willingness to learn new technologies and languages.
  • Ability to work independently and contribute to multiple projects simultaneously, delivering high-quality results within deadlines.
  • Familiarity with database systems and structures (e.g. PostgreSQL, BigQuery).
  • Excellent problem-solving skills and the ability to debug and resolve complex technical issues efficiently.
  • Strong communication skills, both written and verbal in English (our company language), with the ability to effectively collaborate with cross-functional teams.
  • Familiarity with: Census, Fivetran, dbt, Airflow, Castor, HubSpot, Vitally.

Show me the Benefits!

  • Your own Pleo card (no more out-of-pocket spending!)
  • A monthly allowance of €55 per month (INR equivalent) towards your lunch ????
  • Hybrid, flexible working arrangement
  • 25 days of PTO + public holidays
  • Option to purchase 5 additional days of holiday through a salary sacrifice
  • Wellbeing days - fully paid days off designed for a slower pace, allowing you to take time to recharge and prioritise self-care
  • We’re trialling MyndUp to give our employees access to free mental health and wellbeing support with great success so far  ❤️‍????
  • Access to LinkedIn Learning - acquire new skills, stay abreast of industry trends and fuel your personal and professional development continuously 
  • Paid parental leave - we want to make sure that we're supportive of families and help you feel that you don't have to compromise your family due to work ????

Why join us?

Working at Pleo means you're working on something very exciting: the future of work. Our mission is to help every company go beyond the books. Pleo itself means ‘more than you’d expect’, and it’s been the secret to our success over the last 8 years. So it’s only fitting that we’d pass this philosophy onto our customers to help them make the most of their finances.

We think company spending should be delegated to all employees and teams, that it should be as automated as possible, and that it should drive a culture of responsible spending. Finance teams shouldn’t be siloed from the rest of the organisation – they should work in unity with marketing, sales, IT and everyone else.

Speaking of working in unity, our values tell the story of how we work at Pleo. We have four core values, the first of which is ‘champion the customer’, which means we address real pain points that businesses face. Next up is ‘succeed as a team’, which highlights how our strength lies in our diversity and trust in each other. We also ‘make it happen’ by taking bold decisions and following through to deliver results. Last but not least, we ‘build to scale’, creating lasting solutions that address today’s challenges and anticipate tomorrow’s needs.

So, in a nutshell, that's Pleo. Today we are a 850+ team, from over 100 nations, sitting in our Copenhagen HQ, London, Stockholm, Berlin, Madrid, Montreal and Lisbon offices —and quite a few full-time remotes in 35 other countries! Being HQ'd out of Copenhagen means we're inspired by things like a good work-life balance. If you don't work in the office with us, we'll help you set up the best remote setup possible and make sure you still have time to connect with your team.

About your application

  • Please submit your application in English; it’s our company language so you’ll be speaking lots of it if you join ????
  • We treat all candidates equally:If you are interested please apply through our application system - any correspondence should come from there! Our lovely support isn't able to pass on any calls/ emails our way - and this makes sure that the candidate experience is smooth and fair to everyone????
  • We’re on a mission to make everyone feel valued at work. That’s only achievable if our team reflects the diversity of the world around us - and that starts with you, hitting apply, even if you are worried you might not tick all the boxes! We embrace and encourage people from all backgrounds to apply - regardless of race/ethnicity, colour, religion, nationality, gender, sex, sexual orientation, age, marital status, disability, neurodiversity, socio-economic status, culture or beliefs.
  • When you submit an application we process your personal data as a data processor. Find out more about how your data is used in the FAQs section at the bottom of our jobs page.

See more jobs at Pleo

Apply for this job

+30d

Engineering Manager, Data Infrastructure

GrammarlyGermany; Hybrid
remote-firstairflowDesignazureAWS

Grammarly is hiring a Remote Engineering Manager, Data Infrastructure

Grammarly is excited to offer aremote-first hybrid working model. Grammarly team members in this role must be based in Germany, and, depending on business needs, they must meet in person for collaboration weeks, traveling if necessary to the hub(s) where their team is based.

This flexible approach gives team members the best of both worlds: plenty of focus time along with in-person collaboration that fosters trust and unlocks creativity.

About Grammarly

Grammarly is the world’s leading AI writing assistance company trusted by over 30 million people and 70,000 teams. From instantly creating a first draft to perfecting every message, Grammarly helps people at 96% of theFortune 500 and teams at companies like Atlassian, Databricks, and Zoom get their point across—and get results—with best-in-class security practices that keep data private and protected. Founded in 2009, Grammarly is No. 14 on the Forbes Cloud 100, one of TIME’s 100 Most Influential Companies, one of Fast Company’s Most Innovative Companies in AI, and one of Inc.’s Best Workplaces.

The Opportunity

To achieve our ambitious objectives, we are seeking a highly skilled and experienced Manager for our Data Infrastructure team. This role is crucial in managing and evolving our data ingestion processes and tooling to support self-serve analytics and policy management across the organization. The ideal candidate will possess strong technical expertise, exceptional leadership abilities, and the capability to mentor and develop a high-performing team.

This person will be an integral part of the larger data organization, reporting directly to the Director of Data Engineering based in the US, and they’ll have the opportunity to influence decisions and the direction of our overall data platform, including infrastructure, data processing and analytics engineering.

Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure. You can hear more from our team on our technical blog.

As the Manager of the Data Infrastructure team, you will lead and mentor a team of data & software engineers, fostering a collaborative and innovative environment focused on professional growth. You will oversee the design, implementation, and maintenance of secure, scalable, and optimized data infrastructure, ensuring high performance and reliability. Your role includes developing and executing strategic roadmaps aligned with business objectives and collaborating closely with cross-functional teams and the larger data organization to ensure seamless data integration and access. Additionally, you will provide technical leadership and be pivotal in resource management and recruiting efforts, driving the team’s success and aligning with the organization’s long-term data strategy.

In this role, you will:

  • Build a highly specialized engineering team to support the growing needs and complexity of our product and business organizations. 
  • Oversee the design, implementation, and maintenance of a robust data ingestion framework that ensures high availability and reliability.
  • Develop and manage tooling that enables self-serve analytics and policy management across the organization.
  • Ensure data is collected, transformed, and stored efficiently to support real-time and batch processing needs.
  • Act as a liaison between the local team and the broader organization to ensure seamless communication and collaboration.
  • Participate in cross-functional meetings and initiatives to represent the Data Infrastructure team’s interests and contribute to the organization’s overall data strategy.
  • Drive the evaluation, selection, and implementation of new technologies and tools that enhance the team’s capabilities and the organization’s data infrastructure.
  • Implement and enforce data governance policies and practices to ensure data quality, security, and compliance with organizational standards.
  • Collaborate with stakeholders to define and refine data policies that align with business objectives.
  • Monitor and assess the performance of the data infrastructure to identify areas for optimization and improvement.
  • Foster a collaborative and high-performance culture within the team.
  • Cultivate an ownership mindset and culture on your team and across product teams: provide the necessary metrics to help us understand what is working, what is not, and how to fix it.
  • Set high performance and quality standards, coach team members to meet them; mentor and grow junior and senior IC talent.

Qualifications

  • 7+ years of experience in data engineering or data infrastructure, with at least 2-3 years in a leadership or management role.
  • Proven experience in building and managing large-scale data ingestion pipelines and infrastructure.
  • Experience with one or more data platforms (e.g., AWS, GCP, Azure Databricks) 
  • Familiarity with modern data engineering tools and frameworks (e.g., Apache Kafka, Airflow, DBT)
  • Strong understanding of data governance, policy management, and self-serve analytics.
  • Excellent leadership and people management skills, with a track record of mentoring and developing high-performing teams.
  • Experience working with geographically distributed teams and aligning with global data strategies.
  • Strong problem-solving skills, with the ability to navigate and resolve complex technical challenges.
  • Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across different locations and time zones.
  • Has the ability and desire to operate in a fast-paced, dynamic environment where things change quickly.
  • Leads by setting well-understood goals and sharing the appropriate level of context for maximum autonomy but is also deeply technical and can dive in to help when necessary.
  • Embodies our EAGER values—is ethical, adaptable, gritty, empathetic, and remarkable.
  • Is inspired by our MOVE principles: move fast and learn faster; obsess about creating customer value; value impact over activity; and embrace healthy disagreement rooted in trust.
  • Is able to meet in person for their team’s scheduled collaboration weeks, traveling if necessary to the hub where their team is based.

Support for you, professionally and personally

  • Professional growth:We believe that autonomy and trust are key to empowering our team members to do their best, most innovative work in a way that aligns with their interests, talents, and well-being. We also support professional development and advancement with training, coaching, and regular feedback.
  • A connected team: Grammarly builds a product that helps people connect, and we apply this mindset to our own team. Our remote-first hybrid model enables a highly collaborative culture supported by our EAGER (ethical, adaptable, gritty, empathetic, and remarkable) values. We work to foster belonging among team members in a variety of ways. This includes our employee resource groups, Grammarly Circles, which promote connection among those with shared identities including BIPOC and LGBTQIA+ team members, women, and parents. We also celebrate our colleagues and accomplishments with global, local, and team-specific programs. 
  • Comprehensive benefits for candidates based in Germany:Grammarly offers all team members competitive pay along with a benefits package encompassing life care (including mental health care and risk benefits) and ample and defined time off. We also offer support to set up a home office, wellness and pet care stipends, learning and development opportunities, and more.

We encourage you to apply

At Grammarly, we value our differences, and we encourage all to apply. Grammarly is an equal-opportunity company. We do not discriminate on the basis of race or ethnic origin, religion or belief, gender, disability, sexual identity, or age.

For more details about the personal data Grammarly collects during the recruitment process, for what purposes, and how you can address your rights, please see the Grammarly Data Privacy Notice for Candidates here

#LI-Hybrid

 

Apply for this job

+30d

Senior Software Engineer (Data Platforms)

carsalesMelbourne, Australia, Remote
SQSEC2LambdascalaairflowDesign.netdockerpythonAWS

carsales is hiring a Remote Senior Software Engineer (Data Platforms)

Job Description

What you’ll do

  • Contributing to the delivery of scalable data architectures, and data product development & design best practices
  • Leading collaborations across data disciplines to develop, optimise and maintain data pipelines and solutions
  • Engages actively in facilitating team-based problem-solving sessions and contribute to the development of best practices
  • Initiating and nurturing effective working relationships, acting as a trusted advisor on product analytics and commercial data solutions
  • Leading technical recommendations and decision-making while, mentoring early-career engineers playing a key role in growing the team's capabilities
  • Owning the delivery of their allocated initiatives within specified scope, times and budgets

Qualifications

What we are looking for?

Critical to success in the role is the ability to operate in the liminal space between business, data and technical practice.

  • An all-of-business mindset over siloed success; leading with high levels of personal integrity and accountability
  • Skilled at distilling business and analytics requirements into well-defined engineering problems
  • Skilled at applying appropriate software engineering methods (e.g. modularisations, abstractions) that make data assets tractable
  • Track record architecting, building and integrating with RESTful APIs and microservices architectures
  • Proficient in containerisation platforms (e.g. Docker) and CI/CD pipelines
  • Data engineering experience (e.g. transformations, modelling) grounded in the basics of an analytical discipline
  • Skilled in designing and building software/pipelines using cloud services such as AWS EC2, Glue, Lambda, SNS, SQS, IAM, ECS or equivalent
  • Demonstrated experience with distributed technologies and data-intensive processing frameworks -e.g. Airflow, HDFS, Spark, EMR
  • Proficient in at least two programming languages (e.g. Python, Spark, Scala). Experience with .Net frameworks highly favourable

See more jobs at carsales

Apply for this job

+30d

Data Engineer (Intermediate)

SecurityScorecardRemote (Canada)
redisagileBachelor's degreescalanosqlairflowpostgressqlDesignc++jenkinsAWS

SecurityScorecard is hiring a Remote Data Engineer (Intermediate)

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Team

The Data Analytics Engineering team is responsible for developing and managing the core data platform for ratings infrastructure, architecting and implementing business-critical data solutions and pipelines, and enabling data-driven decisions within the organization and for our customers.

About the Role

As a Senior Data Engineer - you will work alongside outstanding engineers to implement new products and features focused on meeting the evolving needs of our customers, while refining requirements with product management and collaborating cross-team. All our team members actively participate in product definition, technical architecture review, iterative development, code review, and operations. Along with this, you’ll have the opportunity to interact with customers to ensure their needs are met.You will be working in a high-performance, fast-paced environment, and contribute to an inclusive work environment.

Responsibilities:

  • Lead and collaborate with engineers to deliver projects from inception to successful execution
  • Write well-crafted, well-tested, readable, maintainable code
  • Participate in code reviews to ensure code quality and distribute knowledge
  • Share engineering support, release, and on-call responsibilities for an always-on 24x7 site
  • Participate in Technical Design Review sessions, and have the ability to explain the various trade-offs made in decisions
  • Maintain existing APIs and data pipelines, contribute to increasing code-coverage 
  • Understand requirements, build business logic and demonstrate ability to learn and quickly adopt 
  • Automate and improve existing processes to sustainably maintain the current features and pipelines
  • Analyze our internal systems and processes, and locate areas for improvement/automation

Requirements

  • BS/MS in computer science or equivalent technical experience, and must have worked in Data engineering space for 3+ years
  • Must have experience in full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations 
  • Technical requirements:
    • Must have 2+ years experience in building and maintaining big data pipelines using Scala with Spark, Airflow, Hive, Presto, Redis
    • Must have 2+ years experience with NoSQL databases, preferably Cassandra / Scylla and Clickhouse; and SQL databases, preferably Postgres
    • Must have 2+ years experience in developing batch/real-time data streams 
    • Worked with CI/CD pipelines using Jenkins
    • Experience with cloud environments, preferably AWS
    • Worked with variety of data (structured/unstructured), data formats (flat files, XML, JSON, relational, parquet)
  • Worked in Agile methodology

Benefits

We offer a competitive salary, stock options, a comprehensive benefits package, including health and dental insurance, unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position. #LI-DNI

See more jobs at SecurityScorecard

Apply for this job

+30d

Senior Data Engineer

StyleSeat100% Remote (U.S. Based Only, Select States)
scalanosqlairflowsqlDesignc++dockerMySQLpython

StyleSeat is hiring a Remote Senior Data Engineer

Senior Data Engineer

100% Remote (U.S. Based Only, Select States - See Below)

About the role

StyleSeat is looking to add a Senior Data Engineer to its cross-functional Search product team. This team of data scientists, analysts, data engineers, software engineers and SDETs is focused on improving our search capability and customer search experience. The Senior Data Engineer will use frameworks and tools to perform the ETL and propose abstractions of those methods to aid in solving the problems associated with data ingestion. 

What you’ll do

  • Handle data engineering tasks in a team focused on improving search functionality and customer search experience.
  • Design, develop, and own ETL pipelines that deliver data with measurable quality.
  • Scope, architect, build, release, and maintain data oriented projects, considering performance, stability, and an error-free operation.
  • Identify and resolve pipeline issues while discovering opportunities for improvement.
  • Architect scalable and reliable solutions to move data across systems from multiple products in nearly real-time.
  • Continuously improve our data platform and keep the technology stack current.
  • Solve critical issues in complex designs or coding schemes.
  • Monitor metrics, analyze data, and partner with other internal teams to solve difficult problems creating a better customer experience.

Who you are 

Successful candidates can come from a variety of backgrounds, yet here are some of the must have  and nice to have experiences we’re looking for:

Must-Have:

  • Expert SQL skills.
  • 4 + years experience with::
    • Scaling and optimizing schemas.
    • Performance tuning ETL pipelines.
    • Building pipelines for processing large amounts of data.
  • Proficiency with Python, Scala and other scripting languages.
  • Experience with:
    • MySQL and Redshift.
    • NoSQL data stores, methods and approaches.
    • Kinesis or other data streaming services. 
    • Airflow or other pipeline workflow management tools. 
    • EMR,  Spark and ElasticSearch.
    • Docker or other container management tools. 
    • Developing infrastructure as code (IAC).
  • Ability to effectively work and communicate with cross-departmental partners and non-technical teams.

Nice to Have:

  • Experience with:
    • Segment customer data platform with integration to Braze.
    • Terraform. 
    • Tableau.
    • Django.
    • Flask.

Salary Range

Our job titles may span more than one career level. The career level we are targeting for this role has a base pay between $136,900 and $184,600. The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. Base pay ranges are subject to change and may be modified in the future. 

Who we are

StyleSeat is the premier business platform for SMBs in the beauty and wellness industry to run and grow their business; and destination for consumers to discover, book and pay. To date, StyleSeat has powered more than 200 million appointments totaling over $12 billion in revenue for small businesses. StyleSeat is a platform and marketplace designed to support and promote the beauty and personal care community. 

Today, StyleSeat connects consumers with top-rated beauty professionals in their area for a variety of services, including hair styling, barbering, massage, waxing, and nail care, among others. Our platform ensures that Pros maximize their schedules and earnings by minimizing gaps and cancellations, effectively attracting and retaining clientele.

StyleSeat Culture & Values 

At StyleSeat, our team is committed to fostering a positive and inclusive work environment. We respect and value the unique perspectives, experiences, and skills of our team members and work to create opportunities for all to grow and succeed. 

  • Diversity - We celebrate and welcome diversity in backgrounds, experiences, and perspectives. We believe in the importance of creating an inclusive work environment where everyone can thrive. 
  • Curiosity- We are committed to fostering a culture of learning and growth. We ask questions, challenge assumptions, and explore new ideas. 
  • Community - We are committed to making a positive impact on each, even when win-win-win scenarios are not always clear or possible in every decision. We strive to find solutions that benefit the community as a whole and drive our shared success.
  • Transparency - We are committed to open, honest, and clear communication. We hold ourselves accountable for maintaining the trust of our customers and team.
  • Entrepreneurship - We are self-driven big-picture thinkers - we move fast and pivot when necessary to achieve our goals. 

Applicant Note: 

StyleSeat is a fully remote, distributed workforce, however, we only have business entities established in the below list of states and, thus, are unable to consider candidates who live in states not on this list for the time being.
**Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time.

 

* Arizona

* Alabama

* California

* Colorado

* Florida

* Georgia

* Illinois

* Indiana

* Massachusetts

* Maryland

* Michigan

* Nebraska

* New York

* New Jersey 

* Ohio

* Oregon

* Pennsylvania

* Virginia 

* Washington

 

See more jobs at StyleSeat

Apply for this job

+30d

Technical Lead - Data Engineering

carwowLondon,England,United Kingdom, Remote
DevOPStableauterraformairflowsqlDesignrubypostgresqlpythonAWS

carwow is hiring a Remote Technical Lead - Data Engineering

THE CARWOW GROUP

Carwow Group is driven by a passion for getting people into cars. But not just any car, the right car. That’s why we are building the go-to destination for car-changing. Designed to reach drivers everywhere with our trail-blazing portfolio of personality rich automotive brands; Carwow, Auto Express, evo, Driving Electric and Car Buyer.

What started as a simple reviews site, is now one of the largest online car-changing destinations in Europe - over 10m customers have used Carwow to help them buy and sell cars since its inception. Last year we grew over 50% with nearly £3bn worth of cars bought on site, while £1.8bn of cars were listed for sale through our Sell My Car service.

In 2024 we went big and acquired Autovia, doubling our audience overnight. Together we now have one of the biggest YouTube channels in the world with over 1.1 billion annual views, sell 1.2 million print copies of our magazines and have an annual web content reach over 350 million.

WHY JOIN US?

We are winners of the prestigious Culture 100 award that recognises the most loved and happiest tech companies to work for! We have just raised $52m in funding led by global venture capital firm Bessemer Venture Partners (an early backer of LinkedIn and Shopify) to accelerate our growth plans!

As pioneers, we’re always driving for new territory and positive change, so our work as a group is never done. Where others see difficulty, it’s our responsibility to see possibility – building new experiences, launching new titles and listening to drivers.

Being a part of Carwow Group means championing drivers and the automotive industry, acting as a disrupter and never being afraid to fail (but learning fast when we do!).

Our team of 500 employees across the UK, Germany, Spain and Portugal are revolutionising car-changing and we are fast expanding our mission across every single brand and country we operate in, so jump in!

ABOUT THE TEAM

As a team our mission is to empower all Carwow teams with access to reliable, actionable data and enable informed decision-making through scalable and secure data platforms.

We’re investing heavily in our Data Engineering capabilities at carwow, ensuring we have the expertise to deliver infrastructure and data-driven best practices that support the growing needs of our Data Analytics team. We build our ETL pipelines in Python and SQL, using Airflow to automate workflows, with data ingested into a Snowflake warehouse from various in-house and 3rd party sources and Terraform to manage infrastructure.

Our tech stack is primarily Ruby on Rails using PostgreSQL, all hosted on Heroku. We use Terraform to manage our infrastructure and encourage teams to be involved in how we deploy and run our code in Production.

You'll be supported by one of our Engineering Managers who are there to help you with your career progression and ensure that the team you are on are working as well as they can - continuously improving!

We have a career progression framework that means you know what is expected of you and how you can progress through our career ladder and your EM will work with you to make sure you are happy, fulfilled, and doing the best work you can be doing.

WHAT YOU’LL DO

  • You will be the Technical Lead within the Data Engineering squad, working simultaneously with the Principal Engineer, Product Manager and Engineering Manager whilst partnering closely with our Analytics & Data Science team to help enhance their productivity
  • You will help to shape the technical strategy for our approach to building data products and the future of Data Engineering at carwow by providing knowledge and expertise in optimum practices and implementation. This includes developing and testing product improvements
  • You will lead the Data Engineering team in scoping how new features can be built and how Stakeholder needs can be met while making pragmatic technical trade-offs
  • Develop and enhance our data architecture and pipelines to enable us to deliver versus our business goals
  • Collaborate with, and mentor your team by reviewing code, monitoring and guiding your team's software in production. Collaborate with other Stakeholders helping them to do their finest work and deliver projects according to our values and brand
  • Define a roadmap of initiatives to both improve our current infrastructure and assist other teams in creating new data products.
  • Contribute to a diverse Engineering culture based on customer-centricity, high-quality code, data-driven outcomes, technical innovation and business impact
  • Identify the important technical differentiating factors of our data product, and leverage those in our architecture

WHAT YOU’LL NEED

  • Previous experience as a Senior Data Engineer or Technical Lead
  • Knowledge of data modelling and schema design with a focus on efficiency, accuracy and scalability
  • Experience with SQL, Python, data infrastructure, SQL ETL/ELT knowledge, experience with DAGs to oversee script dependencies with tools like dbt, Airflow, Snowflake, Terraform
  • It's preferred but not essential if you have experience in Ruby, data visualisation tools (e.g. Looker, Tableau, Power BI), Amplitude, DevOps, Circle CI/CD, AWS
  • Experience guiding and mentoring distributed teams of Engineers to success, working cross-functionally with Product Managers, Stakeholders, and other Developers
  • A desire to learn continuously, share your knowledge, communicate effectively and build a product in close collaboration with others. You raise the bar for technical quality and sharing your expertise
  • Experience delivering work that has measurable impact and value to stakeholders – early and often. You actively scope work and drive projects forward
  • You are fully engaged in the product development life cycle, helping to shape your team’s roadmap

You’re not expected to be an expert in all of these technologies and tools, we are happy to support your learning journey. If you’re unsure about any of the above, please apply.

WHAT’S IN IT FOR YOU

  • Fully remote working role, with offices in London, Munich, Madrid, and Porto that you can work from
  • 4-5 trips to the London office for social and team bonding events
  • Competitive salary to fund that dream holiday to Bali
  • Matched pension contributions for a peaceful retirement
  • Share options - when we thrive, so do you!
  • Vitality Private Healthcare, for peace of mind, plus eyecare vouchers
  • Life Assurance for (even more) peace of mind
  • Monthly coaching sessions with Spill - our mental wellbeing partner
  • Enhanced holiday package, plus Bank Holidays
    • 28 days annual leave
    • 1 day for your wedding
    • 1 day off when you move house - because moving is hard enough without work!
    • For your third year anniversary, get 30 days of annual leave per year
    • For your tenth year anniversary, get 35 days of annual leave per year
    • Option to buy 3 extra days of holiday per year
  • Work from abroad for a month
  • Inclusive parental, partner and shared parental leave, fertility treatment and pregnancy loss policies
  • Bubble childcare support and discounted nanny fees for little ones
  • The latest tech (Macbook or Surface) to power your gif-sending talents
  • Up to £500/€550 home office allowance for that massage chair you’ve been talking about
  • Generous learning and development budget to help you master your craft
  • Regular social events:, tech lunches, coffee with the exec sessions, book clubs, social events/anything else you pester us for
  • Refer a friend, get paid. Repeat for infinite money
  • Lunch & learns and Carwow Classrooms with expert speakers who are here for a free lunch

Diversity and inclusion is an integral part of our culture. We know that diverse teams are strong teams, so we welcome those with alternative identities, backgrounds, and experiences to apply for this position. We make recruiting decisions based on experience, skills and potential, so all our applicants are treated fairly and equally.

See more jobs at carwow

Apply for this job

+30d

Senior Data Engineer

EquipmentShareRemote; Chicago; Denver; Kansas City; Columbia MO
LambdaagileairflowsqlDesignc++postgresqlpythonAWS

EquipmentShare is hiring a Remote Senior Data Engineer

EquipmentShare is Hiring a Senior Data Engineer.

Your role in our team

At EquipmentShare, we believe it’s more than just a job. We invest in our people and encourage you to choose the best path for your career. It’s truly about you, your future, and where you want to go.

We are looking for a Senior Data Engineer to help us continue to build the next evolution of our data platform in a scalable, performant, and customer-centric architecture.

Our main tech stack includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript.

What you'll be doing

We are typically organized into agile cross-functional teams composed of Engineering, Product, and Design, which allows us to develop deep expertise and rapidly deliver high-value features and functionality to our next-generation T3 Platform.

You’ll be part of a close-knit team of data engineers developing and maintaining a data platform built with automation and self-service in mind to support analytics and machine learning data products for the next generation of our T3 Fleet that enableend-users to track, monitor and manage the health of their connected vehicles and deployed assets. 

We'll be there to support you as you become familiar with our teams, product domains, tech stack and processes — generally how we all work together.

Primary responsibilities for a Senior Data Engineer

  • Collaborate with Product Managers, Designers, Engineers, Data Scientists and Data Analysts to take ideas from concept to production at scale.
  • Design, build and maintain our data platform to enable automation and self-service for data scientists, machine learning engineers and analysts.
  • Design, build and maintain data product framework to support EquipmentShare application data science and analytics features.
  • Design, build and maintain CI/CD pipelines and automated data and machine learning deployment processes.
  • Develop data monitoring and alerting capabilities.
  • Document architecture, processes and procedures for knowledge sharing and cross-team collaboration.
  • Mentor peers to help them build their skills.

Why We’re a Better Place to Work

We can promise that every day will be a little different with new ideas, challenges and rewards.

We’ve been growing as a team and we are not finished just yet— there is plenty of opportunity to shape how we deliver together.

Our missionis to enable the construction industry with tools that unlock substantial increases to productivity. Together with our team and customers, we are building the future of construction.

T3is the only cloud-based operating system that brings together construction workflows & data from constantly moving elements in one place.

  • Competitive base salary and market leading equity package.
  • Unlimited PTO.
  • Remote first.
  • True work/life balance.
  • Medical, Dental, Vision and Life Insurance coverage.
  • 401(k) + match.
  • Opportunities for career and professional development with conferences, events, seminars and continued education.
  • On-site fitness center at the Home Office in Columbia, Missouri, complete with weightlifting machines, cardio equipment, group fitness space, racquetball courts, a climbing wall, and much more!
  • Volunteering and local charity support that help you nurture and grow the communities you call home through our Giving Back initiative.
  • Stocked breakroom and full kitchen with breakfast and lunch provided daily by our chef and kitchen crew.

About You

You're a hands-on developer who enjoys solving complex problems and building impactful solutions.  Most importantly, you care about making a difference.

  • Take the initiative to own outcomes from start to finish — knowing what needs to be accomplished within your domain and how we work together to deliver the best solution.
  • You are passionate about developing your craft — you understand what it takes to build quality, robust and scalable solutions.
  • You’ll see the learning opportunity when things don’t quite go to plan — not only for you but for how we continuously improve as a team.
  • You take a hypothesis-driven approach — knowing how to source, create and leverage data to inform decision making, using data to drive how we improve, to shape how we evaluate and make platform recommendations.

So, what is important to us?

Above all, you’ll get stuff done. More importantly, you’ll collaborate to do the right things in the right wayto achieve the right outcomes.

  • 7+ years of relevant data platform development experience building production-grade solutions.
  • Proficient with SQL and a high-order object-oriented language (e.g., Python).
  • Experience with designing and building distributed data architecture.
  • Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow
  • Experience building and managing production-grade data platforms using distributed systems such as Kafka, Spark, Flink and/or others.
  • Familiarity with event data streaming at scale.
  • Proven track record learning new technologies and applying that learning quickly.
  • Experience building observability and monitoring into data products. 
  • Motivated to identify opportunities for automation to reduce manual toil.

EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.

 

#LI-Remote

 

See more jobs at EquipmentShare

Apply for this job

+30d

Senior Data Engineer

MedalogixUnited States, Remote
airflowDesign

Medalogix is hiring a Remote Senior Data Engineer

We are looking for an experienced Senior Data Engineer to join our growing team of data experts. As a data engineer at MedaLogix, you will be responsible for developing, maintaining, and optimizing our data warehouse and data pipelines. The data engineer will support multiple stakeholders, including software developers, database architectures, data analysts, and data scientists, to ensure an optimal data delivery architecture. The ideal candidate should possess strong technical abilities to solve complex problems with data, a willingness to learn new technologies and tools if necessary, and be comfortable supporting the data needs of multiple teams, stakeholders, and products.

  • Design, build, and maintain batch or real-time data pipelines in production while adhering to architectural requirements of maintainability and scalability.
  • Build scalable data pipelines (Python/DBT) leveraging the Airflow scheduler/executor framework.
  • Responsible for ensuring that large and/or more complex engineering projects are delivered in alignment with the appropriate business outcomes.
  • Monitor data systems performance and implement optimization strategies.
  • Stay current with emerging trends and technologies in data engineering, and HealthTech industry standards.
  • Perform on-call off-hours support for critical systems.

• 7+ years as a Data Engineer or related role, with a focus on designing and developing performant data pipelines.

• Intermediate/Expert-level knowledge of Kubernetes fundamentals like nodes, pods, services, deployments etc., and their interactions with the underlying infrastructure.

• 2+ years hands-on experience with Docker containerization technology to package applications for use in a distributed system managed by Kubernetes.

• 3+ years’ experience with orchestration platform airflow. Experience with Azure Data Factory is a plus but not required.

• Expertise in using DBT and Apache Airflow for orchestration and data transformation. • Strong programming skills in Python and SQL. Experience with Scala is a plus but not required.

• Strong experience with at least one cloud platform such as AWS, Azure, or Google Cloud

• Experience working with cloud Data Warehouse solutions (Snowflake).

• Excellent problem-solving, communication, and organizational skills.

• Proven ability to work independently and with a team.

• Approachable, personable and team player comfortable working in an Agile environment

• Experience with working on large data sets and distributed computing.

• Knowledge of EMR systems like HCHB/MatrixCare.

• Prior experience as a Senior Data Engineer within a healthcare SaaS group

  • Health Care Plan (Medical, Dental & Vision)
  • Retirement Plan (401k, IRA)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Paid Time Off (Vacation, Sick & Public Holidays)
  • Family Leave (Maternity, Paternity)

See more jobs at Medalogix

Apply for this job

+30d

HVAC Airflow Loop design release engineer.

Segula TechnologiesMexico City, Mexico, Remote
Bachelor's degree5 years of experienceairflowDesign

Segula Technologies is hiring a Remote HVAC Airflow Loop design release engineer.

Job Description

Responsibilities

The HVAC Release Engineer is responsible for the development and validation of the automotive HVAC unit and airflow loop components. The candidate will ensure the HVAC unit and airflow loop components meet performance objectives and will coordinate development with other functional areas such as IP, hard trim, BIW, Aero/Thermal, Engine Cooling, PDO and suppliers. Duties will include writing source packages, specifications, requirements, change notices and overall supplier management to meet program timing and deliverables. The candidate will maintain a component and sub-system DVP&R, support and keep up to date with component bench testing, simulation, drive-cell and static cell test results as well as perform data analysis and establish proper corrective actions as required. Need to lead virtual team to design and perform CFD analysis of ducts that will meet airflow performance. Candidate will also support pilot build process at the assembly plant for development and installation of HVAC components.

Qualifications

Education

             Bachelor of Science (Engineering) from an ABET accredited university

Basic qualifications, experience, and knowledge

  • Bachelor's Degree in Mechanical or Electrical Engineering (or equivalent) from an ABET accredited university.
  • 1-5 years of experience in HVAC and duct design/development. Other related design experience (Engine cooling, Aero-Thermal) also a plus.
  • Strong PC + MS Office skills required. Excellent written, verbal and listening skills along with strong interpersonal skills.
  • Self-starter and motivated to produce results.
  • Experience with Design Failure Mode & Effects Analysis (DFMEA).
  • Experience with Design Verification Plan & Report (DVP&R).
  • Basic thermodynamics knowledge.

Preferred qualifications            

  • Experience in HVAC, PTC’s, ducts and outlets are a plus.
  • Experience using CAD software (NX, Catia, etc.)
  • Experience with Design for Six Sigma (DFSS).
  • Basic knowledge in Thermodynamics.
  • Automotive assembly plant experience.

Duration

  • As specified in contract

See more jobs at Segula Technologies

Apply for this job

+30d

HVAC Airflow Loop design release engineer

Segula TechnologiesMexico City, Mexico, Remote
Bachelor's degree5 years of experienceairflowDesign

Segula Technologies is hiring a Remote HVAC Airflow Loop design release engineer

Job Description

Responsibilities

The HVAC Release Engineer is responsible for the development and validation of the automotive HVAC unit and airflow loop components. The candidate will ensure the HVAC unit and airflow loop components meet performance objectives and will coordinate development with other functional areas such as IP, hard trim, BIW, Aero/Thermal, Engine Cooling, PDO and suppliers. Duties will include writing source packages, specifications, requirements, change notices and overall supplier management to meet program timing and deliverables. The candidate will maintain a component and sub-system DVP&R, support and keep up to date with component bench testing, simulation, drive-cell and static cell test results as well as perform data analysis and establish proper corrective actions as required. Need to lead virtual team to design and perform CFD analysis of ducts that will meet airflow performance. Candidate will also support pilot build process at the assembly plant for development and installation of HVAC components.

Qualifications

Education

             Bachelor of Science (Engineering) from an ABET accredited university

Basic qualifications, experience, and knowledge

  • Bachelor's Degree in Mechanical or Electrical Engineering (or equivalent) from an ABET accredited university.
  • 1-5 years of experience in HVAC and duct design/development. Other related design experience (Engine cooling, Aero-Thermal) also a plus.
  • Strong PC + MS Office skills required. Excellent written, verbal and listening skills along with strong interpersonal skills.
  • Self-starter and motivated to produce results.
  • Experience with Design Failure Mode & Effects Analysis (DFMEA).
  • Experience with Design Verification Plan & Report (DVP&R).
  • Basic thermodynamics knowledge.

Preferred qualifications            

  • Experience in HVAC, PTC’s, ducts and outlets are a plus.
  • Experience using CAD software (NX, Catia, etc.)
  • Experience with Design for Six Sigma (DFSS).
  • Basic knowledge in Thermodynamics.
  • Automotive assembly plant experience.

See more jobs at Segula Technologies

Apply for this job

+30d

Senior Analytics Engineer

FlywireSpain Remote, Spain, Remote
Bachelor degreeairflowsqlgitdockerpython

Flywire is hiring a Remote Senior Analytics Engineer

Job Description

The Opportunity:

We, at Flywire, are seeking a Senior Analytics Engineer to join our Analytics team and further scale our data driven company.

As a Senior Analytics Engineer, you will bridge the gap between Data Engineering and Reporting to enable our end users to make the most of our data. You will handle data from all areas of the company and succeed through the following responsibilities:

  • Own, manage, and further our full dbt environment and related tooling
  • Manage efficient materializations of streaming data to balance efficiency and latency
  • Explore and implement additional tools to support the wider Analytics team
  • Productionalize complex logic and queries into “gold standard” datasets
  • Provide dbt support throughout the company
  • Optimize dbt models for efficiency, clarity, and scalability
  • Remove and reduce technical debt and automate manual operations
  • Treat data as a product and enforce testing and CI to catch errors early
  • Work with and support company-wide data science/AI initiatives
  • Become a subject matter expert to support all business segments
  • Work with engineering and product teams to understand and model our event data
  • Drive change by identifying areas for improvement while analyzing data

Qualifications

Here’s What We’re Looking For:

  • Bachelor Degree in Economics, Mathematics, or Computer Science
  • 5+ years of experience in an Analytics role
  • Strong proficiency in SQL
  • Strong proficiency with dbt
  • Experience with a cloud data warehouse (BigQuery, Snowflake, Redshift)
  • Experience with git and CI/CD deployments
  • Experience with Python and Docker
  • Experience with Elementary or Great Expectations is preferred
  • Ability to communicate and follow up with internal stakeholders in a timely manner and excellent attention to detail.
  • Experience with BI Tools (Looker or Preset) is a plus

Technologies We Use :

  • GCP/AWS
  • Fivetran/Apache Beam -> BigQuery -> dbt -> Looker/Preset
  • Airflow (GCP Composer)
  • Flink and Apache Beam

See more jobs at Flywire

Apply for this job

+30d

Senior Software Engineer

Live PersonToronto, Canada- Remote
Bachelor's degreenosqlairflowsqlDesigngitjavac++MySQLkubernetesjenkinspythonbackend

Live Person is hiring a Remote Senior Software Engineer

 LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences.  

At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success and reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. 

 

Overview:

The successful candidate has an opportunity to join our AI& Automation team within a fast-paced and successful organization.

 

You will: 

  • Design and develop high-volume, low-latency applications for mission-critical systems and deliver high-availability and performance
  • Design REST-based backend services
  • Debug production issues and help maintain existing code
  • Development of technical specifications and documentation
  • Participate in on-call rotations
  • Work with Bots & Automation team in building next-generation bot runtime platform

 

 

You have:

  • Bachelor's degree in Computer Science or a related field
  • 7+ years of experience building successful production software systems
  • Solid understanding of Data Structures and Algorithm Design
  • Strong programming skills in Java with good knowledge of multi-threading.
  • Expert-level knowledge of Databases (SQL, NoSQL) like Cassandra, MySQL
  • Experience with Data Processing tools like Kafka, Airflow, Apache Spark, Hadoop
  • Experience building REST APIs & debugging distributed microservice-based applications
  • Experience with Git, Jenkins, and other Development tools
  • Experience integrating with third-party APIs
  • Experience in Kubernetes 
  • Experience with NodeJS & Python is a plus

 

 

Benefits: 

  • Health: medical, dental, vision and wellbeing.
  • Time away: 15 days PTO, Public holidays as well 5 care days and 10 sick days.
  • Financial: ESPP, Basic life and AD&D insurance, long-term and short-term disability
  • Family: parental leave, maternity support, fertility services.
  • Development: Generous tuition reimbursement and access to internal professional development resources.
  • Additional: Health Service Navigator, Counseling Services & resources to help you and your family maintain overall good health and wellness
  • #LI-Remote

Why you’ll love working here: 

As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. 

Belonging at LivePerson:

We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law.

We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection.



 

Apply for this job

+30d

Staff Data Infrastructure Engineer

WebflowU.S. Remote
DevOPSS3Webflowremote-firstterraformairflowDesignc++dockerAWSbackend

Webflow is hiring a Remote Staff Data Infrastructure Engineer

At Webflow, our mission is to bring development superpowers to everyone. Webflow is the leading visual development platform for building powerful websites without writing code. By combining modern web development technologies into one platform, Webflow enables people to build websites visually, saving engineering time, while clean code seamlessly generates in the background. From independent designers and creative agencies to Fortune 500 companies, millions worldwide use Webflow to be more nimble, creative, and collaborative. It’s the web, made better. 

We’re looking for a Staff Data Infrastructure Engineer to join our Data Platform team. In this pivotal role, you'll lead efforts to build and manage robust, secure, and scalable infrastructure that powers our data operations. Your responsibilities will include provisioning, deploying, configuring, managing, and scaling key components such as Kafka, Spark, Airflow, and query engines like Athena. You'll work extensively with AWS, containerization technologies, and infrastructure as code tools to ensure our systems run smoothly and reliably. Your expertise will be crucial in integrating these components, providing a solid foundation for our data-driven products. Additionally, you'll have the opportunity to mentor junior engineers and drive best practices across the team. If you're passionate about leveraging cutting-edge technologies to make a real impact, we’d love to connect with you!

About the role:

  • Location: Remote-first (United States; BC & ON, Canada)
  • Full-time
  • Permanent
  • Exempt
  • The cash compensation for this role is tailored to align with the cost of labor in different geographic markets. We've structured the base pay ranges for this role into zones for our geographic markets, and the specific base pay within the range will be determined by the candidate’s geographic location, job-related experience, knowledge, qualifications, and skills.
    • United States  (all figures cited below in USD and pertain to workers in the United States)
      • Zone A: $187,000 - $263,500
      • Zone B: $175,000 - $247,000
      • Zone C: $164,000 - $231,500
    • Canada  (All figures cited below in CAD and pertain to workers in ON & BC, Canada)
      • CAD $212,000 - CAD $299,000
  • Please visit our Careers page for more information on which locations are included in each of our geographic pay zones. However, please confirm the zone for your specific location with your recruiter.
  • Reporting to the Senior Engineering Manager.

As a Staff Data Infrastructure Engineer, you'll

  • Oversee the provisioning and deployment of infrastructure using Pulumi, ensuring seamless deployment of Kafka, Spark, Airflow, Athena, and other critical systems on AWS.
  • Design and implement strategies for scaling Airflow, Kafka, and Spark clusters to accommodate increasing workloads and user demands.
  • Lead efforts in optimizing performance, capacity planning, ensuring fault tolerance, and implementing failure recovery strategies across all infrastructure components.
  • Configure and manage VPCs, load balancers, and VPC endpoints for secure communication between internal and external services.
  • Manage IAM roles, apply security patches, plan and execute version upgrades, and ensure compliance with regulations such as GDPR.
  • Architect and implement high-availability solutions across multiple zones and regions, including backups, multi-region replication, and disaster recovery plans.
  • Oversee S3 data lake management, including file size management, compaction, encryption, and compression to maximize storage efficiency.
  • Implement caching strategies, indexing, and query optimization to ensure efficient data retrieval and processing.
  • Implement monitoring and logging using tools like Datadog, CloudWatch and OpenSearch.
  • Lead efforts to develop services, tools and automation to simplify infrastructure complexity for other engineering teams, enabling them to focus on building great products.
  • Participate in all engineering activities including incident response, interviewing, designing and reviewing technical specifications, code review, and releasing new functionality.
  • Mentor, coach, and inspire a team of engineers of various levels.

In addition to the responsibilities outlined above, at Webflow we will support you in identifying where your interests and development opportunities lie and we'll help you incorporate them into your role.

About you:

You'll thrive as a Staff Data Infrastructure Engineer if you have:

  • 8+ years of experience as a Data Infrastructure Engineer or related roles like Platform Engineer, SRE, DevOps or Backend Engineer.
  • Deep expertise in provisioning and managing data infrastructure components like Kafka, Spark, and Airflow.
  • Extensive experience with cloud services and environments (compute, storage, networking, identity management, infrastructure as code, etc.).
  • Strong experience with containerization technologies like Docker and Kubernetes.
  • Advanced knowledge of infrastructure as code tools like Terraform and Pulumi.
  • Strong understanding of networking concepts and configurations, including VPCs, load balancers, and endpoints.
  • Extensive experience with monitoring and logging tools.
  • Strong problem-solving skills and attention to detail.
  • Excellent leadership, communication, and mentoring skills.

You get extra points if you have:

  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified DevOps Engineer).
  • Proven track record of designing and implementing multi-zone and multi-region high availability and disaster recovery strategies.
  • Prior experience managing data lake storage using Apache Iceberg.
  • Knowledge of compliance standards (GDPR, CCPA) and security best practices.

Our Core Behaviors:

  • Obsess over customer experience. We deeply understand what we’re building and who we’re building for and serving. We define the leading edge of what’s possible in our industry and deliver the future for our customers
  • Move with heartfelt urgency. We have a healthy relationship with impatience, channeling it thoughtfully to show up better and faster for our customers and for each other. Time is the most limited thing we have, and we make the most of every moment
  • Say the hard thing with care. Our best work often comes from intelligent debate, critique, and even difficult conversations. We speak our minds and don’t sugarcoat things — and we do so with respect, maturity, and care
  • Make your mark. We seek out new and unique ways to create meaningful impact, and we champion the same from our colleagues. We work as a team to get the job done, and we go out of our way to celebrate and reward those going above and beyond for our customers and our teammates

Benefits & wellness

  • Equity ownership (RSUs) in a growing, privately-owned company.
  • 100% employer-paid healthcare, vision, and dental insurance coverage for employees and dependents (full-time employees working 30+ hours per week), as well as Health Savings Account/Health Reimbursement Account, dependent care Flexible Spending Account (US only), dependent on insurance plan selection where applicable in the respective country of employment; Employees may also have voluntary insurance options, such as life, disability, hospital protection, accident, and critical illness where applicable in the respective country of employment
  • 12 weeks of paid parental leave for both birthing and non-birthing caregivers, as well as an additional 6-8 weeks of pregnancy disability for birthing parents to be used before child bonding leave (where local requirements are more generous employees receive the greater benefit); Employees also have access to family planning care and reimbursement
  • Flexible PTO with a mandatory annual minimum of 10 days paid time off for all locations (where local requirements are more generous employees receive the greater benefit), and sabbatical program
  • Access to mental wellness and professional coaching, therapy, and Employee Assistance Program
  • Monthly stipends to support health and wellness, smart work, and professional growth
  • Professional career coaching, internal learning & development programs
  • 401k plan and pension schemes (in countries where statutorily required) financial wellness benefits, like CPA or financial advisor coverage
  • Discounted Pet Insurance offering (US only)
  • Commuter benefits for in-office employees

Temporary employees are not eligible for paid holiday time off, accrued paid time off, paid leaves of absence, or company-sponsored perks unless otherwise required by law.

Remote, together

At Webflow, equality is a core tenet of our culture. We are an Equal Opportunity (EEO)/Veterans/Disabled Employer and are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law. Pursuant to the San Francisco Fair Chance Ordinance, Webflow will consider for employment qualified applicants with arrest and conviction records.

Stay connected

Not ready to apply, but want to be part of the Webflow community? Consider following our story on our Webflow Blog, LinkedIn, X (Twitter), and/or Glassdoor

Please note:

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Upon interview scheduling, instructions for confidential accommodation requests will be administered.

To join Webflow, you'll need a valid right to work authorization depending on the country of employment.

If you are extended an offer, that offer may be contingent upon your successful completion of a background check, which will be conducted in accordance with applicable laws. We may obtain one or more background screening reports about you, solely for employment purposes.

For information about how Webflow processes your personal information, please review Webflow’s Applicant Privacy Notice.

 

See more jobs at Webflow

Apply for this job