airflow Remote Jobs

143 Results

16d

Senior Java Engineer (Cloud Native)

ExperianHeredia, Costa Rica, Remote
S3EC2LambdaagilenosqlairflowsqlDesignmongodbapijavapostgresqlpythonAWS

Experian is hiring a Remote Senior Java Engineer (Cloud Native)

Job Description

You will be involved in projects using modern technologies as part of a senior software engineering team. You will help design and implementing product features. This is a technical role requiring excellent coding skills.

You will develop core functionality and processing for a new powerful, enterprise level data platform built with Java and using leading mainstream open-source technologies.

  • Hands-On collaboration as a primary member of a software engineering team focused on building event driven services delivering secure, efficient solutions in a bold timeframe.
  • Deliver available and scalable data streaming application functionality on an AWS cloudbased platform.
  • Diligently observe and maintain Standards for Regulatory Compliance and Information Security • Deliver and maintain accurate, complete and current documentation
  • Participate in full Agile cycle engagements, including meetings, iterative development, estimations, code reviews and design sessions.
  • Contribute to team architecture, engineering, and product discussions ensuring the team delivers software
  • Work with the service quality engineering team to ensure that only thoroughly tested code makes it to production.
  • Oversee deliverables from design through production operationalization •
  • Flexibility to work on Experience providing engineering support to customer support team to resolve any critical customer issues
  • you will report to Software Development Director Senior

Qualifications

  • 5+ years of software development experience building and testing applications following secure coding practices
  • Collaborate as a hands-on team member developing a significant commercial software project in Java with Spring Framework.
  • Proficiency in developing server-side Java applications using mainstream tools including the Spring framework and AWS SDK
  • Experience with event driven architectures using pub/sub message brokers such as Kafka, Kinesis, and NATS.io
  • Current cloud technology experience, preferably AWS (Fargate, EC2, S3, RDS PostgreSQL, Lambda, API Gateway, Airflow)
  • Experience developing web application using Spring Reactive libraries like WebFlux and Project Reactor and normal Spring Web
  • Aproficiency in SQL and NoSQL based data access and management on PostgeSQL and MongoDB or AWS DocumentDB.
  • Recent hands-on experience building and supporting commercial systems managing data and transactions including server-side development of Data Flow processes
  • Experience with Continuous Integration/Continuous Delivery (CI/CD) process and practices (CodeCommit, CodeDeploy, CodePipeline/Harness/Jenkins/Github Actions, CLI, BitBucket/Git)
  • Experience overseeing technologies including Splunk, Datadog, and Cloudwatch
  • Familiarity creating and using Docker/Kubernetes applications Additional Preferred Experience
  • Proficiency in developing server-side Python using mainstream tools including Pandas, SciPy, PySpark, and Pydantic
  • Experience building systems for financial services or tightly regulated businesses.
  • Security and privacy compliance (GPDR, CCPA, ISO 27001, PCI, HIPAA, etc.) experience a plus.

See more jobs at Experian

Apply for this job

17d

Junior Data Engineer

Accesa - RatiodataEmployee can work remotely, Romania, Remote
agileairflowpostgressqlDesignjavapythonbackend

Accesa - Ratiodata is hiring a Remote Junior Data Engineer

Job Description

One of our clients operates prominently in the financial sector, where we enhance operations across their extensive network of 150,000 workstations and support a workforce of 4,500 employees. As part of our commitment to optimizing data management strategies, we are migrating data warehouse (DWH) models into data products within the Data Integration Platform (DIP). 

Responsibilities: 

Drive Data Efficiency: Create and maintain optimal data transformation pipelines. 

Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements. 

Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. 

• Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. 

• Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. 

• Collaborate with Cross-Functional Teams: Work with stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs. 

Qualifications

Must have: 

  • 1+ years of experience in a similar role, preferably within Agile teams. 

  • Skilled in SQL and relational databases for data manipulation. 

  • Experience in building and optimizing Big Data pipelines and architectures. 

  • Knowledge of Big Data tools such as Spark, and object-oriented programming in Java; experience with Spark using Python is a plus.  

  • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement. 

  • Strong analytical skills in working with both structured and unstructured data. 

 

Nice to have: 

  • Experience in engaging with customer stakeholders  

  • Expertise in manipulating and processing large, disconnected datasets to extract actionable  

  • Familiarity with innovative technologies in message queuing, stream processing, and scalable big data storage solutions  

  • Technical skills in the following areas are a plus: Relational Databases (eg. Postgres), Big Data Tools: (eg. Databricks), and Workflow Management (eg. Airflow), and Backend Development using Spring Boot.

  • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar). 

Apply for this job

18d

Head of Data

GeminiRemote (USA)
remote-firstairflowsqlDesignazurepythonAWS

Gemini is hiring a Remote Head of Data

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Data

The Role: Head of Data

As the leader, you’ll shape the way we approach data at Gemini by creating a strategic vision for how data can help drive our business growth. You will build and manage a high-performance machine learning, data engineering, platform, and analytics team and leverage your experience and communication skills to work across business teams to develop innovative data solutions.  You will inspire and mentor a strong data team through your passion for data and its ability to transform decision-making and generate solutions for this new, exciting asset class. Communicating your insights and driving new product development across the organization is paramount to success. You will also be looked upon to share Gemini’s data vision and products externally.

Responsibilities:

  • Lead team responsible for scaling our data infrastructure and optimizing our warehouse’s performance
  • Lead design, architecture and implementation of best-in-class Data Warehousing and reporting solutions
  • Define and drive the vision for data management, analytics, and data culture
  • Collaborate with executive leadership to integrate data initiatives into business strategies
  • Oversee the design, development, and maintenance of the company’s data platform, ensuring scalability, reliability, and security
  • Lead and participate in design discussions and meetings
  • Oversee the end-to-end management of our data ecosystem, ensuring data integrity, and driving data-driven decision-making across the organization including our products
  • Design, automate, build, and launch scalable, efficient and reliable data pipelines into production
  • Drive the development of advanced analytics, reporting solutions, and dashboards to provide actionable insights to stakeholders
  • Oversee design, development,  and maintenance of ETL processes, data warehouses, and data lakes
  • Research new tools and technologies to improve existing processes
  • Implement best practices for data modeling, architecture, and integration across various data sources
  • Develop new systems and tools to enable the teams to consume and understand data more intuitively
  • Partner with engineers, project managers, and analysts to deliver insights to the business
  • Own responsibilities around root cause analysis, production and data issues such as validation

Minimum Qualifications:

  • 12-20+ years experience in data engineering with data warehouse technologies
  • 5-7+ years experience bringing a data infrastructure to the next level including the overhaul of data, pipelines, architecture and modeling
  • 10+ years experience in custom ETL design, implementation and maintenance
  • 10+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Experience building and integrating web analytics solutions
  • Experience with AI/ML tools and techniques, including experience deploying machine learning models in production
  • Strong knowledge of traditional and modern data tools and technologies, including SQL, Python, cloud platforms (AWS, Azure, GCP), and big data frameworks
  • Experience with AI/ML tools and techniques, including experience deploying machine learning models in production
  • Strong understanding of data governance, compliance, and security best practices
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Airflow, AWS Glue, Informatica, Pentaho, SSIS, Alooma, etc)
  • Experienced building cross functional teams across different departments
  • Exceptional leadership, communication, and stakeholder management skills

Preferred Qualifications:

  • Experience in a fast-paced, high-growth environment, particularly in tech or a data-driven industry
  • Hands-on experience with AI/ML frameworks like TensorFlow, PyTorch, or similar
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $269,000 - $336,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-SM1

Apply for this job

20d

Data Engineer (Stage Janvier 2025) H/F

Showroomprive.comSaint-Denis, France, Remote
airflowsqlc++

Showroomprive.com is hiring a Remote Data Engineer (Stage Janvier 2025) H/F

Description du poste

Au cœur du pôle Data de Showroomprive, vous intègrerez l’équipe «Data Engineering».  
Vos missions seront axées autour de l’extraction, du traitement et du stockage de la donnée via le maintien et l’évolution d’un Datawarehouse utilisé par le reste des équipes Data (BI, Data Science, Marketing analyst).  

 

Vos missions se découperont en deux parties :  

  • Un projet principal à réaliser de A à Z autour de la donnée, de son traitement, de son contrôle ou encore de son accessibilité.  
  • Les taches du quotidien de l’équipe (développement de nouveaux flux, exports de données métier, requêtes ad hoc, gestion d’accès…). 

Pour mener à bien ses missions, notre équipe utilise des outils à la pointe du marché en matière de traitement de la donnée grâce à Airflow pour le pipeline et à l’utilisation d’une plateforme cloud leader du marché. 

Vous intégrerez une équipe Data Engineering de 3 personnes qui vous accompagneront au quotidien pour mener à bien vos missions, mais aussi un service Data de 30 personnes aux compétences diverses et pointues dans leurs domaines. 

Qualifications

En fin de formation supérieure (Bac+5) de type Ecole d’Ingénieurs (filière liée à la Data ou Software Engineering). 

Dans le cadre de vos études ou d’une expérience précédente, vous avez pu acquérir de solides bases en SQL et Python. Vous avez aussi développé une réelle appétence à découvrir par vous-même et vous êtes très curieux lorsqu’il s’agit de Data. 

Votre rigueur et votre dynamisme constitueront des atouts clés dans la réussite des missions qui vous seront confiées. 

See more jobs at Showroomprive.com

Apply for this job

23d

Senior Data Engineer - Pacific or Central Time Only

ExperianCosta Mesa, CA, Remote
S32 years of experienceagile5 years of experience3 years of experiencetableauairflowsqlapipythonAWS

Experian is hiring a Remote Senior Data Engineer - Pacific or Central Time Only

Job Description

The Senior Data Engineer reports to the Data Engineer Manager and designs, develops and supports ETL data pipeline solutions in the AWS environment.

  • You will help build a semantic layer by developing ETL and virtualized views.
  • Collaborate with engineering teams to discover and use new data that is being introduced into the environment.
  • Work as part of a team to build and support a data warehouse and implement solutions using Python to process structured and unstructured data.
  • Support existing ETL processes written in SQL, troubleshoot and resolve production issues.
  • You will create report specifications and process documentation for the required data deliverables.
  • Be a liaison between business and technical teams to achieve project goals, delivering cross-functional reporting solutions.
  • Troubleshoot and resolve data, system, and performance issues.
  • Communicate with partners, other technical teams, and management to collect requirements, articulate data deliverables, and provide technical designs.
  • You have Experience providing engineering support to the customer support team to resolve any critical customer issues in an Agile environment.

Qualifications

  • Experience communicating updates and resolutions to customers and other partners since, as a Data Engineer, you will collaborate with partners and technical teams.
  • Minimum 5 years of experience as an ETL Data Engineer, and has intermediate knowledge working with SQL and data Experience approaching a problem from different angles, analyzing pros and cons of different solutions
  • Minimum 5 years of experience in Python scripting
  • Minimum 2 years of experience with AWS data ecosystem (Redshift, EMR, S3, MWAA, etc.)
  • Minimum 3 years of experience working in an Agile environment.
  • Experience with Tableau is a plus.
  • Experience with DBT.
  • Hands-on experience with Apache Airflow or equivalent tools (AWS MWAA) for the orchestration of data pipelines.
  • Hands-on experience working and building with Python API-based data pipelines.

See more jobs at Experian

Apply for this job

27d

Data Driven | Data Engineer

DevoteamLisboa, Portugal, Remote
Master’s Degree3 years of experienceairflowsqlazurepythonAWS

Devoteam is hiring a Remote Data Driven | Data Engineer

Job Description

We are currently looking for a Data Engineer to work with us.

Qualifications

  • Bachelor’s or Master’s degree in IT or equivalent;
  • At least 3 years of experience as a Data Engineer;
  • High level of experience with the following programing languages: Python and SQL;
  • Working experience with AWS or Azure;
  • Proficient Level of English (spoken and written);
  • Good communication skills;
  • Knowledge in Airflow will be a plus.

 

See more jobs at Devoteam

Apply for this job

27d

Data Driven | Python Developer

DevoteamLisboa, Portugal, Remote
Master’s Degree3 years of experienceairflowsqlazurepythonAWS

Devoteam is hiring a Remote Data Driven | Python Developer

Job Description

We are currently looking for a Data Engineer to work with us.

Qualifications

  • Bachelor’s or Master’s degree in IT or equivalent;
  • At least 3 years of experience as a Data Engineer;
  • High level of experience with the following programing languages: Python and SQL;
  • Working experience with AWS or Azure;
  • Proficient Level of English (spoken and written);
  • Good communication skills;
  • Knowledge in Airflow will be a plus.

 

See more jobs at Devoteam

Apply for this job

28d

Lead, Data Engineer (Client Deployment) (United States)

DemystDataUnited States, Remote
remote-firstairflowDesign

DemystData is hiring a Remote Lead, Data Engineer (Client Deployment) (United States)

OUR SOLUTION

At Demyst, we're transforming the way enterprises manage data, eliminating key challenges and driving significant improvements in business outcomes through data workflow automation. Due to growing demand, we're expanding our team and seeking talented individuals to help us scale.

Our platform simplifies workflows, eliminating the need for complicated platforms and expensive consultants. With top-tier security and global reach, we're helping businesses in banking and insurance achieve digital transformation. If you're passionate about data and affecting change, Demyst is the place for you.

THE CHALLENGE

Demyst is seeking a Lead Engineer with a strong data engineering focus to play a pivotal role in delivering our next-generation data platform to leading enterprises across North America. In this role, you will lead a team of data engineers with a primary focus on data integration and solution deployment. You will oversee the development and management of data pipelines, ensuring they are robust, scalable, and reliable. This is an ideal opportunity for a hands-on data engineering leader to apply technical, leadership, and problem-solving skills to deliver high-quality solutions for our clients.

Your role will involve not only technical leadership and mentoring but also actively contributing to coding, architectural decisions, and data engineering strategy. You will guide your team through complex client deployments, from planning to execution, ensuring that data solutions are effectively integrated and aligned with client goals.

Demyst is a remote-first company. The candidate must be based in the United States.

RESPONSIBILITIES

  • Lead the configuration, deployment, and maintenance of data solutions on the Demyst platform to support client use cases.
  • Supervise and mentor the local and distributed data engineering team, ensuring best practices in data architecture, pipeline development, and deployment.
  • Recruit, train, and evaluate technical talent, fostering a high-performing, collaborative team culture.
  • Contribute hands-on to coding, code reviews, and technical decision-making, ensuring scalability and performance.
  • Design, build, and optimize data pipelines, leveraging tools like Apache Airflow, to automate workflows and manage large datasets effectively.
  • Work closely with clients to advise on data engineering best practices, including data cleansing, transformation, and storage strategies.
  • Implement solutions for data ingestion from various sources, ensuring the consistency, accuracy, and availability of data.
  • Lead critical client projects, managing engineering resources, project timelines, and client engagement.
  • Provide technical guidance and support for complex enterprise data integrations with third-party systems (e.g., AI platforms, data providers, decision engines).
  • Ensure compliance with data governance and security protocols when handling sensitive client data.
  • Develop and maintain documentation for solutions and business processes related to data engineering workflows.
  • Other duties as required.
  • Bachelor's degree or higher in Computer Science, Data Engineering, or related fields. Equivalent work experience is also highly valued.
  • 5-10 years of experience in data engineering, software engineering, or client deployment roles, with at least 3 years in a leadership capacity.
  • Strong leadership skills, including the ability to mentor and motivate a team, lead through change, and drive outcomes.
  • Expertise in designing, building, and optimizing ETL/ELT data pipelines using Python, JavaScript, Golang, Scala, or similar languages.
  • Experience in managing large-scale data processing environments, including Databricks and Spark.
  • Proven experience with Apache Airflow to orchestrate data pipelines and manage workflow automation.
  • Deep knowledge of cloud services, particularly AWS (EC2/ECS, Lambda, S3), and their role in data engineering.
  • Hands-on experience with both SQL and NoSQL databases, with a deep understanding of data modeling and architecture.
  • Strong ability to collaborate with clients and cross-functional teams, delivering technical solutions that meet business needs.
  • Proven experience in unit testing, integration testing, and engineering best practices to ensure high-quality code.
  • Familiarity with agile project management tools (JIRA, Confluence, etc.) and methodologies.
  • Experience with data visualization and analytics tools such as Jupyter Lab, Metabase, Tableau.
  • Strong communicator and problem solver, comfortable working in distributed teams.
  • Operate at the forefront of the data management innoivation, and work with the largest industry players in an emerging field that is fueling growth and technological advancement globally
  • Have an outsized impact in a rapidly growing team, offering real autonomy and responsibility for client outcomes
  • Stretch yourself to help define and support something entirely new
  • Distributed team and culture, with fully flexible working hours and location
  • Collaborative, inclusive, and dynamic culture
  • Generous benefits and compensation plans
  • ESOP awards available for tenured staff
  • Join an established, and scaling data technology business

Demyst is committed to creating a diverse, rewarding career environment and is proud to be an equal opportunity employer. We strongly encourage individuals from all walks of life to apply.

See more jobs at DemystData

Apply for this job

29d

Lead Data Analyst, Product

tableauairflowsqlDesignc++python

hims & hers is hiring a Remote Lead Data Analyst, Product

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

As the Lead Data Analyst of Product Analytics, you and your team will shape the customer experience through high-quality experimental design and hypothesis testing. You will work cross-functionally with product managers, growth leads, designers, and engineers in a fast-paced collaborative environment. Your knowledge of A/B testing and digital analytics combined with your background in experimental design will allow Hims and Hers to build best-in-class customer experiences. This position will report to the Senior Manager of Product Analytics.

You Will:

  • Design experiments and provide actionable and scalable recommendations from the results
  • Deliver in-depth analyses that are statistically sound and easily understood by non-technical audiences
  • Work with your team to curate the experimentation roadmap for the product and growth teams
  • Enable data self-service by designing templates that are easy to understand using relevant KPIs
  • Collaborate across analytics, engineering, and growth teams to improve the customer experience
  • Distill your knowledge of tests into playbooks that can be implemented and utilized to help us transform our digital experience
  • Identify causal relationships in our data using advanced statistical modeling
  • Segment users based on demographic, behavioral, and psychographic attributes to tailor product experiences and lifecycle communications
  • Align analytics initiatives with broad business objectives to build long-term value
  • Conduct deep-dive analyses to answer specific business questions and provide actionable recommendations to product and growth team

You Have:

  • 8+ years of analytics experience
  • 5+ years of experience in A/B testing
  • Experience working with subscription metrics
  • A strong work ethic and the drive to learn more and understand a problem in detail
  • Strong organizational skills with an aptitude to manage long-term projects from end to end
  • Expert SQL skills
  • Extensive experience working with data engineering teams and production data pipelines
  • Experience programming in Python, SAS, or R 
  • Experience in data modeling and statistics with a strong knowledge of experimental design and statistical inference 
  • Development and training of predictive models
  • Advanced knowledge of data visualization and BI in Looker or Tableau
  • Ability to explain technical analyses to non-technical audience

A Big Plus If You Have:

  • Advanced degree in Statistics, Mathematics, or a related field
  • Experience with price testing and modeling price elasticity
  • Experience with telehealth concepts
  • Project management experience 
  • DBT, airflow, and Databricks experience

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions, including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range is
$160,000$190,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims considers all qualified applicants for employment, including applicants with arrest or conviction records, in accordance with the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance, the California Fair Chance Act, and any similar state or local fair chance laws.

Hims & Hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, please contact us at accommodations@forhims.com and describe the needed accommodation. Your privacy is important to us, and any information you share will only be used for the legitimate purpose of considering your request for accommodation. Hims & Hers gives consideration to all qualified applicants without regard to any protected status, including disability. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

29d

Senior Analytics Engineer, MarTech

CLEAR - CorporateNew York, New York, United States (Hybrid)
tableauairflowsqlDesignjenkinspythonAWS

CLEAR - Corporate is hiring a Remote Senior Analytics Engineer, MarTech

Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Analytics Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Analytics Engineer at CLEAR, you will participate in the design, implementation of our MarTech products, leveraging your expertise to drive technical innovation and insure seamless integration of marketing technologies.


A brief highlight of our tech stack:

  • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

What you'll do:

  • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
  • Develop and implement data analytics models
  • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
  • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

 What you're great at:

  • 6+ years of data engineering experience
  • Working with cloud-based application development, and be fluent in at least a few of: 
    • Cloud services providers like AWS
    • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
    • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
    • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
    • Data visualization tool like Looker, Tableau, etc
  • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
  • Collaborating and mentoring less experienced members of the team
  • Comfort with ambiguity 
  • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

See more jobs at CLEAR - Corporate

Apply for this job

29d

Engineering Manager, Data Platform

GrammarlySan Francisco; Hybrid
MLremote-firstairflowDesignazurec++AWS

Grammarly is hiring a Remote Engineering Manager, Data Platform

Grammarly is excited to offer aremote-first hybrid working model. Grammarly team members in this role must be based in San Francisco. They must meet in person for collaboration weeks, traveling if necessary to the hub(s) where their team is based.

This flexible approach gives team members the best of both worlds: plenty of focus time along with in-person collaboration that fosters trust and unlocks creativity.

About Grammarly

Grammarly is the world’s leading AI writing assistance company trusted by over 30 million people and 70,000 teams. From instantly creating a first draft to perfecting every message, Grammarly helps people at 96% of theFortune 500 and teams at companies like Atlassian, Databricks, and Zoom get their point across—and get results—with best-in-class security practices that keep data private and protected. Founded in 2009, Grammarly is No. 14 on the Forbes Cloud 100, one of TIME’s 100 Most Influential Companies, one of Fast Company’s Most Innovative Companies in AI, and one of Inc.’s Best Workplaces.

The Opportunity

To achieve our ambitious goals, we’re looking for a Software Engineer to join our Data Platform team and help us build a world-class data platform. Grammarly’s success depends on its ability to efficiently ingest over 60 billion daily events while using our systems to improve our product. This role is a unique opportunity to experience all aspects of building complex software systems: contributing to the strategy, defining the architecture, and building and shipping to production.

Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure. You can hear more from our team on our technical blog.

We are seeking a highly skilled and experienced Manager for our Data Platform team to achieve our ambitious objectives. This role is crucial in managing and evolving our data infrastructure, engineering, and governance processes to support modern machine learning (ML) use cases, self-serve analytics, and data policy management across the organization. The ideal candidate will possess strong technical expertise, exceptional leadership abilities, and the capability to mentor and develop a high-performing team that operates across data infrastructure, engineering, and governance.

This person will be integral to the larger data organization, reporting directly to the Director of Data Platform. They will have the opportunity to influence decisions and the direction of our overall data platform, including data processing, infrastructure, data governance, and analytics engineering. Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure.

As the Data Platform team manager, you will lead and mentor a team of data engineers, infrastructure engineers, and data governance specialists, fostering a collaborative and innovative environment focused on professional growth. You will oversee the design, implementation, and maintenance of secure, scalable, and optimized data platforms, ensuring high performance and reliability. Your role includes developing and executing strategic roadmaps aligned with business objectives and collaborating closely with cross-functional teams and the larger data organization to ensure seamless data integration, governance, and access. Additionally, you will provide technical leadership and play a pivotal role in resource management and recruiting efforts, driving the team’s success and aligning with the organization’s long-term data strategy.

In this role, you will:

  • Build a highly specialized data platform team to support the growing needs and complexity of our product, business, and ML organizations.
  • Oversee the design, implementation, and maintenance of a robust data infrastructure, ensuring high availability and reliability across ingestion, processing, and storage layers.
  • Lead the development of frameworks and tooling that enable self-serve analytics, policy management, and seamless data governance across the organization.
  • Ensure data is collected, transformed, and stored efficiently to support real-time, batch processing, and machine learning needs.
  • Act as a liaison between the Data Platform team and the broader organization, ensuring seamless communication, collaboration, and alignment with global data strategies.
  • Drive cross-functional meetings and initiatives to represent the Data Platform team’s interests and contribute to the organization’s overall data strategy, ensuring ML and analytics use cases are adequately supported.
  • Drive the evaluation, selection, and implementation of new technologies and tools that enhance the team’s capabilities and improve the organization’s overall data infrastructure and governance processes.
  • Implement and enforce data governance policies and practices to ensure data quality, privacy, security, and compliance with organizational standards.
  • Collaborate with stakeholders to define and refine data governance policies that align with business objectives and facilitate discoverability and accessibility of high-quality data.
  • Monitor and assess the data platform's performance to identify areas for optimization, cost management, and continuous improvement.
  • Foster a collaborative and high-performance culture within the team, emphasizing ownership and innovation.
  • Cultivate an ownership mindset and culture across product and platform teams by providing necessary metrics to drive informed decisions and continuous improvement.
  • Set high performance and quality standards, coaching team members to meet them, and mentoring and growing junior and senior IC talent.

Qualifications:

  • 7+ years of experience in data engineering, infrastructure & governance, with at least 2-3 years in a leadership or management role.
  • Proven experience in building and managing large-scale data platforms, including data ingestion pipelines and infrastructure.
  • Experience with cloud platforms and data ecosystems such as AWS, GCP, Azure, and Databricks.
  • Familiarity with modern data engineering and orchestration tools and frameworks (e.g., Apache Kafka, Airflow, DBT, Spark).
  • Strong understanding of data governance frameworks, policy management, and self-serve analytics platforms.
  • Excellent leadership and people management skills, with a track record of mentoring and developing high-performing teams.
  • Experience working with geographically distributed teams and aligning with global data and governance strategies.
  • Strong problem-solving skills, with the ability to navigate and resolve complex technical challenges.
  • Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across different locations and time zones.
  • Proven ability to operate in a fast-paced, dynamic environment where things change quickly.
  • Leads by setting well-understood goals and sharing the appropriate level of context for maximum autonomy, but is also profoundly technical and can dive in to help when necessary.
  • Embodies our EAGER values—ethical, adaptable, gritty, empathetic, and remarkable.
  • Is inspired by our MOVE principles: move fast and learn faster; obsess about creating customer value; value impact over activity; and embrace healthy disagreement rooted in trust.
  • Willingness to meet in person for scheduled team collaboration weeks and travel, if necessary, to the hub where the team is based.

Compensation and Benefits

Grammarly offers all team members competitive pay along with a benefits package encompassing the following and more: 

  • Excellent health care (including a wide range of medical, dental, vision, mental health, and fertility benefits)
  • Disability and life insurance options
  • 401(k) and RRSP matching 
  • Paid parental leave
  • 20 days of paid time off per year, 12 days of paid holidays per year, two floating holidays per year, and flexible sick time
  • Generous stipends (including those for caregiving, pet care, wellness, your home office, and more)
  • Annual professional development budget and opportunities

Grammarly takes a market-based approach to compensation, which means base pay may vary depending on your location. Our US locations are categorized into two compensation zones based on proximity to our hub locations.

Base pay may vary considerably depending on job-related knowledge, skills, and experience. The expected salary ranges for this position are outlined below by compensation zone and may be modified in the future. 

San Francisco: 
Zone 1: $285,000 – $325,000/year (USD)

For more information about our compensation zones and locations where we currently support employment, please refer to this page. If a location of interest is not listed, please speak with a recruiter for additional information. 

We encourage you to apply

At Grammarly, we value our differences, and we encourage all to apply. Grammarly is an equal opportunity company. We do not discriminate on the basis of race or ethnic origin, religion or belief, gender, disability, sexual identity, or age.

For more details about the personal data Grammarly collects during the recruitment process, for what purposes, and how you can address your rights, please see the Grammarly Data Privacy Notice for Candidates here

#LI-Hybrid

 

Apply for this job

+30d

Senior Principal Architect - Cloud Engineering

IFSBengaluru, India, Remote
gRPCgolangagileairfloworacleDesignmobileazuregraphqljavac++.netdockerpostgresqlkubernetesangularjenkinspythonjavascript

IFS is hiring a Remote Senior Principal Architect - Cloud Engineering

Job Description

The Senior Principal Architect (“SPA”) will own the overall architecture accountability for one or more portfolios within IFS Technology. The role of the SPA is to build and develop the technology strategy, while growing, leading, and energising multi-faceted technical teams to design and deliver technical solutions that deliver IFS technology needs and are supported by excellent data, methodology, systems and processes. The role will work with a broad set of stakeholders including product managers, engineers, and various R&D and business leaders. The occupant of this role diagnoses and solves significant, complex and non-routine problems; translates practices from other markets, countries and industries; provides authoritative, technical recommendations which have a significant impact on business performance in short and medium term; and contributes to company standards and procedures, including the IFS Technical Reference Architecture. This role actively identifies new approaches that enhance and simplify where possible complexities in the IFS suite. The SPA represents IFS as the authority in one or technology areas or portfolios and acts as a role model to develop experts within this area.

What is the role?

  • Build, nurture and grow high performance engineering teams using Agile Engineering principles.
  • Provide technical leadership for design and development of software meeting functional & nonfunctional requirements.
  • Provide multi-horizon technology thinking to broad portfolios and platforms in line with desired business needs.
  • Adopt a hands-on approach to develop the architecture runway for teams.
  • Set technical agenda closely with the Product and Program Managers
  • Ensure maintainability, security and performance in software components developed using well-established engineering/architectural principles.
  • Ensure software quality complying with shift left quality principles.  
  • Conduct peer reviews & provide feedback ensuring quality standards.
  • Engage with requirement owners and liaise with other stakeholders.
  • Contribute to improvements in IFS products & services.

Qualifications

What do we need from you? 

It’s your excellent influencing and communication skills that will really make the difference. Entrepreneurship and resilience will be required, to help drive and shape the technology strategy. You will need technical, operational, and commercial breadth to deliver a strategic technical vision alongside a robust, secure and cost-effective delivery platform and operational model.

  • Seasoned Leader with 15+ years of hands-on experience in Design, Development and Implementation of scalable cloud-based web and mobile applications.
  • Have strong software architectural, technical design and programming skills.
  • Experience in Application Security, Scalability and Performance.
  • Ability to envision the big picture and work on details. 
  • Can articulate technology vision and delivery strategy in a way that is understandable to technical and non-technical audiences.
  • Willingness to learn and adapt different technologies/work environments.
  • Knowledge of and skilled in various tools, languages, frameworks and cloud technologies with the ability to be hands-on where needed:
    • Programming languages - C++, C#, GoLang, Python, JavaScript and Java
    • JavaScript frameworks - Angular, Node and React JS, etc.,
    • Back-end frameworks - .NET, GoLang, etc.,
    • Middleware – REST, GraphQL, GRPC,
    • Databases - Oracle, Mongo DB, Cassandra, PostgreSQL etc.
    • Azure and Amazon cloud services. Proven experience in building cloud-native apps on either or both cloud platforms
    • Kubernetes and Docker containerization
    • CI/CD tools - Circle CI, GitHub, GitLab, Jenkins, Tekton
  • Hands on experience in OOP concepts and design principles.
  • Good to have:
    • Knowledge of cloud-native big data tools (Hadoop, Spark, Argo, Airflow) and data science frameworks (PyTorch, Scikit-learn, Keras, TensorFlow, NumPy)
    • Exposure to ERP application development is advantageous.
  • Excellent communication and multi-tasking skills along with an innovative mindset.

See more jobs at IFS

Apply for this job

+30d

Lead Data Engineer (F/H)

ASINantes, France, Remote
S3agilenosqlairflowsqlazureapijava

ASI is hiring a Remote Lead Data Engineer (F/H)

Description du poste

Avec Simon GRIFFON, responsable de l’équipe Data Nantaise, nous recherchons un Lead Data Engineer pour mettre en place, intégrer, développer et optimiser des solutions de pipeline sur des environnements Cloud et On Premise pour nos projets clients. 

Au sein d'une équipe dédiée, principalement en contexte agile, voici les missions qui pourront vous être confiées : 

  • Participer à la compréhension des besoins métiers et réaliser des ateliers de cadrage avec le client 

  • Participer à la rédaction des spécifications fonctionnelles et techniques des flux 

  • Maîtriser les formats de données structurés et non structurés et savoir les manipuler  

  • Modéliser et mettre en place des systèmes décisionnels   

  • Installer et connecter une solution ETL / ELT à une source de données en tenant compte des contraintes et de l’environnement du client 

  • Concevoir et réaliser un pipeline de transformation et de valorisation des données et ordonnancer son fonctionnement 

  • Veiller à la sécurisation des pipelines de données 

  • Concevoir et réaliser des API utilisant les données valorisées  

  • Définir des plans de tests et d’intégration 

  • Prendre en charge la maintenance évolutive et corrective 

  • Accompagner les juniors dans leur montée en compétences 

 

En fonction de vos compétences et appétences, vous intervenez sur l’une ou plusieurs des technologies suivantes : 

  • L’écosystème data notamment Microsoft Azure 

  • Les langages : SQL, Java 

  • Les bases de données SQL et NoSQL 

  • Stockage cloud: S3, Azure Blob Storage… 

  • Les ETL/ESB et autres outils : Talend, Spark, Kafka NIFI, Matillion, Airflow, Datafactory, Glue... 

 

En rejoignant ASI : 

  • Vous évoluerez au sein d’une entreprise aux modes de fonctionnement internes flexibles garantis par une politique RH attentive (accord télétravail 3J/semaine, accord congé parenthèse…)  

  • Vous intégrerez les différentes communautés expertes d'ASI, pour partager des bonnes pratiques et participer aux actions d'amélioration continue. 

  • Vous évoluerez dans une entreprise bientôt reconnue Société à mission, Team GreenCaring et non GreenWashing porteuse d’une démarche RSE incarnée et animée, depuis plus de 10 ans. (Equipe RSE dédiée, accord forfaits mobilités durables…)  

Qualifications

Issu d’une formation supérieure en informatique, mathématiques ou spécialisé en Big Data, vous avez une expérience minimale de 10 ans en ingénierie des données et d'une expérience opérationnelle réussie dans la construction de pipelines de données structurées et non structurées.  

Le salaire proposé pour ce poste est compris entre 40 000 et 45 000 €, selon l'expérience et les compétences, tout en respectant l'équité salariale au sein de l'équipe. 

Attaché à la qualité de ce que vous réalisez, vous faites preuve de rigueur et d'organisation dans la réalisation de vos activités. 

Doté d'une bonne culture technologique, vous faites régulièrement de la veille pour actualiser vos connaissances. 

Un bon niveau d’anglais, tant à l’écrit qu’à l’oral est recommandé. 

Vous êtes doté d’un véritable esprit d’équipe, et votre leadership vous permet d'animer celle-ci en toute bienveillance et pédagogie pour la faire grandir. 

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement.

 

A compétences égales, ce poste est ouvert aux personnes en situation de handicap.  

See more jobs at ASI

Apply for this job

+30d

Senior Data Scientist

redisBachelor's degreeterraformairflowsqlansibledockerkubernetespython

ReCharge Payments is hiring a Remote Senior Data Scientist

Who we are

In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and dynamic bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 100 million subscribers, including brands such as Blueland, Hello Bello, LOLA, Chamberlain Coffee, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

Senior Data Analyst, Product Analytics

Recharge is positioned to support the best Direct-To-Consumer ecommerce brands in the world. We are building multiple AI-based analytic products that revolutionize how our merchants leverage insight to retain and grow their business. 


We are looking for a data scientist who is value driven and passionate about providing actionable insights and helping to create data products that our product and growth teams can leverage. As a data scientist you will be working on both product analytics as well as advanced analytics projects working closely with data engineering and product to deliver value to our merchants through analytics and insights


You will be responsible for preparing data for prescriptive and predictive modeling, driving hypotheses, applying stats, and developing architecture for algorithms. 


What you’ll do

  • Live by and champion all of our core values (#ownership, #empathy, #day-one, and #humility).

  • Collaborate with stakeholders in cross-projects and team settings to identify and clarify business or product questions to answer. Provide feedback to translate and refine business questions into tractable analysis, evaluation metrics, or mathematical models.

  • Perform analysis utilizing relevant tools (e.g., SQL, Python). Provide analytical thought leadership through proactive and strategic contributions (e.g., suggests new analyses, infrastructure or experiments to drive improvements in the business).

  • Own outcomes for projects by covering problem definition, metrics development, data extraction and manipulation, visualization, creation, and implementation of analytical/statistical models, and presentation to stakeholders.

  • Develop solutions, lead, and manage problems that may be ambiguous and lacking clear precedent by framing problems, generating hypotheses, and making recommendations from a perspective that combines both, analytical and product-specific expertise.

  • Work independently to find creative solutions to difficult problems.

  • Effectively communicate analyses and experimental outcomes to business stakeholders, ensuring insights are presented with clear business context and relevance.

  • Write and maintain technical documentation for the data models and analytics solutions.
     

What you'll bring

  • Bachelor's degree ,or equivalent work experience, in Statistics, Mathematics, Data Science, Engineering, Physics, Economics, or a related quantitative field.

  • 5+ years of work experience using analytics to solve product or business problems, performing statistical analysis, and coding (e.g., Python, R, SQL) 

  • Preferred experience in leveraging LLMs to address business challenges, and familiarity with frameworks such as Langchain.

  • Experience developing and operating  within Snowflake

  • Expert in translating data findings to broader audiences including non-data stakeholders, engineering, and executive leadership to maximize impact

  • Preferred experience in dimensional modeling in dbt 

  • Experience working on advanced analytics models (machine learning or learning based models) that accomplish tasks such as making recommendations or scoring users.

  • Ability to demonstrate high self-sufficiency to take on complex problems in a timely manner

  • Consistently navigates ambiguous technical and business requirements while making flexible technical decisions

  • Consistently delivers technically challenging tasks efficiently with quality, speed, and simplicity

  • Payments and/or Ecommerce experience preferred


Our Stack

Vertex ai, Google Colab, Looker, Dbt, Snowflake, Airflow, Fivetran, CloudSQL/MySQL, Python (Pandas, NumPy, Scikit-learn) , Gitlab, Flask, Jinja, ES6, Vue.js, Saas, Webpack, Redis, Docker, GCP, Kubernetes, Helmfile, Terraform, Ansible, Nginx

Recharge | Instagram | Twitter | Facebook

Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

Transparency in Coverage

This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

#LI-Remote

See more jobs at ReCharge Payments

Apply for this job

+30d

Product Data Architect - Xray

SalesFull TimejiraairflowDesignmongodbqaAWSNode.js

Idera, Inc. is hiring a Remote Product Data Architect - Xray

Product Data Architect - Xray - Idera, Inc. - Career PageRequirements:See more jobs at Idera, Inc.

Apply for this job

+30d

Senior Data Engineer (Taiwan)

GOGOXRemote
Full TimeairflowsqlazureapijavapythonAWS

GOGOX is hiring a Remote Senior Data Engineer (Taiwan)

Senior Data Engineer (Taiwan) - GoGoX - Career Page

See more jobs at GOGOX

Apply for this job

+30d

Data and Analytics Engineer

airflowsqlDesignpython

Cloudflare is hiring a Remote Data and Analytics Engineer

About Us

At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

Available Locations: Lisbon or Remote Portugal

About the team

You will be part of the Network Strategy team within Cloudflare’s Infrastructure Engineering department. The Network Strategy team focuses on building both external and internal relationships that allow for Cloudflare to scale and reach user populations around the world. Our group takes a long term and technical approach to forging mutually beneficial and sustainable relationships with all of our network partners. 

About the role

We are looking for an experienced Data and Analytics Engineer to join our team to scale our data insights initiatives. You will work with a wide array of data sources about network traffic, performance, and cost. You’ll be responsible for building data pipelines, doing ad-hoc analytics based on the data, and automating our analysis. Important projects include understanding the resource consumption and cost of Cloudflare’s broad product portfolio.

A candidate will be successful in this role if they’re flexible and able to match the right solution to the right problem. Flexibility is key. Cloudflare is a fast-paced environment and requirements change frequently.

What you'll do

  • Design and implement data pipelines that take unprocessed data and make it usable for advanced analytics
  • Work closely with other product and engineering teams to ensure our products and services collect the right data for our analytics
  • Work closely with a cross functional team of data scientists and analysts and internal stakeholders on strategic initiatives 
  • Build tooling, automation, and visualizations around our analytics for consumption by other Cloudflare teams

Examples of desirable skills, knowledge and experience

  • Excellent Python and SQL (one of the interviews will be a code review)
  • B.S. or M.S in Computer Science, Statistics, Mathematics, or other quantitative fields, or equivalent experience
  • Minimum 3 years of industry experience in software engineering, data engineering, data science or related field with a track record of extracting, transforming and loading large datasets 
  • Knowledge of data management fundamentals and data storage/computing principles
  • Excellent communication & problem solving skills 
  • Ability to collaborate with cross functional teams and work through ambiguous business requirements

Bonus Points

  • Familiarity with Airflow 
  • Familiarity with Google Cloud Platform or other analytics databases

What Makes Cloudflare Special?

We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

Sound like something you’d like to be a part of? We’d love to hear from you!

This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

See more jobs at Cloudflare

Apply for this job

+30d

Data Driven | ETL Developer

DevoteamLisboa, Portugal, Remote
Master’s Degree3 years of experienceairflowsqlazurepythonAWS

Devoteam is hiring a Remote Data Driven | ETL Developer

Job Description

We are currently looking for a Data Engineer to work with us.

Qualifications

  • Bachelor’s or Master’s degree in IT or equivalent;
  • At least 3 years of experience as a Data Engineer;
  • High level of experience with the following programing languages: Python and SQL;
  • Working experience with AWS or Azure;
  • Proficient Level of English (spoken and written);
  • Good communication skills;
  • Knowledge in Airflow will be a plus.

 

See more jobs at Devoteam

Apply for this job

+30d

DevOps Engineer, Data Platform

GrammarlyUnited States; Hybrid
MLDevOPSremote-firstterraformairflowDesignansiblec++kubernetesAWS

Grammarly is hiring a Remote DevOps Engineer, Data Platform

Grammarly is excited to offer aremote-first hybrid working model. Grammarly team members in this role must be based inthe United Statesand, depending on business needs, they must meet in person for collaboration weeks, traveling if necessary to the hub(s) where their team is based.

This flexible approach gives team members the best of both worlds: plenty of focus time along with in-person collaboration that fosters trust and unlocks creativity.

About Grammarly

Grammarly is the world’s leading AI writing assistance company trusted by over 30 million people and 70,000 teams. From instantly creating a first draft to perfecting every message, Grammarly helps people at 96% of theFortune 500 and teams at companies like Atlassian, Databricks, and Zoom get their point across—and get results—with best-in-class security practices that keep data private and protected. Founded in 2009, Grammarly is No. 14 on the Forbes Cloud 100, one of TIME’s 100 Most Influential Companies, one of Fast Company’s Most Innovative Companies in AI, and one of Inc.’s Best Workplaces.

The Opportunity

To achieve our ambitious goals, we’re looking for a Software Engineer to join our Data Platform team and help us build a world-class data platform. Grammarly’s success depends on its ability to efficiently ingest over 60 billion daily events while using our systems to improve our product. This role is a unique opportunity to experience all aspects of building complex software systems: contributing to the strategy, defining the architecture, and building and shipping to production.

Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure. You can hear more from our team on our technical blog.

As a Senior DevOps Engineer on the Data Platform team, you will lead cloud infrastructure design, automation, and management, focusing on Databricks and AWS, ensuring the platform is scalable, secure, and cost-effective. You will be critical in optimizing data processing and machine learning workflows, guaranteeing high performance and uptime for critical systems. Your expertise in Databricks and AWS infrastructure and a focus on cost management will be vital to driving platform efficiency.

You will work closely with data engineers, ML teams, and infrastructure engineers to automate and optimize workflows, manage infrastructure, and ensure the platform supports real-time analytics and machine learning at scale.

  • Design, implement, and manage scalable infrastructure focusing on Databricks, AWS, and Spark clusters to support data pipelines and lakes.
  • Build and optimize CI/CD pipelines for deploying and managing Databricks clusters, data pipelines, and real-time data workflows.
  • Use tools like Terraform, Kubernetes, and Ansible to orchestrate and automate infrastructure tasks, including provisioning, scaling, and monitoring.
  • Implement and manage monitoring and logging systems to ensure data infrastructure's high availability, security, and performance, focusing on Databricks and AWS environments.
  • Implement cloud security best practices, manage access control and encryption, and ensure the platform complies with data privacy regulations to ensure security and compliance.
  • Collaborate with cross-functional teams to maintain cost management strategies, optimizing resource utilization across Databricks, AWS, and other data infrastructure components.

Qualifications

  • Has 5+ years of experience in DevOps with expertise in managing AWS infrastructure and Databricks environments.
  • Has strong experience with Spark clusters, data lakes, and managing large-scale data pipelines in a cloud environment.
  • Is skilled in orchestration, monitoring, and logging, using tools like Terraform, Kubernetes, and Airflow for managing cloud resources and data workflows.
  • Understands security and compliance best practices, ensuring data infrastructure is secure, compliant, and aligned with privacy regulations.
  • Has experience with cost management and optimization of cloud resources, with a focus on AWS and Databricks.
  • Is a problem solver who enjoys building scalable, automated solutions that improve platform reliability and efficiency.
  • Embodies our EAGER values—is ethical, adaptable, gritty, empathetic, and remarkable.
  • Is inspired by our MOVE principles: move fast and learn faster; obsess about creating customer value; value impact over activity; and embrace healthy disagreement rooted in trust.
  • Is able to meet in person for their team’s scheduled collaboration weeks, traveling if necessary to the hub where their team is based.

Compensation and Benefits

Grammarly offers all team members competitive pay along with a benefits package encompassing the following and more: 

  • Excellent health care (including a wide range of medical, dental, vision, mental health, and fertility benefits)
  • Disability and life insurance options
  • 401(k) and RRSP matching 
  • Paid parental leave
  • 20 days of paid time off per year, 12 days of paid holidays per year, two floating holidays per year, and flexible sick time
  • Generous stipends (including those for caregiving, pet care, wellness, your home office, and more)
  • Annual professional development budget and opportunities

Grammarly takes a market-based approach to compensation, which means base pay may vary depending on your location. Our US locations are categorized into two compensation zones based on proximity to our hub locations

Base pay may vary considerably depending on job-related knowledge, skills, and experience. The expected salary ranges for this position are outlined below by compensation zone and may be modified in the future.

United States: 
Zone 1: $250,000 - $284,000/year (USD)
Zone 2: $225,000 - $255,000/year (USD)

For more information about our compensation zones and locations where we currently support employment, please refer to this page. If a location of interest is not listed, please speak with a recruiter for additional information. 

We encourage you to apply

At Grammarly, we value our differences, and we encourage all to apply—especially those whose identities are traditionally underrepresented in tech organizations. We do not discriminate on the basis of race, religion, color, gender expression or identity, sexual orientation, ancestry, national origin, citizenship, age, marital status, veteran status, disability status, political belief, or any other characteristic protected by law. Grammarly is an equal opportunity employer and a participant in the US federal E-Verify program (US). We also abide by the Employment Equity Act (Canada).

#LI-Hybrid

 

Apply for this job

+30d

Data Engineer (Sales Domain)

lastminute.comMadrid, Spain, Remote
Sales2 years of experiencetableauscalaairflowsqlDesignmobilepythonAWS

lastminute.com is hiring a Remote Data Engineer (Sales Domain)

Job Description

lastminute.com is looking for a Data Engineer for its Sales & Partnerships team inside the Data & Analytics department.

The activities of the Sales & Partnerships domain team are focused on reports, tables, analysis and, more generally, all sorts of deliverables related to company's sales data in order to create an important value in supporting decision-making of the business. Significant emphasis will be placed on partnerships data preparation and analysis, helping our business to find best solutions with the partners, monitoring performances and evaluating the effectiveness of sales campaigns, agreements and initiatives through the time. 

The candidate will have the opportunity to become a key member of the team leveraging their engineering skills to acquire, manipulate, orchestrate and monitor data.

Data is at our core and its reliability and effectiveness have direct impact in producing actionable insights and improving business performances

* Please note that this is a hybrid working model position, remote possibilities can be evaluated inside Spanish territory only.

Qualifications

Key Responsibilities

  • Understand and analyse functional needs, raw data and develop data dimensional models
  • Design, build and deploy data pipelines with a focus on automation, performance optimization, scalability, and reliability aspects
  • Helps the business to understand the data and find insights that enable the company to take data driven decisions
  • Leverage data and business principles to solve large-scale web, mobile and data infrastructure problems
  • Build data expertise and own data quality for your area

 

Skills and Experience

Essentials

  • At least 2 years of experience in similar role in a fast-paced environment
  • SQL advanced knowledge
  • Experience in Data Modelling
  • Experience in ETL design, implementation and maintenance
  • Experience with workflow management engines (e.g. Airflow, Google Cloud Composer, Talend)
  • Experience with data quality and validation
  • Fluent in English both written and spoken


Desirable 

  • Bachelor or master degree in Statistics, Mathematics, Engineering or Physics or similar fields
  • Experience working with cloud or on-prem Big Data/MPP analytics platform (e.g. AWS Redshift, Google BigQuery or similar)
  • Programming languages knowledge (e.g. Python, R, Scala)
  • Experience in analysing data to discover opportunities, address gaps and anomaly/outlier detection
  • Experience with Analytics tool (e.g. QlikView, Tableau, Spotfire)
  • Familiarity with digital and e-commerce business

 

Abilities/qualities 

  • Problem solving and decision making skills and innovative thinking 
  • Proactivity and strategic approach
  • Ability to interface with business stakeholders by presenting and negotiating one's solutions
  • Passionate about digital world, ambitious and motivated with a can-do attitude
  • High attention to detail and ability to effectively manage multiple projects at a time, successfully able to meet deadlines
  • Strong team player with a willingness to challenge existing processes and applications

See more jobs at lastminute.com

Apply for this job