Data Engineer Remote Jobs

113 Results

+30d

Data Engineer

In All Media IncArgentina Remote
agilesqlAWS

In All Media Inc is hiring a Remote Data Engineer

Data Analyst Engineer

The objective of this project and role is to focus on delivering the solution your business partners need to grow the business, e.g. an application, an API, a rules engine, or a Data Pipeline. You know what it takes to deliver the best possible, within the given deadline

Deliverables
Tool called conversion, delivering recommendations on this tool , solving technical debt updates and maintaining add net new recommendations, we can directly measure these recommendations by count and impact (example - how many more features were adopted)
CS - simplify data on active points and deliver the best recommendations, more net new


    Requirements:

      Technologies Backend * Python * Flask * SQLAlchemy * PyMySQL * MongoDB * Internal SOA libraries * Healthcheck tools * Tracing tools DevOps * GitLab CI/CD * Docker * Kubernetes * AWS We're seeking a talented Software Engineer Level 2 to join our dynamic team responsible for developing a suite of innovative tools. These tools are essential in automating and streamlining communication processes with our clients. If you are passionate about solving complex problems and improving user experiences, we want you on our team.

    See more jobs at In All Media Inc

    Apply for this job

    +30d

    Staff Data Engineer

    Life36Remote, Canada
    agileremote-firstterraformsqlDesignmobilec++pythonAWS

    Life36 is hiring a Remote Staff Data Engineer

    About Life360

    Life360’s mission is to keep people close to the ones they love. Our category-leading mobile app and Tile tracking devices empower members to protect the people, pets, and things they care about most with a range of services, including location sharing, safe driver reports, and crash detection with emergency dispatch. Life360 serves approximately 66 million monthly active users (MAU) across more than 150 countries.

    Life360 delivers peace of mind and enhances everyday family life with seamless coordination for all the moments that matter, big and small. By continuing to innovate and deliver for our customers, we have become a household name and the must-have mobile-based membership for families (and those friends that basically are family). 

    Life360 has more than 500 (and growing!) remote-first employees. For more information, please visit life360.com.

    Life360 is a Remote First company, which means a remote work environment will be the primary experience for all employees. All positions, unless otherwise specified, can be performed remotely (within Canada) regardless of any specified location above. 

    About the Job

    At Life360, we collect a lot of data: 60 billion unique location points, 12 billion user actions, 8 billion miles driven every single month, and so much more. As a Staff Data Engineer, you will contribute to enhancing and maintaining our data processing and storage pipelines/workflows for a robust and secure finance data lake. You should have a strong engineering background and even more importantly a desire to take ownership of our data systems to make them world class.

    The Canada-based salary range for this position is$190,000to $240,000 CAD. We take into consideration an individual's background and experience in determining final salary- therefore, base pay offered may vary considerably depending on geographic location, job-related knowledge, skills, and experience. The compensation package includes a wide range of medical, dental, vision, financial, and other benefits, as well as equity.

    What You’ll Do

    Primary responsibilities include, but are not limited to:

    • Design, implement, and manage scalable data processing platforms used for real-time analytics and exploratory data analysis.
    • Manage our financial data from ingestion through ETL to storage and batch processing.
    • Automate, test and harden all data workflows.
    • Architect logical and physical data models to ensure the needs of the business are met.
    • Collaborate with finance and analytics teams, while applying best practices
    • Architect and develop systems and algorithms for distributed real-time analytics and data processing
    • Implement strategies for acquiring data to develop new insights
    • Mentor junior engineers, imparting best practices and institutionalizing efficient processes to foster growth and innovation within the team

     

    What We’re Looking For

    • Minimum 5+ years of experience working with high volume data infrastructure.
    • Experience with Databricks, AWS, dbt, ETL and Job orchestration tooling.
    • Extensive experience programming in one of the following languages: Python / Java.
    • Experience in data modeling, optimizing SQL queries, and system performance tuning.
    • You are proficient with SQL, AWS, Databases, Apache Spark, Spark Streaming, EMR, and Kinesis/Kafka
    • Experience working with freemium models, tiered subscriptions, Subscriptions Billing Systems (Apple, Google, Recurly, Chargebee), integrations with systems like NetSuite Financials, and tax district and currency conversion providers.
    • Experience in modern development lifecycle including Agile methodology, CI/CD, automated deployments using Terraform, GitHub Actions etc.
    • Knowledge and proficiency in the latest open source and data frameworks, modern data platform tech stacks and tools.
    • Always be learning and staying up to speed with the fast moving data world.
    • You have good communication skills and can work independently
    • BS in Computer Science, Software Engineering, Mathematics, or equivalent experience

     

    Our Benefits

    • Competitive pay and benefits
    • Medical, dental, vision, life and disability insurance plans 
    • RRSP plan with DPSP company matching program
    • Employee Assistance Program (EAP) for mental well being
    • Flexible PTO, several company wide days off throughout the year
    • Winter and Summer Week-long Synchronized Company Shutdowns
    • Learning & Development programs
    • Equipment, tools, and reimbursement support for a productive remote environment
    • Free Life360 Platinum Membership for your preferred circle
    • Free Tile Products

    Life360 Values

    Our company’s mission driven culture is guided by our shared values to create a trusted work environment where you can bring your authentic self to work and make a positive difference 

    • Be a Good Person - We have a team of high integrity people you can trust. 
    • Be Direct With Respect - We communicate directly, even when it’s hard.
    • Members Before Metrics - We focus on building an exceptional experience for families. 
    • High Intensity High Impact - We do whatever it takes to get the job done. 

    Our Commitment to Diversity

    We believe that different ideas, perspectives and backgrounds create a stronger and more creative work environment that delivers better results. Together, we continue to build an inclusive culture that encourages, supports, and celebrates the diverse voices of our employees. It fuels our innovation and connects us closer to our customers and the communities we serve. We strive to create a workplace that reflects the communities we serve and where everyone feels empowered to bring their authentic best selves to work.

    We are an equal opportunity employer and value diversity at Life360. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or any legally protected status.  

    We encourage people of all backgrounds to apply. We believe that a diversity of perspectives and experiences create a foundation for the best ideas. Come join us in building something meaningful.Even if you don’t meet 100% of the below qualifications, you should still seriously consider applying!

     

    #LI-Remote

    ____________________________________________________________________________

    See more jobs at Life36

    Apply for this job

    +30d

    Principal Data Engineer

    sliceBelfast or remote UK
    scalanosqlsqlDesignjavapython

    slice is hiring a Remote Principal Data Engineer

    UK Remote or Belfast

    Serial tech entrepreneur Ilir Sela started Slice in 2010 with the belief that local pizzerias deserve all of the advantages of major franchises without compromising their independence. Starting with his family’s pizzerias, we now empower over 18,000 restaurants (that’s nearly triple Domino’s U.S. network!) with the technology, services, and collective power that owners need to better serve their digitally minded customers and build lasting businesses. We’re growing and adding more talent to help fulfil this valuable mission. That’s where you come in.

     

    The Challenge to Solve

    Provide Slice with up to date data to grow the business and to empower independent pizzeria owners to make the best data driven decisions through insights that ensure future success.

     

    The Role

    You will be responsible for leading data modelling and dataset development across the team. You’ll be at the forefront of our data strategy. Partnering closely with business and product teams,  to fuel data-driven decisions throughout the company. Your leadership will guide our data architecture expansion, ensuring smooth data delivery and maintaining top-notch data quality. Drawing on your expertise, you’ll steer our tech choices and keep us at the cutting edge of the field. You’ll get to code daily and provide your insights into best practices to the rest of the team.

     

    The Team

    You’ll work with a team of skilled data engineers daily, providing your expertise to their reviews, as well as working on your own exciting projects with teams across the business. You’ll have a high degree of autonomy and the chance to impact many areas of the business. You will optimise data flow and collection for cross-functional teams and support software developers, business intelligence, and data scientists on data initiatives using this to help to support product launches and supporting Marketing efforts to grow the business. This role reports to the Director of Data Engineering.

     

    The Winning Recipe

    We’re looking for creative, entrepreneurial engineers who are excited to build world-class products for small business counters. These are the core competencies this role calls for:

    • Strong track record of designing and implementing modern cloud data processing architectures using programming languages such as Java, Scala, or Python and technologies like Spark
    • Expert-level SQL skills
    • Extensive experience in data modelling and design, building out the right structures to deliver for various business and product domains
    • Strong analytical abilities and a history of using data to identify opportunities for improvement and where data can help drive the business towards its goals
    • Experience with message queuing, stream processing using frameworks such as Flink or KStreams and highly scalable big data data stores, as well as storage and query pattern design with NoSQL stores
    • Proven leadership skills, with a track record of successfully leading complex engineering projects and mentoring junior engineers, as well as working with cross-functional teams and external stakeholders in a dynamic environment
    • Familiarity with serverless technologies and the ability to design and implement scalable and cost-effective data processing architectures

     

    The Extras

    Working at Slice comes with a comprehensive set of benefits, but here are some of the unexpected highlights:

    • Access to medical, dental, and vision plans
    • Flexible working hours
    • Generous time off policies
    • Annual conference attendance and training/development budget
    • Market leading maternity and paternity schemes
    • Discounts for local pizzerias (of course)

     

    The Hiring Process

    Here’s what we expect the hiring process for this role to be, should all go well with your candidacy. This entire process is expected to take 1-3 weeks to complete and you’d be expected to start on a specific date.

    1. 30 minute introductory meeting
    2. 30 minute hiring manager meeting
    3. 60 minute pairing interview
    4. 60 minute technical interview
    5. 30 minute CTO interview
    6. Offer!

    Pizza brings people together. Slice is no different. We’re an Equal Opportunity Employer and embrace a diversity of backgrounds, cultures, and perspectives. We do not discriminate on the basis of race, colour, gender, sexual orientation, gender identity or expression, religion, disability, national origin, protected veteran status, age, or any other status protected by applicable national, federal, state, or local law. We are also proud members of the Diversity Mark NI initiative as a Bronze Member.

    Privacy Notice Statement of Acknowledgment

    When you apply for a job on this site, the personal data contained in your application will be collected by Slice. Slice is keeping your data safe and secure. Once we have received your personal data, we put in place reasonable and appropriate measures and controls to prevent any accidental or unlawful destruction, loss, alteration, or unauthorised access. If selected, we will process your personal data for hiring /employment processes, as well as our legal obligations. If you are not selected for the job position and you have given consent on the question below (by selecting "Give consent") we will store and process your personal data and submitted documents (CV) to consider eligibility for employment up to 365 days (one year). You have the right to withdraw your previously given consent for storing your personal data and CV in the Slice database considering eligibility for employment for a year. You have the right to withdraw your consent at any time.For additional information and / or exercise of your rights to the protection of personal data, you can contact our Data Protection Officer, e-mail:privacy@slicelife.com

    See more jobs at slice

    Apply for this job

    +30d

    Senior Data Engineer

    Nile BitsCairo, Egypt, Remote
    agileairflowsqlDesigndockerlinuxpythonAWS

    Nile Bits is hiring a Remote Senior Data Engineer

    Job Description

    • Designing and implementing core functionality within our data pipeline in order to support key business processes
    • Shaping the technical direction of the data engineering team
    • Supporting our Data Warehousing approach and strategy
    • Maintaining our data infrastructure so that our jobs run reliably and at scale
    • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
    • Mentoring and upskilling other members of the team

    Qualifications

    • Experience building data pipelines and/or ETL processes
    • Experience working in a Data Engineering role
    • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
    • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
    • Confident writing SQL and good understanding of database design.
    • Experience working with web APIs.
    • Experience leading projects from a technical perspective
    • Knowledge of Docker, shell scripting, working with Linux
    • Experience with a cloud data warehouse
    • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
    • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
    • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
    • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
    • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
    • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
    • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
    • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
    • Experience with AWS services or those of another cloud provider
    • Experience with Snowflake
    • Good understanding of Agile

    See more jobs at Nile Bits

    Apply for this job

    +30d

    Data Engineer

    MetioraMadrid, Spain, Remote
    nosqlsqlazuregitc++dockerlinuxpythonAWS

    Metiora is hiring a Remote Data Engineer

    Descripción del empleo

    Estamos buscando a un/a #excepcional Data engineer ???? que sea capaz de entender los retos de nuestros clientes, hacerlos suyos y que nos ayude a establecer relaciones a largo plazo con ellos, garantizando el éxito y la ejecución de los proyectos, teniendo como funciones:

    • Desarrollar procesos de integración con nuestros clientes para poder explotar sus datos desde nuestra plataforma MINEO y plataformas cloud del cliente (Azure, AWS, GCP)
    • Ayudar a la mejora de las actuales herramientas de integración (ETL)
    • Entender el dato del cliente y poder anticiparse a los posibles problemas que puedan surgir
    • Elaborar análisis de los cambios y las nuevas funcionalidades que se vayan a desarrollar
    • Realizar code reviews de las tareas llevadas a cabo por los compañeros

    ¿Qué esperamos de tu perfil profesional? 

    • Grado en carreras STEM,  Matemáticas, Estadística, ingeniería de Telecomunicaciones o Informática
    • Formación adicional en ciencia de datos
    • Al menos entre 2-5 años de experiencia en proyectos reales
    • Proactividad y pasión por la tecnología
    • Ganas de trabajar en equipo
    • Curiosidad intelectual y persistencia para resolver problemas
       

    Requisitos

    • Grado o Máster en Ingeniería informática o titulación similar

    • Background en desarrollo de software. Tienes que ser capaz de entender  y desarrollar código, especialmente Python

    • Manejo avanzado de bases de datos SQL y NoSQL

    • Conocimientos avanzados de algoritmia

    • Conocimiento en Git

    Soft skills:

    • Ganas de trabajar en equipo

    • Enfocado en la calidad, escalabilidad y código limpio

    • Curiosidad intelectual y persistencia para resolver problemas

    • Buen nivel de inglés

    • Proactividad y pasión por la tecnología

    Se valorará muy positivamente:

    • Soluciones de "containerization" (Docker)

    • Conocimientos de algoritmia para desarrollo de procesos de grandes cantidades de datos

    • Conocimiento de sistema Linux

    See more jobs at Metiora

    Apply for this job

    +30d

    Data Services Engineer

    DevoteamWarszawa, Poland, Remote
    SalesagilenosqlDesignazurejavapythonAWS

    Devoteam is hiring a Remote Data Services Engineer

    Job Description

    We are looking for a highly motivated Data Engineer to use their experience in the public cloud to interpret our customers’ needs and extract value from their data using GCP and its suite of tools. You will primarily work as part of the data team.

    You will be involved in pre-sales activities, building upon our customers’ cloud infrastructure and creating analytics in the cloud. You will be involved in designing and constructing data pipelines and architecture, using GCP products. You will help turn big data into valuable business insights using Machine Learning. Programming and preparing custom solutions optimised for each client will be an essential part of your job. We believe that working with data is your passion that you want to share with others.

    We want you to know and advocate for Google Cloud Platform and its products. Knowledge in equivalent platforms could be a valuable asset, i.e. AWS or Azure, to make you a strong candidate.

    It’s a customer-facing role, so you must be outgoing and confident in your ability to interact with clients, manage workshops and execute necessary training on your own.

    Qualifications

    • BA/BS in Computer Science or related technical field, or equivalent experience
    • Experience in one or more development languages, with a strong preference for Python, Java
    • Good knowledge of Power BI 
    • Basic knowledge of Bash 
    • Experience with programming in agile methodology
    • Knowledge of database and data analysis technologies, including relational and NoSQL databases, data warehouse design, and ETL pipeline.
    • Fluency in English (both written and spoken)

    Nice to have:

    • Experience in drawing UML diagrams
    • Experience in Hadoop and using platforms such as Apache Spark, Pig, or Hive
    • Fluency in Polish (both written and spoken)

    See more jobs at Devoteam

    Apply for this job

    +30d

    Data Engineer

    LegalistRemote
    agilenosqlsqlDesignc++dockerkubernetesAWS

    Legalist is hiring a Remote Data Engineer

    Intro description:

    Legalist is an institutional alternative asset management firm. Founded in 2016 and incubated at Y Combinator, the firm uses data-driven technology to invest in credit assets at scale. We are always looking for talented people to join our team.

    As a highly collaborative organization, our data engineers work cross-functionally with software engineering, data science, and product management to optimize growth and strategy of our data pipeline. In this position, you will be joining the data engineering team in an effort to take our data pipeline to the next level.

    Where you come in:

    • Design and develop scalable data pipelines to collect, process, and analyze large volumes of data efficiently.
    • Collaborate with cross-functional teams including data scientists, software engineers, and product managers to understand data requirements and deliver solutions that meet business needs.
    • Develop ELT processes to transform raw data into actionable insights, leveraging tools and frameworks such as Airbyte, BigQuery, Dagster, DBT or similar technologies.
    • Participate in agile development processes, including sprint planning, daily stand-ups, and retrospective meetings, to deliver iterative improvements and drive continuous innovation.
    • Apply best practices in data modeling and schema design to ensure data integrity, consistency, and efficiency.
    • Continuously monitor and optimize data pipelines and systems for performance, availability, scalability, and cost-effectiveness.

    What you’ll be bringing to the team:

    • Bachelor’s degree (BA or BS) or equivalent.
    • A minimum of 2 years of work experience in data engineering or similar role.
    • Advanced SQL knowledge and experience working with a variety of databases (SQL, NoSQL, Graph, Multi-model).
    • A minimum of 2 years professional experience with ETL//ELT, data modeling and Python.
    • Familiarity with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, BigQuery, etc.
    • You have a pragmatic, data-driven mindset and are not dogmatic or overly idealistic about technology choices and trade-offs.
    • You have an aptitude for learning new things quickly and have the confidence and humility to ask clarifying questions.

    Even better if you have, but not necessary:

    • Experience with one or more of the following: data processing automation, data quality, data warehousing, data governance, business intelligence, data visualization.
    • Experience working with TB scale data.

    See more jobs at Legalist

    Apply for this job

    +30d

    Senior Data Engineer

    IndigoRemote with Frequent Travel
    MLSalesEC2LambdasqlDesignpythonAWSjavascriptNode.js

    Indigo is hiring a Remote Senior Data Engineer

    Company Description

    Healthcare providers spend roughly $20B annually on premiums for medical professional liability (“MPL”) insurance, $5-6B of which is spent for physicians. Incumbent carriers utilize outdated risk selection, underwriting, and sales processes. For the first time in 10 years, The MPL market is currently in a hardening cycle. While incumbent carriers are increasing rates to make up for underwriting losses, the environment is ripe for an innovative disruptor to capture market share by deploying artificial intelligence.

    Rubicon Founders, an entrepreneurial firm focused on building and growing transformational healthcare companies, has launched Indigo to address this issue. Backed by Oak HC/FT, this company will disrupt the medical professional liability market by offering more competitive products to providers. Benefitting from significant potential cost savings as a result, providers may reallocate resources to invest more directly in patient or staff care. This company intends to fulfill Rubicon Founder’s mission of creating enduring value by impacting people in a measurable way.   

    Position Description

    Are you an expert data engineer with a passion for making a difference in the healthcare industry? If so, we want you to join our team at Indigo, where we're using technology to transform medical professional liability insurance. Our AI driven underwriting process and modern technology streamline the insurance process and equip quality physicians and groups with the coverage they need.

    As a senior data engineer you will own the various data components of our system, which includes our data model architecture, our data pipeline / ETL, and any ML data engineering needs. You will help us leverage existing data platforms and databases with best practices to ensure that the data strategy is coherent, that our data systems are highly available, secure, and scalable. You will work with cross-functional teams including business stakeholders, product managers, and data scientists to ensure our data pipeline system meets business requirements and adheres to best practices.  

     

    We are a remote distributed team who gathers with intention. You will thrive here if you find energy working from home and getting together to build relationships. At Indigo, you'll have the opportunity to contribute to a meaningful mission that makes a difference in the lives of healthcare providers and their patients.

     

    Responsibilities:

    • Design and build the Indigo data pipeline and ETL process.
    • Design and build the Indigo application datastore architecture and implementation 
    • Define a governance and lineage strategy.
    • Define and implement security across our data systems.
    • Participate in code reviews and ensure the code is of high quality.
    • Collaborate with cross-functional teams including product and engineering to understand business requirements and translate them into technical requirements.
    • Keep up to date with emerging trends in data engineering and contribute to improving our development processes.

    Requirements:

    • Bachelor’s degree in CS or similar; masters advantageous
    • 5+ years of data engineering experience 
    • Expert SQL and Python or Javascript (Node.js)
    • Experience with AWS services such as EC2, Lambda, RDS, DynamoDB, and S3.
    • Experience with Snowflake, dbt
    • Strong understanding of data design principles
    • Experience designing and building data products from concept to productization.
    • Experience with automation in the processes of data definition, migration.
    • Excellent design and problem-solving skills.
    • Self starter, highly autonomous, thrives in a remote distributed environment



    See more jobs at Indigo

    Apply for this job

    +30d

    Senior Data Engineer

    MLairflowpostgressqloracleDesigndockerMySQLkubernetespythonAWS

    ReCharge Payments is hiring a Remote Senior Data Engineer

    Who we are

    In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

    Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and dynamic bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 100 million subscribers, including brands such as Blueland, Hello Bello, LOLA, Chamberlain Coffee, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

    Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

    Overview

    The centralized Data and Analytics team at Recharge delivers critical analytic capabilities and insights for Recharge’s business and customers. 

    As a Senior Data Engineer, you will build scalable data pipelines and infrastructure that power internal business analytics and customer-facing data products.  Your work will empower data analysts to derive deeper strategic insights from our data, and will  enable developers to build applications that surface data insights directly to our merchants. 

    What you’ll do

    • Build data pipeline, ELT and infrastructure solutions to power internal data analytics/science and external, customer-facing data products.

    • Create automated monitoring, auditing and alerting processes that ensure data quality and consistency.

    • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models

    • Design, develop, implement, and optimize existing ETL processes that merge data from disparate sources for consumption by data analysts, business owners, and customers

    • Seek ways to continually improve the operations, monitoring and performance of the data warehouse

    • Influence and communicate with all levels of stakeholders including analysts, developers, business users, and executives.

    • Live by and champion our values: #day-one, #ownership, #empathy, #humility.

    What you’ll bring

    • Typically, 5+ years experience in a data engineering related role (Data Engineer, Data Platform Engineer, Analytics Engineer etc) with a track record of building scalable data pipeline, transformation, and platform solutions. 

    • 3+ years of hands-on experience designing and building data pipelines and models  to ingesting, transforming and delivery of large amounts of data, from multiple sources into a Dimensional (Star Schema) Data Warehouse, Data Lake.

    • Experience with a variety of data warehouse, data lake and enterprise data management platforms (Snowflake {preferred}, Redshift, databricks, MySQL, Postgres, Oracle,  RDS, AWS, GCP)

    • Experience building data pipelines, models and infrastructure powering external, customer-facing (in addition to internal business facing) analytics applications.

    • Solid grasp to data warehousing methodologies like Kimball and Inmon

    • Experience working with a variety of ETL tools. (FiveTran, dbt, Python etc)

    • Experience with workflow orchestration management engines such as Airflow & Cloud Composer

    • Hands on experience with Data Infra tools like Kubernetes, Docker

    • Expert proficiency in SQL

    • Strong Python proficiency

    • Experience with ML Operations is a plus.

    Recharge | Instagram | Twitter | Facebook

    Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

    Transparency in Coverage

    This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

    #LI-Remote

    See more jobs at ReCharge Payments

    Apply for this job

    +30d

    Senior Data Engineer

    InvocaRemote
    SalesBachelor's degreesqlsalesforceDesignswiftazureapic++postgresqlMySQLpythonAWS

    Invoca is hiring a Remote Senior Data Engineer

    About Invoca:

    Invoca is the industry leader and innovator in AI and machine learning-powered Conversation Intelligence. With over 300 employees, 2,000+ customers, and $100M in revenue, there are tremendous opportunities to continue growing the business. We are building a world-class SaaS company and have raised over $184M from leading venture capitalists including Upfront Ventures, Accel, Silver Lake Waterman, H.I.G. Growth Partners, and Salesforce Ventures.

    About the Engineering Team:

    You’ll join a team where everyone, including you, is striving to constantly improve their knowledge of software development tools, practices, and processes. We are an incredibly supportive team. We swarm when problems arise and give excellent feedback to help each other grow. Working on our close-knit, multi-functional teams is a chance to share and grow your knowledge of different domains from databases to front ends to telephony and everything in between.

    We are passionate about many things: continuous improvement, working at a brisk but sustainable pace, writing resilient code, maintaining production reliability, paying down technical debt, hiring fantastic teammates; and we love to share these passions with each other.

    Learn more about the Invoca development team on our blog and check out our open source projects.

    You Will:

    Invoca offers a unique opportunity to make massive contributions to machine learning and data science as it applies to conversation intelligence, marketing, sales and user experience optimization.

    You are excited about this opportunity because you get to:

    • Design and develop highly performant and scalable data storage solutions
    • Extend and enhance the architecture of Invoca’s data infrastructure and pipelines
    • Deploy and fine-tune machine learning models within an API-driven environment, ensuring scalability, efficiency, and optimal performance.
    • Expand and optimize our Extract, Transform, and Load (ETL) processes to include various structured and unstructured data sources within the Invoca Platform.
    • Evaluate and implement new technologies as needed and work with technical leadership to drive adoption.
    • Collaborate with data scientists, engineering teams, analysts, and other stakeholders to understand data requirements and deliver solutions on behalf of our customers
    • Support diversity, equity and inclusion at Invoca

    At Invoca, our Senior Data Engineers benefit from mentorship provided by experts spanning our data science, engineering, and architecture teams. Our dedicated data science team is at the forefront of leveraging a blend of cutting-edge technology, including our proprietary and patented solutions, along with tools from leading vendors, to develop an exceptionally scalable data modeling platform.

    Our overarching objective is to seamlessly deliver models through our robust API platform, catering to both internal stakeholders and external clients. Your pivotal role will focus on optimizing model accessibility and usability, thereby expediting model integration within our feature engineering teams. Ultimately, this streamlined process ensures swift model adoption, translating to enhanced value for our customers.

     

    You Have:

    We are excited about you because you have:

    • 3+ years of professional experience in Data Engineering or a related area of data science or software engineering
    • Advanced proficiency in Python, including expertise in data processing libraries (e.g.,  spaCy, Pandas), data visualization libraries (e.g., Matplotlib, Plotly), and familiarity with machine learning frameworks
    • Advanced proficiency using python API frameworks (e.g., FastAPI, Ray/AnyScale, AWS Sagemaker) to build, host. and optimize machine learning model inference APIs.
    • Intermediate proficiency working with the Databricks platform (e.g. Unity Catalog, Job/Compute, Deltalake) (or similar platform) for data engineering and analytics tasks
    • Intermediate proficiency working with the Machine Learning and Large Language Model (LLM) tools from AWS (e.g., Sagemaker, Bedrock) or other cloud vendors such Azure, or Google Cloud Platform
    • Intermediate proficiency with big data technologies and frameworks  (e.g., Spark, Hadoop)
    • Intermediate proficiency with SQL and relational databases (e.g., MySQL, PostgreSQL)
    • Basic proficiency in several areas apart from pure coding, such as monitoring, performance optimization, integration testing, security and more
    • Basic proficiency with Kafka (or similar stream-processing software) is a plus
    • Bachelor's Degree or equivalent experience preferred

    Salary, Benefits & Perks:

    Teammates begin receiving benefits on the first day of the month following or coinciding with one month of employment. Offerings include:

    • Paid Time Off -Invoca encourages a work-life balance for our employees. We have an outstanding PTO policy starting at 20 days off for all full-time employees. We also offer 16 paid holidays, 10 days of Compassionate Leave, days of volunteer time, and more.
    • Healthcare -Invoca offers a healthcare program that includes medical, dental, and vision coverage. There are multiple plan options to choose from. You can make the best choice for yourself, your partner, and your family.
    • Retirement - Invoca offers a 401(k) plan through Fidelity with a company match of up to 4%.
    • Stock options - All employees are invited to ownership in Invoca through stock options.
    • Employee Assistance Program -Invoca offers well-being support on issues ranging from personal matters to everyday-life topics through the WorkLifeMatters program.
    • Paid Family Leave -Invoca offers up to 6 weeks of 100% paid leave for baby bonding, adoption, and caring for family members.
    • Paid Medical Leave - Invoca offers up to 12 weeks of 100% paid leave for childbirth and medical needs.
    • Sabbatical -We thank our long-term team members with an additional week of PTO and a bonus after 7 years of service.
    • Wellness Subsidy - Invoca provides a wellness subsidy applicable to a gym membership, fitness classes, and more.
    • Position Base Range - $139,500 to $175,000Salary Range / plus bonus potential
    • Please note, per Invoca's COVID-19 policy, depending on your vaccine verification status, you may be required to work only from home / remotely. At this time, travel and in-person meetings will require verification. This policy is regularly reviewed and subject to change at any time

    Recently, we’ve noticed a rise in phishing attempts targeting individuals who are applying to our job postings. These fraudulent emails, posing as official communications from Invoca aim to deceive individuals into sharing sensitive information. These attacks have attempted to use our name and logo, and have tried to impersonate individuals from our HR team by claiming to represent Invoca. 

    We will never ask you to send financial information or other sensitive information via email. 

     

    DEI Statement

    We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.

    #LI-Remote

    See more jobs at Invoca

    Apply for this job

    +30d

    Senior Data Engineer

    HandshakeRemote (USA)
    MLsqlDesignc++docker

    Handshake is hiring a Remote Senior Data Engineer

    Everyone is welcome at Handshake. We know diverse teams build better products and we are committed to creating an inclusive culture built on a foundation of respect for all individuals. We strongly encourage candidates from non-traditional backgrounds, historically marginalized or underrepresented groups to apply.

    Your impact

    At Handshake, we are assembling a team of dynamic engineers who are passionate about creating high-quality, impactful products. As a Senior Data Engineer, you will play a key role in driving the architecture, implementation, and evolution of our cutting-edge data platform. Your technical expertise will be instrumental in helping millions of students discover meaningful careers, irrespective of their educational background, network, or financial resources.

    Our primary focus is on building a robust data platform that empowers all teams to develop data-driven features while ensuring that every facet of the business has access to the right data for making informed conclusions. While this individual will work closely in collaboration with our ML teams, they will also be supporting our businesses data needs as a whole.

    Your role

    • Technical leadership: Taking ownership of the data engineering function and providing technical guidance to the data engineering team. Mentoring junior data engineers, fostering a culture of learning, and promoting best practices in data engineering.
    • Collaborating with cross-functional teams: Working closely with product managers, product engineers, and other stakeholders to define data requirements, design data solutions, and deliver high-quality, data-driven features.
    • Data architecture and design: Designing and implementing scalable and robust data pipelines, data services, and data products that meet business needs and adhere to best practices. Staying abreast of emerging technologies and tools in the data engineering space, evaluating their potential impact on the data platform, and making strategic recommendations.
    • Performance optimization: Identifying performance bottlenecks in data processes and implementing solutions to enhance data processing efficiency.
    • Data quality and governance: Ensuring data integrity, reliability, and security through the implementation of data governance policies and data quality monitoring.
    • Advancing our Generative AI strategy: Leveraging your Data Engineering knowledge to design and implement data pipelines that support our Generative AI initiatives, advising and working in collaboration with our ML teams.

    Your experience

    • Extensive data engineering experience: A proven track record in designing and implementing large-scale, complex data pipelines, data warehousing solutions, and data services. Deep knowledge of data engineering technologies, tools, and frameworks.
    • Cloud platform proficiency: Hands-on experience with cloud-based data technologies, preferably Google Cloud Platform (GCP), including BigQuery, DataFlow, BigTable, and more
    • Advanced SQL skills: Strong expertise in SQL and experience with data modeling and database design conventions.
    • Problem-solving abilities: Exceptional problem-solving skills, with the ability to tackle complex data engineering challenges and propose innovative solutions.
    • Collaborative mindset: A collaborative and team-oriented approach to work, with the ability to communicate effectively with both technical and non-technical stakeholders.

    Bonus areas of expertise

    • Machine learning for data enrichment: Experience in applying machine learning techniques to data engineering tasks for data enrichment and augmentation.
    • End to end data service deployment, comfortable with product alignment of data-driven initiatives
    • Containerization and orchestration: Familiarity with containerization technologies like Docker and container orchestration platforms like Kubernetes.
    • dbt: Experience with dbt as a data transformation tool for orchestrating and organizing data pipelines.

    Compensation range

    $173,000-$213,580

    For cash compensation, we set standard ranges for all U.S.-based roles based on function, level, and geographic location, benchmarked against similar stage growth companies. In order to be compliant with local legislation, as well as to provide greater transparency to candidates, we share salary ranges on all job postings regardless of desired hiring location. Final offer amounts are determined by multiple factors, including geographic location as well as candidate experience and expertise, and may vary from the amounts listed above.

    About us

    Handshake is the #1 place to launch a career with no connections, experience, or luck required. The platform connects up-and-coming talent with 750,000+ employers - from Fortune 500 companies like Google, Nike, and Target to thousands of public school districts, healthcare systems, and nonprofits. In 2022 we announced our $200M Series F funding round. This Series F fundraise and valuation of $3.5B will fuel Handshake’s next phase of growth and propel our mission to help more people start, restart, and jumpstart their careers.

    When it comes to our workforce strategy, we’ve thought deeply about how work-life should look here at Handshake. With our Hub-Based Remote Working strategy, employees can enjoy the flexibility of remote work, whilst ensuring collaboration and team experiences in a shared space remains possible. Handshake is headquartered in San Francisco with offices in Denver, New York, London, and Berlin and teammates working globally. 

    Check out our careers site to find a hub near you!

    What we offer

    At Handshake, we'll give you the tools to feel healthy, happy and secure.

    Benefits below apply to employees in full-time positions.

    • ???? Equity and ownership in a fast-growing company.
    • ???? 16 Weeks of paid parental leave for birth giving parents & 10 weeks of paid parental leave for non-birth giving parents.
    • ???? Comprehensive medical, dental, and vision policies including LGTBQ+ Coverage. We also provide resources for Mental Health Assistance, Employee Assistance Programs and counseling support.
    • ???? Handshake offers $500/£360 home office stipend for you to spend during your first 3 months to create a productive and comfortable workspace at home.
    • ???? Generous learning & development opportunities and an annual $2,000/£1,500/€1,850 stipend for you to grow your skills and career.
    • ???? Financial coaching through Origin to help you through your financial journey.
    • ???? Monthly internet stipend and a brand new MacBook to allow you to do your best work.
    • ???? Monthly commuter stipend for you to expense your travel to the office (for office-based employees).
    • ???? Free lunch provided twice a week across all offices.
    • ???? Referral bonus to reward you when you bring great talent to Handshake.

    (US-specific benefits, in addition to the first section)

    • ???? 401k Match: Handshake offers a dollar-for-dollar match on 1% of deferred salary, up to a maximum of $1,200 per year.
    • ???? All full-time US-based Handshakers are eligible for our flexible time off policy to get out and see the world. In addition, we offer 8 standardized holidays, and 2 additional days of flexible holiday time off. Lastly, we have a Winter #ShakeBreak, a one-week period of Collective Time Off.
    • ???? Lactation support: Handshake partners with Milk Stork to provide a comprehensive 100% employer-sponsored lactation support to traveling parents and guardians.

    (UK-specific benefits, in addition to the first section) 

    • ???? Pension Scheme: Handshake will provide you with a workplace pension, where you will make contributions based on 5% of your salary. Handshake will pay the equivalent of 3% towards your pension plan, subject to qualifying earnings limits.
    • ???? Up to 25 days of vacation to encourage people to reset, recharge, and refresh, in addition to 8 bank holidays throughout the year.
    • ???? Regular offsites each year to bring the team together + opportunity to travel to our HQ in San Francisco.
    • ????️ Discounts across various high street retailers, cinemas and other social activities exclusively for Handshake UK employees.

    (Germany-specific benefits, in addition to the first section)

    • ???? 25 days of annual leave + we have a Winter #ShakeBreak, a one-week period of Collective Time Off across the company.
    • ???? Regular offsites each year to bring the team together + opportunity to travel to our HQ in San Francisco once a year.
    • ???? Urban sports club membership offering access to a diverse network of fitness and wellness facilities.
    • ????️ Discounts across various high street retailers, cinemas and other social activities exclusively for Handshake Germany employees.

    For roles based in Romania: Please ask your recruiter about region specific benefits.

    Looking for more? Explore our mission, values and comprehensive US benefits at joinhandshake.com/careers.

    Handshake is committed to providing reasonable accommodations in our recruitment processes for candidates with disabilities, sincerely held religious beliefs or other reasons protected by applicable laws. If you need assistance or reasonable accommodation, please reach out to us at people-hr@joinhandshake.com.

    See more jobs at Handshake

    Apply for this job

    +30d

    Senior Data Engineer

    DevoteamTunis, Tunisia, Remote
    airflowsqlscrum

    Devoteam is hiring a Remote Senior Data Engineer

    Description du poste

    Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

    Votre rôle consistera à contribuer à des projets data en apportant son expertise sur les tâches suivantes :

    • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur GCP, en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
    • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
    • Optimiser les performances des requêtes SQL et des processus ETL pour garantir des temps de réponse rapides et une scalabilité.
    • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
    • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
    • Restez à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

    Qualifications

    • Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.
    • Au moins 3 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
    • Certification GCP (Google Cloud Platform) est un plus.
    • Très bonne communication écrite et orale (livrables et reportings de qualité)

    See more jobs at Devoteam

    Apply for this job

    +30d

    Data Engineer

    phDataLATAM - Remote
    scalasqlazurejavapythonAWS

    phData is hiring a Remote Data Engineer

    Job Application for Data Engineer at phData

    See more jobs at phData

    Apply for this job