Data Engineer Remote Jobs

113 Results

19d

Data Engineer

Blend36Edinburgh, United Kingdom, Remote
terraformsqlDesignazurepython

Blend36 is hiring a Remote Data Engineer

Job Description

Life as a Data Engineer at Blend

We are looking for someone who is ready for the next step in their career and is excited by the idea of solving problems and designing best in class. 

However, they also need to be aware of the practicalities of making a difference in the real world – whilst we love innovative advanced solutions, we also believe that sometimes a simple solution can have the most impact.   

Our Data Engineer is someone who feels the most comfortable around solving problems, answering questions and proposing solutions. We place a high value on the ability to communicate and translate complex analytical thinking into non-technical and commercially oriented concepts, and experience working on difficult projects and/or with demanding stakeholders is always appreciated. 

Reporting to a Senior Data Engineer and working closely with the Data Science and Business Development teams, this role will be responsible for driving high delivery standards and innovation in the company. Typically, this involves delivering data solutions to support the provision of actionable insights for stakeholders. 

What can you expect from the role? 

  • Preparing and presenting data driven solutions to stakeholders.
  • Analyse and organise raw data.
  • Design, develop, deploy and maintain ingestion, transformation and storage solutions.
  • Use a variety of Data Engineering tools and methods to deliver.
  • Own projects end-to-end.
  • Contributing to solutions design and proposal submissions.
  • Supporting the development of the data engineering team within Blend.
  • Maintain in-depth knowledge of data ecosystems and trends. 
  • Mentor junior colleagues.

Qualifications

What you need to have? 

  • Proven track record of building analytical production pipelines using Python and SQL programming.
  • Working knowledge of large-scale data such as data warehouses and their best practices and principles in managing them.
  • Experience with development, test and production environments and knowledge and experience of using CI/CD.
  • ETL technical design, development and support.
  • Knowledge of Data Warehousing and database operation, management & design.

Nice to have 

  • Knowledge in container deployment.
  • Experience of creating ARM template design and production (or other IaC, e.g., CloudFormation, Terraform).
  • Experience in cloud infrastructure management.
  • Experience of Machine Learning deployment.
  • Experience in Azure tools and services such as Azure ADFv2, Azure Databricks, Storage, Azure SQL, Synapse and Azure IoT.
  • Experience of leveraging data out of SAP or S/4HANA.

See more jobs at Blend36

Apply for this job

20d

Staff Data Security Engineer

GeminiRemote (USA)
remote-firstDesignkuberneteslinuxpythonAWS

Gemini is hiring a Remote Staff Data Security Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Platform Security

The Role: Staff Data Security Engineer

The Platform Security team secures Gemini’s infrastructure through service hardening and by developing and supporting a suite of foundational tools. We provide secure-by-default infrastructure, consumable security services, and expert consultation to engineering teams for secure cloud and non-cloud infrastructure.

The Platform Security team covers a broad problem space that includes all areas of Gemini’s platform infrastructure. In the past, this team has focused specifically on cloud security and we continue to invest heavily in this area.  This role will bring additional depth and specialization in database design, security and  We also value expertise in neighboring areas of infrastructure and platform security engineering including: PKI, core cryptography, identity management, network security, etc.

Responsibilities:

  • Design, deploy, and maintain database, and relevant security controls for security and engineering teams.
  • Build and improve security controls capturing data in transit and data at rest. 
  • Partner with engineering teams on security architecture and implementation decisions.
  • Own our database security roadmap and act as relevant SME within Gemini.
  • Collaborate with AppSec, Threat Detection, Incident Response, GRC and similar security functions to identify, understand, and reduce security risk.

Minimum Qualifications:

  • 6+ years of experience in the field.
  • Extensive knowledge of database architecture and security principles.
  • Significant experience with container orchestration technologies and relevant security considerations. We often use Kubernetes and EKS.
  • Experience in SRE, systems engineering, or network engineering.
  • Experience with distributed systems or cloud computing. We often use AWS.
  • Significant software development experience. We often use Python or Go.
  • Experience building and owning high-availability critical systems or cloud-based services
  • Able to self-scope, define, and manage short and long term technical goals.

Preferred Qualifications:

  • Proven track record securing databases and ensuring data integrity.
  • Experience securing AWS and Linux environments, both native and third-party.
  • Experience designing and implementing cryptographic infrastructure such as PKI, secrets management, authentication, or secure data storage/transmission.
  • Experience designing and implementing systems for identity and access management.
  • Experience with configuration management and infrastructure as code. We often use Terraform.
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $172,000 - $215,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AH1

Apply for this job

26d

Data Engineer

Zone ITSydney,New South Wales,Australia, Remote Hybrid

Zone IT is hiring a Remote Data Engineer

We are currently seeking a highly motivated and experienced Data Engineer to a full-time position. You will be responsible for designing and implementing data architectures, integrating data from various sources, and optimizing data pipelines to ensure efficient and accurate data processing.

Key responsibilities:

  • Design and implement data architectures, including databases and processing systems
  • Integrate data from various sources and ensure data quality and reliability
  • Optimize data pipelines for scalability and performance
  • Develop and maintain ETL processes and data transformation solutions
  • Apply data security measures and ensure compliance with data privacy regulations
  • Create and maintain documentation related to data systems design and maintenance
  • Collaborate with cross-functional teams to understand data requirements and provide effective data solutions

Key skills and qualifications:

  • Bachelor's degree or higher in Computer Science, Data Science, or a related field
  • Strong proficiency in SQL, Python, and/or Java
  • Experience with ETL processes and data integration
  • Working knowledge of data modeling and database design principles
  • Familiarity with big data technologies such as Hadoop, Spark, or Kafka is a plus
  • Experience with cloud platforms such as AWS, Azure, or GCP is a plus
  • Strong problem-solving and analytical skills
  • Excellent communication and collaboration abilities

About Us

Zone IT Solutions is Australia based Recruitment Company. We specialize in Digital, ERP and larger IT Services. We offer flexible, efficient and collaborative solutions to any organization that requires IT, experts. Our agile, agnostic and flexible solutions will help you source the IT Expertise you need. Our delivery Offices are in Melbourne, Sydney and India. If you are looking for new opportunities; please share your profile at Careers@zoneitsolutions.com or contact us at 0434189909

Also follow our LinkedIn page for new job opportunities and more.

Zone IT Solutions is an equal opportunity employer and our recruitment process focuses on essential skills and abilities. We welcome applicants from a diverse range of backgrounds, including Aboriginal and Torres Strait Islander peoples, people from culturally and linguistically diverse (CALD) backgrounds and people with disabilities.

See more jobs at Zone IT

Apply for this job

26d

Data Engineer

Tech9Remote
MLFull TimeDevOPSagileterraformsqlDesignazurepython

Tech9 is hiring a Remote Data Engineer

Data Engineer - Tech9 - Career Page: Work with skil

See more jobs at Tech9

Apply for this job

30d

Sr. Data Engineer - Remote

Trace3Remote
DevOPSagilenosqlsqlDesignazuregraphqlapijavac++c#pythonbackend

Trace3 is hiring a Remote Sr. Data Engineer - Remote


Who is Trace3?

Trace3 is a leading Transformative IT Authority, providing unique technology solutions and consulting services to our clients. Equipped with elite engineering and dynamic innovation, we empower IT executives and their organizations to achieve competitive advantage through a process of Integrate, Automate, Innovate.

Our culture at Trace3 embodies the spirit of a startup with the advantage of a scalable business. Employees can grow their career and have fun while doing it!

Trace3 is headquartered in Irvine, California. We employ more than 1,200 people all over the United States. Our major field office locations include Denver, Indianapolis, Grand Rapids, Lexington, Los Angeles, Louisville, Texas, San Francisco.  

Ready to discover the possibilities that live in technology?

 

Come Join Us!

Street-Smart Thriving in Dynamic Times

We are flexible and resilient in a fast-changing environment. We continuously innovate and drive constructive change while keeping a focus on the “big picture.” We exercise sound business judgment in making high-quality decisions in a timely and cost-effective manner. We are highly creative and can dig deep within ourselves to find positive solutions to different problems.

Juice - The “Stuff” it takes to be a Needle Mover

We get things done and drive results. We lead without a title, empowering others through a can-do attitude. We look forward to the goal, mentally mapping out every checkpoint on the pathway to success, and visualizing what the final destination looks and feels like.

Teamwork - Humble, Hungry and Smart

We are humble individuals who understand how our job impacts the company's mission. We treat others with respect, admit mistakes, give credit where it’s due and demonstrate transparency. We “bring the weather” by exhibiting positive leadership and solution-focused thinking. We hug people in their trials, struggles, and failures – not just their success. We appreciate the individuality of the people around us.


 

Who We’re Looking For:

We’re looking to add a Senior Data Integration Engineer with a strong background in data engineering and development.  You will work with a team of software and data engineers to build client-facing data-first solutions utilizing data technologies such as SQL Server and MongoDB. You will develop data pipelines to transform/wrangle/integrate the data into different data zones.

To be successful in this role, you will need to hold extensive knowledge of SQL, relational databases, ETL pipelines, and big data fundamentals.  You will also need to possess strong experience in the development and consumption of RESTful APIs.  The ideal candidate will also be a strong independent worker and learner.

 

What You’ll Be Doing

  • Develop processes and data models for consuming large quantities of 3rd party vendor data via RESTful APIs.
  • Develop data processing pipelines to analyze, transform, and migrate data between applications and systems.
  • Analyze data from multiple sources and negotiate differences in storage schema using the ETL process.
  • Develop APIs for external consumption by partners and customers.
  • Develop and support our ETL environment by recommending improvements, monitoring, and deploying quality and validation processes to ensure accuracy and integrity of data.
  • Design, develop, test, deploy, maintain, and improve data integration pipelines.
  • Create technical solutions that solve business problems and are well engineered, operable, and maintainable.
  • Design and implement tools to detect data anomalies (observability). Ensure that data is accurate, complete, and high quality across all platforms.
  • Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues.
  • Assembles large and complex data sets; develops data models based on specifications using structured data sets.
  • Develops familiarity with emerging and complex automations and technologies that support business processes.
  • Develops scalable and re-usable frameworks for ingestion and transformation of large datasets.
  • Work within an Agile delivery / DevOps methodology to deliver product increments in iterative sprints.
  • Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues.
  • Develop data models and mappings and build new data assets required by users. Perform exploratory data analysis on existing products and datasets.
  • Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
  • Engage in logical and physical design of databases, table creation, script creation, views, procedures, packages, and other database objects.
  • Create documentation for solutions and processes implemented or updated to ensure team members and stakeholders can correctly interpret it.
  • Design and implement processes and/or process improvements to help the development of technology solutions.

 

Your Skills and Experience (In Order of Importance):

  • 5+ years of relational database development experience; including SQL query generation and tuning, database design, and data concepts.
  • 5+ years of backend and Restful API development experience in NodeJS (experience with GraphQL a plus).
  • 5+ years of development experience with the following languages Python, Java, C#/ .NET.
  • 5+ years of experience with SQL and NoSQL databases; including MS SQL and MongoDB.
  • 5+ years consuming RESTful APIs with data ingestion and storage.
  • 5+ years developing RESTful APIs for use by customers and 3rd
  • 3+ years of professional work experience designing and implementing data pipelines in a cloud environment.
  • 3+ years of experience working within Azure cloud.
  • Experience in integrating and ingesting data from external data sources.
  • Strong diagnostic skills and ability to research, troubleshoot, and logically determine solutions.
  • Ability to effectively prioritize tasks in a fast-paced, high-volume, and evolving work environment.
  • Comfortable managing multiple and changing priorities, and meeting deadlines.
  • Highly organized, detail-oriented, excellent time management skills.
  • Excellent written and verbal communication skills.
Actual salary will be based on a variety of factors, including location, experience, skill set, performance, licensure and certification, and business needs. The range for this position in other geographic locations may differ. Certain positions may also be eligible for variable incentive compensation, such as bonuses or commissions, that is not included in the base salary.
Estimated Pay Range
$142,500$168,700 USD

The Perks:

  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Stocked kitchen with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off

 

***To all recruitment agencies: Trace3 does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Trace3 employees or any other company location. Trace3 is not responsible for any fees related to unsolicited resumes/CVs.

See more jobs at Trace3

Apply for this job

+30d

Process Engineer

Tessenderlo GroupPhoenix, AZ, Remote
Designapi

Tessenderlo Group is hiring a Remote Process Engineer

Job Description

Are you an experienced Chemical Engineer passionate about process optimization and hands-on work? Do you thrive in environments where you're given the autonomy to lead, innovate, and solve complex problems? If so, we have an exciting opportunity for you!

As a Process Engineer III with Tessenderlo Kerley, Inc., you will be pivotal in troubleshooting, designing, and implementing different processes at multiple sites. You will collaborate closely with plant operations, HS&E, and project teams to achieve company production, quality control, and compliance goals. In addition, you will work with the Process Engineering Manager and other engineers to learn company tools and standard practices. Tessenderlo Kerley has multiple facilities in the U.S. and abroad, offering countless opportunities for professional growth and development.

The ideal candidate for this role will have a sharp eye for detail, strong organizational skills and the ability to balance multiple projects. You’ll alsoneed a solid technical background in chemical plant operations, an interest in analyzing process data, and the drive to find practical solutions for engineering challenges.

Key Responsibilities:

  • Chemical engineering– Understanding piping and instrumentation diagrams, mass and energy balances, chemical compatibility, and product quality controls.
  • Process Safety Management – Participation or leadership of PHA/HAZOPs, assisting with change management.
  • Design– P&ID redlines, equipment/instrument specifications, and calculations (line sizing, PSV sizing per API codes, rotating equipment sizing).
  • Project Execution– Scope of work development, gathering and review of vendor bids, and collaboration with other engineering disciplines.
  • Field Work:Provide technical support for troubleshooting, turnarounds and project commissioning efforts at 2-4 sites, with approximately 30-40% travel.

    Qualifications

    What We’re Looking For:

    • A Bachelor of Science degree in Chemical Engineering.
    • At least five years of hands-on process engineering experience, ideally with some exposure to Sulfur Recovery Units.
    • Strong, independent decision-making skills to drive projects with minimal oversight.
    • Technical skills such as P&ID design, equipment/instrument sizing and selection, review of procedures and operating manuals.
    • A knack for balancing multiple projects and sites while maintaining safety and productivity standards.
    • A motivated, safety-conscious individual who inspires others through professionalism and effective communication.

    What we can offer you:

    • Independence: You will have the freedom to make impactful decisions and optimize processes with minimal supervision.
    • Continuous Learning: You will participate in seminars and gain exposure to various subjects, processes and cutting-edge technology.
    • Diverse Experiences: With both office and fieldwork, you'll collaborate with cross-functional teams, travel to multiple sites (domestic and minimal international), and tackle unique challenges.
    • Flexibility: Tessenderlo Kerley values professional growth and allows engineers to explore their interests related to company projects and assignments.
    • Safety First: You will join a company with an outstanding safety record and where your well-being is a top priority.

    Physical Requirements:

    • Ability to lift 50 pounds, climb stairs and use a variety of safety equipment, including respirators and SCBAs.

    If you’re a problem solver, project executor, and passionate about pushing the boundaries of process engineering, this is the role for you!

    Join our team and take your career to the next level by applying your skills to real-world challenges in a dynamic and rewarding environment.

    See more jobs at Tessenderlo Group

    Apply for this job

    +30d

    Data Engineer

    Phocas SoftwareChristchurch,Canterbury,New Zealand, Remote Hybrid
    sqlpostgresqlpython

    Phocas Software is hiring a Remote Data Engineer

    We're a business planning and analytics company on a mission to make people feel good about data. Since 2001, we’ve helped thousands of companies turn complex business data into performance boosting results. Despite our now global status of 300 world-class humans, we’ve held on to our start-up roots. The result is a workplace that’s fast, exciting and designed for fun.

    As long as you’re happy, the rest falls into place. Think less stress, higher performance, more energy and all-round nicer human. Your friends and family will be delighted.

    As the Internal Data Specialist, you'll ensure the business can leverage our internal data sources, allowing us to make better decisions, react faster to changes and build confidence in our data and decisions. Your work will be split between support and project deliverables working with the Phocas IT and Finance teams and the wider business.

    What will you be doing?

    • Supporting internal reporting systems and data transformation processes.
    • Implementing new dashboards, reports, data sources and improvements to existing data sources.
    • Creating scalable and robust pipelines to ingest data from multiple structured and unstructured sources (APIs, databases, flat files, etc.) into our data platform.
    • Generating answers to the business’ questions by working with our internal data assets.
    • Improving business understanding of our data including where it comes from, how it fits together, and the meaning of the data.

    What are we looking for?

    • A degree in data science/computer science or similar, and solid (5+ years) experience in similar roles, working with data analytics products (Phocas, Power BI, etc.). 
    • Strong database experience (SQL Server, PostgreSQL) and experience with scripting languages (Python).
    • A general understanding of finance basics: terms, systems, processes, and best practices. 
    • Strong experience designing, developing, and supporting complex data import and transformation processes.
    • Experience creating technical and non-technical documentation and user guides and a natural tendency to produce strong documentation (both comments within the code, and externally.
    • Proven critical thinking skills; able to proactively problem solve and develop out of the box solutions. 
    • A growth mindset: a willingness to embrace new challenges and opportunities to grow.
    • Someone who can develop strong relationships and work collaboratively and supportively with a diverse global team
    • Bonus points for experience building financial reporting solutions, working with third-party APIs to extract data in an automated manner, and/or experience working in internal customer facing support roles.

    Why work at Phocas? 

    • People – when we ask what people like about working here, 'the people’ is the single most common answer 
    • Social/fun stuff – opportunities to get together, sometimes (optional) silly games, & food. We all really like food. 
    • Our office – spacious, conveniently located in sunny Sydenham, plenty of parking for four-, two- or even single wheeled vehicles.  
    • Southern Cross, Life, TPD and Income Protection Insurance 
    • Extra paid parental leave 
    • Flexible/hybrid working policy  

    Phocas is an Accredited Employer and typically we are strong supporters of international talent, but due to current visa settings and processing times, we can only consider applicants with current NZ working rights.

    We are a 2024 Circle Back Initiative Employer – we commit to respond to every applicant.

    To all recruitment agencies: Phocas does not accept agency resumes. Please do not forward resumes to our jobs alias, Phocas employees or any other company location. Phocas will not be responsible for any fees related to unsolicited resumes.

    Phocas is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans status or any other characteristic protected by law.

    #LI-NG1 #LI-HYBRID

    See more jobs at Phocas Software

    Apply for this job

    +30d

    Senior Data Engineer

    BloomreachRemote CEE, Czechia, Slovakia
    redisremote-firstc++kubernetespython

    Bloomreach is hiring a Remote Senior Data Engineer

    Bloomreach is the world’s #1 Commerce Experience Cloud, empowering brands to deliver customer journeys so personalized, they feel like magic. It offers a suite of products that drive true personalization and digital commerce growth, including:

    • Discovery, offering AI-driven search and merchandising
    • Content, offering a headless CMS
    • Engagement, offering a leading CDP and marketing automation solutions

    Together, these solutions combine the power of unified customer and product data with the speed and scale of AI optimization, enabling revenue-driving digital commerce experiences that convert on any channel and every journey. Bloomreach serves over 850 global brands including Albertsons, Bosch, Puma, FC Bayern München, and Marks & Spencer. Bloomreach recently raised $175 million in a Series F funding round, bringing its total valuation to $2.2 billion. The investment was led by Goldman Sachs Asset Management with participation from Bain Capital Ventures and Sixth Street Growth. For more information, visit Bloomreach.com.

     

    We want you to join us as a full-timeSenior Data Engineer into our Data Pipelineteam. We work remotely first, but we are more than happy to meet you in our nice office in Bratislava or Brno. And if you are interested in who will be your engineering manager, check out Vaclav's Linkedin.

    Intrigued? Read on ????…

    Your responsibilities

    • You will develop and maintain Data LakeHouse on top of GCP platform using Apache IceBerg, BigQuery, BigLake tables, DataPlex and DataProc in the form ofApacheSpark/Flink with open file formats like AVRO and Parquet
    • You will help to maintain a streaming mechanism on how data from Apache Kafka gets into the Data LakeHouse
    • You will optimise the Data LakeHouse for near-real-time and non-real-time analytical use-cases primarily for customer activation and scenarios/campaign evaluation 
    • You should help with areas like data discovery and managed access to data through the data governance layer and data catalog using DataPlex so our engineering teams can leverage from this unified Data LakeHouse
    • You feel responsible for DataModeling and schema evolution
    • You should help us with adopting the concepts from Data Fabrics and Data Mesh to run data as a product to unlock the potential the data can unleash for our clients
    • You should bring expertise into the team from similar previous projects to influence how we adopt and evolve the concepts mentioned above and as an addition to that to topics like Zero-copy or reverse ETL to increase the ease of integration with client’s platforms
    • You will also help to maintain the existing data exports to Google’s BigQuery using google’s DataFlows and Apache Beam 
    • You will help us run and support our services in production handling high-volume traffic using Google Cloud Platform and Kubernetes.
    • You will review the code of your peers and they'll review yours. We have high code quality standards and the four-eyes principle is a must!

    Your qualifications

    • You have production experience with building and operating a DataLake, Data Warehouse or Data LakeHouses
    • You have a taste for big data streaming, storage and processing using open source technologies
    • You can demonstrate your understanding of what it means to treat data as a product
    • You know what are Data Mashes and Data Fabrics and what is critical to make sure for them to bring value
    • You are able to learn and adapt. It'll be handy while exploring new tech, navigating our not-so-small code base, or when iterating on our team processes.
    • You knowdata structures,you knowPython and (optionaly) Go.

    Our tech stack

    • Google Cloud Platform, DataFlow, Apache Beam, BigQuery, BigLake Table
    • Open formats IceBerg, Avro, Parquet
    • DataProc, Spark, Flink, Presto
    • Python, GO
    • Apache Kafka, Kubernetes, GitLab
    • BigTable, Mongo, Redis
    • … and much more ????

    Compensations

    • Salary range starting from 4300 EUR gross per month,going up depending on your experience and skills
    • There's a bonus based on company performance and your salary.
    • You will be entitled to restricted stock options ????that will truly make you a part of Bloomreach.
    • You can spend 1500 USD per year on the education of your choice (books, conferences, courses, ...).
    • You can count on free access to Udemy courses.
    • We have 4 company-wide disconnect days throughout the year during which you will be encouraged not to work and spend a day with your friends and family "disconnected".
    • You will have extra 5 days of paid vacation????. Extra days off for extra work-life balance ????.
    • Food allowance!
    • Sweet referral bonus up to 3000 USD based on the position.

    Your success story.

    • During the first 30 days, you will get to know the team, the company, and the most important processes. You’ll work on yourfirst tasks. We will help you to get familiar with our codebase and our product.
    • During the first 90 days, you will participate in yourfirst, more complex projects. You will help the team to find solutions to various problems, break the solution down into smaller tasks and participate in implementation. You will learn how we identify problems, how we prioritize our efforts, and how we deliver value to our customers.
    • During the first 180 days, you’ll become an integral part of the team. You will achieve the first goals we will set together to help you grow and explore new and interesting things. You will help us to deliver multi-milestone projects bringing great value to our customers. You will help us mitigate your first incidents and eventually even join the on-call rotation. You will get a sense of where the team is heading and you’ll help us toshape our future.
    • Finally, you’ll find out that our values are truly lived by us ????. We are dreamers and builders. Join us!

     

    More things you'll like about Bloomreach:

    Culture:

    • A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. 

    • We have defined our5 valuesand the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. 

    • We believe in flexible working hours to accommodate your working style.

    • We work remote-first with several Bloomreach Hubs available across three continents.

    • We organize company events to experience the global spirit of the company and get excited about what's ahead.

    • We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*.
    • TheBloomreach Glassdoor pageelaborates on our stellar 4.6/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5

    Personal Development:

    • We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions.

    • Our resident communication coachIvo Večeřais available to help navigate work-related communications & decision-making challenges.*
    • Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins.

    • Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)*

    Well-being:

    • The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.*

    • Subscription to Calm - sleep and meditation app.*

    • We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones.

    • We facilitate sports, yoga, and meditation opportunities for each other.

    • Extended parental leave up to 26 calendar weeks for Primary Caregivers.*

    Compensation:

    • Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.*

    • Everyone gets to participate in the company's success through the company performance bonus.*

    • We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts.

    • We reward & celebrate work anniversaries -- Bloomversaries!*

    (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)

    Excited? Join us and transform the future of commerce experiences!

    If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!


    Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.

     #LI-Remote

    See more jobs at Bloomreach

    Apply for this job

    +30d

    Data Engineer II - (Remote - US)

    MediavineAtlanta,Georgia,United States, Remote
    sqlDesignpythonAWS

    Mediavine is hiring a Remote Data Engineer II - (Remote - US)

    Mediavine is seeking an experienced Data Engineer to join our engineering team. We are looking for someone who enjoys solving interesting problems and wants to work with a small team of talented engineers on a product used by thousands of publishers. Applicants must be based in the United States.

    About Mediavine

    Mediavine is a fast-growing advertising management company representing over 10,000 websites in the food, lifestyle, DIY, and entertainment space. Founded by content creators, for content creators, Mediavine is a Top 20 Comscore property, exclusively reaching over 125 million monthly unique visitors. With best-in-class technology and a commitment to traffic quality and brand safety, we ensure optimal performance for our creators.

    Mission & Culture

    We are striving to build an inclusive and diverse team of highly talented individuals that reflect the industries we serve and the world we live in. The unique experiences and perspectives of our team members is encouraged and valued. If you are talented, driven, enjoy the pace of a start-up like environment, let’s talk!

    Position Title & Overview:

    The Data & Analytics team consists of data analysts, data engineers and analytics engineers working to build the most effective platform and tools to help uncover opportunities and make decisions with data here at Mediavine. We partner with Product, Support, Ad Operations and other teams within the Engineering department to understand behavior, develop accurate predictors and build solutions that provide the best internal and external experience possible.

    A Data Engineer at Mediavine will help build and maintain our data infrastructure. Building scalable data pipelines, managing transformation processes, and ensuring data quality and security at all steps along the way. This will include writing and maintaining code in Python and SQL, developing on AWS, and selecting and using third-party tools like Rundeck, Metabase, and others to round out the environment. You will be involved in decisions around tool selection and coding standards.

     Our current data engineering toolkit consists of custom Python data pipelines, AWS infrastructure including Kinesis pipelines, Rundeck scheduling, dbt for transformation and Snowflake as our data warehouse platform. We are open to new tools and expect this position to be a part of deciding the direction we take. 

    Essential Responsibilities:

    • Create data pipelines that make data available for analytic and application use cases
    • Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly
    • Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team
    • Leading projects from a technical standpoint, creating project Technical Design Documents
    •  Support data analysts and analytics engineers ability to meet the needs of the organization
    • Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices
    • Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed
    • Provide next level support when data issues are discovered and communicated by the data analysts
    • Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users
    • Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advice

    Location: 

    • Applicants must be based in the United States

    You Have: 

    • 3+ years of experience in a data engineering role
    • Strong Python skills (Understands tradeoffs, optimization, etc)
    • Strong SQL skills (CTEs, window functions, optimization)
    • Experience working in cloud environments (AWS preferred, GCS, Azure)
    • An understanding of how to best structure data to enable internal and external facing analytics
    • Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination)
    • Experience working with DevOps to deploy, scale and monitor data infrastructure
    • Scheduler experience either traditional or DAG based
    • Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query)
    • Experience with other DBMS systems (Postgres in particular)

    Nice to haves:

    • Experience with web analysis such as creating data structure that support product funnels, user behavior, and decision path analysis 
    • Understanding of Snowflake external stages, file formats and snowpipe
    • Experience with orchestration tools particularly across different technologies and stacks
    • Experience with dbt
    • Knowledge of Ad Tech, Google Ad Manager and all of it’s fun quirks (so fun)
    • The ability to make your teammates laugh (it wouldn’t hurt if you were fun to work with is what I’m saying)
    • Familiarity with event tracking systems (NewRelic, Snowplow, etc)
    • Experience with one or more major BI tools (Domo, Looker, PowerBI, etc.)
    • 100% remote 
    • Comprehensive benefits including Health, Dental, Vision and 401k match
    • Generous paid time off 
    • Wellness and Home Office Perks 
    • Up to 12 weeks of paid Parental Leave 
    • Inclusive Family Forming Benefits 
    • Professional development opportunities 
    • Travel opportunities for teams, our annual All Hands retreat as well as industry events

    Mediavine provides equal employment opportunities to applicants and employees. All aspects of employment will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.

    We strongly encourage minorities and individuals from underrepresented groups in technology to apply for this position.

    At Mediavine, base salary is one part of our competitive total compensation and benefits package and is determined using a salary range.  Individual compensation varies based on job-related factors, including business needs, experience, level of responsibility and qualifications. The base salary range for this role at the time of posting is $115,000 - $130,000 USD/yr.

    See more jobs at Mediavine

    Apply for this job

    +30d

    Senior Azure Data Engineer

    DevOPSagileBachelor's degree5 years of experiencejirascalanosqlsqlDesignazurescrumjavapython

    FuseMachines is hiring a Remote Senior Azure Data Engineer

    Senior Azure Data Engineer - Fusemachines - Career Page
  • See more jobs at FuseMachines

    Apply for this job

  • +30d

    Data and Analytics Engineer

    airflowsqlDesignpython

    Cloudflare is hiring a Remote Data and Analytics Engineer

    About Us

    At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

    We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

    Available Locations: Lisbon or Remote Portugal

    About the team

    You will be part of the Network Strategy team within Cloudflare’s Infrastructure Engineering department. The Network Strategy team focuses on building both external and internal relationships that allow for Cloudflare to scale and reach user populations around the world. Our group takes a long term and technical approach to forging mutually beneficial and sustainable relationships with all of our network partners. 

    About the role

    We are looking for an experienced Data and Analytics Engineer to join our team to scale our data insights initiatives. You will work with a wide array of data sources about network traffic, performance, and cost. You’ll be responsible for building data pipelines, doing ad-hoc analytics based on the data, and automating our analysis. Important projects include understanding the resource consumption and cost of Cloudflare’s broad product portfolio.

    A candidate will be successful in this role if they’re flexible and able to match the right solution to the right problem. Flexibility is key. Cloudflare is a fast-paced environment and requirements change frequently.

    What you'll do

    • Design and implement data pipelines that take unprocessed data and make it usable for advanced analytics
    • Work closely with other product and engineering teams to ensure our products and services collect the right data for our analytics
    • Work closely with a cross functional team of data scientists and analysts and internal stakeholders on strategic initiatives 
    • Build tooling, automation, and visualizations around our analytics for consumption by other Cloudflare teams

    Examples of desirable skills, knowledge and experience

    • Excellent Python and SQL (one of the interviews will be a code review)
    • B.S. or M.S in Computer Science, Statistics, Mathematics, or other quantitative fields, or equivalent experience
    • Minimum 3 years of industry experience in software engineering, data engineering, data science or related field with a track record of extracting, transforming and loading large datasets 
    • Knowledge of data management fundamentals and data storage/computing principles
    • Excellent communication & problem solving skills 
    • Ability to collaborate with cross functional teams and work through ambiguous business requirements

    Bonus Points

    • Familiarity with Airflow 
    • Familiarity with Google Cloud Platform or other analytics databases

    What Makes Cloudflare Special?

    We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

    Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

    Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

    1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

    Sound like something you’d like to be a part of? We’d love to hear from you!

    This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

    Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

    Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

    See more jobs at Cloudflare

    Apply for this job

    +30d

    Azure Data Engineer

    ProArchHyderabad,Telangana,India, Remote
    Designazure

    ProArch is hiring a Remote Azure Data Engineer

    ProArch is hiring a skilled Azure Data Engineer to join our team. As an Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data processing systems using Azure technologies. Additionally, you will collaborate with cross-functional teams to understand business requirements, identify opportunities for data-driven improvements, and deliver high-quality solutions. If you have a strong background in Azure data tools and technologies, excellent problem-solving skills, and a passion for data engineering, we want to hear from you!

    Responsibilities:

    • Design, develop, and implement data engineering solutions on the Azure platform
    • Create and maintain data pipelines and ETL processes
    • Optimize data storage and retrieval for performance and scalability
    • Collaborate with data scientists and analysts to build data models and enable data-driven insights
    • Ensure data quality and integrity through data validation and cleansing
    • Monitor and troubleshoot data pipelines and resolve any issues
    • Stay up-to-date with the latest Azure data engineering best practices and technologies
    • Excellent communication skills
    • Strong experience in Python/Pyspark
    • The ability to understand businesses concepts and work with customers to process data accurately.
    • A solid of understanding Azure Data Lake, Spark for Synapse (or Azure Databricks), Synapse Pipelines (or Azure Data Factory), Mapping Data Flows, SQL Server, Synapse Serverless/Pools (or SQL Data Warehouse).
    • Experience with source control, version control and moving data artifacts from Dev to Test to Prod.
    • A proactive self-starter, who likes deliver value, solves challenges and make progress.
    • Comfortable working in a team or as an individual contributor
    • Good data modelling skills (e.g., relationships, entities, facts, and dimensions)

    See more jobs at ProArch

    Apply for this job

    +30d

    Senior Data Engineer

    CLEAR - CorporateNew York, New York, United States (Hybrid)
    tableauairflowsqlDesignjenkinspythonAWS

    CLEAR - Corporate is hiring a Remote Senior Data Engineer

    Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Data Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Data Engineer at CLEAR, you will participate in the design, implementation, testing, and deployment of applications to build and enhance our platform- one that interconnects dozens of attributes and qualifications while keeping member privacy and security at the core. 


    A brief highlight of our tech stack:

    • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

    What you'll do:

    • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
    • Build processes supporting data transformation, data structures, metadata, dependency and workload management
    • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
    • Develop and implement data analytics models
    • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
    • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

     What you're great at:

    • 6+ years of data engineering experience
    • Working with cloud-based application development, and be fluent in at least a few of: 
      • Cloud services providers like AWS
      • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
      • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
      • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
      • Data visualization tool like Looker, Tableau, etc
    • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
    • Collaborating and mentoring less experienced members of the team
    • Comfort with ambiguity 
    • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

    How You'll be Rewarded:

    At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

    We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

    The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

    About CLEAR

    Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

    CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

    See more jobs at CLEAR - Corporate

    Apply for this job

    +30d

    Snowflake Data Engineer

    OnebridgeIndianapolis, IN - Remote - Hybrid
    sqlDesigngit

    Onebridge is hiring a Remote Snowflake Data Engineer

    Onebridge is a Consulting firm with an HQ in Indianapolis, and clients dispersed throughout the United States and beyond. We have an exciting opportunity for a highly skilled Snowflake Data Engineer to join an innovative and dynamic group of professionals at a company rated among the top “Best Places to Work” in Indianapolis since 2015.

    Snowflake Data Engineer | About You

    As a Snowflake Data Engineer, you are responsible for defining data requirements, developing technical specifications, and architecting scalable and efficient data pipelines. You have a strong background in cloud-based data platforms and services, along with proven leadership skills to manage a team of data engineers. You will optimize ETL architectures and ensure adherence to best practices, security, and coding guidelines. You will also work closely with cross-functional teams, offering strategic insights and reporting project status, risks, and issues.

    Snowflake Data Engineer | Day-to-Day

    • Lead a team of data engineers in the design, development, and implementation of cloud-based data solutions using Snowflake, Fivetran, and Azure services.
    • Collaborate with cross-functional teams to define data requirements, develop technical specifications, and architect scalable, efficient data pipelines.
    • Design and implement data models, ETL processes, and data integration solutions to support business objectives and ensure data quality and integrity.
    • Optimize data architecture for performance, scalability, and cost-effectiveness, leveraging cloud-native technologies and best practices.
    • Provide technical leadership and mentorship, guiding team members in the adoption of best practices, tools, and methodologies.

    Snowflake Data Engineer| Skills & Experience

    • 8+ years of experience as a Data Engineer with a focus on cloud-based data platforms and services such as AWS, Azure, or GCP.
    • Extensive hands-on experience designing and implementing data solutions using Snowflake, Fivetran, and Azure cloud environments.
    • Strong proficiency in SQL and Python, with advanced knowledge of data modelling techniques, dimensional modelling, and data warehousing concepts.
    • In-depth understanding of data governance, security, and compliance frameworks, with experience in implementing security controls and encryption in cloud environments.
    • Excellent leadership and communication skills, with the ability to lead cross-functional teams, communicate technical strategies, and achieve goals in a fast-paced environment.

      A Best Place to Work in Indiana, since 2015.

      See more jobs at Onebridge

      Apply for this job

      +30d

      Principal Data Engineer

      Transcarent APIUS - Remote
      MLSalesEC2Bachelor's degreescalasqlDesignapijavac++pythonAWS

      Transcarent API is hiring a Remote Principal Data Engineer

      Who we are  

      Transcarentis the One Place for Health and Care. We cut through complexity, making it easy for people to access high-quality, affordable health and care. We create a personalized experience tailored for each Member, including an on-demand care team, and a connected ecosystem of high-quality, in-person care and virtual point solutions.Transcarent eliminatesthe guesswork and empowers Members to make better decisions about their health and care.

      Transcarentis aligned with those who pay for healthcare and takes accountability for results – offering at-risk pricing models and transparent impact reporting toensure incentives support a measurably better experience, better health, and lower costs. 

      AtTranscarent, you will be part of a world-class team, supported by top tier investors like 7wireVentures and General Catalyst, and founded by a mission-driven team committed to transforming the health and care experience for all. In May 2024, we closed our Series D with $126 million, propelling our total funding to $450 million and fueling accelerated AI capabilities and strategic growthopportunities. 

      We are looking for teammates to join us in building our company, culture, and Member experience who:  

      • Put people first, and make decisions with the Member’s best interests in mind 
      • Are active learners, constantly looking to improve and grow 
      • Are driven by our mission to measurably improve health and care each day 
      • Bring the energy needed to transform health and care, and move and adapt rapidly 
      • Are laser focused on delivering results for Members, and proactively problem solving to get there 

      What you’ll do  

      • Lead the Design and Implementation: Using modern data architecture principles, architect and implement cutting-edge data processing platforms and enterprise-wide data solutions. 
      • Scale Data Platform: Develop a scalable Platform for optimal data extraction, transformation, and loading from various sources, ensuring data integrity and accessibility. 
      • AI / ML platform: Design and build scalable AI and ML platforms to support Transcarent use cases.  
      • Collaborate Across Teams: Partner with Executive, Product, Clinical, Data, and Design teams to meet their data infrastructure needs, supporting them with technical expertise. 
      • Optimize Data Pipelines: Build and optimize complex data pipelines, ensuring high performance, reliability, and scalability. 
      • Innovate and Automate: Create and maintain data tools and pipelines that empower analytics, data science, and other teams to drive innovation and operational excellence. 
      • Mentor and Lead: Provide technical leadership and mentorship to the data engineering team, fostering a culture of continuous learning and improvement. 

      What we’re looking for:  

      • Experienced: 10+ years of experience in data engineering with a strong background in building and scaling data architectures in complex environments. Healthcare experience is a plus. 
      • Technical Expertise: Advanced working knowledge of SQL, relational databases, and big data tools (e.g., Spark, Kafka). Proficient in cloud-based data warehousing (e.g., Snowflake) and cloud services (e.g., AWS). Proficient in understanding of AI / ML workflows, etc., 
      • Architectural Visionary: Demonstrated experience in service-oriented and event-based architecture with strong API development skills. 
      • Problem Solver: Ability to manage and optimize processes supporting data transformation, metadata management, and workload management. 
      • Collaborative Leader: Strong communication skills with the ability to present ideas clearly and lead cross-functional teams effectively. 
      • Project Management: Strong project management and organizational skills, capable of leading multiple projects simultaneously. 

      Nice to have: 

      • Preferred tech stack: 
        • Data Platforms: Experience with building Data-as-a-Service platforms and API development. 
        • Programming Languages: Proficient in object-oriented scripting languages such as Python, Java, C++, Scala, etc. 
        • Big Data Tools: Expertise with tools like Spark, Kafka, and stream-processing systems like Storm or Spark-Streaming. 
        • Cloud Services: In-depth experience with AWS services (EC2, EMR, RDS, Redshift) and workflow management tools like Airflow. 
      As a remote position, the salary range for this role is:
      $210,000$220,000 USD

      Total Rewards 

      Individual compensation packages are based on a few different factors unique to each candidate, including primary work location and an evaluation of a candidate’s skills, experience, market demands, and internal equity.  

      Salary is just one component of Transcarent's total package. All regular employees are also eligible for the corporate bonus program or a sales incentive (target included in OTE) as well as stock options.  

      Our benefits and perks programs include, but are not limited to:  

      • Competitive medical, dental, and vision coverage  
      • Competitive 401(k) Plan with a generous company match  
      • Flexible Time Off/Paid Time Off, 12 paid holidays  
      • Protection Plans including Life Insurance, Disability Insurance, and Supplemental Insurance 
      • Mental Health and Wellness benefits  

      Location  

      You must be authorized to work in the United States. Depending on the position we may have a preference to a specific location, but are generally open to remote work anywhere in the US.  

      Transcarent is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. If you are a person with a disability and require assistance during the application process, please don’t hesitate to reach out!  

      Research shows that candidates from underrepresented backgrounds often don’t apply unless they meet 100% of the job criteria. While we have worked to consolidate the minimum qualifications for each role, we aren’t looking for someone who checks each box on a page; we’re looking for active learners and people who care about disrupting the current health and care with their unique experiences. 

       

      Apply for this job

      +30d

      Data Engineer

      ClickViewPyrmont,New South Wales,Australia, Remote Hybrid
      S3LambdasqlDesignMySQLAWS

      ClickView is hiring a Remote Data Engineer

      • Australia’s leading educational video platform as we scale globally
      • Join a high-performing, passionate and motivated engineering team with a culture of best practice
      • Permanent role, full-time, working hybrid from home and in the Sydney office

      Why Join Us?

      Do you want to make a positive impact on the education of future generations? If the answer’s yes, then we want you here at ClickView. We believe in using the power of video to transform traditional education, making video learning easy by building innovative tools, systems and services to provide the education sector with an intuitive platform to view and share high quality videos.

      At ClickView, we look for passionate professionals who are seeking a hands-on role in a dynamic organisation. In turn, we invest in our staff to enhance overall team performance and achieve growth together. You can expect support and the benefits of a flexible, open and vibrant work environment.

      Are you ready to take your first step with us?

      The role:

      We are looking for a talented Data Engineer to join our fast-moving Product Team that ships code to production many times a week. We’ve built a fully cloud, best-of-breed data pipeline on top of AWS services and Snowflake, DBT, FiveTran, HighTouch, and Looker.

      Our product is used by millions of users around the world and we stream millions of events from our products and services that are transformed, hydrated, and stored in our data lake. We are looking for someone to join the team and help make our data useful to teams.

      This role resides within the Product Team, so there is no expectation to spend time in cross-departmental meetings and interface with internal stakeholders, leaving you with more time for big projects and deep work. The ideal candidate is a smart, personable, and friendly Data Engineer who communicates in a clear and concise manner. You live and breathe best-practice engineering and are passionate about learning new technologies

      Responsibilities:

      • End-to-end responsibility for data projects and work to enable our engineering teams by making data useful:
      • Design, build, and maintain efficient, reusable, and reliable data pipelines and ETL processes.
      • Develop and manage the data architecture to ensure data quality, integrity, and security.
      • Implement best practices for data management, including data governance and data lifecycle management.
      • Collaborate with product and engineering teams to understand data requirements and provide scalable solutions.
      • Monitor and troubleshoot data issues, ensuring timely resolution and minimal disruption.
      • Work closely within the team to support data visualization and reporting solutions using Looker.
      • Create and maintain documentation related to data engineering processes and workflows.

      Requirements:

      • Solid experience working in data engineering or software engineering
      • Experience with cloud Data Warehousing (Snowflake) and Data Lake experience
      • A strong understanding of ELT/ETL processes and data architecture
      • Nice to have, experience with data structures and query languages in relational DBs such as MySQL and MS SQL Server
      • Experience with DBT and familiar with the best practice of data transformation and adherence best practices.
      • Familiar with AWS native services (S3, DMS, Lambda, Glue, etc)
      • A passion for data engineering and analytics
      • Understanding of statistical models
      • Ability to visualize data with the knowledge of BI tools (we use Looker)
      • Ability to maintain data infrastructure, and data pipelines and ensure data integrity
      • Ability to communicate ideas, processes, and practices clearly

      Benefits:

      • Extra paid Wellbeing and Volunteering leave - to care for yourself and others ????
      • Flexible working hours and arrangements- to accommodate for different working preferences and personal situations ????
      • 100 days working from anywhere - work remotely from a different location for up to 100 calendar days per year ????
      • Employee discounts - we offer all employee’s access to a wide range of discounts through FlareHR to support their wellbeing and financial health ????
      • Learning and Development budgets - access to LinkedIn Learning, along with professional opportunities made available to all our teams, so you can continue growing to be the best you ????
      • Wellbeing Policy - with access to EAP and wellbeing apps, we put your mental health and wellbeing at the forefront of what we do ????‍♂️
      • Generous parental leave policy - we offer an additional 16 week’s full pay ????
      • Regular social events and conferences - we celebrate the hard work of our team with regular catered social events, conferences across all offices, amazing harbour front office views, and free snacks daily ????

      See more jobs at ClickView

      Apply for this job

      +30d

      Data Engineer II

      Life36Remote, USA
      agile3 years of experienceremote-firstterraformsqlDesignmobilec++pythonAWS

      Life36 is hiring a Remote Data Engineer II

      About Life360

      Life360’s mission is to keep people close to the ones they love. Our category-leading mobile app and Tile tracking devices empower members to protect the people, pets, and things they care about most with a range of services, including location sharing, safe driver reports, and crash detection with emergency dispatch. Life360 serves approximately 66 million monthly active users (MAU) across more than 150 countries. 

      Life360 delivers peace of mind and enhances everyday family life with seamless coordination for all the moments that matter, big and small. By continuing to innovate and deliver for our customers, we have become a household name and the must-have mobile-based membership for families (and those friends that basically are family). 

      Life360 has more than 500 (and growing!) remote-first employees. For more information, please visit life360.com.

      Life360 is a Remote First company, which means a remote work environment will be the primary experience for all employees. All positions, unless otherwise specified, can be performed remotely (within the US) regardless of any specified location above. 

      About the Team

      The Data and Analytics team is looking for a high intensity individual who is passionate about driving our in-app ads experience forward. Our mission is to envision, design, and build high impact ads data products while maintaining a commitment to putting our members before metrics. You will be expected to become the go-to member in our team for ads related data products. We want open-minded individuals that are highly collaborative and communicative. We work together and celebrate our wins as a team and are committed to building a welcoming team where everyone can perform their best.

      About the Job

      At Life360, we collect a lot of data: tens of billions of unique location points, billions of user actions, billions of miles driven every single month, and so much more. As a Data Engineer II, you will contribute to enhancing and maintaining our data processing and storage pipelines/workflows for a robust and secure data lake. You should have a strong engineering background and even more importantly a desire to take ownership of our data systems to make them world class.

      The US-based salary range for this role is $135,000 - $185,000. We take into consideration an individual's background and experience in determining final salary - therefore, base pay offered may vary considerably depending on geographic location, job-related knowledge, skills, and experience. The compensation package includes a wide range of medical, dental, vision, financial, and other benefits, as well as equity.

      What You’ll Do

      Primary responsibilities include, but are not limited to:

      • Design, implement, and maintain scalable data processing platforms used for real-time analytics and exploratory data analysis.
      • Manage our ads data from ingestion through ETL to storage and batch processing.
      • Automate, test and harden all data workflows.
      • Architect logical and physical data models to ensure the needs of the business are met.
      • Collaborate with our ads analytics teams, while applying best practices.
      • Architect and develop systems and algorithms for distributed real-time analytics and data processing.
      • Implement strategies for acquiring and transforming our data to develop new insights.
      • Champion data engineering best practices and institutionalizing efficient processes to foster growth and innovation within the team.

      What We’re Looking For

      • Minimum of 3 years of experience working with high volume data infrastructure.
      • Experience with cloud computing platforms like Databricks and AWS, or a cloud computing framework like dbt.
      • Experience with job orchestration tooling like Airflow.
      • Proficient programming in either Python or Java.
      • Proficiency with SQL and ability to optimize queries.
      • Experience in data modeling and database design.
      • Experience with large-scale data processing using Spark and/or Presto/Trino.
      • Experience working with an ads-related data system like Google Ad Manager, Adbutler, Kevel, Acxiom, Fantix, LiveRamp, etc.
      • Experience in modern development lifecycle including Agile methodology, CI/CD, automated deployments using Terraform, GitHub Actions etc.
      • Knowledge and proficiency in the latest open source and data frameworks, modern data platform tech stacks and tools.
      • Always learning and staying up to speed with the fast moving data world.
      • You have good communication skills and can work independently.
      • BS in Computer Science, Software Engineering, Mathematics, or equivalent experience.

      Our Benefits

      • Competitive pay and benefits
      • Medical, dental, vision, life and disability insurance plans (100% paid for employees)
      • 401(k) plan with company matching program
      • Mental Wellness Program & Employee Assistance Program (EAP) for mental well being
      • Flexible PTO, 13 company wide days off throughout the year
      • Winter and Summer Week-long Synchronized Company Shutdowns
      • Learning & Development programs
      • Equipment, tools, and reimbursement support for a productive remote environment
      • Free Life360 Platinum Membership for your preferred circle
      • Free Tile Products

      Life360 Values

      Our company’s mission driven culture is guided by our shared values to create a trusted work environment where you can bring your authentic self to work and make a positive difference.

      • Be a Good Person - We have a team of high integrity people you can trust. 
      • Be Direct With Respect - We communicate directly, even when it’s hard.
      • Members Before Metrics - We focus on building an exceptional experience for families. 
      • High Intensity, High Impact - We do whatever it takes to get the job done. 

      Our Commitment to Diversity

      We believe that different ideas, perspectives and backgrounds create a stronger and more creative work environment that delivers better results. Together, we continue to build an inclusive culture that encourages, supports, and celebrates the diverse voices of our employees. It fuels our innovation and connects us closer to our customers and the communities we serve. We strive to create a workplace that reflects the communities we serve and where everyone feels empowered to bring their authentic best selves to work.

      We are an equal opportunity employer and value diversity at Life360. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or any legally protected status.

      We encourage people of all backgrounds to apply. We believe that a diversity of perspectives and experiences create a foundation for the best ideas. Come join us in building something meaningful.Even if you don’t meet 100% of the below qualifications, you should still seriously consider applying!

      #LI-Remote

      ____________________________________________________________________________

      See more jobs at Life36

      Apply for this job

      +30d

      Senior Data Engineer

      StyleSeat100% Remote (U.S. Based Only, Select States)
      scalanosqlairflowsqlDesignc++dockerMySQLpython

      StyleSeat is hiring a Remote Senior Data Engineer

      Senior Data Engineer

      100% Remote (U.S. Based Only, Select States - See Below)

      About the role

      StyleSeat is looking to add a Senior Data Engineer to its cross-functional Search product team. This team of data scientists, analysts, data engineers, software engineers and SDETs is focused on improving our search capability and customer search experience. The Senior Data Engineer will use frameworks and tools to perform the ETL and propose abstractions of those methods to aid in solving the problems associated with data ingestion. 

      What you’ll do

      • Handle data engineering tasks in a team focused on improving search functionality and customer search experience.
      • Design, develop, and own ETL pipelines that deliver data with measurable quality.
      • Scope, architect, build, release, and maintain data oriented projects, considering performance, stability, and an error-free operation.
      • Identify and resolve pipeline issues while discovering opportunities for improvement.
      • Architect scalable and reliable solutions to move data across systems from multiple products in nearly real-time.
      • Continuously improve our data platform and keep the technology stack current.
      • Solve critical issues in complex designs or coding schemes.
      • Monitor metrics, analyze data, and partner with other internal teams to solve difficult problems creating a better customer experience.

      Who you are 

      Successful candidates can come from a variety of backgrounds, yet here are some of the must have  and nice to have experiences we’re looking for:

      Must-Have:

      • Expert SQL skills.
      • 4 + years experience with::
        • Scaling and optimizing schemas.
        • Performance tuning ETL pipelines.
        • Building pipelines for processing large amounts of data.
      • Proficiency with Python, Scala and other scripting languages.
      • Experience with:
        • MySQL and Redshift.
        • NoSQL data stores, methods and approaches.
        • Kinesis or other data streaming services. 
        • Airflow or other pipeline workflow management tools. 
        • EMR,  Spark and ElasticSearch.
        • Docker or other container management tools. 
        • Developing infrastructure as code (IAC).
      • Ability to effectively work and communicate with cross-departmental partners and non-technical teams.

      Nice to Have:

      • Experience with:
        • Segment customer data platform with integration to Braze.
        • Terraform. 
        • Tableau.
        • Django.
        • Flask.

      Salary Range

      Our job titles may span more than one career level. The career level we are targeting for this role has a base pay between $136,900 and $184,600. The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. Base pay ranges are subject to change and may be modified in the future. 

      Who we are

      StyleSeat is the premier business platform for SMBs in the beauty and wellness industry to run and grow their business; and destination for consumers to discover, book and pay. To date, StyleSeat has powered more than 200 million appointments totaling over $12 billion in revenue for small businesses. StyleSeat is a platform and marketplace designed to support and promote the beauty and personal care community. 

      Today, StyleSeat connects consumers with top-rated beauty professionals in their area for a variety of services, including hair styling, barbering, massage, waxing, and nail care, among others. Our platform ensures that Pros maximize their schedules and earnings by minimizing gaps and cancellations, effectively attracting and retaining clientele.

      StyleSeat Culture & Values 

      At StyleSeat, our team is committed to fostering a positive and inclusive work environment. We respect and value the unique perspectives, experiences, and skills of our team members and work to create opportunities for all to grow and succeed. 

      • Diversity - We celebrate and welcome diversity in backgrounds, experiences, and perspectives. We believe in the importance of creating an inclusive work environment where everyone can thrive. 
      • Curiosity- We are committed to fostering a culture of learning and growth. We ask questions, challenge assumptions, and explore new ideas. 
      • Community - We are committed to making a positive impact on each, even when win-win-win scenarios are not always clear or possible in every decision. We strive to find solutions that benefit the community as a whole and drive our shared success.
      • Transparency - We are committed to open, honest, and clear communication. We hold ourselves accountable for maintaining the trust of our customers and team.
      • Entrepreneurship - We are self-driven big-picture thinkers - we move fast and pivot when necessary to achieve our goals. 

      Applicant Note: 

      StyleSeat is a fully remote, distributed workforce, however, we only have business entities established in the below list of states and, thus, are unable to consider candidates who live in states not on this list for the time being.
      **Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time.

       

      * Arizona

      * Alabama

      * California

      * Colorado

      * Florida

      * Georgia

      * Illinois

      * Indiana

      * Massachusetts

      * Maryland

      * Michigan

      * Nebraska

      * New York

      * New Jersey 

      * Ohio

      * Oregon

      * Pennsylvania

      * Virginia 

      * Washington

       

      See more jobs at StyleSeat

      Apply for this job

      +30d

      Data Engineer

      Creative CFO (Pty) LtdCape Town, ZA - Remote
      LambdanosqlpostgressqlDesignazureapiAWS

      Creative CFO (Pty) Ltd is hiring a Remote Data Engineer

      Become part of a vibrant, quality-focused team that leverages trust and autonomy to deliver exceptional services to diverse, high-growth clients. Receive recognition for your committed, results-producing approach to problem-solving, and opportunities for learning to realise your own passion for personal growth. All while working with some of the country’s most exciting growing businesses - from local entertainers, gin distilleries and ice-cream parlours, to enterprises revolutionising traditional spaces like retail, property and advertising or treading on the cutting edge of fintech.

      About the company

      At Creative CFO (Pty) Ltd, we provide companies with the best financial tools and services to plan, structure, invest and grow. We believe that innovative solutions are an interconnected web of small problems solved brilliantly. By walking through all the financial processes for each company and solving problems along the way, we have developed a full-service solution that business owners really appreciate.

      We are committed to solving the challenges that small business owners face. Accounting and tax is one part of this, but we also cover business process analysis, financial systems implementation and investment support. We unlock value by creating a platform through which business owners can manage and focus their creativity, energy and financial resources.

      Position Summary

      As a Data Engineer at Creative CFO, you will be at the forefront of architecting, building, and maintaining our data infrastructure, supporting data-driven decision-making processes.

      We are dedicated to optimising data flow and collection to enhance our financial clarity services for high-growth businesses. Join our dynamic and rapidly expanding team, committed to building a world where more SMEs thrive.

      The Ideal Candidate

      To be successful in the role you will need to:

      Build and optimise data systems:

        • Design, construct, install, test, and maintain highly scalable data management systems.
        • Ensure systems meet business requirements and industry practices.

        Expertly process data:

        • Develop batch processing and real-time data streaming capabilities.
        • Read, extract, transform, stage, and load data to selected tools and frameworks as required.

        Build data Integrations

        • Work closely with data analysts, data scientists, and other stakeholders to assist with data-related technical issues and support their data infrastructure needs.
        • Collaborate with data analytics and BI teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organisation.

        Be versatile with technology

        • Exhibit proficiency in ETL tools such as Apache NiFi or Talend and a deep understanding of SQL and NoSQL databases, including Postgres and Cassandra.
          Demonstrate expertise in at least one cloud services platform like Microsoft Azure, Google Cloud Platform/Engine, or AWS.

        Focus on quality assurance

        • Implement systems for monitoring data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

        Have a growth mindset

        • Stay informed of the latest developments in data engineering, and adopt best practices to continuously improve the robustness and performance of data processing and storage systems.

        Requirements

        Key skills & qualifications:

        • Bachelor’s degree in Statistics, Data Science, Computer Science, Information Technology, or Engineering.
        • Minimum of 2 years of professional experience in a Data Engineering role, with a proven track record of successful data infrastructure projects.
        • Proficiency in data analysis tools and languages such as SQL, R, and Python.
        • Strong knowledge of data modeling, data access, and data storage techniques.
        • Proficiency in at least one of Microsoft Azure, Google Cloud Platform/Engine, or AWS Lambda environments.
        • Familiarity with data visualisation tools such as PowerBI, Pyramid, and Google Looker Studio.
        • Excellent analytical and problem-solving skills.
        • Effective communication skills to convey technical findings to non-technical stakeholders.
        • Eagerness to learn and adapt to new technologies and methodologies

        Relevant experience required:

        • Previous roles might include Data Engineer, Data Systems Developer, or similar positions with a focus on building and maintaining scalable, reliable data infrastructures.
        • Strong experience in API development, integration, and management. Proficient in RESTful and SOAP services, with a solid understanding of API security best practices
        • Experience in a fast-paced, high-growth environment, demonstrating the ability to adapt to changing needs and technologies.


        Why Apply

        Vibrant Community

        • Be part of a close-knit, vibrant community that fosters creativity, wellness, and regular team-building events.
        • Celebrate individual contributions to team wins, fostering a culture of recognition.

        Innovative Leadership

        • Work under forward-thinking leadership shaping the future of data analytics.
        • Receive intentional mentorship for your professional and personal development.

        Education and Growth

        • Receive matched pay on education spend and extra leave days for ongoing education.
        • Enjoy a day's paid leave on your birthday - a celebration of you!

        Hybrid Work Setup

        • Experience the flexibility of a hybrid work setup, with currently one in-office day per month.
        • Choose to work in a great office space, if preferred.

        Professional and Personal Resources

        • Use the best technology, provided by the company
        • Benefit from Parental and Maternity Leave policies designed for our team members.

        See more jobs at Creative CFO (Pty) Ltd

        Apply for this job

        +30d

        Data Engineer

        BrushfireFort Worth, TX, Remote
        DevOPSBachelor's degreetableausqlFirebaseazurec++typescriptkubernetespython

        Brushfire is hiring a Remote Data Engineer

        Job Description

        The primary responsibility of this position is to manage our data warehouse and the pipelines that feed to/from that warehouse. This requires advanced knowledge of our systems, processes, data structures, and existing tooling. The secondary responsibility will be administering our production OLTP database to ensure it runs smoothly and using standard/best practices.

        The ideal candidate for this position is someone who is extremely comfortable with the latest technology, trends, and favors concrete execution over abstract ideation. We are proudly impartial when it comes to solutions – we like to try new things and the best idea is always the winning idea, regardless of the way we’ve done it previously.

        This is a full-time work-from-home position.

        Qualifications

        The following characteristics are necessary for a qualified applicant to be considered:

        • 3+ years of experience working with data warehouses (BigQuery preferred, Redshift, Snowflake, etc)

        • 3+ years of experience working with dbt, ETL (Fivetran, Airbyte, etc), and Reverse ETL (Census) solutions 

        • 3+ years of experience with Database Administration (Azure SQL, Query Profiler, Redgate, etc)

        • 1+ years of experience with BI/Visualization tools (Google Data/Looker Studio, Tableau, etc)

        • Experience with PubSub databases, specifically Google Cloud Firestore and Firebase Realtime Database

        • Experience with Github (or other similar version control systems) and CI/CD pipeline tools like Azure Devops and Github actions

        • Ability to communicate fluently, pleasantly, and effectively—both orally and in writing, in the English language—with customers and co-workers.

        • Concrete examples and evidence of work product and what role the applicant played in it

        The following characteristics are not necessary but are highly desired:

        • Experience with Kubernetes, C#, TypeScript, Python

        • Bachelor's degree or higher in computer science or related technical field

        • Ability to contribute to strategic and planning sessions around architecture and implementation

        See more jobs at Brushfire

        Apply for this job