All Jobs

Senior Data and Software Engineer

Post date
December 18, 2025
Work location
Remotely
Job type
Full Time

Cloud Employee

Cloud Employee, is a UK-owned business established 8 years ago. We connect high-performing software engineer talent worldwide with some of the world’s leading and most innovative tech companies. Developers join to work as part of international engineering teams and grow their CV and skill set.


We pride ourselves on being a supportive and cutting-edge workplace that continuously invests in staff development, engagement, and well-being. We provide security, and career paths, along with individual training programs and mentoring.

Client Overview

A fast-growing US-based tech startup revolutionizing the dental supply chain through a modern e-commerce platform. Their solution consolidates product catalogs from multiple vendors, giving dental practices a streamlined interface to manage their supply ordering in one place.


With a strong technical foundation and a lean, collaborative team, they’ve built proprietary automation and scraping systems to power real-time product visibility and purchasing for each individual user. As the company scales, they are expanding their engineering team to strengthen their automation infrastructure and accelerate platform growth.

Job Overview

We’re hiring a Senior Data and Software Engineer with deep expertise in Python to design and own complex data automation workflows, scraping pipelines, and backend integrations, while also supporting backend development efforts in Node.js.


You’ll be working closely with the CTO and CEO to reverse-engineer APIs, optimize scraping speed, and maintain fault-tolerant headless browser systems. This is a fullstack role but heavily weighted towards backend engineering and data automation. It’s ideal for someone who thrives in fast-paced startup or scale-up environments, has strong problem-solving instincts, and values clean, scalable architecture. You’ll be collaborating directly with an experienced founder who has previously exited a business, making this a unique opportunity to help build a high-impact product alongside someone who understands what it takes to scale successfully.

Key Responsibilities

  • Build and maintain advanced headless browser automations (e.g. using Playwright) to extract product data from vendor websites behind authentication
  • Solve complex challenges involving CAPTCHA solvers, Cloudflare bypassing, and session management
  • Reverse-engineer internal APIs and implement resilient scraping strategies
  • Develop fullstack applications using Node.js and Python to power automation pipelines
  • Deploy and monitor scrapers using AWS infrastructure, rotating proxies, and fail-safe logic
  • Support the migration from MongoDB to PostgreSQL and ensure proper data integrity
  • Collaborate with a lean, hands-on technical team to continuously improve scraping performance and latency
  • Contribute to overall product development and ensure integration of scraping layers into the core web app

Required & Optional Skills

  • Minimum 5 years of experience in software engineering.
  • Experience working in startup or scale-up environments
  • Strong experience with Node.js in production environments
  • Advanced experience with Python, data engineering, and automation
  • Experience with API reverse engineering
  • Previous experience in data engineering, ETL pipelines, or crawling systems
  • Proficiency with Playwright or similar headless browser automation tools
  • Experience scraping authenticated websites, including solving CAPTCHAs and bypassing anti-bot protections like Cloudflare
  • Strong Python scripting experience, especially in automation contexts
  • Experience deploying scrapers at scale using AWS (e.g. EC2, proxies, remote infrastructure)
  • Hands-on experience with both MongoDB and PostgreSQL
  • Familiarity with Git/GitHub for version control and collaboration
  • Solid understanding of web performance and asynchronous workflows

Nice to Have

  • Familiarity with Vite.js or modern frontend frameworks (React/Vue)
  • Understanding of MCP protocols, sitemaps, and modern scraping best practices
  • Basic DevOps skills to handle containerized deployments and script scheduling

Soft Skills & Candidate Characteristics

The ideal candidate will be:

  • Highly autonomous and capable of working independently without needing step-by-step direction
  • Reliable and responsive, especially when handling live data tasks or urgent fixes
  • Able to communicate clearly in English, both written and verbal
  • Genuinely interested in solving performance, latency, and data scaling challenges
  • Comfortable operating in a startup environment where priorities shift and systems evolve
  • Someone who values ethical access and security, especially when working with user credentials
  • Motivated by real product impact and eager to help shape scalable, modern architecture

What we offer

  • Competitive compensation package is USD
  • 15 PTOs per year
  • Health insurance allowance
  • 1,000 USD per year budget for learning benefits