Data & AI Engineer - Google Cloud

Product Management
Full time

About WebLife Labs

WebLife Labs is the innovation arm of WebLife Stores LLC, an eCommerce Management & Operations team founded in 2008. As a pioneering force in the industry with over $250 million in all-time revenue and the largest distributor of mailboxes across the U.S., WebLife has built a legacy of e-commerce excellence spanning 16 years. WebLife Labs represents the next chapter in that journey—an evolution focused on leveraging cutting-edge AI and advanced digital capabilities to drive growth, innovation, and operational excellence across WebLife's global ecosystem. Our mission is to transform potential into performance—equipping our teams with cutting-edge tools, insights, and support to excel in a fast-changing digital world. Combining the strength of a proven eCommerce business with the agility of an innovation hub, we offer a collaborative, growth-focused environment where your ideas matter and your career can thrive.

About the Role

We are looking for a Data & AI Engineer who goes well beyond traditional pipeline work. In this role, you will own WebLife's data warehouse end-to-end on Google Cloud Platform, architecting a centralized data platform that serves multiple ventures across the WebLife portfolio. You will design and build ELT/ETL pipelines using GCP-native services, integrate third-party data sources, and work at the cutting edge of agentic AI within the Google ecosystem — including agent building with BigQuery and Google ADK. This is a high-ownership role where you'll operate as a trusted internal consultant — translating broad business needs into well-architected data solutions and driving them from concept to delivery with genuine autonomy. If you thrive on ownership, are passionate about GCP, and want to push into the agentic AI frontier within a fast-moving e-commerce environment, this role is for you.

Key Responsibilities

Data Warehouse & Pipeline Engineering

  • Own and manage the end-to-end data warehouse architecture on BigQuery, including schema design, optimization, data modeling, and governance
  • Architect scalable data warehouse solutions that centralize and unify data from multiple ventures, ensuring standardized schemas, cross-venture reporting capabilities, and a single source of truth
  • Design, build, and maintain scalable ELT/ETL pipelines using GCP-native tools (Cloud Functions, Cloud Run, Dataflow, Cloud Composer, Pub/Sub)
  • Integrate and extract data from third-party platforms, APIs, and SaaS tools to enrich the central data warehouse
  • Implement data quality checks, monitoring, and alerting to ensure reliability and accuracy across all pipelines
  • Create and maintain runbooks for production pipeline operations, failure recovery, and backfill procedures

GCP Cloud Architecture

  • Serve as the in-house expert on Google Cloud Platform data services, advising teams on best practices and architecture decisions
  • Manage and optimize GCP resources including BigQuery, Cloud Storage, Cloud Functions, Cloud Run, and related services
  • Design cost-effective, performant cloud data solutions that scale with business growth across multiple ventures
  • Configure and manage VPC networking, private connectivity, and firewall rules to ensure secure data flows across GCP services
  • Implement security best practices for data access, encryption, and compliance within GCP

Agentic AI & Intelligent Data Solutions

  • Build and deploy AI agents using BigQuery ML, Google ADK (Agent Development Kit), and related GCP AI/ML services
  • Design agentic workflows that automate data processing, insight generation, and decision-support tasks
  • Collaborate with data science and BI teams to operationalize machine learning models within the data platform
  • Stay current with rapidly evolving GCP AI capabilities and proactively identify opportunities to apply them

Cross-Functional Collaboration & Consultancy

  • Work directly with marketing, operations, and product teams across multiple ventures to understand data needs and translate them into technical solutions
  • Operate as an internal data consultant — proactively identifying data gaps, inefficiencies, and opportunities without waiting for direction
  • Document data architecture, pipelines, and processes to ensure knowledge sharing and team scalability
  • Support the BI team with clean, well-structured data models that power dashboards and analytics

Requirements

  • 3–4 years of overall experience in data engineering or a closely related role
  • Minimum 2 years of hands-on experience working with Google Cloud Platform (BigQuery, Cloud Functions, Cloud Run, Dataflow, Cloud Storage, Pub/Sub)
  • Proven experience designing and building ELT/ETL pipelines using GCP-native services
  • Experience scaling and centralizing data warehouse architecture across multiple business units or ventures
  • Experience with Cloud Run for containerized pipeline workloads and microservices
  • Demonstrable experience working with third-party data extraction tools and API integrations
  • Hands-on experience with agentic AI workflows in GCP, including agent building with BigQuery and/or Google ADK
  • Strong proficiency in SQL (BigQuery SQL specifically) and Python
  • Experience independently managing and optimizing a data warehouse (schema design, partitioning, clustering, cost management)
  • Practical experience creating and maintaining runbooks for production pipeline operations, failure recovery, and backfill procedures
  • Experience with GCP networking fundamentals including VPC setup, configuration, and private connectivity
  • Familiarity with data orchestration tools (e.g., Cloud Composer/Airflow)
  • Understanding of data modeling principles (dimensional modeling, star/snowflake schemas)
  • Strong English communication skills — ability to articulate technical concepts to non-technical stakeholders
  • Experience working with international teams, preferably US, UK, or Australian companies
  • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field
  • Self-motivated with a consultant mindset — comfortable working independently, making decisions, and driving projects forward

Technical Competencies

  • Google Cloud Platform: BigQuery, Cloud Functions, Cloud Run, Dataflow, Cloud Composer, Pub/Sub, Cloud Storage, IAM, VPC & Networking
  • AI/ML on GCP: BigQuery ML, Google ADK, Vertex AI (familiarity is a plus)
  • Languages: SQL (advanced), Python
  • Data Integration: REST APIs, third-party connector tools, webhook-based data extraction
  • Data Warehousing: Dimensional modeling, data governance, multi-venture data centralization, optimization techniques
  • Operations: Production runbooks, failure recovery procedures, backfill management
  • DevOps/Version Control: Git, CI/CD for data pipelines (a plus)

What We Offer

  • Competitive USD-based compensation packages
  • AI-driven, innovation-focused projects
  • A culture that encourages curiosity and lifelong learning
  • Clear career paths and personal development opportunities
  • A flexible, work-from-home setup as part of a globally connected team

apply for the role

We’d love to hear from you. Please fill out this form.
Apply now