About WebLife
WebLife is a multi-venture holding company that builds and operates multiple businesses simultaneously. Founded in 2008, with over 16 years of operational excellence and $250 million in all-time revenue, we have evolved from an e-commerce leader into a portfolio of AI-focused ventures spanning e-commerce, SaaS, education, and recruitment.
What sets us apart is our operational foundation — disciplined operations, data-driven decision-making, and deep AI integration across everything we do.
Backed by WebLife's long-term stability and operational maturity, we operate with the speed and ownership mindset of a product-first startup — tight feedback loops, high accountability, and an expectation that every hire meaningfully influences strategy, execution, and how our ventures evolve over time. Combining the strength of a proven e-commerce business with the agility of an innovation hub, we offer a collaborative, growth-focused environment where your ideas matter and your career can thrive.
The Opportunity
We are looking for a Data & AI Engineer to own WebLife’s data warehouse end-to-end on Google Cloud Platform. You will design and build ELT/ETL pipelines, integrate third-party data sources, and work at the cutting edge of agentic AI within the Google ecosystem — including agent building with BigQuery and Google ADK. This is a high-ownership role. You will operate as a trusted internal consultant across multiple ventures, translating business needs into well-architected data solutions with genuine autonomy.
Key Responsibilities
Data Platform & Engineering
- Own BigQuery data warehouse (schema design, data modelling, optimisation, governance)
- Design and build scalable ELT/ETL pipelines using GCP-native tools
- Integrate data from APIs, SaaS platforms, and third-party sources
- Ensure data quality, monitoring, alerting, and documentation
- Optimise performance and cost as data scales
Cloud & Architecture
- Act as internal expert on GCP data stack (BigQuery, Cloud Functions, Run, Storage, etc.)
- Design scalable, secure, and cost-efficient data architectures
- Implement best practices for data security and access
AI & Advanced Use Cases
- Build and deploy AI/agentic workflows using BigQuery ML and Google ADK
- Automate data processing, insights, and decision-support systems
- Support ML model operationalisation within the data platform
Collaboration
- Partner with product, marketing, and operations teams to deliver data solutions
- Identify data gaps and opportunities proactively
- Enable BI/reporting through clean, structured data models
Requirements
- Bachelor's degree in Computer Science, Information Technology, Data Science, Engineering, or a related field
- 2–4 years in data engineering or related role
- Strong SQL and Python
- Experience with cloud data warehouses (BigQuery, Snowflake, Redshift, etc.)
- Hands-on experience building ELT/ETL pipelines
- Strong communication skills and ability to link data to business outcomes
Nice to Have
- Experience with GCP ecosystem
- Exposure to AI/ML or agentic workflows
- Airflow / Cloud Composer or orchestration tools
- Experience with e-commerce or multi-source data environments
- Understanding of data modelling (star/snowflake schemas)
What We Offer
- Fully remote — work from the comfort of your home
- Competitive salary with payment in USD
- Be part of a culturally diverse, inclusive, and innovative team
- Opportunities for professional development and growth within the company
- AI-driven, innovation-focused projects at the frontier of e-commerce technology