DoiT International

Cloud Data Architect - GCP Data & Analytics

Sep 04, 2023

Remote US West


Our Cloud Data Architect will be an integral part of our Cloud Reliability Engineering team in North America. This remote-based role will is required to sit in either Pacific (PT) or Mountain (MT) time zones.

Who We AreDoiT helps fast-growing, digital native companies globally to harness public cloud technology and services to drive business growth. A full-service provider of multi-cloud technology and expertise, DoiT combines the power of intelligent software with deep expertise in Kubernetes, artificial intelligence, and more to deliver the true promise of the cloud at peak efficiency - with ease, not cost. 

An award-winning strategic partner of AWS, Google Cloud, and Microsoft Azure with $2B cloud spend under management, DoiT works alongside more than 3,000 customers in 70 countries. At DoiT, you’ll join a growing team of committed, experienced, and collaborative “Do’ers” who are passionate about solving the most complex cloud challenges. 

The Opportunity

As a Cloud Data Architect, you will be part of our global CRE  team, working with rapidly growing companies in North America and around the world. This role offers you the chance to:

  • Apply your hands-on experience & skills in a consultative manner to address our customers’ strategic and tactical needs around cloud technologies
  • Grow your technical and interpersonal skills by addressing customer challenges in your daily work and leveraging dedicated time allocated by the company for learning new technologies and engaging in internal initiatives
  • Strengthen your personal brand through thought leadership activities such as blogging, public speaking, and participation in technology events


  • Good verbal and written communication skills. 
  • The expertise to architect, develop and troubleshoot large production-grade distributed systems on GCP and select the appropriate tools to tackle business problems at the correct scale.
  • Experienced working with cloud data storage, data warehousing/data lake architecture, ELT/ ETL, and reporting/ analytic frameworks technologies, such as BigQuery, Dataflow, Looker Studio, and Dataproc. 
  • Experience with designing highly-available systems for serving transactional web-scale low-latency traffic using both RDBMS and NoSQL technologies, such as GCP CloudSQL, BigTable, and Firestore. 
  • A programming background with shell scripts and one or more of the following: JavaScript, Java, Python, GO, or Rust.
  • Familiarity with debugging, refactoring and optimizing code and the experience to know when and how to automate tasks.
  • Ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
  • Hands on with developing and deploying production data pipelines using orchestration tools like GCP Cloud Composer/Airflow.

Bonus Points 

  • Experience with AI/ML technologies like Vertex AI and BigQuery ML is a plus.
  • Experience with Looker or other BI Visualization tools is a plus. 
  • Being open to study and work with GCP data products.

Are you a Do’er?

Be your truest self. Work on your terms. Make a difference. 

We are home to a global team of incredible talent who work remotely and have the flexibility to have a schedule that balances your work and home life. We embrace and support leveling up your skills professionally and personally.  

What does being a Do’er mean? We’re all about being entrepreneurial, pursuing knowledge and having fun! 

Sounds too good to be true? Check out our Glassdoor Page.

We thought so too, but we’re here and happy we hit that ‘apply’ button. 

  • Unlimited PTO
  • Flexible Working Options
  • Health Insurance
  • Parental Leave
  • Employee Stock Option Plan
  • Home Office Allowance
  • Professional Development Stipend 
  • Peer Recognition Program


Join 27215+ Machine Learning Engineers, receiving daily job alerts.