Senior Data Engineer

Posted: 1 year ago

Details

United States (Remote)

Senior Data Engineer

Remote

About Pantheon

Pantheon’s WebOps Platform powers the open web, running sites in the cloud for customers including Stitch Fix, Okta, Home Depot, Pernod Ricard and The Barack Obama Foundation. Every day, thousands of developers and marketers create, iterate, and scale websites on the open web to reach billions of people globally. Pantheon’s SaaS model puts large and small web and digital teams in control of increasing the performance of their teams, websites, and marketing programs. Pantheon cloud native software includes governance, security and collaboration tools that make it easy to securely manage a single website or thousands of websites across multiple teams in one platform. The built-in ability to simultaneously create, test, deploy and run live sites with unrivaled hosting speed, scalability and uptime give marketing teams the agility to win in the dynamic world of digital marketing.

With 35% of the web running open-source and significant investments in a $200 billion total addressable market, we are growing aggressively into a huge market opportunity and looking to expand our organization. 

The Role

Pantheon is looking for a Data Engineer to join our team, either remote or onsite at our SF headquarters. (on U.S. hours.) Proudly based out of San Francisco, Pantheon is a platform where marketers and developers build, host, and manage their high-value Drupal and WordPress websites. Pantheon's engineering's secret sauce is not our innovative scaling and performance tooling but our passionate, creative, collaborative team.

The Data & Analytics team at Pantheon is focused on delivering actionable, trusted, and meaningful data products to the business. These data products will allow Pantheon to continue to drive growth, retain customers, understand our platform usage, and refine efficiencies in our business and platform. We are looking for a DE who recognizes how important data is as an asset to the company. The data engineer will work closely with Data Ops, Analytic Engineers, Analysts, and Data Scientists to help identify, source, ingest, and cleanse data to be made available for further refinement.  

Pantheon’s core company values are Trust, Teamwork, Passion, and Customers First. Within Pantheon engineering, we value collaboration, character, autonomy, and a no-blame culture. We're enthusiastic participants in several open-source communities and have real relationships with many of our most active customers. If all of this sounds interesting to you, read on!

Cool Things You'll Do

  • Work with the Data Ops team to define Ingest as a Service enabling consistent data extractions and loading into the data lake.
  • Work with new technologies like Snowflake including Snowpark with Python, Google Cloud Platform, Airflow 2, FiveTran, Census, Docker and Kubernetes
  • Assist Analytic Engineers and Product Owners in understanding data requirements and how to source and gather this data
  • Engage with platform engineers to continue to refine meaningful events from the Pantheon platform with Segment.
  • Develop large data pipelines with efficiency and speed
  • Design batch, event, and real-time data pipelines
  • Work with business owners and data providers to create data contract SLAs
  • Define data validation testing strategy and test for each component of a data pipeline utilizing Great Expectations
  • Continuous improvements to our standard of engineering excellence by implementing best practices for coding, testing, deploying and communication
  • Support the Data Org as a member of the on-call engineer rotation, contributing to the stability, reliability and performance of the data and data infrastructure that drives Pantheon's success.

What you Bring to the Table

  • You enjoy and have experience with large volume data pipelines, cloud databases, and real time data events
  • You are passionate about data
  • You need to have focused on sourcing, cleansing, deduplicating, and enriching data
  • You have experience programming with Python, specifically Python3
  • You have worked with Cloud Databases: Snowflake, BigQuery, FireBolt, or Redshift
  • Terms like Kimball Dimensional Model, Data Lake, Delta Lake, and/or Data Mesh are second nature to you
  • You are a clear communicator, able to represent your contributions and ideas with clarity while remaining open and giving space to the contributions and ideas of others.
  • Take pride in what you can do as part of a team.

What We Offer

We have all the usual perks and benefits but what we can really offer you is a fantastic work environment powered by an amazing team.

  • Industry competitive compensation and equity plan
  • Flexible time off and sick days
  • Full medical coverage (medical, dental, vision)
  • Top-of-line equipment
  • Fun at WordPress and Drupal community events
  • Extra benefits like a stipend for reading books and your workouts 
  • Events and activities both team-based and company wide that inspire, educate and cultivate 

Pantheon is an equal opportunity/affirmative action employer and we welcome applications from all backgrounds regardless of race, color, religion, sex, national origin, ancestry, age, marital status, sexual orientation, gender identity, veteran status, disability, or any other classification protected by law.  Pantheon complies with federal and local disability laws and makes reasonable accommodations for applicants and employees with disabilities. If you need a reasonable accommodation due to a disability for any part of the interview process, please contact talent@pantheon.io. Pursuant to local and federal regulations, Pantheon will consider qualified applicants with arrest and conviction records for employment.

After an offer is made and accepted, E-verify will be utilized to establish your identity and employment eligibility as required by the U.S. Department of Homeland Security

To review the Employee and Applicant's Privacy Policy, click here.  

Visa Sponsorship is not available at this time.

#LI-Remote

#LI-LT1

Information

Skill Level:
Intermediate / Proficient
Expert / Advanced
Job Category:
DevOps
Job Type:
Full Time
Telecommute / Remote