Data Engineer

At Wag!, we build products that help dogs and their parents live more joyful lives. Working with our technology and engineering team is part art, part skill—and all heart. We like to surround ourselves with other smart people that will challenge us and bring new thinking to the team to help us build better products. We're all about automation and we're constantly pushing the limits on what can be delivered auto-magically—from QA to deploying code in production. Our patent-pending Woof Pack Scrum structure allows us to work more independently, faster, and with the right level of autonomy and accountability for producing results. If there's a way to integrate it into Slack, we've probably done it. We love music, food and drinks—anything that helps us be the team possible. And maybe most important—we don't like to leave our dogs home alone, so be ready to spend your days in the office hanging with the pups.

We are moving a mile a minute to make sure that Wag! customers, walkers, and partners are able to experience the magic of Wag!. We're looking for a talented dog lover to help our Data Engineering team build out our Trust & Safety data pipeline. You will help us move our data systems to the next level. You will not only be working with data but also help define the way Trust and Safety performance is calibrated and what questions should be asked. You should be excited to contribute new ideas and articulate them to a variety of stakeholders.  You should also be committed to learning, seeking to innovate and raise the bar on how we use data for our Trust and Safety efforts. Specifically, you will be responsible for the following:

What you'll be doing
  • Building out a robust T&S data platform from how we define T&S issues to developing reporting mechanisms to share that data with key stakeholders
  • Monitoring and managing incident rates globally, and working proactively to identify opportunities for improvements in data quality and/or Trust and Safety efforts.
  • Collaborate with cross-functional agile teams, including product, marketing, analytics and data science, to understand, design and develop data models to support business reporting and analysis
  • Assist the Data Services Team Lead in maintaining optimal data pipeline architecture and performance
  • Research, design and share your ideas in technical design and architecture discussions
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
  • Be subject matter expert of our data warehouse stack; MySQL, Alooma, Snowflake, dbt and Tableau
  • Ensure compliance with international data storage regulations (e.g. GDPR)
  • Ship code!
  • Provide coaching to junior data engineers and data modellers and share and learn from your peers
  • Develop your craft and build your expertise in data engineering
  • Representing Trust and Safety in company-wide data efforts with engineering, business intelligence, and others.
  • Designing analytical and reporting frameworks to make complex data easy to understand and drive decision making for stakeholders

What you’ll bring
  • Advanced working knowledge and experience working with relational databases (MySQL preferred)
  • Experience modelling, building and optimizing Data Pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong and deep refactoring skills, and the ability to impart them to other developers
  • Experience working with large codebases and writing robust and testable code
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Must require masters degree or higher

Bonus points for
  • Experience with safety or risk data
  • A desire to stay at the forefront of data pipeline technology
  • Experience with MySQL, Alooma, Snowflake and dbt
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with dbt and/or willing to learn it. (getdbt.com)
  • Experience with AWS cloud services: S3, RDS, EMR, DataBrick
  • Experience with Docker, Kubernetes, Ansible, and Terraform, and other DevOps and infrastructure as code technologies.
  • Experience with Python
  • Ability to tolerate questionable dog puns and a sense of fun

Why Work Here:
Competitive Salary
Medical, Dental, Vision, Life Insurance
Unlimited PTO
Catered daily lunches, snacks galore, and endless coffee
Office dogs!
Get in on the ground floor of a fast-growing startup!

About Us
At Wag! we're crazy about dogs and the people who love them, and everything we do is intended to bring them joy and keep them safe. Our company was founded in 2015 and born from a love of dogs and an entrepreneurial spirit to help make pet parenthood just a little bit easier, so dogs and their humans can share a life full of joyful moments.

We invented on-demand dog walking by connecting an already passionate community of local dog walkers with pet parents. Launching in Los Angeles in 2015, Wag! services are now available in 43 states and more than 100 cities nationwide. Our walkers and sitters are vetted and pass a robust screening process—and our services are bonded and insured. We know your dogs are members of your family and taking care of them is the highest honor you can give Wag! walkers and sitters.

Want to apply later?

Type your email address below to receive a reminder

ErrorRequired field

Apply to Job

ErrorRequired field
ErrorRequired field
ErrorRequired field
ErrorRequired field
insert_drive_file
Error