Backend Engineer / Data Architect


Backend Engineer / Data Architect

About Autopilot
Autopilot is simple and visual marketing automation software trusted and loved by over 3,000 teams around the world. It helps marketers capture and convert new leads, connect with customers and create loyal repeat buyers. The best teams at Lyft, Atlassian, Microsoft, Instapage, LiveChat, Greenpeace, Patreon and Crunchbase all use Autopilot to automate their marketing.
To date our software has helped Australia promote the YES vote on gay marriage and say YES to equality. It’s given Greenpeace the data and tools they need to win activist campaigns. It’s helped Patreon connect and educate creators to give them new revenue streams. And it’s helped hundreds of startups tell their story and grow their businesses.
We’re a successful, fast growing and global company with offices in Sydney and Minneapolis. We have thousands of remarkable customers, an extraordinary team and have wonderful investors like Blackbird, Rembrandt and Salesforce Ventures.

We are currently searching for a Data Engineer to join our rapidly growing development team in Sydney. 

Responsibilities include but are not limited to:

  • Work closely with our Data Science team to implement test and deploy RESTful interfaces for information flow to and from databases, caches, and other APIs (Golang) with a focus on data access and availability
  • Contribute to author and deploy ETL pipelines to collect and store data from a variety of sources (Python - Airflow) and allow data access to the relevant stakeholders
  • Improve the data architecture behind Autopilot’s data products, anticipate risks and limitations of different approaches with a proactive attitude and a strong motivation to progress quickly from ideas to proof-of-concept to deployment in production
  • Contribute to projects such as adaptive anomaly detection of user activity, user engagement, customer segmentation, as well as monitoring product-usage, ...

Skills - Essential

  • Highly technical, fluent across a broad range of languages and data tools, with demonstrated experience working with a variety of SQL and NoSQL databases
  • Proficient in Python (our ETL), Golang (our new backend), and Node.js (our legacy backend) with an in-depth exposure to both AWS and GCP solutions
  • Insatiable learner and great team player who values knowledge sharing and collaboration
  • Outstanding communication skills, and can work independently

Skills - Preferable

  • Good understanding of statistical and machine learning principles 
  • Experience in Big Data technologies for batch and stream processing
  • Demonstrated ability to deploy fault-tolerant distributed computation systems

Want to apply later?

Type your email address below to receive a reminder

ErrorRequired field

Apply to Job

ErrorRequired field
ErrorRequired field
ErrorRequired field
Error
Error
insert_drive_file
insert_drive_file