In this position, you will help build out data infrastructure and implement new machine learning methods to enhance Tagup's core competencies of anomaly detection and survival modelling. The tools you build will impact model development, deployment, model performance reporting, and data monitoring. You will work closely with the founders and field engineering to deliver a cutting-edge solution to our industrial clients.
We welcome applicants just out of graduate programs; please feel free to include references to any academic articles and/or relevant open-source projects you have contributed to.
- Design and implement new distributed machine learning methods
- Implement data warehouses, real-time ETL, and batch processing data to support modeling needs
- Work with engineers to define and manage data sources
- Build and maintain internal data processing and visualization tools
- 3+ years of data science work and/or academic experience
- Deep experience with python
- Experience shipping products and features early/often
- Experience working with very large datasets, especially using Dask and Kinesis
- Background in time series analysis, survival modelling, or anomaly detection.
- Experience working with hidden Markov models, Gaussian processes, and variational inference.
Work with the founding team to solve some of the hardest problems in heavy industry (using lots of unique data!). We work with energy companies and utilities around the world to increase infrastructure reliability, reduce costs, and improve safety.