Utility Warehouse is seeking a Data Engineer who will play a key role in connecting our people with data and insight.
We're a supplier of home services (Energy, Telephony, Insurance & Broadband) with around 2% of the UK market, a unique business model and a huge potential to grow. As an established, 20-year-old British company our challenges are to rebuild whilst continuing to innovate and compete, in order to develop a platform to enable the future growth of the business.
At the beginning of 2016 we started a fundamental transformation here at Utility Warehouse - driven by our board; there’s a strong desire to become a modern, agile, UX and tech-led business.
Our data, product and technology teams are a mix of new joiners and company veterans - a lot of the usual practices and processes still need establishing so it's definitely a good time to join and influence how things shape up in the future.
The Data Platform Team is responsible for providing the infrastructure, data pipelines and best practices for teams to integrate their data into the analytics platform. The aim of the team is to ensure data consumers, whether developers, analysts or business users, have the tools, data or insights to do their job. We are seeking to establish an internal consultancy for data and insights, sharing our knowledge and enabling our data community to be empowered by insight.
Technologies in the stack:
AWS, GCP, Kubernetes, Talend, Airflow, Kafka, BigQuery, Postgres, Python and more...
What you will do:
You will work in a cross-cutting team who is concerned with supporting our analytics stack from an engineering perspective. You will be building and supporting data integration pipelines as well as supporting our data community in the use and best practices of this infrastructure. This role requires an appreciation of reporting & analytics, data modelling, extending and architecting logical data models to facilitate self-service data analysis.
You will ensure our data warehouse and databases are populated daily and data quality is maintained while balancing solution development/improvement and the operational support needs of users. Your customer first attitude will be exemplary, helping everyone win with data, owning multiple data solutions and being adaptive to changing business requirements.
The successful candidate will be a data evangelist, preferably with strong problem-solving skills the ability to dive into detail to logically analyse and solve complex problems. This role will suit candidates from eclectic backgrounds as we are primarily seeking an analytical and inquisitive mind. You will be a self-starter, good communicator and team player who is able to manage their own time effectively.
You will need
- A passion for Data Engineering, Software Development and DevOps work
- Expert in use of ETL tools such as Talend and Airflow
- Expert SQL scripting
- Exposure to cloud data warehousing, e.g BigQuery, Redshift, Snowflake, Athena, AWS RDS
- Experience of working with structured and semi-structured data sets
- Experience in data modelling skills
- Use and knowledge of version control technologies e.g. git
Bonus points will be given for the following although not essential...
- Use of Cloud Analytics tools e.g. Looker, Qlik
- Knowledge of Kubernetes, Terraform or Ansible
- Experience in monitoring and improving data quality
- Knowledge of data streaming architectures and microservices, e.g Kafka, NATS