Data Engineer

Data Engineer



Responsibilities:

 Build and maintain multiple data pipelines to ingest new data sources (APIs, Files, Streaming, Databases,
Email, etc.) and support products used by both external users and internal teams
 Optimize infrastructure and pipelines by building DataOps tools to evaluate and automatically monitor
data quality, auto-scale serverless infrastructure, and develop data-driven pipelines
 Work with our data science and product management teams to design, rapidly prototype, and
productize new internal and external data product ideas and capabilities
 Work with the data engineering team to migrate and enhance our existing Pentaho-based ETL pipeline
to a new Python-based system and develop a serverless cloud data lake to augment our existing
Snowflake Data Warehouse
 Conquer complex problems by finding new ways to solve with simple, efficient approaches with focus on
reliability, scalability, quality and cost of our platforms
 Build processes supporting data transformation, data structures metadata, and workload management
 Collaborate with the team to perform root cause analysis and audit internal and external data and
processes to help answer specific business questions

REQUIREMENTS
 Bachelor’s degree in Computer Science or related field of study
 2+ years of data engineering experience or education equivalency
 SQL and Python skills including knowledge of Python libraries / frameworks
 Experience with AWS tools including EMR/Athena, S3, Kinesis, API Gateway, LAMBDA, Athena, etc.
 Excellent troubleshooting and problem-solving skills
 Excellent communication and teamwork, and a passion to learn
 Familiarity with distributed computing platforms (e.g. Hadoop, Spark, Storm)

Want to apply later?

Type your email address below to receive a reminder

Apply to Job

ErrorRequired field
ErrorRequired field
ErrorRequired field
Error
Error
insert_drive_file
insert_drive_file