Delphia's mission is to develop an online marketplace in which people and companies can invest their data as assets, control how those assets are used, and realize the economic returns that their data generates. Through our various apps and services, we're building a new kind of company that completely rethinks the value of privacy, data, and accountability.
Delphia is a small team people located near Queen and Bathurst in downtown Toronto. We started in January and have already graduated from YCombinator and received a sizeable seed round investment, with interest from additional investors.
Whether we're tweaking the most minute interface details, or analyzing giant data sets, people at Delphia are creatives at heart. Our teams are continually iterating, solving problems, and working together to build new ways to generate economic returns for our users. Working together, we can help people to realize the actual value of their data - we're just getting started.
At Delphia, data is our lifeblood. How would you like to work on data and build some of the tools that are essential to moving & transforming this data into valuable and insightful information? If so, this is the right job for you.
You're a Data engineer with Software Engineering chops. You're used to crafting data pipelines that efficiently and reliably move data across systems. You're also passionate about building the next generation of data tools which enable us to take full advantage of this data. In this role, you will get opportunities to expand our toolkit, enrich the signal of our data, and work closely with our infrastructure team. You'll also help to grow the team and use these core assets to have a big impact on our business.
- Build data expertise and own data quality for pipelines
- Design, build and launch extremely efficient & reliable data pipelines to move data (both large and small amounts) to our Data Warehouse
- Design and develop new systems and tools to enable faster data consumption
- Use your expert coding skills in Python
- Work with engineers, product managers, and data scientists to understand data needs
- Build a framework for auditing, error logging, and master data management for your pipelines
- Work across multiple teams in high transparency roles
- 3+ years of Python development experience
- 3+ years of SQL (Hive, PostgreSQL, etc.) experience
- Experience analyzing data to identify gaps and inconsistencies
- 3+ years of experience in ETL design, implementation, and maintenance
- Experience working with a MapReduce or MPP system
- Experience leading data driven discussions
- BS in CS/EE/CSE or computational sciences
- Experience with more than one coding language
- MS in CE/EE/CSE or computational sciences
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.