Packlink is a fast-growing global provider of shipping services and technology.
We are building the best shipping platform for consumers and businesses. Shipping is complex and small to medium-sized companies need a simple solution. With our technology, they can ship with the carrier of their choice and deliver a premium service to their customers.
We are actively seeking a talented and experienced Data Engineer to work at improving our current GCP ETL (Google case study
) and other Data related needs such as optimizations and new requirements. You will work with our partner and in-house teams to move forward our data pipeline to meet our needs. You will bring a problem solving and data oriented set of skills with deep technical knowledge of data treatment tools. We are looking for someone with experience working with GCP tools and other data related tools and techniques.
What you’ll find is:
- A growing business with international expansion
- EU leaders in our sector
- Impressive GCP ETL written in Java
- Data pipelines with Python (Airflow)
- Rapid data volume growth
- Data driven company
- Possibility to have great impact from day 1
- A company with more than 15 nationalities
- And much more!
What you will be doing:
- Work with internal technology teams delivering key BI capabilities.
- Improving our ETL and Data pipelines.
- Adding new information and/or databases to be processed.
- Providing data (processing, modeling, transforming) as needed by internal BI related teams.
- Create and test ETLs and tools to automate ETLs for the data migration into the platform
- Achieve a good data governance and data quality
What we are looking for:
- Work permit for the EU.
- Bachelor’s degree in engineering, computer science or a related field, or equivalent work experience.
- Excellent organization and communication skills.
- Fluent Spanish & English both verbal and written.
- Advanced Knowledge of Python and/or Java
- Knowledge of Spark or Apache Beam
- Experience with job orchestration tools like Apache Airflow
- Experience working with ETL pipelines and designing ETLs
- Experience with SQL like idioms.
- High degree of initiative and ability to work collaboratively.
It would be great if you have:
- Proficiency with at least one other programming language
- Experience working with data technologies of GCP (Apache Beam, Jupiter, etc)
- Solid software development skills and tooling
- Data tests and data quality
What we offer:
- Competitive salary package. We’re looking for the right person. Annual salary offer: 55 - 70K your skills and experience will define your salary ;)
- Up to 2000€/year training budget (certification, conferences attendance...)
- Opportunity to attend and speak at local user groups as well as international events.
- Flexible working hours.
- Remote 2 days per week.
- Paid vacation leave.
- Amazing location, amazing city: Madrid center (next to Atocha station).
- Free English/Spanish/French classes.
- We care about you: gym membership or private health insurance, 100% free.
- Take your bday off and celebrate it the way you want it.