At geoFluxus, we're committed to helping businesses make smart, sustainable choices as they transition to a circular economy. Our platform turns complex waste data into actionable insights, empowering companies to enhance their waste management and sustainability efforts. As a Data Pipeline Engineer, you'll play a crucial role in shaping and optimising our products and bringing them to life.
Join us in paving the way towards a greener future with geoFluxus!
Overview
Are you a problem solver who thrives in designing and optimizing data pipelines? Do you enjoy working with large datasets, automation, and scalable infrastructure? We're looking for a Data Engineer / Data Pipeline Engineer to join our Tech team, where your contributions will directly impact the development and performance of our data-driven platform.
In this role, you'll focus on operationalizing data analysis pipelines, ensuring they are robust, efficient, and well-integrated into our platform. You'll work closely with our R&D and backend teams to transform research scripts into production-ready workflows, automate data ingestion and processing, and enhance system reliability. You'll also be involved in integrating AI/ML capabilities into our pipelines to drive deeper insights and smarter decision-making.
On a typical day, you'll participate in team standups, solve puzzles related to data analysis automation, implement and test scalable architectures, and collaborate with colleagues to push the boundaries of waste analytics.
If you're passionate about problem solving, automation, data pipelines, and AI-driven analytics, we'd love to hear from you!
Key responsibilities
- Design, develop, and maintain scalable data pipelines and infrastructure.
- Operationalize data analysis and structuring workflows, ensuring efficiency and reliability.
- Automate data ingestion, transformation, and validation processes.
- Build and maintain CI/CD pipelines for data workflows and services.
- Implement robust testing strategies to ensure data integrity and pipeline reliability.
- Optimize database queries and storage solutions to support high-performance data operations.
- Integrate AI/ML models into data pipelines for enhanced analytics and automation.
- Document data workflows, API endpoints, and best practices to support collaboration across teams.
- Work closely with backend and frontend developers to enable seamless data access and visualization.
Skills & experience
We're looking for an analytical and solution-oriented Data Engineer who thrives on building and optimizing scalable data pipelines in an innovative environment and supportive, cross-functional team. You'll be a great fit if you bring the following skills:
- Proven experience as a Data Engineer or Data Pipeline Engineer with a strong portfolio of scalable data solutions.
- Proficiency in Python for data processing, automation, and backend services.
- Experience with data pipeline orchestration tools (e.g., Apache Airflow).
- Strong knowledge of CI/CD best practices and experience implementing automated deployment workflows.
- Experience with containerization and orchestration tools like Docker or Kubernetes.
- Proficiency in working with relational databases like PostgreSQL.
- Strong testing mindset with experience in writing unit and integration tests for data workflows.
- Excellent analytical and problem-solving skills.
- Strong communication and teamwork abilities.
Bonus qualifications
- Experience with AWS for hosting and scaling applications.
- Familiarity with integrating AI/ML models into production pipelines.
- Knowledge of monitoring and logging tools for pipeline performance tracking.
- Ability to explain technical concepts to non-technical team members.
What you can expect from us
We want to create a work environment where you enjoy working and making an impact. This is what you can expect from us:
- A 36-hour work week, with Fridays off every other week-with no impact on your salary
- A hybrid work structure: Wednesday through Friday at our office in Rotterdam, the rest remotely or at the office.
- An NS Business Card for your commute to work.
- Good and complete working conditions.
- At least 4 weeks of vacation, with the flexibility to take more if you need it.
- Quarterly offsite, in the Netherlands or abroad, where we plan, brainstorm and get to know each other better.
- A dynamic and fast-growing work environment where you can help build the company and make a real impact.
Application process
Ready to apply? Here's what the process looks like:
- Apply: Send us your resume and a short cover letter explaining your experience and why you are excited about this role.
- Introduction: If your profile matches what we are looking for, we will schedule a 30-minute introductory meeting.
- Case assignment: You will create a short code demonstrating how you approach data pipeline design and solve problems.
- Final Interview: You'll present your case to our tech lead and CTO, and we'll discuss your match with the role in more detail.
- Offer: If all goes well, we will make you an offer and welcome you to the team!
Language & Office
Fluency in English is required for effective communication within our international team, Dutch is a great added bonus. Our team combines working from home and from our Rotterdam office. Ideally, we'd like to see you in our Rotterdam office each week.