BI Data Engineer

Destinus

Destinus

Data Science

Madrid, Spain

Posted on Apr 15, 2026

About the role

Imagine stepping into a role where messy spreadsheets evolve into a powerful, scalable data platform. As a Data Engineer at Destinus, you build the backbone that turns scattered data into reliable insights powering production, supply chain, and company-wide decisions. You take ownership of our Microsoft Fabric Lakehouse, the central data hub used by over 200 people across five European sites. Your mission is to migrate Excel-based reporting into a structured environment, integrate new data sources, and ensure seamless connection with ERP systems, creating a foundation that scales as we grow.

At Destinus, we are revolutionizing the defense industry with cutting-edge Unmanned Aerial Vehicles (UAVs). Our innovative technologies are designed to meet the unique demands of modern defense operations, delivering unparalleled speed, precision, and cost effectiveness. Destinus partners with government agencies and defense organizations worldwide to provide advanced solutions for mission-critical operations, enabling a new era of efficiency and technological superiority. Join us in shaping the future of defense with groundbreaking aerospace innovations.

What You’ll Do

  • Own and maintain the Microsoft Fabric Lakehouse, ensuring data availability, integrity, and performance across all connected domains.
  • Migrate existing Excel-based data sources into the Lakehouse through automated pipelines (Dataflow Gen 2), eliminating manual data handling.
  • Integrate new data sources as reporting demand grows across the organisation—including Excel and SharePoint files today, and additional systems such as quality management tools as they are introduced.
  • Ensure operational data integrates cleanly with ERP data (Microsoft Dynamics D365 F&O) already present in the Lakehouse, enabling unified reporting.
  • Collaborate closely with the Operations Data Specialist and BI users to ensure the data layer meets downstream dashboard and analytics needs.
  • Implement data quality checks, monitoring, and alerting to proactively catch pipeline failures or data drift.
  • Document data models, pipeline logic, and transformation rules to ensure long-term maintainability.

What You’ll Need

  • Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related technical field.
  • 2–4 years of experience with data engineering, ETL/ELT pipeline development, or data platform management.
  • Strong proficiency in SQL and experience working with relational and analytical data models.
  • Experience with Microsoft Fabric, Azure Synapse, or comparable cloud data platforms.
  • Familiarity with Power BI at the data model and semantic layer level (understanding how the BI layer consumes data, not necessarily building dashboards).
  • Understanding of data governance, access control, and working in sensitive or classified data environments.
  • Fluent in English, additional European languages are a plus.
  • Experience in a start-up or scale-up environment is preferred.

Who You Are

You thrive where infrastructure meets impact. You understand that a well-designed pipeline is not just a technical achievement, it's what allows hundreds of people to trust the numbers on their screens and make better decisions. You're methodical, reliable, and energized by the challenge of turning fragmented, manual data processes into something robust and scalable. You take ownership of the systems you build, document what you do, and think ahead to how things will need to evolve. You see the Lakehouse not as a project with an end date, but as a living platform that grows with the company. That responsibility drives you.